convert performance large matrix

Questions and postings pertaining to the usage of ImageMagick regardless of the interface. This includes the command-line utilities, as well as the C and C++ APIs. Usage questions are like "How do I use ImageMagick to create drop shadows?".
Post Reply
jmagyari
Posts: 22
Joined: 2010-01-19T18:09:32-07:00
Authentication code: 8675309

convert performance large matrix

Post by jmagyari »

OS XP 2 GIG Memory, Pentium 4 2.56GHz processor.

I have 1,000,000 png images ~60kb per image that I would like to place in a 1000x1000matrix to form one large tif image.
I would also like to create smaller subsets of images from above tiles 500x500 to a single tif as well.

1) convert (1000 png images) -append columname.mpc <------- this goes pretty fast to get 1000 columns (Assumed that making MPC Format might help next step)

Now taking those 1000 colums and placing them next to each other takes forever

2) convert (1000 mpc colums) +append mybig.tif <-After 8 hours I've aborted process and have been trying differnet combinations to see if I can speed things up.

Question
A) Are there arguments to help step 2 to combine these columns quickly?

B) Should I convert to a different format in step 1(Tif, PNG, other loseless format) to help speed up combination of columns in step 2?
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: convert performance large matrix

Post by fmw42 »

I know little about this. But I suspect you are running out of memory and thrashing to disk. See the following about processing large images:

http://www.imagemagick.org/Usage/files/#massive
http://www.imagemagick.org/script/comma ... .php#limit
snibgo
Posts: 12159
Joined: 2010-01-23T23:01:33-07:00
Authentication code: 1151
Location: England, UK

Re: convert performance large matrix

Post by snibgo »

1,000,000 x 60kB = 60 GB.

And that's the compressed size, but I gather MH stores the image uncompressed in memory. I don't think Windows XP allows you to have 60 GB virtual memory.

So I think you are asking the impossible.

Some other software might work line-by-line, without storing the image in RAM, so creating one line of 1000 images with MH, then the next line, and eventually joining the lines together (with some other software) to make the 60 GB file might work.
snibgo
Posts: 12159
Joined: 2010-01-23T23:01:33-07:00
Authentication code: 1151
Location: England, UK

Re: convert performance large matrix

Post by snibgo »

Sorry for the quick semi-correction. I'm a newbie.

The architecture document (http://www.imagemagick.org/script/architecture.php) is worth reading. If there's not enough memory, IM will store pixels on disk.

I tried this out by creating a one billion pixel file:

convert -size 1000x1000000 xc:red x.png

It created a temporary file in %TEMP% of 8,000,000,000 bytes, then spent 30 minutes with the disk light permanently on, and my poor old 4GB dual-core laptop wouldn't respond to Ctrl-Alt-Del or anything.

The arch doc suggests IM is better at accessing rows than columns. It seems one can write a program with ReadStream/WriteStream that might not try to store the entire image in memory.
snibgo's IM pages: im.snibgo.com
Post Reply