Page 1 of 1

Posted: 2006-05-12T11:04:48-07:00
by magick
Use the tiffinfo program to identify large TIFF images.

Posted: 2006-05-12T12:26:43-07:00
by magick
Ok, ImageMagick can be slow for large images. Perhaps another program may be a better choice for your particular workflow. For details, see http://www.imagemagick.org/script/architecture.php.

Posted: 2006-05-12T13:48:43-07:00
by magick
We converted your image to PNG in 17 seconds on our Redhat Opteron system. It took 11 seconds when we used the Q8 version of ImageMagick 6.2.7-5, the current release. The default build is Q16 (16 bits-per-pixel).

Posted: 2006-05-12T17:09:20-07:00
by magick
ImageMagick seems to perform better on 64-bit architectures. We have a dual-processor 844 Opteron with 2GB of memory.

We got some image that is a 12000x16814 in the TIFF format from the ImageMagick mailing list.

Version 4 of ImageMagick used runlength encoding which turned out to be efficient for images that had runs of similiar colors. This would reduce the memory requirements of the image but for version 5 we developed the pixel cache because the RLE encoding did not lend itself to the large number of requirements needed to implement the thousands of algorithms within ImageMagick.

You can set the image depth at run time but not the internal quantum depth. However, we have seen some installations that keep both the Q8 and Q16 versions of ImageMagick. ImageMagick puts the libraries in separate paths (e.g. lib/ImageMagick-6.2.7-Q16 and lib/ImageMagick-6.2.7-Q8) and you can name the programs with different names (e.g. convert8 and convert16).

You could try script-fu from Gimp. It does batch image processing. It uses tile-based access to the image so it should be memory efficient.