www.PANCROMA.com

Pan Sharpening Huge Satellite Image Files using the Windows O/S

Pan sharpening full size Landsat data sets can be a challenge. The three band files plus the panchromatic image require around 600MB of storage space. Processing the images takes considerably more storage space than this. A pan sharpened Landsat image is around 800MB in size. The storage requirements for processing satellite imagery like Landsat can easily exceed the capabilities of a Windows operating system unless the programmer is very mindful of the storage expense. Problems can occur even with a lot of effort to avoid them.

There are a lot of misconceptions about computer memory. One line of erroneous thinking says that you can increase processing capability indefinitely by adding more and more RAM to your computer. Another misguided notion is that the amount of virtual RAM is limited only by hard disk space as a result of the memory paging capability of the operating system.

By definition, a 32-bit processor uses 32 bits to refer to the address of each memory location (byte). 2^32 = 4.2 billion, which means a memory address that is 32 bits long can only refer to 4.2 billion unique locations (i.e. 4 GB). In the 32-bit Windows world, each application has its own virtual 4GB memory space.

This 4GB space is (normally) evenly divided into two parts, with 2GB dedicated for kernel usage (i.e. the operating system which all processes have to share), and 2GB left for application usage. So the reality is that the amount of addressable memory is limited by the address space, not the amount of RAM or the virtual storage space, and the maximum amount of addressable storage for any process is 2GB.

It is easy to see how problems might arise when processing large satellite images. Pan sharpening a full Landsat data set requires 600MB of input files; another 1000MB of intermediate files, and produces an 800MB output file. Just the data files alone can easily add up to more than the 2GB limit. When processing such large files the programmer must have a high degree of memory "situational awareness" and in fact must employ a host of techniques to avoid running out of memory.

Sharing the OS kernel among processes can be a problem as well. Applications that process huge data sets can overwhelm the queuing system and block other resources from getting any attention at all, like for example the GUI window of the application itself. This results in "freezing" the GUI window making it unresponsive and unavailable to background commands. Special programming techniques are required to address this problem.

Let's take a look at a real example: loading an 800MB pan sharpened image and displaying it on the monitor. One approach is as follows: read the file from disk into a memory buffer, consuming 800 MB of storage. Copy the image into the image bitmap object, consuming another 800MB of storage. Delete the memory buffer. The net effect is 800MB of storage required. However during processing twice that amount is demanded, i.e. 1.6GB. This is how many graphics applications load image data. It is done this way because it is the fastest way to load image files of "normal" size.

A better approach when dealing with very large files might go like this: read one row of the image file from disk into a memory buffer. Write the buffer to the image bitmap object. Go back and get the second row and load buffer, over writing the first row. Write that to the bitmap. Repeat until the entire image is in the image bitmap object. The net is still 800MB of storage required, but peak storage is only 800MB plus one row, essentially half that required by the naive approach.

You might think that compression is the answer. It is true that the JPEG and PNG compression algorithms can reduce a Landsat pan sharpened file from 800MB to around 200MB. The problem is that in order for an image processing application to use a compressed image, it must inflate it back to its original size. As a result, compression only decreases the amount of disk space required to archive the file, not the addressable memory required to process it.

Other factors also affect the ability of a computer to process large image files. My tests on a variety of computers show some remarkable differences in their ability (or not) to process graphics files around 800MB in size, with some very capable computers with lots of installed RAM unable to process files that very limited computers are able to handle (albeit slowly). In an extreme case, a Dell Optiplex DX620 with Windows XP Professional, 2048 Megabytes of installed memory with a 3.8 GHz Intel Pentium 4 processor was unable to load an 812MB file into Paintshop Pro or Windows Paint. An HP Pavilion ze1115 with 256MB memory also running XP at 1.1GHz was able to load the same file into both applications with no problem (except the glacial speed). One reason in this case was a large amount of RAM consumed by memory-resident programs in the more capable computer. Firewall and virus screening programs can be particularly insidious and consume large quantities of RAM needed by applications processing satellite data. Another major culprit is the modern operating system. They have become extremely large and complex and are designed to utilize as much RAM as they possibly can. This allows them to process a large number of smaller tasks very quickly, to the detriment of applications desiring a lot of RAM themselves.

Another culprit is the graphics file library. Library utilities designed to read and write graphics files such as TIFF, JPEG and PNG files operate with varying degrees of memory overhead requirements. the libPNG library is relatively efficient, allowing the developer to configure it for efficient memory buffering. JPEGlib, the common JPEG file read/write utility on the other hand is inflexible and expensive from a RAM utilization perspective. The PANCROMA GeoTiff and BMP read/write utilities are the most memory efficient of all as they were built from the ground up for maximum memory efficiency, for which it trades a bit of operational speed.

A few simple tests can gauge the ability of your own computer to handle large files. If you have a graphics program that allows you to resize an image, load a small 24-bit RGB file into the application. Try resizing the file to 16000 by 14500 pixels. If the application protests, your computer may have some issues with handling full size Landsat pan sharpened files. If it resizes successfully, try saving it to disk in an uncompressed format like BMP and then try to reload it. This is another stress test that may identify issues.

The PANCROMA satellite image processing utility has lightweight TIFF, PNG and BMP readers and writers, so you should always be able to create, save and reload your images. However if you plan for example to create images in PANCROMA and then import them into other applications, make sure your computer has the ability to handle your processed files. I have attached a compressed PNG that will expand to a 16,621 by 14,521 pixel image if you wish to experiment.

[750MB Test File. Click to enlarge.]





Web site and all contents © Copyright TERRAINMAP Earth Imaging LLC 2010, All rights reserved.