Skip to main content

ASU camera creates stunning mosaic of moon's polar region


Gigapan
March 18, 2014

Today, the Lunar Reconnaissance Orbiter Camera (LROC), run by the Arizona State University-based team under professor Mark Robinson, released what very well may be the largest image mosaic available on the web. This map offers a complete picture of the moon’s northern polar region in stunning detail.

On December 11, 2011, after two and a half years in a near-circular polar orbit, NASA’s Lunar Reconnaissance Orbiter (LRO) entered an elliptical polar orbit, with the periapsis (point where the LRO is closest to the surface) near the south pole, and the apoapsis (point where LRO is furthest from the surface) near the north pole of the moon. The increased altitude over the northern hemisphere enables the two narrow angle cameras and the wide angle camera to capture more terrain in each image acquired in the northern hemisphere.

The resulting LROC northern polar mosaic is comprised of 10,581 narrow angle camera images, collected over four years, and covers the latitude range of zero to 60 degrees north.

In the fall of 2010, the LROC team produced its first mosaic of the moon’s northern polar region, but it doesn’t even compare to this new mosaic, with its 50-times-higher resolution, and over 680 gigapixels of valid image data covering a region of the moon slightly larger than the combined area of Alaska and Texas – at a resolution of 2 meters per pixel.

To create the mosaic, each LROC narrow angle camera image was map projected on a 30-meters-per-pixel lunar orbiter laser altimeter-derived digital terrain model using a software package (written by the United States' Geological Survey) called Integrated Software for Imagers and Spectrometers.

The northern polar mosaic was assembled from individual “collar” mosaics. Each collar mosaic was acquired by imaging the same latitude once every two-hour orbit for a month, during which time the rotation of the moon steadily brought every longitude into view. Each collar mosaic has very similar lighting from start to end, and covers 1 to 3 degrees of latitude.

The mosaic was originally assembled as 841 large tiles, due to the sheer volume of data. If the mosaic was processed as a single file, it would have been approximately 3.3 terabytes in size. Part of the large size is due to the incredible dynamic range of the narrow angle cameras. The raw images are recorded as 12-bit data (4,096 gray levels), then processed to normalized reflectance (a quantitative measure of the percentage of light reflected from each spot on the ground).

To preserve the subtle shading gradations of the raw images during processing, the narrow angle camera images are stored as 32-bit, floating-point values (millions of gray levels). The 32-bit values are four times the disk size of the finalized 8-bit (255 gray levels) representation most computers use to display grayscale images. The conversion process from 32-bit to 8-bit pixels results in saturation (a group of pixels all with the maximum value of 255) in the brightest areas.

Even with the conversion, the compressed JPEG images that make up the final product take up almost a terabyte of disk space.

In total, the massive mosaic required 17,641,035 small tiles to produce the final product.

“The (northern polar mosaic) is another example of LRO observations paving the way for science discoveries and future missions of exploration. Creation of this giant mosaic took four years and a huge team effort across the LRO project. We now have a nearly uniform map to unravel key science questions and find the best landing spots for future exploration,” says Robinson, a professor in the School of Earth and Space Exploration in ASU’s College of Liberal Arts and Sciences.