At the University of California, Berkeley (www.berkeley.edu), a portable laser backpack has been developed that is capable of producing automatic and realistic 3-D maps of interiors. The backpack incorporates an InterSense InertiaCube to assist with indoor, GPS denied, image mapping.
Funded by the Air Force Office of Scientific Research (Arlington, VA, USA; www.wpafb.af.mil) and the Army Research Office (Adelphi, MD, USA; www.aro.army.mil), Avideh Zakhor and her colleagues have also developed sensor fusion algorithms that use cameras, lasers range finders, and inertial measurement units to generate textured, photorealistic, 3-D models that can operate without requiring a GPS.
To generate these models in a realistic manner, image data from cameras and laser range scanners must be fused with positional and temporal information. To capture image data, the backpack is equipped with three 1.3-Mpixel FireWire cameras from Point Grey Research(Richmond, BC, Canada; www.ptgrey.com). By mounting these cameras orthogonally, a 3-D image of the scene can be captured as the user traverses a building.
While these cameras only provide visual information about the scene, depth information must also be gathered. To accomplish this, three 2-D laser scanners are also mounted orthogonally to provide 3-D point cloud data maps by computing the pitch, yaw, and roll of the backpack. While two vertically mounted 10-Hz URG-04LX 2D laser scanners from Hokuyo (Osaka, Japan; www.hokuyo-aut.jp) are used to capture pitch and roll data at a 4-m range over a field of view of 240°, a horizontally mounted 40-Hz Hokuyo UTM-30LX 2-D scanner with a 30-m range and a field of view of 270° captures yaw information by applying scan-matching algorithms.