CSIRO: Moving Mobile Mapping Indoors

A 1.266Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

Imagine a scenario in which you arrive at a client’s site for the first time. Within five minutes you have your gear unpacked and ready to scan. The client shows you around offices and meeting rooms, up and down stairways to access multiple floors, through their warehouse and unstructured outdoor areas. After forty minutes, you arrive in the cafeteria, where you instantly present a high-resolution 3D model of the site on your tablet to your guide over a cup of coffee. You hand over a USB stick with the data, shake hands, and head over to an industrial site across town for your next data acquisition session. Rather than spending days with a team of people moving heavy equipment around a building, a 3D point cloud is now generated in the same time it takes a single person to walk through a building.

How is this scenario possible without the use of high grade positioning from GPS/INS or the time consuming setup and surveying of target markers? While there has been rapid progress in the development of commercial outdoor mobile mapping solutions, transition to indoor environments has proven to be a far more challenging problem. Central to the development of mobile mapping solutions is the ability to determine the trajectory of the LiDAR sensor in order to project range measurements to a common coordinate frame. In outdoor systems, this trajectory can be determined directly from GPS/GNSS measurements, where short periods of no or poor GPS/GNSS information can generally be handled by the use of local inertial measurements to estimate location.

In the case of indoor environments however, there is typically either no or very poor positioning information available. While technologies for indoor location using RF or radar information are emerging, these typically have fairly poor precision (>1m error) and require significant additional infrastructure to be placed within buildings. Likewise, where inertial measurements are effective for estimating position over short time periods, the error drift over longer durations indicates that this is not a satisfactory means to obtain accurate trajectory information indoors.

Researchers at the CSIRO ICT Centre in Australia have been tackling this very problem of achieving accurate 3D-mapping in environments where GPS is unreliable or unavailable over extended time periods. Not only is the challenge to derive maps without the aid of additional external infrastructure for localization, it is also to acquire the required measurements via a hardware platform that is lightweight and portable–enabling a user to map while walking through any indoor environment.

The CSIRO solution that enables this vision is a handheld 3D mapping system called Zebedee. Zebedee consists of a lightweight LiDAR scanner with 30m (100ft) maximum range and an industrial-grade MEMS inertial measurement unit (IMU) mounted on a simple spring mechanism (Figures 1a and 1b). As an operator holding the device moves through the environment, the scanner loosely oscillates about the spring, thereby producing a rotation that converts the LiDAR’s inherent 2D scanning plane into a local 3D field of view. With the use of proprietary software, the range measurements can be projected into a common coordinate frame to generate an accurate 3D point cloud in real-time. To achieve this result, the software must estimate the six degree of freedom (6DoF) trajectory of the scanner on top of the spring solely from the available range and inertial data.

An example point cloud produced and demonstrated at the recent SPAR International 2012 Conference venue in Houston, Texas, is depicted in Figure 2. Data collection occurred over three twenty-minute sessions during normal conference hours, with the operator walking though variously sized rooms and hallways, going up and down stairways and escalators, and transitioning between indoor and outdoor environments. The high level of robustness developed into the system ensures that the presence of moving people and traffic in the busy conference environment does not adversely affect the solution. In the zoomed in detail, one can see the unique scanning pattern from the oscillatory spring motion that allows the operator to capture areas with minimal shadowing in the data. A short video demonstrating the operation of Zebedee is available at http://youtu.be/Uj9BKcnXOyo.

Simultaneous Localization and Mapping
The challenge of concurrently building a map and estimating motion in an unknown environment without an external reference system is a well-known problem in the robotics community called Simultaneous Localization and Mapping (SLAM). The CSIRO team has been developing large-scale SLAM solutions for several years across a range of applications from mining and industrial sites, to forestry and natural environments. The general operating principle behind Zebedee can be understood by considering the simplified example of a range sensor measuring the surface of a flat wall. As the sensor moves towards the wall, the range measurements associated with that surface decrease in magnitude. The motion towards the wall can therefore be inferred by the observed change in distance to the wall. By integrating thousands of similar relative observations of many surfaces over time, and making reasonable assumptions about the platform’s dynamics, the 6DoF trajectory of the sensor can be estimated with considerable accuracy. The oscillatory behavior of Zebedee ensures that surfaces are re-observed at a sufficient frequency to estimate the sensor motion while the operator moves at a walking pace.

Zebedee’s ability to self-localize enables its use in indoor, underground, and otherwise covered environments such as dense forest. In contrast, traditional mobile mapping systems utilize independent positioning systems, which in most cases consist of a combination of GPS and highquality inertial sensors or tripod mounted tracking systems. Systems relying on GPS will not function indoors over the long-term–even the available high-end inertial systems will drift significantly over time without an external reference, and tripod-based trackers are limited by range and line-of-sight.

Mobility
As a handheld sensor, Zebedee can go almost anywhere a human can. Its simple mechanical design results in a lightweight package of fewer than 400g (well less than one pound). In addition to the handheld hardware, a data acquisition computer and small battery (sufficient for a full day of operation) can be stored in a wearable unit, such as a backpack. The handheld device can also be attached to a pole, which can extend the sensor’s view to beyond the operator’s reach.

Unlike wheeled mobile platforms, Zebedee can be operated on stairways and on rough terrain, allowing for seamless mapping between levels, interiors and exteriors, and multiple buildings. Our proprietary SLAM algorithm does not require any particular artificial structure such as flat walls, floors, or ceilings; therefore, both natural and built environments can be mapped.

Versatility
The mathematical framework behind the Zebedee solution is derived from an earlier system the team developed for both actuated (e.g., rotating or nodding) and trawling (i.e., rigidly mounted and "painted" across the environment) 2D LiDAR sensors. These systems had previously been demonstrated in a variety of environments including mapping underground mines, city streets, building interiors, and industrial sites. Even in the case of Zebedee, where the input sensor motion is non-deterministic, the software is robust to anomalies and variations in operator style. The mechanical design intentionally regulates the range of possible output motions, resulting in a suitable surface re-observation rate for the accurate estimation of the scanner trajectory by the software. The Zebedee system has been trialled with multiple new operators, who in all cases, after receiving only a few minutes of basic instruction, were able to successfully produce quality 3D maps.

Data Quality
The Zebedee software is capable of estimating the scanner location to sub-centimeter precision; however, the point cloud precision is currently limited by the centimeter-scale range error of the Hokuyo scanner used on the device. Therefore, the overall data quality is less crisp than point clouds produced by typical high-end terrestrial scanners or mobile mapping vehicles–but it comes at orders of magnitude lower cost both in terms of hardware requirements and acquisition efficiency. Post-processing can further improve the point cloud quality relative to the raw values. As scanner technology improves, more accurate, lightweight, and low-cost LiDAR scanners are expected to come on the market and can readily be incorporated into future Zebedee models.

Computation
While the current data acquisition system can operate with fairly minimal computing power, the processing software is run post-acquisition on a standard consumer laptop. Efforts are currently underway to migrate the processing software from a prototype phase to a more efficient productized version capable of running in real-time on the acquisition computer while data are collected. This improvement would allow the system to provide instant feedback to the user to help improve operator performance and guarantee that the coverage and data density meets the application requirements. The full realization of the vision is for the entire system to be run on a small embedded processor in the device itself, producing maps in real-time as the data are collected.

Mobile Mapping Anywhere
The ability to rapidly generate 3D maps in any environment, via a lightweight, low-cost hardware platform, has the potential to dramatically increase the use-cases for 3D mapping. While many applications still require survey-grade (millimeter) precision, there are many emerging applications where centimeter-precision is more than adequate. Many tasks that are currently manually carried out such as infrastructure asset audits, damage assessment, or monitoring of construction progress, could all be achieved semi-autonomously with the type of technology that has been described in this article.

Likewise, the move to lower-cost platforms means that service-based models could start to become more viable for a range of emerging markets around 3D mapping. As more and more of the built infrastructure around the globe could become mapped from the inside and outside, this information could provide enormous value in future planning, design, and management of urban environments.

Robert Zlot, Michael Bosse, Tim Wark, Paul Flick, and Elliot Duff are with the CSIRO ICT Centre’s Autonomous Systems Laboratory in Brisbane, Australia. The research team has between them decades of combined experience in the area of Field Robotics. The technology behind Zebedee was derived from real-time mapping systems originally developed for autonomous vehicle perception.

A 1.266Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE