Software AutomationThe Bridge between Data Collection and Information Product

A 544Kb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

The bridge clearance diagram shown at right is a deceptively straightforward information product. This diagram provides a set of clearance measurements (i.e. the distance between the road surface and the underside of a bridge) on each lane marker with the minimum clearance designated in red. Geometric properties of the diagram convey bridge shape while the measurement arrangement designates the lane layout. What is not conveyed is the host of operations and millions of range measurements assessed to create an image so easily consumed by the observer.

The diagram shown at right is one of many information products derived from mobile scanning technologies. The advent of mobile scanning, in combination with affordable mass data storage, has made possible the means to document our world at unprecedented scale and speed. The applications fueling these technologies are also defined in terms of volume: hundreds of roadway kilometers, thousands of bridges and tens of thousands of measurements. Production and management of information derived from this raw data has therefore become a major challenge affecting the growth and adoption of mobile scanning.

Production of information from point cloud data as it occurs today is largely a manual, time-consuming process. Processing requires the storing and organization of many large disparate files, simultaneous visualization of data from multiple software tools, hand manipulation of data and customized generation of deliverables into a variety of formats. In small quantities, manual workflows are manageable; however, when volume is applied these methods become intractable. Large projects either require significant staff or long periods of time to complete, thus limiting the utility of mobile LiDAR. To realize the full benefit of mobile scanning, companies must learn to leverage automated methods of processing to achieve scale, reduce complexity and maintain the accuracy of information deliverables.

For example, to process the data for a single bridge clearance analysis, the bridge is first identified and separated from a point cloud. Environmental noise is removed and paint lines indicating lane markings are identified. Cross sections about each lane are extracted and the minimum distance between the road surface and bridge is determined for each lane. On average, five or more operations are necessary to generate each of the 6 to 18 clearance measurements per bridge.

Now consider the implications of project consisting of 600 bridges with the workflow previously described. The anticipated deliverable will have approximately 8,000 measurements where those measurements must be formatted, stored, and visualized into reports; GIS information must be correlated to the bridge; extra data (such as camera images) are added to the report; each deliverable must undergo QA/QC; and the entire result set must be properly organized and archived. Consequently, the time budgeted to manually generate reports of 600 bridge analyses span three to four man-months of effort. Considering the time required to scan 600 bridges takes approximately two weeks, processing can absorb 80-85 percent of the total project time.

With some states conducting surveys on a scale of 10,000 or more bridges, alternative processing methods are required. Software automation can provide a viable solution to manage workflow. Everything from data filtering, measurement selection, GIS correlation and result formatting can be handled with minimal user interaction. Automated methods have shown to reduce processing time to 40-45 percent of the total project time.

Automation offers capability beyond bridges; however, for automation to show benefit, input 3D and geospatial data must be reliable and consistent and output quality must be accurate and repeatable. Allpoint’s team has a decade of experience building automated 3D software tools and has derived the following principles for successful automation.

1. Incoming data has standardized formats, logging procedures, file naming conventions and file organizations.
2. The method for data collection is repeatable and predicable.
3. Objects of interest in the data should have distinguishable properties in the context of the final result (e.g. road surfaces).
4. Final deliverables are well defined to eliminate ambiguity or uncertainly.
5. There exists a quick and efficient process to verify deliverable integrity.
6. Automated failure points are identifiable and exceptions can be handled in a controlled and predictable manner.

In Allpoint’s experience as software developers and data processors, these principles have been demonstrated across a host of applications such as tunnel mapping, robotic wastewater pipe inspection and outdoor mobile LiDAR. Common to all these applications are large amounts of data logged, consistent acquisition processes, and target environments that lend themselves to modeling. The key to bridging data to a deliverable then becomes clearly defining the information products and ensuring the results can be efficiently verified.

Aaron Morris is the CEO and founder of Allpoint Systems, llc, a software development and data processing company with roots in robotics and automation. Allpoint’s software solution, the Perception engine, was developed to make possible data deliverables that would be difficult to do through manual processing. Their experience is in creating a hybrid software toolsets that mixes automation (for extremely tedious tasks) with streamlined user interaction (for efficient high-level decision making).

A 544Kb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE