Establishing Requirements, Extracting Metrics and Evaluating Quality of LiDAR Data

A 4.607Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

Certainty 3D LLC has been on the receiving end of many LiDAR projects. The data typically shows up via a FedEx delivery of a hard drive from a mobile LiDAR system with which we have had no direct contact. All we typically receive is data, the requirements for extracting a 3D model from the data and assurances that the data is correct.

With a typical laser scanning project requiring several to tens of man-days to extract high quality models, the Certainty 3D team has developed practical tools and workflows to first establish project requirements, then extract metrics and finally evaluate LiDAR data quality, prior to modeling the first break line. These processes have proven extremely successful. Thus we have documented and made them available to the entire LiDAR community on our C3D University website collection of "TechNote" whitepapers at http://www.certainty3d.com/pdf/technotes/TopoDOT_TechNote 1021.pdf.

This TechNote (#1021) is fundamentally written from the perspective of a "Customer" acquiring LiDAR data from the "consultant" who is running the system, field operations and data acquisition. Having received data from many data providers over the years, Certainty 3D was fundamentally in the same position as the Customer. Thus the tools and workflows we developed over the years have been focused on answering the Customer’s fundamental questions: "How do I place requirements on LiDAR data?" and "How do I know if the data I receive meets those requirements?"

The motivation behind TechNote #1021 is to support the LiDAR community by increasing market demand. Educating potential Customers and providing intuitive and welldefined tools and workflows will serve to increase demand for LiDAR data throughout the downstream design, engineering and construction processes.

Thus, TechNote #1021 essentially reduces the seemingly overwhelming prospect of placing requirements on LiDAR data quality to establishing just ten (10) parameters over five (5) distinct categories. Having established these requirements, proven processes are provided demonstrating how the corresponding metrics are extracted and compared to each parameter in order to assess quality. Well educated Customers with the capability to define and assess data quality will instill more confidence in LiDAR services procurement.

Origins of Data Uncertainty
TechNote #1021 starts out with an introduction to the basics of LiDAR technology and the source of uncertainty in LiDAR data received from the provider. The analyses are not rigorous while the intention is to provide the reader with an intuitive and basic understanding of the numerous potential sources of data uncertainty.

We begin by defining the uncertainty of a specific point within a "point cloud" as being comprised of components; specifically a random and systematic component. So the uncertainty, , of any point is given by:

total = random + systematic

Thus total is some expected distance between the data point and point’s actual location in the scene. One should think of random error as being caused by anything resulting in the oscillation of the data about a relatively consistent mean value. Systematic rangefinder error can be thought of as an "offset" in the data. A discussion provides insight into the sources of both types of error thereby providing the starting point for the introduction of establishing LiDAR data requirements.

Establishing LiDAR System Data Requirements
TechNote #1021 provides the Customer with a clear roadmap to establish data requirements consistent with meeting overall project objectives. These requirements are clearly defined along with the intended process for their respective evaluation.

A LiDAR data set is generally comprised of three components: 1) point cloud data, 2) survey control data and 3) calibrated images (if available). One of the highlights of TechNote #1021 is to reduce LiDAR data analysis to only six characteristics. They are:
Scan (static) or Flightline (mobile) alignment
Survey control alignment
Calibrated image alignment
Random noise
Point density
Coverage

The first three characteristics: Scan/ Flightline alignment, Survey control alignment and image alignment collectively establish a traceable lineage between the images, point cloud and survey control coordinates. The latter three: Random noise, Point density and Coverage pertain to LiDAR data characteristics necessary to assure extracted features, measurements and/ or models will be of sufficient fidelity to meet project requirements. Moreover, TechNote #1021demonstrates that these characteristics are easily extracted, quantified, and interpreted.

Extraction of Data Characteristic Metrics
TechNote #1021 provides a detail description of workflows, software tools and other relevant information to extract quantifiable data metrics for comparison against requirements in each characteristic.

Assessing Lineage from LiDAR Data to Survey Control Data
Quality assessment begins with three alignment evaluations: Image, Scan/ Flightline and Control Survey establishing a lineage from each LiDAR data component to the reference survey control data as illustrated below. This lineage is of critical importance since survey control data is the only data meeting acceptance criteria long established within the Federal Geographic Data Committee standards (https://www.fgdc. gov/standards) for spatial data accuracy. Thus TechNote #1021 organizes the evaluation of LiDAR data in a way which is clearly traceable back to the control survey data.

Assessing Data Characteristics
After establishing methods to produce a traceable lineage between LiDAR data and survey control, the focus of TechNote #1021 moves to the characteristics of: Random noise, Point Density and Coverage. These additional tools and workflows provided in TechNote #1021 are designed to assure data characteristics will support feature identification, measurement and model extraction consistent with project requirements.

For example, TechNote #1021 provides simple and effective methods for quantifying the random noise or "fuzziness" of the LiDAR point cloud data. Moreover, the method for establishing the random noise requirement is provided along with the very intuitive reasoning supporting establishment of an appropriate level of acceptable random noise.

Point cloud density is another critical data characteristic. The correct density for a particular project depends roughly on the size of the smallest feature to be extracted. More specifically, the surface area of the feature must be sampled at a sufficiently high density such that this feature may be extracted to an accuracy meeting project requirements.

TechNote #1021 recognizes the complexity of establishing the "correct" level of point density. The appropriate point cloud density depends on the geometrical structural cues of the feature as well as the techniques employed in identifying and extracting the feature from the LiDAR data. Complex and rigorous analyses for calculating the point density are avoided in TechNote #1021 through reasoned practical approaches balanced with the current expectations of modern LiDAR system performance.

As the tools and methods described in TechNote #1021 focus on the effective and "practical" assessment of LiDAR data quality, it recognizes that certain flexibility within each evaluation is necessary. For example, within the context of Point Cloud Density the topic of range "shadowing" arises and is discussed in great detail. "Shadowing" is caused by a fixed object along the corridor blocking the beam thus leaving a shadow behind wherein the point cloud density is greatly diminished or missing altogether. TechNote #1021 actually provides effective means of combining very quantifiable concepts such as point cloud density with requirements and evaluations on effects such as shadowing.

TechNote #1021 ends with a discussion on the requirement of data "coverage" The coverage requirement serves as . a single collective summary of the LiDAR project requirements. Fundamentally coverage is then defined as the data boundaries of the LiDAR project and what individual requirements are placed on the data characteristics and alignment within the boundary.

As discussed in the beginning of this article, TechNote #1021 was written from the Customer’s perspective. Thus it should be noted, that throughout TechNote #1021 an attempt has been made to provide recommended text for inclusion in a Request for Proposal (RFP) format. Everything in this forty plus page document is designed to assure that prospective LiDAR data customers have the understanding and tools necessary to place requirements on LiDAR data and assess data quality against those requirements. Certainty 3D has made the tools and methods developed and proven over many successful projects available to the entire LiDAR community.

Ted Knaak founded Riegl USA in 1993 and in 2011 he founded Certainty 3D, a company focused on data processing software and technology solutions for the laser scanning industry.

A 4.607Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE