Airborne LiDAR Sensors: Increased Repetition Rates Present New Data Quality Challenges

As LiDAR sensor technologies have evolved the profession as realized a number of benefits. Most noticeable is the increase in repetition rates. When I started in the commercial LiDAR industry in 1996, the company I worked for had a 5Khz LiDAR sensor. Now we routinely use sensors capable of collecting at rates for 500Khz.

It should be noted that just because a company has a 500Khz sensor doesnt mean the sensor will be operated at that rate. There are altitude limitations that need to be considered. Currently, Quantum owns Leica, Optech and Riegl LIDAR sensors. All of these manufactures have made tremendous advances in their designs over the past 10 years and we are able to use our sensors for a wide range of applications. As with any technology there are real life challenges and limitations of these sensors that as professionals we would rather not have to deal with.

One of these challenges is range Jitter. This occurs when the ranges within a LiDAR scan do not agree with corresponding ranges. It is my experience that range jitter is a developing challenge, not one that is present with a brand new sensor but it develops over time as the laser degrades. It can vary based on the use of the sensor and should be monitored on a regular basis.

Range jitter can occur within a scan, between scans or when you have a split beam sensor. This occurrence can vary between sensors of the same make. This problem should not be considered fatal as long as it is addressed and monitored. Range jitter will not be noticeable or very apparent with low nominal point spacing (NPS) collections but you can be assured that it is still in the data if the issue has not been addressed. Low nominal point spacing should be classified as any point spacing of 4 points per meter (ppm) or less.

It is easier to identify range jitter at a ppm of 8 or higher. We discovered range jitter by accident when we flew a job at 32 ppm. We discovered there was excessive noise in our LiDAR data which caused us to investigate what was going wrong and why it was happening.

Once we identified the problem, we flew a series of test flights in several different configurations to determine what was causing the problem. We wanted to fix the data already collected and determine how to fix the problem so it doesnt happen again. What we found was enlightening and of concern because we had never seen this before and this was different than similar range jitter issues that had been caused by bad intensity tables as a result of very low reflective surfaces.

What we were seeing was a difference in return information outside the tolerances of most of the specifications required in all of our contracts. We were told that the noise we were seeing was well within the manufactures specifications. The issue was that these manufacturer specifications were outside of our clients specifications resulting in an extremely messy data set that would cause heart burn to most all clients.

For example: the USGS LBS v1.0 specifications require a relative accuracy of 7 cm within scan and 10cm between scans. According to the draft USGS LBS V1.1 specifications the quality level (QL) 1 and 2 offerings will require 6cm within a scan and 8cm between scans and a maximum allowable difference in overlapping data of 16cm. We were seeing differences much greater than this.

The data collected for the said project showed the following symptoms. We saw range jitter in one of the channels (or scans) from a split beam sensor. The range jitter has no bias and varied between 10 and 17 cm within the scan with no detectable systematic nature to it. Additionally, we saw the range jitter between the channels (or scans). After further research we determined that Channel A (scan A) had no significant detectable range jitter, or was well within acceptable tolerances. The range jitter in Channel B (scan B) seemed to be causing the issue between channels as well. The following is a representation of the range jitter (figure 1) between channels and within the channel.

Figure 1 shows the range jitter across the LiDAR scan comparing the differences between Channel A and Channel B. The scan indicates the level of difference as it relates to the scan in meters (15cm to -15cm). You can see that most of the data is not of concern. The areas of water are not an issue either because of the variance in return frequency in these area.

The areas of red are of great concern and need to be addressed. Areas of bright yellow are also areas were we wanted to address and tune the sensor because they are right on the edge of acceptability. This profile represents the worst area with the largest magnitude of range jitter.

As discussed previously the range jitter is a result of degradation of the laser or the performance of the laser. This doesnt mean the laser is bad or will not provide the required results. It means that the sensor should be re-calibrated to adjust for the change in the laser performance.

What we found regarding Can we fix this so it doesnt happen again?, was that if we adjusted the desired SNR (signal to noise), adjusted the gain setting and tweaked the threshold discrimination settings we could reduce the range jitter to an acceptable level. Putting this result through our standard processing procedure we did have a data set that well exceeded our clients specifications and expectations. It should be noted that in extreme cases the manufacturer could adjust the voltage bias to help correct this issue.

This answered the question as to if we could adjust the sensor and get good data for further projects, but could we fix the data we already collected? The short answer was yes. Since the Channel A data was good and showed no significant range Jitter we could use this data to reference the noisy data in Channel B. and fix only that data. This would be similar to creating a localized geoid model for transforming the data but in this case the amount of points is exponential compared to a geoid model adjustment. The following are two surface models, one showing the data before the correction (figure 2) and one after the correction (figure 3).

What I have learned from this experience and several others like it is that not every challenge will be vetted by the manufacturer. We as users of the sensors will continue to experience challenges from new technology and we will need to be vigilant to make sure that in the final analysis we meet our clients requirements and specifications. The manufacturers and the profession continue to push for the latest in technology which is good, but we should proceed with caution knowing that this technology has limitations. LiDAR sensor manufacturers are beginning to understand that we work in the real world where conditions are not as controlled as they are in the test labs. By identifying issues like range jitter we can provide the manufacturers with the data they need to improve the technology.

About the Author

Guest Contributors

Guest Contributors ... Articles from Experts in the LiDAR community
Contact Contributor Article List Below