LiDAR Data Processing Drives Innovation and Increases Complexity

My early coursework in GIS and remote sensing was during my undergraduate years at the University of New Hampshire in the mid-90s. The computers, cutting-edge Pentium 386 workstations, could only process what we would now consider to be ridiculously small data sets. The software was all command line and I often marveled at the patience our professor had as we pestered him with syntax questions. Yet in many ways, it was a simpler time. For GIS you ran one software package and for remote sensing you ran another. Practically everyone in the geospatial industry used the same packages and if you could even spell GIS you were practically guaranteed a job.

When I first came to the Spatial Analysis Lab (SAL) at the University of Vermont as a graduate student in 2000 we were running those same two software packages and only those same two software packages. Fast forward eleven years and we now run twelve different geospatial software packages in the SAL, a 600% increase! LiDAR has been the driving factor behind this increase, primarily for two reasons: 1) we needed to work with point clouds, a format poorly supported by mainstream geospatial software and 2) the massive data sets produced by LiDAR meant that 32-bit software packages that are only single threaded, which was the norm until recently, no longer cut it. The rapid adoption of LiDAR as a remote sensing technology clearly took some of the established geospatial software companies off guard. I wont mention any names, but suffice to say that support for viewing and processing LiDAR data is still severely lacking in many of the well-established geospatial software packages.

The gap in capabilities served as a catalyst for innovation. Quick Terrain Modeler (Applied Imagery) grew out of a project at the Johns Hopkins Applied Physics Lab. LiDAR Analyst (Overwatch) and TIFFS (Global LiDAR) can trace their roots to academia. Merrick, a company that specializes in LiDAR acquisition, found existing solutions so lacking that they ended up developing their own product, MARS, which they now market and sell. In a similar fashion the USDA Forest Service developed a LiDAR software package called Fusion, which is made available to the general public at no cost. On the open source side, LAS Tools, developed by Martin Isenberg has some of the most well-respected point cloud processing tools around. The pace of innovation that LiDAR software has undergone in the past five years is something to marvel at.

Overall, the number of options one has with respect to LiDAR software has been of great benefit to the industry as a whole. A moderately experienced analyst can visualize and process LiDAR data sets that not many years ago would have required a team of programmers. As most of the LiDAR software packages tend to carve out a unique niche there is really no one LiDAR software package that does it all. Out of the twelve geospatial software packages we run in the SAL, three are LiDAR-specific, and each one has a unique place in our workflow. This creates a bit of a challenge for an end user, like me, as I have to learn and stay abreast of all of these wonderful tools. As someone who also teaches at the college level the challenge of the fragmentation of the geospatial software landscape is much, much greater. There are simply not enough hours in the day to teach our students everything they need to know about LiDAR, never mind teach them the ins and outs of the host of LiDAR-specific software packages they might use.

How will we meet this challenge, particularly as LiDAR is just one of the many new technologies that the well-rounded geospatial professional must become familiar with? I dont have all of the answers, but I do believe that closer collaboration between industry and academia is crucial. We need to hear from private industry what technical skills they value most in their employees. We need to work with software companies to make it easier to give all interested students access to cutting-edge software and take advantage of open source development. Finally, we need to constantly update our curriculum to reflect the rapid pace of change in the industry.

About the Author

Jarlath ONeil-Dunne

Jarlath O'Neil-Dunne ... Jarlath O'Neil-Dunne is a researcher with the University of Vermont's (UVM) Spatial Analysis Laboratory (SAL) and also holds a joint appointment with the USDA Forest Service's Northern Research Station. He has over 15 years experience with GIS and remote sensing and is recognized as a leading expert on the design and application of Object-Based Image Analysis Systems (OBIA) for automated land cover mapping. His team at the SAL has generated billions of pixels worth of high-resolution land cover data from a variety of aerial, satellite, and LiDAR sensors in support of urban forestry planning, ecosystem service estimation, and water quality modeling. In addition to his research duties he teaches introductory and advanced courses in GIS and remote sensing using ArcGIS, ERDAS IMAGINE, eCognition, and QT Modeler. He earned a Bachelor of Science in Forestry from the University of New Hampshire, a Masters of Science in Water Resources from the University of Vermont, and certificates in hyperspectral image exploitation and joint GIS operations from the National Geospatial Intelligence College. He is a former officer in the United States Marine Corps where commanded infantry, counter-terrorism, and geospatial intelligence units.
Contact Jarlath Article List Below