A 4.192Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE
No topic in the remote sensing community was hotter in 2014 than Unmanned Aerial Systems (UAS) and it appears that will also hold true for 2015. Arguably the biggest UAS news over the past few months has been the commercial release of LiDAR systems for UAS, with a number of manufacturers succeeding in reducing the size of their LiDAR sensors so that they can be mounted on UAS platforms. Given that there is now LiDAR-based UAS it begs the question as to why one would even consider a photogrammetric UAS solution in which topographic data are produced from overlapping images?
Despite the advances made in shrinking LiDAR sensors, they are still larger and heavier than the cameras used for photogrammetric UAS solutions. LiDAR sensors thus require a larger UAS platform with greater lift capacity than a comparable photogrammetric UAS. Larger UAS platforms have an inherently greater risk associated with them. There is a big difference between a 1.5 lb. Styrofoam fixed-wing UAS loosing power and gliding to the ground versus a 17 lb. hexicopter with a LiDAR sensor falling from the sky.
LiDAR UAS solutions are also substantially more expensive and require a higher level of technical expertise to operate. With a lower purchase price, smaller and safer platform, and ease of operation, photogrammetric UAS options are an attractive solution to those seeking on-demand topographic mapping. This begs another question, are UAS photogrammetric solutions accurate enough? I will get to that answer later, but first a bit of a back story.
Commercial photogrammetric UAS have been around for a number of years. I first became interested in these systems a number of years ago when Hurricane Irene devastated Vermont’s transportation network by dumping inches of rain in a matter of hours, flooding roads, and leaving one town completely cut off from the outside world. Despite the vast network of satellite and aerial systems capable of delivering remotely sensed data, cloud cover and challenges in compiling a collection deck in a timely manner confounded acquisition.
Even if these issues had been resolved it was not clear that the data could have been delivered with requisite specifications within a timely manner. Fixing roads required mapping-grade, or in some cases, survey-grade data to make detailed measurements. Traditional field survey techniques were not possible in many cases due to the inherent dangers involved, and the extent of damage would have exhausted the number of surveyors available even if it were possible. In the absence of data, decisions, such as estimating fill volume for washed-out roads, were often made using the tried, but certainly not true, "ocular estimation" approach.
UAS certainly seemed to be the remote sensing solution needed for these types of situations. With funding from the U.S. Department of Transportation we set out to evaluate the ability of commercial UAS photogrammetric solutions to deliver GIS-ready data suitable for accurate measurements and mapping. We purchased a senseFly eBee, a lightweight (1.5 lb.) UAS with a 38 inch wingspan that uses a digital camera to produce orthoimages and photogrammetric point clouds in LAS format.
Generating these products requires accurate flight planning, to insure sufficient image overlap, and photogrammetric post-processing software. The eBee is accomplishes this through a tightly integrated workflow in which eMotion software is used to plan the flights (Figure 1) and Pix4D’s Postflight software is used to produce the 2-D and 3-D products (Figure 2). While any system requires some level of expertise it is hard to imagine making it any easier to generate photogrammetic point clouds than the eBee does.
Before I get into the accuracy of products from photogrammetric UAS it is important to discuss their capabilities and limitations. The eBee, and similar systems, are small, lightweight, and battery powered. Onboard systems track the location using GPS and gather key flight parameters such as wind speed. The FAA’s proposed small UAS 500 foot flight ceiling means that such a system could map several hundred acres in a 30-40 minute flight. By using a few batteries you can map a good-sized area, but you are not going to map your town (at least in a day) with one of these small UAS.
Now let’s get to the fun stuff–point clouds! Figure 3 shows a point cloud produced using the eBee workflow. As mentioned before the point cloud is in LAS format, which opens up traditional workflows developed for LiDAR data. Quick Terrain Modeler, Applied Imagery’s popular terrain analysis software was used to display the point cloud and perform the cross section profiles of the stream.
The point cloud looks stunning, but there are some obvious differences when compared to LiDAR. There are some data voids caused by shadows and there are no points under the tree canopy. The points are not data rich like LiDAR (e.g. no return information), but they are colorized automatically from the imagery, and unlike LiDAR, there are points for the water.
There are some obvious errors, but given that Figure 3 shows only 1/6th of a dataset gathered in a 30-minute flight, followed by 2 hours of post-processing with little user intervention, the results are impressive. Getting a point cloud for a couple of miles of river the same day you fly it would have been unheard of a few years ago.
How do photogrammetic point clouds stack up to LiDAR? Figures 4a and 4b show a UAS photogrammetric point cloud and a traditional LiDAR point cloud from a manned fixed wing collect processed to USGS QL2 specs for the same area. If you are having second thoughts that the point clouds are of the same area you are not the blame. The traffic circle seen in the UAS point cloud was constructed after the LiDAR was acquired.
This gets to one of the chief advantages of UAS, the ability to gather high-resolution topographic data when you need it, at a low cost. The UAS point cloud obviously has a higher point density, averaging nearly 50 points per square meter, compared to the nearly 3 points per square meter of the LiDAR data. Point density is, of course, not a definitive measure of quality. Despite that ground control points (GCPs) were not used, the UAS data were within half a meter of the LiDAR data on the horizontal plane, but the absolute vertical difference exceeded 50 meters.
GCPs narrowed the differences to tens of centimeters, both horizontally and vertically. GCPs do add another layer of complexity to the UAS data collection, and in disaster response, laying out GCPs might not be feasible due to time or safety constraints. For a number of use cases relative vertical measurements are what is needed.
With this in mind we measured a number of buildings that remained consistent in the two datasets (Figures 5a and 5b). Differences in the height measurements were always less than 30 centimeters, indicating that rapid accurate relative vertical measurements are possible using photogrammetric UAS workflows.
The title of this article was slightly provocative, and one could be tempted to slide into the LiDAR vs photogrammetry debate. Photogrammetric point clouds are certainly not LiDAR point clouds, but when factors such as cost, safety, timeliness, and acquisition area are considered, they might offer a superior solution. I believe that photogrammetric UAS will offer attractive solutions to many mapping and survey projects for years to come. Recent advances in photogrammetric UAS, such as the eBee RTK, which yields data with horizontal and vertical accuracies of less than 5 centimeters, will blur the lines between LiDAR and photogrammetic point clouds.
Disclaimer: The views, opinions, findings and conclusions reflected in this presentation are the responsibility of the authors only and do not represent the official policy or position of the USDOT/OST-R, or any State or other entity.
Jarlath O’Neil-Dunne is the Director of the University of Vermont Spatial Analysis Laboratory. He specializes in solutions that provide actionable information from high-resolution remotely sensed data.
A 4.192Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE