Random Points: Fly Me to the Gravel Pile!

A 988Kb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

My company, GeoCue Corporation, has been heavily investigating the use of small unmanned aerial systems (sUAS) for metric mapping operations. In fact, we will soon be launching a subsidiary company devoted to this business.

In my parlance, an sUAS is an autonomous aerial vehicle with complete sensor payload, fully fueled and coming in at under 1 kg of mass. I am particularly interested in this class of sUAS because (fingers crossed!), I think the Federal Aviation Administration (FAA) will carve out these very light platforms from the more onerous rules likely to be imposed on the more massive platforms. I used to call these "micro" UAVs but now I see a newly emerging class of UAVs with just several dozen grams of mass, fitting in the palm of the hand, which will define this category!

A Styrofoam sUAS in this sub 1 kg mass range will suffer much more damage than the human it might hit in an inadvertent collision–think Nerf plane. Figure 1 depicts an eBee sUAS from senseFly . This "foamie" is a complete aerial data capture system designed for serious mapping operations. It can execute approximately hour of mapping on a "fuel" load (a charged battery).

We first began investigating these platforms in 2011 as an economical way to obtain certain types of measurements. For me, the most exciting example is stockpile measurements for very localized areas such as gravel quarries and timber yards. In anticipation of this need, we added volumetric analysis tools to our LP360 product line.

The question remained–can a very light mass UAS with a non-metric camera produce results with sufficient accuracy to support creating accurate orthomosaics and point clouds? The short answer is an unequivocal yes! The longer answer is, "you have to know what you are doing."

Our industry has yet to name the point clouds that we can obtain by correlating homologous object space points in disparate images. In preparation for a talk I gave in 2012 at the European SPAR conference, Michael Gruber (Microsoft) and I decided to call it "Photo Correlated Digital Surface Models" or PCDSM. I hope someone else comes up with a much nicer name! A previous Random Points article was devoted to a comparison of PCDSM point clouds to LIDAR data.

In the previous issue of LIDAR News, John Raugust and Michael Olsen discussed the process of obtaining point clouds from multiple, disparate images ("Emerging Technology–Structure from Motion). Their focus was on the use of open source software for obtaining the Exterior Orientation (EO) of the camera stations and subsequently generating a dense point cloud.

If you are serious about investigating this as an augmentation to your business, I recommend you purchase a copy of PhotoScan by Agisoft (www. agisoft.ru). This $3,500 software package will save you considerable aggravation as compared to trying to "roll your own" from open source. If you are not comfortable purchasing from a Russian web site, you can purchase this software from GeoCue–just send a note to suas@geocue.com.

Collecting and generating data involves the following steps:
1. Plan the mission–this is usually done on-site with mission planning software mated to your sUAS. Site planning rather than office planning is recommended since the sUAS flight plan must accommodate local weather conditions (e.g. the currently prevailing wind).
2. Signalize the project area–place image identifiable ground control point targets and find their precise location using surveying techniques.
3. Upload the mission plan
4. Fly!–Well, it’s not that much fun. You will never collect a serious mapping project with joystick in hand. Sit back, relax and let the onboard autopilot do the work!
5. Load data into your favorite processing software (examples include PhotoScan, Pix4D, Correlator 3D and others)
6. Measure the signalized control (ground control points) in the images
7. Solve for the camera stations–This is analogous to classic Exterior Orientation and includes X, Y, Z (position) as well as Pitch, Yaw and Roll (attitude). In photogrammetry, we would call this the "AT" step. This is done via matching interest points in overlapping images and then saying "what possible 3D object points could have produced these 2D image points?" The result of this process is a "sparse" point cloud. These are actually the surviving matching points, projected to object space.
8. Solve for intrinsic camera parameters such as focal length, principal point offset and so forth. In practice, this can occur simultaneously with step 7
9. Create a dense point cloud. This is not a densification of the sparse point cloud (which is typically discarded). The dense point cloud is generated by an entirely different algorithm such as Semi-Global Matching (SGM). Most processing software will output the dense point cloud in LAS format with color values (RGB).
10. Create derivative products such as a "true" ortho. Note that, under the hood, this step is quite involved, requiring seamline generation and tonal balancing.

There are a number of commercially available software packages that combine all of the above steps into an integrated process. Examples include
SimActive: Correlator3D for UAS
Pix4D: Pix4D Mapper
Agisoft: PhotoScan

For those with much more time on their hands than money, a solution can be assembled with various "open source" software programs. This is still in its infancy so you will have to have a very high skill level and a tremendous amount of patience to "roll your own." Also note that the use of available "open source" is a bit risky since numerous patents and non-standard license agreements abound.

Of course, the question at the end of the day is, "how good is the solution?" When everything goes right, it is beyond fabulous! We have produced results using a $250 Canon Elph camera on a sub $1,000 quad copter that defy belief with accuracies on the order of 4 cm horizontal and 6 cm vertical over a 250 acre area. On the other hand, we have had models delivered to us for analysis that exhibit phenomenon such as model tilt, resulting in gross vertical error. Keep in mind that even if a dense point cloud and orthomosaic look absolutely stunning, they may exhibit gross relative and absolute error.

We have found that the software designed for the steps necessary to generate output products vary widely in both their performance and results. The fastest applications make use of NVidia Graphic Processing Units (GPUs) to accelerate processing. One of the applications that we have tested can generate a relative model using no a priori image locations at all while another application needs fairly accurate a priori X, Y, Z (from the GPS receiver on board the flight) as well as a seed elevation surface.

A flaw of all of the packages that we have tested is the lack of good orientation results analysis tools. Several of the packages have nice graphic displays that indicate tie lines between images and "error" residuals between a priori position estimates (what we call "given EO" in photogrammetry) and the resolved EO. Such plots and numbers are not very useful in determining the overall accuracy of the results. I think this lack of diagnostic information may be an artifact of the influence of robotics and computer vision on the algorithms; in computer vision, relative accuracy is often much more important that absolute.

There is also a significant difference in the quality of the final dense point cloud generation software. The extreme outlier points are easily identified in the resultant model. A more subtle consideration is the amount of noise in the data. Noise essentially means that the correlator is "drifting" above and below the true surface. Figure 2 illustrates the use of LP360 to analyze noise. I cut a series of cross-sections across a flat surface of the model (Map View in Figure 2).

There are two layers of LAS data in the project, one represented by red and the other by green. The LAS data are turned off in all views for clarity. In the Profile View of Figure 2 there is one pair of cross sections. Note the very high noise in the red profile (generated by software package #2) as compared to the much lower noise exhibited from the model generated by software package #1 (the green profile). It does little good to have absolute error of less than 5 cm vertical if the noise excursions exceed 12 cm (as is the case in our example).

The takeaways here are:
sUAS is a game changer. Don’t be caught unprepared.
It is very easy to generate bad data and not know it. Generating very high quality data is not difficult. Just learn how to run the software and, most importantly, how to analyze the results.
For any serious mapping project, you will need ground control points. Hire a professional land surveyor for this part if you are not qualified or do not have one on staff.
There is a big difference in output quality of the various commercial software tools and these differences are not necessarily correlated to price (I have not looked in too much detail at "open source" packages. I don’t have that much spare time!)
Recognize that it is technically illegal to commercially fly an sUAS in the United States for monetary gain. The rules are very, very fuzzy on the legality of your kid flying one as a hobby (perfectly legal) and giving you the images for free (the fuzzy part). We do expect the sUAS rules to be in place sometime in 2015. However, the genie is out of the bottle and sUAS will be introducing a big change in local area mapping.

In a future issue of Random Points, we will investigate, in detail, the use of sUAS imagery for performing volumetric analysis. This is a metric analysis area where a sUAS may provide the best possible solution.

Lewis Graham is the President and CTO of GeoCue Corporation. GeoCue is North America’s largest supplier of LIDAR production and workflow tools and consulting services for airborne and mobile laser scanning.

A 988Kb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE