A 1.946Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE
The year was 1981. The Dodgers had taken the World Series from the Yankees in Game Six at Yankee Stadium. The Go Go’s topped the billboard charts with "Our Lips Are Sealed" and I became an aerial , photographer. My father had decided that my summer was better spent working for him than lying around on the beach…and I kinda owed him for driving his van through a fence and into a cabbage patch while learning to drive. His business was Pacific Aerographics and I was now employee number 3.
It actually was a great job and I was hooked after seeing my first stereo pair of Disneyland using a pocket stereoscope.
So I learned the profession flying up and down the coast of California, inland and occasionally over into Arizona and Nevada. Years later after gaining experience as a surveyor in the Army and my father’s passing in an accident, I was offered an opportunity to learn photogrammetry from a client of ours Rattray and Associates. Tom Rattray the owner sent me off to Cal State Fresno for a six week course sponsored by one of the equipment manufacturers WILD who made analytical plotters for photogrammetry.
Photogrammetry is fascinating to me. Defined by the ASPRS as, "The art, science, and technology of obtaining reliable information about physical objects and the environment through processes of recording, measuring and interpreting photographic images and patterns of recorded radiant electromagnetic energy and other phenomena." Without all the fluff it’s accurate measurements from imagery.
What made it really interesting to me was the use of mathematical principles, formula and processes to yield a visible three dimensional result. Until then my experience with math in the real world revolved around turning angles, measuring heights and recording data into a notebook. Surveying in the Army was a great opportunity, but it did not provide that visual connection that one side of my brain craves. It did however lay the groundwork for my understanding and interest in photogrammetry.
At that time aerial mapping cameras had to be calibrated to ensure all errors from lens distortion were accounted for. The USGS gave them an Area Weighted Average Resolution or AWAR rating–the higher the better. After film processing, diapositives (positive images from negatives) were then created on glass plates and later Mylar film. This was done to minimize stretching and warping due to heat, misuse and time. Error in both the camera system and the resultant image processing had to be minimized and accounted for.
The images were then placed in optical mechanical or analog plotters used to reconstruct the physical geometry with which they were captured. In Image 2 you can see the glass plates above and the rods coming down to the platen that the operator uses to draw the projected information.
If anything it is a good example of this mechanical reconstruction. In the 1970’s the analytical plotters such as the one shown in Image 3 combined early computing and more refined optics to shrink things down quite a bit.
Later in the 1990’s computing power and quality scanners allowed us to move to the digital photogrammetric workstations aka soft copy workstations we now use.
Through all of this advancement nothing really changed except the size and space requirements of equipment needed to reconstruct the physical geometry of the images at the time of acquisition.
In parallel, advances in computing power lead to the development of software to do some of the heavy lifting. Software could calculate interior orientations, relative orientations…then do bundle adjustments to correct lines and blocks of imagery. The amount of time and human effort needed to take imagery to map ready stereo models was drastically reduced. Taking this leap allowed the map compiler to concentrate on the extraction of information.
"Y2K!" was going to be the end of the world. To quote Bill Murray in Ghostbusters "Human sacrifice! Dogs and cats, living together! Mass hysteria!"– nothing happened. Well not "nothing" we , began to see the widespread incorporation of airborne GPS to aid in positioning which reduced the amount of ground control needed and Inertial Measurement Units to help solve for the pitch, roll and heading of the aircraft during flight. Using these navigation tools it is possible to reduce some of the equations needed to reconstruct the aircraft and camera position. Remember while computers existed, huge processing workhorses did not and anything that could be done to reduce processing workload resulted in time and cost savings.
Large format digital aerial mapping cameras were introduced around 2000 and became widely accepted in 2005. This brought about a big change in our profession. The large format made digital cameras comparable to traditional analogue cameras. Resolution became comparable as well. Rather than one lens as in analogue cameras, we were introduced to multiple lens configurations–3, 4, 5, and 6 lens combinations or pushbroom sensors that rely heavily on GPS and the IMU. The UltraCAM D could capture 86 Megapixels per exposure in 2003. In 2009 the UltraCAM XP increased it to 196 Megapixel. While analogue mapping cameras continue to operate in large numbers, they are no longer produced.
LiDAR also really caught on in the 2000’s and began to give photogrammetry a run for its money for at least the creation of surface models. Back then it was meters per point not the points per meter we have today. Elise MacPherson, one of the original data processors at Optech and former Director of Business Development of Airborne 1 recalls in 2004 having to do 5 or more passes over an area to achieve 8 points per square meter with the equipment they owned. Today airborne LiDAR systems are capable of far greater than that with a single pass.
Of course altitude, speed and beam width all affect point density. With this greater density came the ability to extract features similar to what we do with aerial imagery and stereo models only with point clouds in what Lewis Graham of GeoCue described as "LiDARGrammetry".
Thanks to advances in computing hardware, LiDAR and processing algorithms developed to take points and create surface models, and some outside the box thinking from within and outside our profession we are seeing a renaissance in photogrammetry. After all as Dr. Ricardo Passini of BAE Systems once pointed out to me in the early 2000’s, pixels (from imagery scanned or digital) are just points and with enough computing power we can solve for the location of all of them.
At the time we were a little short on computing power, but within a couple of years BAE released NGate a tool that did just this. NGate is the predecessor to tools such as Pix4D and other image based point cloud rendering tools. Software developers such as Pix4D are taking photogrammetry to the masses with simple users friendly processing and work flows bundled with many small UAS systems. Output can be ortho photos, image based point clouds and now even NDVI classified data. What it took was a spark to bring development toward this pixel-based mapping. Those working in close range photogrammetry and 3d modeling really took this on and made it happen. Rendering or modeling with imagery has really taken off.
All of this is being done without the use of GPS or IMU data. Original image positions are now calculated based on the position and overlap of other images. Basically, we are backing in the original camera positions via the software.
Doing all of this without the benefit of GPS and an IMU is not new to those of us in photogrammetry because that’s how it was done prior to their invention. Getting into the math is not the point of this article…only that it was done that way then and with today’s computing power it can be done much more quickly and accurately now.
This is one of the reasons we are seeing this renaissance of photogrammetry. The other is a new acquisition platform–the Unmanned Aerial System (UAS). Instead of walking around an object or structure and taking tons of photos that need to be registered, then turned into a surface and rendered, the camera can now easily and affordably be maneuvered above and around objects giving greater freedom of coverage through the use of a UAS.
So we are coming full circle with photogrammetry and while it may not be nearly as hands on, the principals remain the same. Computing power and software do the heavy lifting generating products combining the best of photogrammetry, imagery, orthos, and so on as well as the point clouds similar to LiDAR. With the introduction of UASs into our profession we will likely see even more use of photogrammetry as a means to producing data and deliverables without the added cost, weight and complexity of hardware such LiDAR and IMU’s.
A UAS suitable for mapping arguably can cost as little as $1200 with the DJI Phantom being an example. The DJI Phantom, although low priced should not be considered a toy. It has full waypoint capability, a 14 mega pixel camera with live feed for viewing what you are capturing and GPS positioning. With 20 minutes of flight capability these may not be practical for traditional mapping projects, but you will see it used for modeling and mapping very small areas. There are other offerings in the multi rotor space that are fully capable of small to medium scale mapping projects.
Then there are the Delta winged systems such as the SenseFly eBee and Trimble’s Gatewing product capable of fully autonomous flight. You program in the path, make sure the area is clear and let it go do its thing. These cost a bit more, but are certainly "out of the box ready" to do some serious mapping. The DJI, the Sensefly and the Gatewing all weigh 5 lbs. or less. They all land on their belly, either softly or in a skid.
Riegl and Velodyne offer LiDAR solutions for UASs, but the added cost and weight of a LiDAR unit, high end GPS and IMU remain prohibitive for most early entrants into this market. In addition there is the "pucker factor" . How confident are you that your UAS isn’t going to descend abruptly and jar or completely destroy this delicate equipment? A $20,000 UAS might be an acceptable risk. A UAS with $40 to $200k of LiDAR sensor hanging beneath it is a whole different story. Simultaneous Localization and Mapping (SLAM) technology is a promising alternative to the traditional IMU and it may help usher in the next wave of LiDAR sensors into small UAS space.
Of course all of this is moot at the moment while we wait for the FAA’s ruling on UAS integration later this year. However, the desire for an alternative to expensive aircraft mobilization costs as well as new needs based on the low flying capabilities of UAS will push this technology along regardless. It will also help drive reduction in costs and miniaturization of LiDAR and IMU technology, at least until someone other than the government can affordably and legally fly a large UAS.
Eric Andelin CP, GISP brings 28 years experience to the mapping profession. He is the manager of Geospatial Business Development for Surveying And Mapping Inc., (SAM, Inc.) in Austin, TX. Andelin also serves on the MAPPS Board of Directors and on the MAPPS Transportation and Infrastructure Committee.
A 1.946Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE