LIDAR Magazine

Improving Airport Safety with Obscurant-Penetrating Lidar

A 1.062Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

It’s the airport equivalent of looking for a needle in a haystack, and the stakes are incredibly high. Get it wrong and there’s a risk of serious accidents, with people injured or killed, and possibly millions of dollars of damage to aircraft or airport facilities and untold indirect costs for affected airlines.

No wonder airports are turning to space-age technology for help.

In June, the Technische Universitt in Dresden, Germany selected Neptec Technologies Corporation’s OPAL-360 3D laser scanners–which are based on technologies originally developed for the International Space Station and the U.S. Space Shuttle–for an airport tarmac safety project in Germany. The research project, led by Professor Dr. Hartmut Fricke, Dean of the Faculty of Transportation and Traffic Sciences at TU Dresden, is looking at innovative ways to improve the safety level of airport operations with a real-time 3D point cloud surveillance and visualization system for the airport apron controller.

The apron, or ramp, of an airport is where aircraft park and are refueled, loaded with cargo and boarded by passengers. It’s an extremely congested and highly dynamic area exposed to many different objects and participants, all very different in size and behavior, concentrated in a very limited space and operating in all types of weather. The risk of an accident, whether between two aircraft or between an aircraft and a ground vehicle, or caused by foreign object debris (FOD)1, is high. So, too, are the potential costs.

The Flight Safety Foundation estimated in 2007 that ramp accidents were costing major airlines at least US$10 billion every year in direct and indirect costs, with about 243,000 people being injured.2 Meanwhile, Boeing has estimated that damage caused by FOD is at least US$4 billion a year, although a more recent analysis by consulting group Insight SRI suggests the figure could be considerably higher.3 It’s easy to see why the U.S. National Transportation Safety Board has included airport surface operations on its 2013 most-wanted list4 of transportation industry improvements. The Federal Aviation Administration has also included improving airport surface operations in its top four priorities of its NextGen program.5

Apron management is usually performed by the airport ground or apron controller, who relies on a direct line-of-sight to the apron, sometimes enhanced by video cameras or high-precision radar. The controller’s view, however, tends to be sensitive to weather and lighting conditions, or–with radar–is affected by shadowing effects, multipath propagation and unwanted reflections.

The researchers at TU Dresden propose to use 3D point clouds and real-time processing to automatically classify and track objects on the apron. A key requirement of any such system is the ability to generate these highresolution 3D point clouds in real-time without significant gaps in the data. Multi-sensor integration and a range of 200m are also needed to cover the airport’s core zones–where the widest range of safety-critical activities usually takes place (such as aircraft turnaround, aircraft movements and apron taxiing). Conventional 360-degree LiDAR sensors targeted at the autonomous car markets do not have this level of performance, but the Neptec OPAL-360 scanner selected by TU Dresden does.

Automated airport surveillance systems also need to be able to operate in harsh environments, extreme temperatures and poor visibility caused by all types of weather conditions, including snow, rain, fog and dust.6 While other vision systems using conventional video cameras and laser scanners cannot penetrate very far in these obscurants, the OPAL-360 scanner can.

OPAL ("Obscurant Penetrating Auto-synchronous LiDAR") is a dust-penetrating technology originally developed for helicopters landing in the desert where dust or sand thrown up by the rotors can quickly reduce pilot visibility to zero, but OPAL also works for other obscurants such as fog, rain, snow and smoke. The OPAL uses a patented detection method based on LiDAR waveform and advanced temporal and spatial filtering and is the only true "see through dust" LiDAR technology in the market today that operates in real-time with no post-processing.

As a rule of thumb, the OPAL sensor performance in obscurants typically corresponds to a penetration distance equivalent to two to three times the visibility range of the naked eye within the area of interest. The advantage of the sensor over the eyes or a video camera is that it provides 3D data that can then be used for higher-level processing, such as measuring the dimensions of an object, using 3D information to identify changes in an environment and tracking an object with high-precision based on its 3D shape. These advantages are key for the apron controller, who needs to be able to identify potential hazards that could result in either the collision of aircraft with vehicles or pedestrians, or damage caused by FOD left on the tarmac.

The TU Dresden research project is developing methods to provide the apron controller with a 3D visualization of the apron along with alerts of potential hazards using real-time 3D LiDAR data.

In a first step, the data collected will be compared to a 3D mapping of the apron’s plan surface and associated terminal buildings in order to detect the 3D changes seen in the current scan. In a second step, the detected 3D changes will be assigned to a class. This is done by comparing the 3D characteristics of extracted changes with a knowledge database containing dimensions and contours information of objects typical of the area of interest (AOI), such as vehicles and aircraft. For each detected object entering the AOI, a timeline is initiated and the object is tracked until it leaves the AOI. The surveillance concept envisaged for FOD detection will, in part, use a 3D LiDAR scanner mounted on a mast or a roof of a terminal building. A major criterion to attain a high probability of FOD detection in the AOI is a thorough analysis of the optimal field of view. A LiDAR can only detect objects that are within its line-of-sight. The field of view should be such that it minimizes shadowing, including shadowing caused by aircrafts.

TU Dresden selected Neptec’s OPAL-360 sensor because of its large, panoramic field of view, longer range options and, in particular, its nonoverlapping scan pattern that avoids creating "blind spots" when the scanner is stationary compared to conventional 360o laser scanners designed for autonomous vehicles.

The importance of having automated scanning systems for FOD cannot be underestimated. According to Insight SRI, airports conducting only mandatory visual inspections of runways tend to find one piece of FOD on the runways every 60-70 days, while those that use automated systems find one piece of FOD every two days.7 That’s because FOD moves, particularly if it’s blown about by the wash from a jet engine, so it’s essential that airports are able to locate FOD in real time, track it and remove it before it has a chance to cause an accident or injury. And they need to be able to do that at any time, in any weather.

That’s what the TU Dresden researchers will be testing at Dresden airport, using a sensor suite including the Neptec OPAL-360 3D LiDAR. It’s another example of how 3D laser scanners are increasingly finding new applications beyond traditional survey and mapping. And it’s another example of how Neptec Technologies is adapting mission-critical space technologies for terrestrial applications.

Endnotes
1 "FOD is any foreign object that does not belong on the runway, taxiway, or ramp area. FOD can cause damage to aircraft, and in rare instances, cause an accident. Typical FOD items are aircraft parts, tire fragments, mechanics’ tools, nails, luggage parts, broken pavement and stones." ("Fact Sheet Foreign Object Debris (FOD)" Federal Aviation Administration, November 15, 2013)
2 "Defusing the Ramp" by Mark Lacagnina, Aerosafety World Magazine, May 2007, pp. 20-24
3 "Runway Safety: FOD, Birds, and the Case for Automated Scanning" by Iain McCreary, Insight SRI, 2010, pp. 146-157
4 www.ntsb.gov/safety/mwl1_2012.html
5 "The Four New Priorities of NextGen" by Woodrow Bellamy III, Avionics Today, June 5, 2014 www.aviationtoday.com/av/ nextgen/The-Four-New-Priorities-of-NextGen_82336.html#.U8A7GyfvjFI
6 "Airport Foreign Object Debris (FOD) Detection Equipment" Advisory Circular AC No: 150/5220-24, Section 3.3(a) (20), Federal Aviation Administration, September 30, 2009 www.faa.gov/ documentLibrary/media/Advisory_ Circular/150_5210_24.pdf
7 "Runway Safety: FOD, Birds, and the Case for Automated Scanning" by Iain McCreary, Insight SRI, 2010, pp. 181-183

Michael Jamieson is an Applications Engineer at Neptec Technologies where he supports end-users in the design, development, integration and support of solutions based on Neptec’s real-time 3D vision sensors and 3DRi software.

Michael Dunbar is Director of Business Development for Neptec Technologies. A veteran of several successful startups in the sensors market, he leads Neptec’s business development activities relating to the new OPAL LiDARs and associated 3DRI software tools.

A 1.062Mb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

Exit mobile version