Real-Time, Full Motion 3D Color LiDAR Imagery

A 926Kb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE

To get you in the right frame of mind let’s start by considering the following real world scenarios:

Imagine: major damage from an earthquake or tornado with visual cues destroyed, roads blocked, buildings toppled, ruble everywhere. The disaster response team needs a georeferenced image of the scene immediately. To that end, a plane is dispatched equipped with an integrated LiDAR, VIS and IR camera system capable of relaying multispectral, georeferenced 3D full motion imagery of the situation back to the command center. The command center can immediately direct emergency response personnel to the critical locations using clear routes. Preliminary building and infrastructure safety can be rapidly ascertained due to non-normal tilt or shifts.

Imagine: a forest fire displacing people who want to know if their house is still standing or has burned down. Rather than the command center chief repeatedly communicating this to those concerned, interested parties can access a touch screen kiosk and find their property on a color 3D image of the area where they can zoom in, rotate or check other areas. Being familiar with the area, they can immediately find their house. This real time 3D color imagery is provided by a plane equipped with an integrated LiDAR, visible cameras transmitting real time, color 3D imagery back to the command center.

Imagine: a military squad receives an order for a quick-turn operation to an area where terrain knowledge is incomplete. To compensate for lack of information, they send out a UAV equipped with a multispectral pod that includes a real-time LIDAR system. The fused 3D imagery is relayed back in real-time allowing for mission planning determination of ingress/egress routes, lines of fire, obstacle height and accurate target point location.

The real-time fusing of multispectral imagery with Flash LiDAR provides a new capability: Laser Imaging, Detection and Ranging or LIDAR. All this is possible now with high frame rate Flash LIDAR technology. Linear Mode Flash LiDAR cameras, which operate at 30 fps providing up to 490,000 points per second from a 128×128 array are available from Advanced Scientific Technologies (ASC). Other companies such as Voxtel and Raytheon Vision Systems are building low noise, large format Flash LiDAR sensors that can be integrated into a camera system.

These LiDAR sensors typically operate in a first return, multiple returns, or gated mode. In a first return mode, every pixel or small cluster of pixels independently triggers once the return signal crosses a threshold. That information can be used to determine both a range and intensity values. In gated mode, all pixels begin `looking’ at a preset range for a return within a range gate. In this mode, the cameras typically capture multiple temporal samples from the return pulse. The current ASC system provide up to 20 range slices. Both ASC and Voxtel anticipate releasing a low noise, larger format sensor with more waveform samples early next year. Early testing indicates that the increase in sensitivity for the new devices will more than compensate for increase in pixels, allowing for much higher data rates with the same laser power.

Ball Aerospace has developed the Total Sight LIDAR system, which integrates the ASC camera with either a visible and/or MWIR (mid wavelength infra-red) camera depending on the mission. The system fuses the context camera’s pixel values (RGB, Temp) with the associated LiDAR camera pixel in real time, providing fused, georeferenced 3D full motion imagery or video (3DFMV) to the user.

Real-time down linking of this fused imagery from the plane to the ground using standard RF downlinks has been demonstrated by Ball. This is achievable as the metadata is per frame instead of per point as compared to scanning LiDAR systems, greatly reducing the data volume. This capability allows LIDAR data to be instantly disseminated as well as servicing real-time missions, opening the door to new applications of LIDAR in the field. Total Sight has been successfully tested at altitudes up to 10,000 ft and supports operations on both rotary and fixed wing aircraft. Figure 1 shows the Total Sight sensor suspended underneath a Jet Ranger with the associated 3U electronics box strapped to the seat in the helicopter.

The value of LiDAR imagery has been established, but with raw LiDAR data being at best only a 3D intensity product, an experienced user or analyst is required to process the data: coloring by height, classifying by features, building extraction, etc. However, once actual color is added to the imagery, any user can immediately understand the 3D image and intuitively manipulate the image for information relevant to them.

An example of this is the Android tablet LiDAR viewer developed by Ball. The tablet displays LAS files with embedded color and the user can manipulate the image as with any touch screen application: zooming with two fingers, touch one point for GPS position, touch two points for vector information, put their point of view anywhere in the image and pan around `seeing’ what they can see, coloring by height. Just like a 3D game. No training required.

Imagine: a user in the field needing surrounding 3D imagery. A request is sent back to the command center which sends out a LAS data file of that immediate area, allowing the user to interrogate a fully georeferenced, color 3D image of the area of concern. Once again, achievable because of the real-time capability of the Flash LIDAR system flying nearby.

Due to the high frame rate of the Flash LiDAR sensor, up to 30fps, and the frame to frame overlap at typical flight speeds and altitudes, very high point densities can be achieved since an object or area on the ground is sampled in multiple frames. As with any camera, the system FOV can be changed depending on the ground spatial sampling resolution desired. In addition, the system FOV can be panned both in in-track and cross-track to increase the overall field of regard or area coverage. The Ball Total Sight system in a standard configuration has a 3 degree FOV lens and can scan up to 20 degrees in cross and in track while the LiDAR camera continues to run at 30 fps. The LiDAR and visible cameras are frame synced and boresighted allowing for accurate, real-time fusing of the camera data. Each frame of the LiDAR sensor array is fused with the context camera and georeferenced using the internal Applanix POS AV sensor. Frames are accumulated or stitched in real-time creating an accurate, color LAS point cloud. The Omni Star service is used to produce the most accurate real-time POS measurements.

Imagery from the Ball Total Sight system is shown in Figures 2 and 3. Figure 2a is of a quarry near Morrison, Colo. All data was collected in a single pass with a 20 degree cross track coverage. Note the reflection from the water. Because color is only fused with a LiDAR range return, 1.06 um laser return was apparently reflected from some contamination at the water surface. Figure 2b shows a close-up view of the work area. This 3D data immediately provides volume information on the quarry tailings. It could also provide information on water flow for environmental planning.

Figure 3 shows scenes from Denver, Colo. Figure 3a of downtown area and 3b of the Sports Authority Field. Again, 20 degrees cross track and a single pass data collect. The value of fused color is quickly apparent in image interpretation.

To summarize, real-time, full motion 3D color LIDAR is available now, opening multiple opportunities in the area of georeferenced visual communication. These opportunities range from emergency management to civil planning to defense to entertainment. The real-time fusing of context imagery–whether visible, SWIR or M/LWIR–significantly enhances the intelligence value and interpretability of the LiDAR point cloud. And with the ongoing development of lower noise, larger Flash LiDAR arrays, fused 3D LIDAR imagery will undoubtedly find many new applications.

Note: Videos of the Ball Total Sight system can be viewed at: http://www.youtube.com/watch?v=JwoEfJeHWs8youtube.com/watch?v=vJ-rQJQ1qOM

Roy Nelson is the Sr. Business Area Manager for Laser Applications within the Ball Aerospace Tactical Solutions Group and can be reached at rnelson@ball.com.

A 926Kb PDF of this article as it appeared in the magazine complete with images is available by clicking HERE