Tech Report: Elon Musk is Right—Lidar is a Crutch

Elon Musk has been right about a lot of things. That the web would enable new forms of payment (PayPal). That the rocket-launch industry was ripe for disruption (SpaceX). That the combination of falling hardware prices and innovative financing could open up many more roofs to solar energy (SolarCity). That there was a market for electric cars (Tesla). Heck, he may even be right that electric skates hurtling through tunnels can slice the Gordian knot of metropolitan traffic jams (The Boring Company).

Editor’s note: A PDF of this article as it appeared in the magazine is available HERE.

Musk was also right when he called lidar a “crutch” when it came to self-driving cars. Where he’s not right, though, is that the technology’s crutchitude is somehow a reason not to employ lidar—which uses lasers to scan the world around them, identify objects and characterize surroundings to centimeter accuracy—as a fundamental sensor on self-driving cars. Musk and almost everyone else in the business recognize self-driving cars and trucks as the future of automotive transportation. Every major automaker, not to mention the likes of Apple, Uber, and Google spinoff Waymo, has an autonomous vehicle program going. Billions of dollars a year are pouring into dozens of efforts involving thousands of wickedly smart people around the world.

These systems build on decades of work in academia and corporate research labs. The biggest challenge with vehicle autonomy has been, and remains, developing computer brains capable of handling whatever side streets, dirt roads, parking lots, six-lane highways, roundabouts, construction zones, railroad crossings, and so on throw at them (Amish buggies, kids leaping out from between cars in pursuit of bouncing balls, snowstorms on country roads, idiots darting across four lanes to exit ramps, school-bus drop-offs…). The second-biggest hurdle—and one inextricably enmeshed with the computer-brain challenge—has been to make sure the computer brains are acting on accurate representations of the world they’re attempting to navigate.

That’s where sensors come in. Cameras can tell a red light from a green one, see the difference between a UPS truck and an ambulance, and capture the dancing lights of a police car parked on the shoulder. Radar bounces microwaves off objects near and far, day and night. Lidar does that, too, but using light with wavelengths thousands of times shorter than those of microwaves. Shorter wavelengths mean sharper resolution. So where radar can tell you something’s there, lidar can tell you if a surprise moving across the road is a trash bag caught in the wind or a boy on a Big Wheel.

In daylight, cameras can do that, too, but not so much in the dark, which is why the autonomous vehicle development world has by and large settled on sensor fusion, which involves a combination of lidar, radar and cameras. The strengths of one sensor compensate for the weaknesses of others. Make no mistake, however: of these increasingly fused sensors, lidar has been the most vital in moving self-driving transportation forward. Consider the difference between the first DARPA Grand Challenge in 2004, in which there were no finishers despite a relatively straightforward desert course, and the 2007 DARPA Urban Challenge, which featured the first commercial Velodyne lidars. The majority of the 11 finalists finished a far more difficult and chaotic course. The computers had become faster and the software, better, certainly, yet the new lidar was decisive.

The issue with lidar has been cost, which is falling, but would still run to thousands of dollars for outfitting a mass-market vehicle. Lidar makers say the price will come down as manufacturing scales up, but automotive lidar will never be headlight-cheap. This would be a problem if everybody was buying their own self-driving car. But autonomous vehicles will mostly end up in fleets zipping around doing picks and drops at all hours. For many, owning a car to commute will make as much sense as owning a cell tower to scroll Instagram. We’ll just subscribe and let the automotive equivalent of Sprint or Verizon deal with the infrastructure. If this seems far-fetched, consider that Uber has spent more than a billion dollars on autonomous vehicle research because it envisions itself as an automated, driverless service. It’s all but certain that, in the not-so-distant future, the human Uber driver will seem as archaic as a Netflix DVD in a Tyvek mailer.

Musk is right, therefore, that lidar is a crutch. But so are Google (brain crutch), contact lenses (eye crutch) and, indeed, crutches (crutch-crutch). Crutches are assistive, enabling technologies. Perhaps one day cameras and radar alone will be enough for safe self-driving. But lidar has been a fundamental enabler of the rapid advance of autonomous vehicle technology, and it will make self-driving cars and trucks safer than they’d otherwise be. It’s the sort of crutch I’ll want to lean on when my daughters—and, maybe one day, my grandchildren—ride off in a car with no driver. 1

Todd Neff is the author of The Laser That’s Changing the World1, a history of lidar.

1 Neff, T., 2018. The Laser That’s Changing the World: The Amazing Stories behind Lidar from 3D Mapping to Self-Driving Cars, Prometheus Books, Amherst, New York, 314 pp.