Self-driving cars are nearly ready for primetime, and so are the laser sensors that help them see the world. Lidar, which builds a 3-D map of a car’s surroundings by firing millions of laser points a second and measuring how long they take to bounce back, has been in development since 2005, when a guy named Dave Hall made one for the Darpa Grand Challenge, an autonomous vehicle contest. In the decade-plus since then, if you wanted a lidar for your self-driving car, Velodyne was your only choice.
Yet Velodyne’s one-time monopoly has eroded in recent years, as dozens of lidar startups came to life, and robocar makers found their own way. Google’s sister company Waymo put years and millions of dollars into developing a proprietary system. General Motors bought a lidar startup called Strobe. Argo AI, which is making a robo-driving system for Ford, acquired one called Princeton Lightwave.
The latest challenger is Luminar, the Silicon Valley-based startup that already has a deal with Toyota, plus three more manufacturers it declines to name. Today, Luminar is announcing the introduction of its newest lidar unit, with a 120-degree field of view (that’s enough to see what’s ahead of the car, but you’d need a couple to get a 360-degree view). And after a first production run of just 100 units, it’s ready to start cranking them out by the thousand—more than enough to meet today’s demand. And maybe, enough to make self-driving cars cheaper for everybody.
“By the end of this year, we’ll have enough capacity to equip pretty much every autonomous test and development vehicle on the road, globally,” says CEO Austin Russell, who dropped out of Stanford in 2012 when he was 17 years old to make Luminar his full-time gig. “This is no longer being built by optics PhDs in a handcrafted process. This is a proper automotive serial product.”
In its 136,000 square foot facility in Orlando (an optics industry hub), the company has dropped the build time for a single unit from about a day, to eight minutes. In the past year, it has doubled its staff, to about 350. It hired Motorola product guru Jason Wojack to head its hardware team. Alejandro Garcia came over from major auto industry supplier Harman to run manufacturing.
Luminar is playing catch up here. Last year, Velodyne opened a “megafactory” to ramp up production and built 10,000 laser sensors. President Marta Hall says it could build a million a year if it wanted to. But the ability to build lots of lidars isn’t enough to win here.
Lidar is a fantastic sensor—it’s more precise than radar and works in more conditions than cameras do—but it’s way too expensive. Velodyne’s top shelf unit, which sees in 360 degrees with a 300-meter range, costs about $75,000 a piece. Buying in bulk will drop that cost, but that’s still a hard price tag to bear, even on a fleet vehicle that can amortize costs over years of service.
At its Orlando production facility, Luminar can now make a lidar unit in about eight minutes—it used to take a day.
Luminar
Luminar made the cost question harder by making its lidar’s receiver (the that acts like your eye’s retina) out of indium gallium arsenide (InGaAs) instead of silicon. Why is this important? Well, to make your lidar “see” farther, you have to fire more powerful pulses of light. They have to be powerful so they have the strength to hit faraway objects and make it all the way back. Most lidars use lasers at the 905 nanometer wavelength. That’s invisible to humans. But if it hits an actual eyeball, like yours, with enough power, it can damage the retina. If you want to fire more powerful pulses (and have your lidar “see” farther) without blinding actual people, you can use the 1550 nanometer wavelength, which is further into the infrared part of the spectrum, and thus can’t penetrate a human eyeball.
Which brings us back to silicon. Receivers made of silicon, which is cheap, can’t detect light at the 1550 wavelength. InGaAs can, but it’s far more expensive. So the industry standard is to use silicon, run at 905 nanometers, and accept you just can’t send your lasers all that far.
But Russell insisted on the extra power, which meant 1550 nanometers, which meant using a receiver made of InGaAs. As a result, he can fire pulses 40 times more powerful than what his competitors shoot, so his lidar can see objects extremely dark objects—like, the kind that can absorb 95 percent of light—even from 250 meters away. He says no one’s lidar can see so well at such distance.
But seriously, InGaAs, as the French say, coute la peus des fesses*. A receiver array about the size of a big potato chip can cost tens of thousands of dollars, Russell says. So Luminar built its own. The result, now in its seventh iteration, is about the size of a strawberry seed. (The entire unit, including the laser and accompanying electronics, is about half a foot square and three inches deep.) That includes the chip that calculates, down to the second, how long the photon has been out in the world. It costs a piddling $3, obliterating Luminar’s cost concerns while allowing for that extra range and resolution. Russell wouldn’t reveal an exact price for the lidar as a whole, but says his customers are quite pleased. And when they’re finally ready to start offering you rides in their robo-taxis, maybe they won’t have to charge you as much for that trip home from the bar.
Luminar’s R&D team also managed to increase the “dynamic range” of the receiver. Just like how your pupils dilate based on light conditions, lidar receivers are tuned to pick up pulses of a certain strength (the farther a photon goes before bouncing back, the weaker it becomes). If you set it to look for faint signals and it gets hit by a much stronger pulse, you can fry the receiver. “We have countless blown-up detectors,” Russell says. The current unit can handle a much greater range of pulse strengths, without even a wisp of smoke.
Meanwhile, Luminar’s already working on the next generation sensor. That one, Russell says, will be affordable enough to put in consumer cars—making the gift of sight little more than a commodity.
Rolling Toward Ready
Want to know when self-driving cars will be ready? You’re asking the wrong question
The lose-lose ethics of testing robo-cars on public streets
The unavoidable folly of making humans supervise self-driving cars