By Luis Dussan
Tesla founder Elon Musk recently declared that LiDAR is a “crutch” for autonomous vehicle makers. The comment sparked headlines and raised eyebrows in the industry. Given that this vision technology is the core of many companies’ self-driving car strategies, his view strikes many as anathema or just plain nuts.
But for the moment, let’s ignore the fact that LiDAR is vital to self-driving cars from GM, Toyota and others. Forget that the most advanced autonomous vehicle projects have focused on developing laser-sensing systems.
Even disregard that the alleged theft of LiDAR secrets was at heart of the legal battle between Uber and Alphabet’s Waymo. Waymo claimed that LiDAR is essential technology for autonomous vehicles and won a settlement recently worth about $245 million.
The truth is: Mr. Musk is right. Relying solely on LiDAR can steer autonomous vehicle companies into innovation cul-de-sacs.
LiDAR is not enough. Autonomous vehicles require a rapid, accurate and complete perception system. It is a system-level problem that requires a system-level solution.
My agreement with Mr. Musk may seem surprising given that our company, AEye, sees LiDAR as playing a significant role in making driverless cars a commercial reality.
But we too have realized that if autonomous vehicles are ever going to be capable of avoiding accidents and saving lives, LiDAR is not the answer. At least not by itself.
Not THE answer, but part of the answer…
At Tesla, Mr. Musk is forsaking LiDAR for a 2D camera-based vision system. While Mr. Musk is known for disruptive thinking, it is hard to escape the fact that autonomous vehicles move through a 3D world and successful navigation of that world requires the seamless integration of both 2D and 3D data precisely mapped to both time and space.
At AEye, we believe LiDAR is the foundation of the solution when it seamlessly integrates with a multi-sensor perception system that is truly intelligent and dynamic. Our research has produced an elegant and multi-dimensional visual processing system modeled after the most effective in existence — the human visual cortex.
In fact, AEye’s initial perception system, called iDAR (Intelligent Detection and Ranging), offers a robotic perception system that is more reliable than human vision. LiDAR integrates with a low-light camera, embedded artificial intelligence and at-the-edge processing to enable a car’s vision system to replicate how the human visual cortex quickly interprets a scene.
In short, iDAR enables cars to see like people.
Why this is the superior approach?
In his skepticism of LiDAR, Mr. Musk has curiously bet on a “camera-mostly” strategy when building a vision system for autonomous Tesla vehicles. He has previously made bold (many say unrealistic) predictions that Tesla would achieve full Level 5 autonomous driving with camera-mostly vision in 2019. Navigant Research, in their annual ranking of self-driving vehicle makers, says this is “unlikely to ever be achievable” and rates Tesla at the back of the pack.
The company’s Autopilot system relies on cameras, some radar, and GPS. It has suffered setbacks due to a split with its camera supplier in 2016 after a fatal accident that investigators have blamed partly on Autopilot. Last month, a Tesla smashed into a firetruck in Culver City, California, and the driver said it was “on autopilot.”
The evidence strongly argues against Mr. Musk’s decision to bet on passive optical image processing systems. Existing 2D image processors and 2D to 3D image conversion concepts have serious flaws that can only be addressed with massive computing power and more importantly — algorithms that have not been invented, and are many years away from becoming a reality. This makes this approach too costly, inefficient and cumbersome to achieve Level 5 autonomous driving at commercial scale.
At AEye we know that integrating cameras, agile LiDAR, and AI equals a perception system that is better than the sum of its parts. It surpasses both the human eye and camera alone, which is required if you don’t have the sophistication of the human brain yet replicated.
In his “crutch” comments, Mr. Musk predicted that LiDAR-based systems will make cars “expensive, ugly and unnecessary,” adding: “I think they will find themselves at a competitive disadvantage.” The truth is that size, weight, power, and cost are decreasing for vehicle navigation grade LiDAR. And they will fall further. AEye, and maybe others, will see to that.
We respect Musk’s innovations and are grateful to him shedding light on where LiDAR needs to go to reach full autonomy. But in the end, as we see LiDAR as a lever, rather than a crutch, we can only give him partial credit for his understanding of the way forward.
ALL NEWS & VIEWS
Elon Musk Is Right: LiDAR Is a Crutch (Sort of.) —
- AEye Introduces Groundbreaking iDAR Technology
- Observe, Orient, Decide, Act: How AEye’s iDAR System Adopts Principles of the OODA Loop to Achieve Intelligent, Long-Range Detection
- AEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™
- AEye Announces the AE100 Robotic Perception System for Autonomous Vehicles
- The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a Human
- AEye Announces Addition of Aravind Ratnam as Vice President of Product Management
- CB Insights Unveils Second Annual AI 100 Companies at A-ha!
- AEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI Technology
- Gartner Names AEye Cool Vendor in AI for Computer Vision
- AEye Welcomes James Robnett to Executive Team as Vice President of Automotive Business Development