Volkswagen planning new North America factory for electric vehicles

LOS ANGELES (Reuters) – Volkswagen VOWF_p.DE is deciding where to locate a new factory in North America to build electric vehicles for the U.S. market, the German automaker’s new head for the Americas said on Wednesday. FILE PHOTO: Volkswagen logos during the media day of the Salao do Automovel International Auto Show in Sao Paulo,… Continue reading Volkswagen planning new North America factory for electric vehicles

  Why Our Company’s Trucks Won’t Carry Lidar 28 Nov

About the Cars That Think blog

IEEE Spectrum’s blog about the sensors, software, and systems that are making cars smarter, more entertaining, and ultimately, autonomous.

Follow @/CarsThatThink

Philip E. Ross, Senior Editor

Willie D. Jones, Assistant Editor

Evan Ackerman, Senior Writer

Lucas Laursen, Contributor

Subscribe to RSS Feed

Tesla Autopilot Hits 1 Billion Miles! & Why Tesla Autopilot Is The Top Approach To Autonomy

Electric Cars
Electric Car Benefits
Electric Car Sales
Solar Energy Rocks
RSS
Advertise
Privacy Policy

Cars

Published on November 28th, 2018 |

by Michael Barnard

Tesla Autopilot Hits 1 Billion Miles! & Why Tesla Autopilot Is The Top Approach To Autonomy

Twitter
Google+
LinkedIn
Pinterest
Facebook

November 28th, 2018 by Michael Barnard

Editor’s note: Tesla just tweeted that Tesla owners have now driven 1 billion miles on Autopilot.

In honor of the milestone (no pun intended), I’m reposting one of my favorite Autopilot articles of all time, a 2015 article by Mike Barnard. Enjoy.

Tesla recently released its Autopilot mode for its cars. It has a fundamentally different intellectual approach to autonomy than Google’s, and it’s superior.

One of my backgrounds is robotics. I spent a year digging my way through PhD theses from robotics programs around the world as I worked on a startup idea for specific applications of swarm-based robots. We got as far as software architecture, simple simulations, 3D modelling of physical robots, and specific applications which had fiscal value. I have some depth here without pretending to be a roboticist, and I’ve continued to pay attention to the field from the outside.

So I feel comfortable in saying that, in general, there are two approaches for robots getting from Point A to Point B.

→ The first is the world map paradigm, in which the robot or a connected system has a complete and detailed map of the world and a route is planned along that in advance accounting for obstacles. Basically, the robot has to think its way past or over every obstacle, which makes for a lot of programming.

Yes, that’s a cat in a shark costume riding a Roomba.

→ The second is the subsumption architecture paradigm, in which a robot is first made so that it can survive environments it will find itself in, then equipped with mechanisms to seek goals. The robot then, without any idea of the map of the world, navigates toward Point B. The robot is robust and can stumble its way through obstacles without any thinking at all. The original Roomba vacuum cleaner was a pure subsumption beast.

Obviously, both have strengths and limitations and obviously, at least to me, a combination is the best choice, but it’s worth assessing Tesla’s vs Google’s choices based on this.

Google is starting from the full world map paradigm. For one of its cars to work, it needs an up-to-date centimetre-scale, 3D model of the entirety of the route it will take. Google’s cars are ridiculously non-robust — by design — and when confronted with something unusual will stop completely. Basically, all intelligence has to be provided by people in the lab writing better software.

Why would Google start with this enormous requirement? Well, in my opinion without having spoken to any of the principals in the decision, it’s likely because it fits their biases and blindspots. Google builds massive data sets and solves problems based on that data with intelligent algorithms. They don’t build real-world objects. And the split I highlighted above in world map vs subsumption paradigms is a very real dividing line in academics and research around robotics. It was very easy for Google and world view robotics researchers to find one another and confirm each others’ biases. Others assert that Google is taking a risk-averse approach by leaping straight to Level Four autonomy, and while I’m sure that’s a component of the decision-making process, I suspect it’s a bit of a rationalization for their biases. It’s also being proved wrong by the lack of Tesla crashes to date, but it is early days.

To be clear, Google cars can do things Teslas currently can’t, at least in the controlled prototype conditions that they are testing. They can drive from Point A to Point B in towns and regions that Google has mapped to centimetre scale, which is basically areas south of San Francisco plus a few demo areas. You can’t get in a Tesla, give it an address, and sit back. These are clear performance advantages of the Google model over current Tesla capabilities, and while not trivial, are enabled by the world map model.

Tesla, on the other hand, is starting with the subsumption model. First, the car is immensely capable of surviving on roads: great acceleration, great deceleration, great lateral turning speed and precision, great collision survivability. Then it’s made more capable of surviving. All the car needs to drive on the freeway is knowledge of the lines and the cars around it. Then it adds cameras to give it a hint about appropriate speed. It has only a handful of survivability goals: don’t hit cars in front of you, don’t let other cars hit you, stay in your lane, change lanes when requested, and it’s safe. Because of its great maneuverability — survivability — it can have suboptimal software because it is more able to get out of the way of bad situations. And it has human backup.

And if that’s where Tesla was stopping, everyone who is pooh-poohing its autonomy would be basically correct. But Tesla isn’t stopping there.

Tesla is leveraging intelligent real-world research assistants to put focused, experienced instincts into its cars. They are called the drivers of the Teslas. Every action the Autopilot makes and every intervention a driver makes is uploaded to the Tesla Cloud, where it’s combined with all of the other decisions cars and drivers are making. And every driver passing along a piece of road is automatically granted the knowledge of what the cars and drivers before them have done. In real time.

So, for example, within a couple of days of downloading, Teslas were already automatically slowing for corners that they took at speed before. And not trying to take confusingly marked offramps. And not exceeding the speed limits in places where the signs are obscured.

Within a couple of days of being available, the first people Cannonballed across the USA in under 59 hours with 96% or so of the driving done by the car. Given Google’s requirements, they would have had to send at least two cars out, one or more with a hyper-accurate mapping functionality, then a day or a week later, when the data was integrated, the actual autonomous car. And there would have been no chance of side trips or detours for the Google car. It literally couldn’t drive on a route that wasn’t pre-mapped at centimetre scale. But the Tesla drivers could just go for it.

People are driving Teslas on back roads and city streets with Autopilot, definitely not the optimum location-only situations that others claim Tesla is limited to. And Teslas haven’t hit anything; in fact, have been recorded as avoiding accidents that the driver was unaware of. Survivability remains very high.

Tesla cars are driving themselves autonomously in a whole bunch of places where Google cars can’t and won’t be able to for years or possibly decades. That’s because Teslas don’t depend on perfect centimetre scale maps that are up-to-date in order to do anything. Subsumption wins over world maps in an enormous number of real-world situations.

Finally, Teslas have a world map. It’s called Google Maps. And Tesla is doing more accurate mapping with its sensors for more accurate driving maps. But Teslas don’t require centimetre-scale accuracy in their world map to get around. They are just fine with much coarser-grained maps which are much easier to build, store, manipulate, and layer with intelligence as needed. These simpler maps combined with subsumption will enable Teslas to drive from Point A to Point B easily. They can already drive to the parkade and return by the themselves in controlled environments; the rest is just liability and regulations.

The rapid leaps in capability of the Autopilot in just a few days after release should be giving Google serious pause. By the time its software geniuses get the Google car ready for prime time on a large subset of roads, Teslas will be able to literally drive circles around them.

Support CleanTechnica’s work by becoming a Member, Supporter, or Ambassador.
Or you can buy a cool t-shirt, cup, baby outfit, bag, or hoodie or make a one-time donation on PayPal.

About the Author

Michael Barnard is a C-level technology and strategy consultant who works with startups, existing businesses and investors to identify opportunities for significant bottom line growth in the transforming low-carbon economy. He is editor of The Future is Electric, a Medium publication. He regularly publishes analyses of low-carbon technology and policy in sites including Newsweek, Slate, Forbes, Huffington Post, Quartz, CleanTechnica and RenewEconomy, with some of his work included in textbooks. Third-party articles on his analyses and interviews have been published in dozens of news sites globally and have reached #1 on Reddit Science. Much of his work originates on Quora.com, where Mike has been a Top Writer annually since 2012. He's available for consulting engagements, speaking engagements and Board positions.

Back to Top ↑

Advertisement

Advertise with CleanTechnica to get your company in front of our readers.

CleanTechnica Clothing & Cups

Top News On CleanTechnica

Advertisement

Follow @cleantechnica
Join CleanTechnica Today!

EV Charging Guidelines for Cities

Share our free report on EV charging guidelines for cities, “Electric Vehicle Charging Infrastructure: Guidelines For Cities.”
Advertisement

Cleantech Press Releases

New Research Shows That Only Two Large Petroleum Companies Have Meaningful Emission Reduction Targets

Koben Announces EVOLVE EVSF —Grid-Friendly Modular EV Store & Forward System

The New Danish Climate Plan — Together For A Greener Future

The EV Safety Advantage

Read & share our free report on EV safety, “The EV Safety Advantage.”

The State of EV Charging

Our 93-Page EV Driver Report

30 Electric Car Benefits

Blockchain × Cleantech

Our Electric Vehicle Reviews

Tesla News

Correcting t..

Velodyne Lidar Sensors Power ThorDrive’s Trailblazing Autonomous Driving Commercial Vehicle Services – Business Wire

SAN JOSE, Calif.–(BUSINESS WIRE)–Utilizing Velodyne Lidar sensors, ThorDrive, a leading autonomous vehicle (AV) startup, is introducing commercial vehicle services in the first of many pilots. The kick-off event will be attended by local government and business leaders, including Mayor Liz Kniss of Palo Alto, Mike Jellen, President and CCO of Velodyne Lidar, and Seung-Woo Seo,… Continue reading Velodyne Lidar Sensors Power ThorDrive’s Trailblazing Autonomous Driving Commercial Vehicle Services – Business Wire

Tesla’s Autopilot racks up 1B miles driven

Tesla’s Autopilot racks up 1B miles drivenTesla Inc. says owners of its electric vehicles have driven 1 billion miles using the company’s Autopilot driver-assistance feature – a significant milestone for the automaker, which uses the collected data to improve the software as a competitive advantage.
Tesla, which announced the mark in a Tweet today, has installed Autopilot hardware on every car it’s produced since October 2014. Autopilot is designed for use on highways, but the vehicles are operating under diverse road and weather conditions around the world. The resulting trove of real-world miles acts as a feedback loop to the algorithms that are constantly training the fleet of Tesla vehicles on the road how to behave.
In the race toward full autonomy, not all miles are created equal. There are semi-autonomous as well as fully self-driving ones; real-world versus simulated; and those racked up on highways versus those in trickier urban environments. Chief Executive Officer Elon Musk has promised to demonstrate a fully self-driving cross-country road trip from Los Angeles to New York, but the timeline for when that may happen has continually slipped.
Tesla tells drivers they must keep their hands on the steering wheel and monitor the system at all times, but Autopilot has come under scrutiny from regulators and consumer advocacy groups, including after a fatal crash in March. In May, Musk dismissed the notion that Autopilot users involved in accidents have the mistaken belief that the system is capable of fully-autonomous driving.
Read or Share this story: https://www.detroitnews.com/story/business/autos/mobility/2018/11/28/teslas-autopilot-racks-miles-driven/38633889/

Byton reveals self-driving living-room on wheels, the K-Byte, in LA

Byton K-Byte concept
Byton aims to reinvent the car by turning it into the next generation iPhone. Other automakers have talked about turning their center screen displays into iPhone-like interfaces, but not the whole car.

That seems to be the goal of Byton, which says it is creating the “next generation smart device,” “spaceship Byton,” “your self-cruising living space.”

In a two-hour press conference streamed from Shanghai in English and Chinese to coincide with its press-conference slot at the LA Auto Show, the startup electric-carmaker laid out plans to bring its first SUV to market late next year, and showed a concept version of its next model, the K-Byte luxury sedan.

READ THIS: Byton begins road testing its electric car in China

Both cars are fully electric, based on the same “skateboard” chassis, with batteries under the floor.

Byton was scant with actual specs, but said the base version of the M-Byte SUV will have about 280 miles of range, while a more expensive trim level with a bigger battery will get about 325 miles. Batteries will be supplied by Chinese manufacturer CATL.

The K-Byte will be a full-size sedan, 195 inches long, on a 118-inch wheelbase (about 2 inches longer than the SUV), making it about the size of a Toyota Avalon. It has small flying buttresses covering the rear window pillars, which Byton says improve aerodynamics to stretch the car's range.

Byton K-Byte concept

Byton executives made a bigger deal about the cars’ 49-inch “coast-to-coast,” “shared-experience display,” which it displayed in the M-Byte, and which we had a chance to sample at the Consumer Electronics Show in Las Vegas last January. The single, curved screen stretches the width of what would normally be the dashboard. If the car is to be the next-generation iPhone, this screen is the heart of the car.

The interface isn’t that different from the one we sampled, with three sections to the screen, one on the left for driving displays, a center section for navigation and audio, and an entertainment screen for the passenger on the right.

It will offer augmented reality for the driver, virtual reality, a simulator, and video conferencing. Reading and sending emails should be comparatively simple. A wide-angle, digital rear-view camera screen appears at the bottom left behind the steering wheel.

Byton K-Byte concept

Byton says the screen will comply with all international automotive standards, which means that while driving, the screen either can’t display video or the driver can’t see the section of it that does. The company also says it will be silicon-coated for crash safety.

The screen can be reconfigured, with the navigation system, for example, taking up the right two-thirds of the display, complete with, Byton says, “all” points of interest, which will be integrated with navigation and search, much like Google Maps—only larger.

Driving controls are on a separate tablet in the center of the steering wheel, which gets covered by the airbag in an accident to protect the driver. Byton says there are no other controls.

CHECK OUT: Chinese electric-car startup Byton reveals its second concept: a Level 4 self-driving sedan

The idea, however, is for drivers not to have to use the system to drive. Byton says the M-byte will come with full Level 4 self-driving capability starting in 2020, which will allow the car to drive itself on limited access highways.

Byton K-Byte concept

The K-byte uses a set of lidar sensors integrated on the front and rear of the panoramic glass roof, and retractable lidar sensors that the company calls LiGuards on the front fenders. They deploy when the self-driving system is activated.

Both the self-driving system and the “shared-experience display” can be controlled using gesture commands or voice commands provided by Byton partner Baidu.

Byton K-Byte concept

The company says it has finished its global factory in Nanjing, China, and has produced the first handful of its planned 100 M-Byte prototypes.

Byton says this “self-cruising living space,” will be the new standard of automotive luxury when the M-Byte goes on sale next year. The question we can’t help but ask is, how will this new iPhone on wheels drive?

Phiar raises $3 million for an AR navigation app for drivers

Augmented reality is a very buzzy space, but the fundamental technologies underpinning it are pushing boundaries across a lot of other verticals. Tech like machine learning, object recognition and visual mapping tech are the pillars of plenty of new ventures, enabling there to be companies that thrive in the overlap. Phiar (pronounced fire) is building… Continue reading Phiar raises $3 million for an AR navigation app for drivers

Top Automotive Industry News for Week of November 5 – November 11, 2018

Here is the most important news associated with the automotive industry
identified by the AEA for the week of November 5, 2018 -November 11, 2018.

We hope it helps you stay up to speed on the key developments in our
industry:

-Automotive Manufacturing News-

First ex-UAW official sentenced in FCA-related scandal; gets 1 year

(Detroit Free Press)

Ford plans construction on Michigan Central Depot by year's end

(Detroit Free Press)

Former Tesla employee charged with embezzling $9.3 million from Elon
Musk's company

(MarketWatch)

Is Toyota Next to Pare Back its Passenger Car Line?

(The Detroit Bureau)

Tesla picks an insider to be chairwoman, fueling doubt Elon Musk will
be reined in

(LA Times)

The gas engine still has a long life to live, Aston Martin CEO says

(CNBC)

VW planning $21K EV to challenge Tesla

(The Detroit News)

VW takes another shot at compact pickup market

(The Detroit News)

White House, California to discuss vehicle emissions rules next week

(autoblog)

Why GM is moving 3,000 workers from Pontiac to Warren

(Detroit Free Press)

-Automotive Evolution News-

8 concept cars that show how technology will dominate the drive of the
future

(CNBC)

Autonomous Cars Face Big Hurdles; They Will Succeed, But When?

(Forbes)

Daimler And Bosch Choose San Jose For Their Silicon Valley Robo-Taxi
Service

(Forbes)

GM's future lineup will run on electricity, drive itself — and fly

(Detroit Free Press)

Mercedes-Benz, Bosch to offer self-driving car rides in San Jose,
California

(USA Today)

Tesla Drivers Report Autopilot Disengaging While Driving Due To
Software Bug

(Forbes)

This Robot Truck Startup May Have An Edge Over Waymo In Bad-Weather
Driving

(Forbes)

Uber rival Taxify says it can grow 100 times bigger in the scooter and
ride-hailing market

(CNBC)

Uber ups its driver perks with 'Pro' program, including free college
education

(USA Today)

-Automotive Retail News-

Better inventory listings, lead management anchor more sales

(Auto Remarketing)

CarGurus Helps Dealerships Solve for Attribution with More Insight

(PR Newswire)

Dealertrack Looks to Speed Car Buying Process

(Auto Finance News)

Jumpstart: Car Buyers Want to Negotiate

(Auto Dealer Monthly)

Lithia and Shift to operate separately and share technology

(Auto Remarketing)

Luxury car owners trade up for American pickups as Ford, GM and Ram
trucks dominate market

(CNBC)

Millennials Spending Big on Cars — With Auto Loans to Match

(The Detroit Bureau)

Used car payments hit record $400 per month as prices top $20,000

(USA Today)

Used-Car Prices Reach 13-Year High in Third Quarter

(Vehicle Remarketing)

Used Vehicle Prices Rising, Pushing Buyers to Look at Leasing

(The Detroit Bureau)

-Automotive Wholesale News-

Compact Van Values Dip at Start of November

(Vehicle Remarketing)

-Automotive Ownership News-

Where your car is most likely to be stolen in every state

(USA Today)

-Automotive Enthusiast News-

Inside the World's Most Valuable Hot Wheels Collection

(Car and Driver)

With millions at stake, car collectors scour Earth for lost classics

(Detroit Free Press)

-Automotive Servicing News-

Mazda to recall 640,000 vehicles globally over diesel engine issue

(Reuters)

Subaru recalls nearly 400K vehicles to fix stalling problems

(Detroit Free Press)

U.S. agency probes 1.7 million GM SUVs over wiper failures

(Reuters)

-General Business & Executive News-

CDK Global Names Brian Krzanich President and Chief Executive Officer

(CDK)

Ford buys electric scooter startup Spin, joining competitors Bird and
Lime

(USA Today)

Harley-Davidson's electric motorcycle signals a big change for the
legendary, but troubled, company

(CNBC)

Tesla’s Booming Model 3 Sales and More This Week In The Future Of Cars

(Wired)

Time Dealer Of The Year

(Automotive News)

VW Considers Investing in Ford-Backed Autonomous Unit Argo

(Bloomberg)

-AEA Reminder-

Did we miss something? Let us know via our

Contact Us Page >>

. If you have specific important news going public soon that you would like
to share with your fellow AEA Members, submit your

PR Distribution Request >>

Have a great week,

Member Services

memberservices@automotiveexecutives.com

Automotive Executives Association

www.automotiveexecutives.com

AEye Advisory Board Profile: Willie Gault

We sat down with each of our Advisory Board Members to ask them why they’re excited about working with AEye…
Willie Gault is a former NFL wide receiver and Olympic athlete. Gault was an All-American at the University of Tennessee from 1979 to 1982. He played in the National Football League for 11 seasons for the Chicago Bears and Los Angeles Raiders. Considered one of the fastest NFL players of all-time, Gault was a member of the Chicago Bears team that won Super Bowl XX, and was also a participant of both the summer and winter U.S. Olympic teams. Gault is currently an investor, remains active, and holds several world records in masters track and field.

Q: What in your past experience ultimately drew you to the autonomous vehicle arena?
As a professional athlete, I have always been fascinated and amazed by human perception and the role it plays in athletic performance. The brain’s ability to sense the details in the world around you and then accurately calculate where your body needs to be in space and time is remarkable. I have been curious about how these capabilities might be replicated with technology and artificial intelligence. Recently, I have been tracking the application of artificial intelligence in autonomous vehicles which led me to AEye.

Q: Why AEye?
What AEye is doing aligns with my interests in biomimicry, which uses knowledge of natural processes found in humans, plants, and animals to better inform technology and design. After I found out AEye was pursuing research in this field, I knew I had to be part of it.

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond? How do you see AEye playing a pivotal role in this vision?
I live in Southern California where traffic has a major impact on quality of life. Autonomous vehicles will not only improve safety and efficiency on the roads, but will greatly improve quality of life around the world. I would like to see this technology adopted quickly and widely. However, one of the barriers to its adoption is cost. I believe that AEye’s iDAR system can be manufactured at tremendous scale, efficiently, and at a price point that encourages rapid adoption.

ALL PROFILES
Advisory Board Profile: Elliot Garbus — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Introduces Groundbreaking iDAR TechnologyThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessCB Insights Unveils Second Annual AI 100 Companies at A-ha!Observe, Orient, Decide, Act: How AEye’s iDAR System Adopts Principles of the OODA Loop to Achieve Intelligent, Long-Range DetectionAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Nate Ramanathan Joins AEye as Vice President of OperationsElon Musk Is Right: LiDAR Is a Crutch (Sort of.)

Elon Musk Is Right: LiDAR Is a Crutch (Sort of.)

By Luis Dussan

Tesla founder Elon Musk recently declared that LiDAR is a “crutch” for autonomous vehicle makers. The comment sparked headlines and raised eyebrows in the industry. Given that this vision technology is the core of many companies’ self-driving car strategies, his view strikes many as anathema or just plain nuts.

But for the moment, let’s ignore the fact that LiDAR is vital to self-driving cars from GM, Toyota and others. Forget that the most advanced autonomous vehicle projects have focused on developing laser-sensing systems.

Even disregard that the alleged theft of LiDAR secrets was at heart of the legal battle between Uber and Alphabet’s Waymo. Waymo claimed that LiDAR is essential technology for autonomous vehicles and won a settlement recently worth about $245 million.

The truth is: Mr. Musk is right. Relying solely on LiDAR can steer autonomous vehicle companies into innovation cul-de-sacs.

LiDAR is not enough. Autonomous vehicles require a rapid, accurate and complete perception system. It is a system-level problem that requires a system-level solution.

My agreement with Mr. Musk may seem surprising given that our company, AEye, sees LiDAR as playing a significant role in making driverless cars a commercial reality.

But we too have realized that if autonomous vehicles are ever going to be capable of avoiding accidents and saving lives, LiDAR is not the answer. At least not by itself.

Not THE answer, but part of the answer…
At Tesla, Mr. Musk is forsaking LiDAR for a 2D camera-based vision system. While Mr. Musk is known for disruptive thinking, it is hard to escape the fact that autonomous vehicles move through a 3D world and successful navigation of that world requires the seamless integration of both 2D and 3D data precisely mapped to both time and space.

At AEye, we believe LiDAR is the foundation of the solution when it seamlessly integrates with a multi-sensor perception system that is truly intelligent and dynamic. Our research has produced an elegant and multi-dimensional visual processing system modeled after the most effective in existence — the human visual cortex.

In fact, AEye’s initial perception system, called iDAR (Intelligent Detection and Ranging), offers a robotic perception system that is more reliable than human vision. LiDAR integrates with a low-light camera, embedded artificial intelligence and at-the-edge processing to enable a car’s vision system to replicate how the human visual cortex quickly interprets a scene.
In short, iDAR enables cars to see like people.

Why this is the superior approach?
In his skepticism of LiDAR, Mr. Musk has curiously bet on a “camera-mostly” strategy when building a vision system for autonomous Tesla vehicles. He has previously made bold (many say unrealistic) predictions that Tesla would achieve full Level 5 autonomous driving with camera-mostly vision in 2019. Navigant Research, in their annual ranking of self-driving vehicle makers, says this is “unlikely to ever be achievable” and rates Tesla at the back of the pack.

The company’s Autopilot system relies on cameras, some radar, and GPS. It has suffered setbacks due to a split with its camera supplier in 2016 after a fatal accident that investigators have blamed partly on Autopilot. Last month, a Tesla smashed into a firetruck in Culver City, California, and the driver said it was “on autopilot.”

The evidence strongly argues against Mr. Musk’s decision to bet on passive optical image processing systems. Existing 2D image processors and 2D to 3D image conversion concepts have serious flaws that can only be addressed with massive computing power and more importantly — algorithms that have not been invented, and are many years away from becoming a reality. This makes this approach too costly, inefficient and cumbersome to achieve Level 5 autonomous driving at commercial scale.

At AEye we know that integrating cameras, agile LiDAR, and AI equals a perception system that is better than the sum of its parts. It surpasses both the human eye and camera alone, which is required if you don’t have the sophistication of the human brain yet replicated.

In his “crutch” comments, Mr. Musk predicted that LiDAR-based systems will make cars “expensive, ugly and unnecessary,” adding: “I think they will find themselves at a competitive disadvantage.” The truth is that size, weight, power, and cost are decreasing for vehicle navigation grade LiDAR. And they will fall further. AEye, and maybe others, will see to that.

We respect Musk’s innovations and are grateful to him shedding light on where LiDAR needs to go to reach full autonomy. But in the end, as we see LiDAR as a lever, rather than a crutch, we can only give him partial credit for his understanding of the way forward.

ALL NEWS & VIEWS
Elon Musk Is Right: LiDAR Is a Crutch (Sort of.) — AEye Introduces Groundbreaking iDAR TechnologyObserve, Orient, Decide, Act: How AEye’s iDAR System Adopts Principles of the OODA Loop to Achieve Intelligent, Long-Range DetectionAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye Announces the AE100 Robotic Perception System for Autonomous VehiclesThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementCB Insights Unveils Second Annual AI 100 Companies at A-ha!AEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyGartner Names AEye Cool Vendor in AI for Computer VisionAEye Welcomes James Robnett to Executive Team as Vice President of Automotive Business Development