AEye Redefines the Three “Rs” of LiDAR – Rate, Resolution, and Range

Company’s Intelligent Sensor Extends Metrics for Evaluating Automotive LiDAR System Performance

Pleasanton, CA – July 18, 2019 – In a new white paper released today, artificial perception pioneer AEye proposes newly extended metrics for evaluating advanced LiDAR system performance. Industry leaders recognize that the conventional metrics of frame rate, angular resolution, and detection range currently used for evaluating LiDAR performance no longer alone adequately measure the effectiveness of sensors to solve real world use cases that underlie autonomous driving. In response, AEye has proposed three new extended metrics for LiDAR evaluation: intra-frame object revisit rate, instantaneous enhanced resolution, and object classification range. The AEye white paper describes these capabilities and why they matter within the context of real-world automotive applications.

“Current metrics used for evaluating LiDAR systems designed for autonomous driving often fail to adequately address how a system will perform in real-world conditions,” said AEye co-founder and CEO, Luis Dussan. “These extended metrics are more apropos to measuring advanced LiDAR performance, and are key to evaluating systems that will solve the most challenging use cases.”

First generation LiDAR sensors passively search a scene and detect objects using scan patterns that are fixed in both time and in space, with no ability to enhance performance with a faster revisit nor to apply extra resolution to high interest areas like the road surface or intersections. A new class of advanced solid-state LiDAR sensors enable intelligent information capture that expands the capabilities of LiDAR and moves from passive “search” or detection of objects to active search and, in many cases, to the actual acquisition and classification attributes of objects in real-time making perception and path planning software safer and more effective.

Extended Metric #1: From Frame Rate to Object Revisit Rate
It is universally accepted that a single interrogation point, or shot, does not deliver enough confidence to verify a hazard. Therefore, passive LiDAR systems need multiple interrogations/detects on the same object or position over multiple frames to validate an object. New, intelligent LiDAR systems, such as AEye’s iDAR™, can revisit an object within the same frame. These agile systems can accelerate the revisit rate by allowing for intelligent shot scheduling within a frame, with the ability to interrogate an object or position multiple times within a conventional frame.

In addition, existing LiDAR systems are limited by the physics of fixed laser pulse energy, fixed dwell time, and fixed scan patterns. Next generation systems such as iDAR, are software definable by perception, path and motion planning modules so that they can dynamically adjust their data collection approach to best fit their needs. Therefore, Object Revisit Rate, or the time between two shots at the same point or set of points, is a more important and relevant metric than Frame Rate alone.

Extended Metric #2: From Angular Resolution to Instantaneous (Angular) Resolution
The assumption behind the use of resolution as a conventional LiDAR metric is that the entire Field of view will be scanned with a constant pattern and uniform power. However, AEye’s iDAR technology, based on advanced robotic vision paradigms like those utilized in missile defense systems, was developed to break this assumption. Agile LiDAR systems enable a dynamic change in both temporal and spatial sampling density within a region of interest, creating instantaneous resolution. These regions of interest can be fixed at design time, triggered by specific conditions, or dynamically generated at run-time.

“Laser power is a valuable commodity. LiDAR systems need to be able to focus their defined laser power on objects that matter, said Allan Steinhardt, Chief Scientist at AEye. “Therefore, it is beneficial to measure how much more resolution can be applied on demand to key objects in addition to merely measuring static angular resolution over a fixed pattern. If you are not intelligently scanning, you are either over sampling, or under sampling the majority of a scene, wasting precious power with no gain in information value.”

Extended Metric #3: From Detection Range to Classification Range
The traditional metric of detection range to may work for simple applications, but for autonomy the more critical performance measurement is classification range. While it has been generally assumed that LiDAR manufacturers need not know or care about how the domain controller classifies or how long it takes, this can ultimately add latency and leave the vehicle vulnerable to dangerous situations. The more a sensor can provide classification attributes, the faster the perception system can confirm and classify. Measuring classification range, in addition to detection range, will provide better assessment of an automotive LiDAR’s capabilities, since it eliminates the unknowns in the perception stack, pinpointing salient information faster.

Unlike first generation LiDAR sensors, AEye’s iDAR is an integrated, responsive perception system that mimics the way the human visual cortex focuses on and evaluates potential driving hazards. Using a distributed architecture and edge processing, iDAR dynamically tracks objects of interest, while always critically assessing general surroundings. Its software-configurable hardware enables vehicle control system software to selectively customize data collection in real-time, while edge processing reduces control loop latency. By combining software-definability, artificial intelligence, feedback loops, with smart, agile sensors, iDAR is able to capture more intelligent information with less data, faster, for optimal performance and safety.

AEye’s iDAR system uniquely architected to scale from modular ADAS solutions to fully integrated mobility/robot-taxi implementations. In order to deliver automotive-grade ADAS solutions at scale, AEye has partnered with top Tier 1 global automotive suppliers such as Hella, LG Electronics, and Aisin to design and manufacture best-in-class ADAS systems to global automakers. In addition, the company is engaged in pilots with more than a dozen non-disclosed OEMs and mobility companies.

“To create an effective sensing system, two things matter most – the quality of the data and the speed at which you can make it actionable,” said AEye Co-Founder and SVP of Engineering, Barry Behnken. “Performance metrics matter because they determine how designers and engineers approach problem-solving. These extended metrics help the industry focus on what matters most.”

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, Airbus Ventures, and others.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye Redefines the Three “Rs” of LiDAR – Rate, Resolution, and Range — AEye Sets New Benchmark for LiDAR RangeAEye Advisory Board Profile: Luke SchneiderBlair LaCorte Named President of AEyeAEye Team Profile: Vivek ThotlaElon Musk Is Right: LiDAR Is a Crutch (Sort of.)SAE's Autonomous Vehicle Engineering on New LiDAR Performance MetricsLeading Global Automotive Supplier Aisin Invests in AEye through Pegasus Tech VenturesAEye Advisory Board Profile: Tim ShipleAEye Team Profile: Indu Vijayan

World first: Bosch and Daimler obtain approval for driverless parking without human supervision

Automated valet parking is the world’s first fully automated driverless (SAE Level 4)¹ parking function to be approved by the authorities System to be in daily use in the Mercedes-Benz Museum parking garage in Stuttgart Automated parking system collects and returns the vehicle completely independently Bosch supplies the infrastructure; Daimler the vehicle technology Special permit… Continue reading World first: Bosch and Daimler obtain approval for driverless parking without human supervision

China’s Neolix to trial autonomous vehicles in Saudi, UAE

DUBAI (Reuters) – China’s Neolix has signed a preliminary agreement with Middle East e-commerce company noon to trial autonomous vehicles in Saudi Arabia and the United Arab Emirates. Neolix will build driverless vehicles customized to the region’s weather conditions, where temperatures can soar above 50 degrees Celsius in the summer, noon said in a statement… Continue reading China’s Neolix to trial autonomous vehicles in Saudi, UAE

Apple’s latest Tesla hire specializes in car interiors

Another high-level Tesla engineering executive has hopped over to Apple. Steve MacManus, who was vice president of engineering at Tesla, is now a senior director at Apple, according to an update on his LinkedIn profile. Bloomberg was the first to report MacManus had taken the position at Apple. MacManus, whose was in charge of interior… Continue reading Apple’s latest Tesla hire specializes in car interiors

Kaarta using Velodyne’s lidar technology in new mobile street mapping system – Traffic Technology Today

Real-time mobile 3D reality capture and mapping technology firm Kaarta has included smart and powerful laser scanning equipment from Velodyne Lidar Inc. in its new Stencil 2-32 ground surface mapping system for infrastructure and asset management. Kaarta’s patent-pending technology, rooted in advanced robotics, accurately transforms the real world into 3D digital twins, streamlining workflow, reducing… Continue reading Kaarta using Velodyne’s lidar technology in new mobile street mapping system – Traffic Technology Today

Quanergy and Chery Join Forces to Launch New Era of Autonomous Vehicles and Smart Cities – Embedded Computing Design

Quanergy Systems, Inc. will be working with Chery Automobile as its LiDAR partner to accelerate advancements in autonomous driving and smart cities throughout China. Quanergy will be contributing its S3 automotive grade solid state LiDARs, which have no moving parts, as well as its QORTEX DTC (Detection, Tracking, Classification), based on LiDAR hardware and AI… Continue reading Quanergy and Chery Join Forces to Launch New Era of Autonomous Vehicles and Smart Cities – Embedded Computing Design

Where in the is the Pickup Truck Emoji? Hopefully on Your Phone Soon, Thanks to Ford

Next

Previous

Home

News

Where in the 🌎 is the Pickup Truck Emoji? Hopefully on Your Phone Soon, Thanks to Ford

Ford, with more than 100 years of truck heritage, is entering a white space segment of extremely small pickups by petitioning the Unicode Consortium to add a pickup truck emoji to the approved list of icons
Representing global truck customers with its petition to the arbiter that determines worldwide text standards, the first-ever pickup truck emoji also celebrates World Emoji Day
·Following months of top-secret development and testing, the all-new pickup truck emoji is short-listed as a candidate for inclusion in the next emoji update planned for early 2020

DEARBORN, Mich., July 17, 2019 – With billions of emoji sent daily and nearly every mode of transportation including cars, scooters, boats, spaceships and ski lifts among the 3,000 approved icons available to emoji users, truck fans noticed a glaring omission: There is no pickup truck. Ford decided it w..

After Harald Krüger, Oliver Zipse takes over driving at BMW

It is once again a “pure BMW product” that the supervisory board of the Bavarian manufacturer has chosen to take the lead of the group, while the challenges accumulate with the slowdown in the global market and expensive investments related to the car from the future. Oliver Zipse will succeed Harald Krüger on August 16th.… Continue reading After Harald Krüger, Oliver Zipse takes over driving at BMW