Deconstructing Two Conventional LiDAR Metrics, Part 2

Executive Summary
Conventional metrics for evaluating LiDAR systems designed for autonomous driving are problematic because they often fail to adequately or explicitly address real-world scenarios. Therefore, AEye, the developer of iDAR™ technology, proposes a number of new metrics to better assess the safety and performance of advanced automotive LiDAR sensors.

In Part 1 of this series, two metrics (frame rate and fixed [angular] resolution over a fixed Field-of-View) were discussed in relation to the more meaningful metrics of object revisit rate and instantaneous (angular) resolution. Now in Part 2, we’ll explore the metrics of detection range and velocity, and propose two new corresponding metrics for consideration: object classification range and time to true velocity.

Download “Deconstructing Two Conventional LiDAR Metrics, Part 2” [pdf]

Introduction
How is the effectiveness of an autonomous vehicle’s perception system measured? Performance metrics matter because they ultimately determine how designers and engineers approach problem-solving. Defining problems accurately makes them easier to solve, saving time, money, and resources.

When it comes to measuring how well automotive LiDAR systems perceive the space around them, manufacturers commonly agree that it’s valuable to determine their detection range. To optimize safety, the on-board computer system should detect obstacles as far ahead as possible. The speed with which they can do so theoretically determines whether control systems can plan and perform timely, evasive maneuvers. However, AEye believes that detection range is not the most important measurement in this scenario. Ultimately, it’s the control system’s ability to classify an object (here we refer to low level classification [e.g., blob plus dimensionality]) that enables it to decide on a basic course of action.

What matters most then, is how quickly an object can be identified and classified and how quickly a decision can be made about an object so an appropriate response can be calculated. In other words, it is not simply enough to quantify a distance at which a potential object can be detected at the sensor. One must also quantify the latency from the actual event to the sensor detection — plus the latency from the sensor detection to the CPU decision.

Similarly, the conventional metric of velocity has limitations. Today, some lab prototype frequency modulated continuous wave (FMCW) LiDAR systems can determine the radial velocity of nearby objects by interrogating them continuously for a period of time sufficient to observe a discernible change in position. However, this has two disadvantages: 1) the beam must remain locked on in fixed position for a certain period of time, and 2) only velocity in the radial direction can be discerned. Lateral velocity must be calculated with the standard update in position method. Exploration of these disadvantages will illustrate why, to achieve the highest degree of safety, time to true velocity is a much more useful metric. In other words, how long does it take a system to determine the velocity — in any direction — of a newly identified or appearing object?

Both object classification range and time to true velocity are more relevant metrics for assessing what a LiDAR system can and should achieve in tomorrow’s autonomous vehicles. In this white paper, we examine how these new metrics better measure and define the problems solved by more advanced LiDAR systems, such as AEye’s iDAR (Intelligent Detection and Ranging).

Conventional Metric #1: Detection Range
A single point detection — where the LiDAR registers one detect on a new object or person entering the scene — is indistinguishable from noise. Therefore, we will use a common industry definition for detection which involves persistence in adjacent shots per frame and/or across frames. For example, we might require 5 detects on an object per frame (5 points at the same range) and/or from frame-to-frame (1 single related point in 5 consecutive frames) to declare that a detection is a valid object.

It is a widely held belief that a detection range of 200+ meters at highway speeds is the required range for vehicles to effectively react to changing road conditions and surroundings. Conventional LiDAR sensors scan and collect data about the occupancy grid in a uniform pattern without discretion. This forms part of a constant stream of gigabytes of data sent to the vehicle’s on-board controller in order to detect objects. This design puts a massive strain on resources. Anywhere from 70 to 90+ percent of data is redundant or useless, which means it’s discarded.

Under these conditions, even a system that’s able to operate at a 10-30 Hz frame rate will struggle to deliver low latency while supporting high frame rates and high performance. And if latency for newly appearing objects is even 0.25 seconds, the frame rate hardly matters — by the time the data is made available to the central compute platform in some circumstances, it’s practically worthless. On the road, driving conditions can change dramatically in a tenth of a second. After 0.1 seconds, two cars closing in at a mutual speed of 200 km/hour are 18 feet closer. While predictive algorithms work well to counter this latency in structured, well-behaved environments, there are several examples where they don’t. One such example is the fast, “head-on” approaching small object. Here, a newly appearing object appears “head-on” with a single LiDAR point and it requires N consecutive single LiDAR point detects before it can be classified as an object. In this example, it’s easy to see that detection range and object classification range are two vastly different things.

With a variety of factors influencing the domain controller’s processing speed, measuring the efficacy of a system by its detection range is problematic. Without knowledge of latency or other pertinent factors, unwarranted trust is put on the controller’s ability to manage competing priorities. While it is generally assumed that LiDAR manufacturers are not supposed to know or care about how the domain controller classifies (or how long classification takes), we propose that ultimately, this leaves designers vulnerable to very dangerous situations.

AEye’s Metric
Object Classification Range
Currently, classification takes place somewhere in the domain controller. It’s at this point that objects are labeled as such and eventually, more clearly identified. At some level of identification, this data is used to predict known behavior patterns or trajectories. It is obviously extremely important and therefore, AEye argues that a better measurement for assessing an automotive LiDAR’s capability is its object classification range. This metric reduces the unknowns — such as latency associated with noise suppression (e.g., N of M detections) — early in the perception stack, pinpointing the salient information about whether a LiDAR system is capable of operating at optimal safety.

As a relatively new field, the definition of how much data is necessary for classification in automotive LiDAR has not yet been defined. Thus, AEye proposes that adopting perception standards used by video classification provides a valuable provisional definition. According to video standards, enabling classification begins with a 3×3 pixel grid of an object. Under this definition, an automotive LiDAR system might be assessed by how fast it’s able to generate a high quality, high-resolution 3×3 point cloud that enables the domain controller to comprehend objects and people in a scene.

Generating a 3×3 point cloud is a struggle for conventional LiDAR systems. While many tout an ability to manifest point clouds comprised of half a million or more points in one second, there is a lack of uniformity in these images. Point clouds created by most LiDAR systems display a fine degree of high-density horizontal lines coupled with very poor density vertical spacing, or in general, low overall density. Regardless, these fixed angular sampling patterns can be difficult for classification routines because the domain controller has to grapple with half a million points per second that are, in many cases, out of balance with the resolution required for the critical sampling of the object in question. Such an askew “mish-mash” of points means it needs to do additional interpretation, putting extra strain on CPU resources.

A much more efficient approach would be to gather about 10 percent of this data, focusing solely on Special Regions of Interest (e.g., moving vehicles and pedestrians) while keeping tabs on the background scene (trees, parked cars, buildings, etc.). Collecting only the salient data in the scene significantly speeds up classification. AEye’s agile iDAR is a LiDAR system integrated with AI that can intelligently accelerate shots only in a Region of Interest (ROI). This comes from its ability to selectively revisit points twice in 10’s of microseconds — an improvement of 3 orders of magnitude over conventional 64-line systems that can only hit an object once per frame (every 100 milliseconds). Future white papers will discuss various methods of using iDAR to ensure that we do not discount important background information by correctly employing the concepts of Search, Acquisition, and Tracking. This is similar to how humans perceive.

In summary, one can move low-level object detection to the sensor level by employing, as an example, a dense 3×3 voxel grid every time a significant detection occurs more or less in real-time. This happens before the data is sent to the central controller, allowing for higher instantaneous resolution than a fixed pattern system can offer and, ultimately, better object classification ranges when using video detection range analogies.

Real-World Applications: Imagine that an autonomous vehicle is driving on a desolate highway. Ahead, the road appears empty. Suddenly, the sensor per..

Leading Global Automotive Supplier Aisin Invests in AEye through Pegasus Tech Ventures

“AEye is a breakthrough automotive technology startup with a world-class management and technical team. Their AI-based perception solution, iDAR, is a superior technology, rendering the other LiDAR approaches obsolete and creating what we believe is a new standard for ADAS and autonomous solutions in the Japanese market and beyond.”

Pleasanton, CA – May 14, 2019 – AEye, a world leader in artificial perception systems and the developer of iDAR™, today announced a strategic investment from Japanese components and systems manufacturer Aisin through their corporate venture capital fund managed by Pegasus Tech Ventures. Aisin is ranked the sixth largest Tier 1 OEM parts supplier globally, and a premier developer of information and communications technology and electronics for the automotive market. Aisin’s investment paves the way for collaboration and expansion as the companies scale to meet the demands of OEMs deploying ADAS and mobility solutions.

“AEye is a breakthrough automotive technology startup with a world-class management and technical team,” said Anis Uzzaman, General Partner & CEO at Pegasus Tech Ventures. “Their AI-based perception solution, iDAR, is a superior technology, rendering the other LiDAR approaches obsolete and creating what we believe is a new standard for ADAS and autonomous solutions in the Japanese market and beyond.”

“We are delighted that Aisin and Pegasus Tech Ventures have joined as partners and as investors,” said Blair LaCorte, president of AEye. “The addition of Aisin as an investor, makes AEye uniquely positioned as the only LiDAR company to have both multiple OEMs and multiple Tier 1s as investors. This validation from automotive companies that are seeking to deploy solutions in the mobility and ADAS markets is a great honor and responsibility.”

Pegasus Tech Ventures, as the manager of Aisin’s corporate venture capital fund, has its global headquarters in Silicon Valley and invests in promising technology companies that align with Aisin’s mission to respond swiftly to the increasingly sophisticated and diverse needs of OEM customers with best-in-class products.

AEye’s iDAR combines software extensibility, artificial intelligence and smart, agile sensors to deliver intelligent data collection at the sensor level. The company’s AE110 product features the industry’s only software-definable LiDAR, creating an open platform for perception innovation that lets software engineers optimize data collection to best meet their needs. The AE200 system is designed to address the needs of modular, high-performance sensors, and is optimized for size, weight, power, and cost. Both will be available commercially during the second half of this year.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin/Pegasus Tech Ventures, Intel Capital, Airbus Ventures, and others.

About Pegasus Tech Ventures
Pegasus Tech Ventures is a global venture capital firm based in Silicon Valley. Pegasus manages over 20 parallel investment funds for its corporate limited partners, which include some of the largest multinational technology companies. With eight offices around the world, Pegasus invests in emerging technology companies of strategic interest to its limited partners and facilitates connections between its portfolio companies and its global corporate partners to accelerate the growth and competitiveness of both portfolio companies and corporate partners.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

Leading Global Automotive Supplier Aisin Invests in AEye through Pegasus Tech Ventures — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketDeconstructing Two Conventional LiDAR MetricsAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingBlair LaCorte Named President of AEyeAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual Property

AEye Expands Business Development and Customer Success Team to Support Growing Network of Global Partners and Customers

“I am very excited to join such an experienced and focused team. There is real value in leveraging the unique software-definable features of the iDAR platform to markets that require intelligent robotic vision, and I look forward to leading that charge.”

Pleasanton, CA – May 14, 2019 – Artificial perception pioneer AEye today announced the expansion of its business development team to address the global market opportunity AEye’s iDAR™ system. The company has hired BD directors in the Americas, Europe, and Japan to support the growing network of AEye automotive partners and customers worldwide. In addition, AEye announced that former Quanergy executive Akram Benmbarek has joined AEye as VP of strategic initiatives.

Meeting Global Market Demand
AEye’s new global team of BD directors will provide in-region, full lifecycle support for the company’s OEM and tier 1 partners deploying ADAS and autonomous vehicle systems in the U.S., Europe and Asia.

In the Americas, AEye has hired Ashika Schroll. Previously the senior manager of technology and business development for automotive programs at Flex (formerly Flextronics), Ashika worked with automotive OEMs, startups and Tier 1s to enable autonomous, connectivity, electrification and smart technologies. She holds a BS in Electrical Engineering and an MBA from Northwestern.

Peter Szelei will oversee AEye’s European business from Berlin. Previously, he managed business development, as well as led the global automotive business team at navigation solution provider NNG, where he worked closely with OEMs and Tier 1s to create best-in-class solutions. He holds a masters in Management and Entrepreneurship, as well as Management and Leadership.

In Asia, Itohru Iwama will be focused on Japan. He comes to AEye from Denso, where he was a business development manager. Iwama is a veteran tech and automotive executive with 20+ years of product/project management experience in automotive, electronics and software. He holds a BA in Foreign Studies and an MS in Management, and is fluent in Japanese, Chinese, German and English.

“Our unique opportunity to define the market has attracted an exceptional group of professionals who know the automotive industry, are excited about the technology, and have a proven track-record in helping Tier 1s and OEMs integrate leading-edge technologies into their products,” said Jim Robnett, AEye’s VP of automotive business development.

Extending into New Markets
In response to market demand and to facilitate commercialization of AEye’s iDAR technology into markets beyond automotive, AEye also announced today that former Quanergy executive Akram Benmbarek has joined the executive team as VP of strategic initiatives. Benmbarek is responsible for responding to growing demand for AEye’s iDAR platform in industries ranging from transportation automation to industrial automation, IoT and mapping.

Benmbarek comes to AEye from Quanergy, where he was charged with growing the company’s application of LiDAR across non-automotive verticals, including robotics, IoT smart spaces and security. Prior to Quanergy, Akram spent 16 years in Silicon Valley as both an entrepreneur and investment banker. He has a BA in Applied Economics, and an MBA from USC.

“I am very excited to join such an experienced and focused team,” said Benmbarek. “There is real value in leveraging the unique software definable features of the iDAR platform to markets that require intelligent robotic vision, and I look forward to leading that charge.”

“While our near-term priorities are automotive ADAS and mobility, we have seen increasing interest in our iDAR platform from other industries and markets,” said Luis Dussan, co-founder and CEO of AEye. “Akram has deep experience commercializing LiDAR technology across different verticals, and he’ll be a great asset as he helps AEye’s customers do the same.”

2019 Sees Continued Success for AEye
AEye’s iDAR Artificial Perception platform combines software extensibility, artificial intelligence and smart, agile sensors to deliver intelligent data collection at the sensor level. The company’s AE110 product features the industry’s only software-definable LiDAR, creating an open platform for perception innovation that allows software engineers to optimize data collection to best meet their needs. The AE200 system is designed to address the needs of modular, high-performance sensors, and is optimized for size, weight, power and cost. Both will be available commercially during the second half of this year.

AEye and iDAR have been increasingly recognized by technology, automotive and other industry organizations. In 2019 alone, AEye has been honored as “Best Up-and-Coming Company” by Image Sensors, “Outstanding Innovation in LiDAR” by ILMF and LiDAR Magazine, “Innovation Award for AI and Machine Learning” by SXSW and “ACES” award for Autonomy and Sensing by Autonomous Vehicle Technology. In addition, AEye was named to CB Insights AI 100 list as well as their Top Startups Revolutionizing Auto with AI for 2019.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin/Pegasus Tech Ventures, Intel Capital, Airbus Ventures, and others.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye Expands Business Development and Customer Success Team to Support Growing Network of Global Partners and Customers — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARLeading Global Automotive Supplier Aisin Invests in AEye through Pegasus Tech VenturesAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual PropertyBlair LaCorte Named President of AEyeAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye Adds VP of AI and Software to Executive TeamHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a Human

Forbes Learns how AEye Teaches Autonomous Vehicles to Perceive Like a Human

AEye asks: “What is the best way to deliver artificial perception for robotic and autonomous vehicles?” “In Teaching An Autonomous Car To Perceive Like A Human,” Forbes learns that it involves mimicking the advanced processes of the human visual cortex. “Intelligence begins with how you collect data,” says AEye President, Blair LaCorte, “[and] iDAR captures and processes environmental data – just as the human visual cortex does.”

Article >
Forbes Learns how AEye Teaches Autonomous Vehicles to Perceive Like a Human — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateDeconstructing Two Conventional LiDAR MetricsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorAEye Adds VP of AI and Software to Executive TeamForbes Features AEye’s iDAR 1000 Meter Detection Range and 100 HZ Scan Rate Achievement

AEye Team Profile: Indu Vijayan

On April 11, 2019, AEye’s Technical Product Manager, Indu Vijayan, will speak on “AI & Machine Learning” at SAE World Congress in Detroit, Michigan.
Indu Vijayan is a specialist in systems, software, algorithms and perception for self driving-cars. As the Technical Product Manager at AEye, she leads software development for the company’s leading-edge artificial perception system for autonomous vehicles. Prior to AEye, Indu spent five years at Delphi/Aptiv, where, as a senior software engineer on the Autonomous Driving team, she played a major role in bridging ADAS sensors and algorithms, and extending them for mobility. She holds a BS, Technology in Computer Science from India’s Amrita University, and an MS in Computer Engineering from Stony Brook University.

We sat down with Indu to learn more about why the advancement of edge computing and AI is so critical to the rollout of safe and efficient autonomous vehicles…
Q: What does it mean to implement Artificial Intelligence “at the sensor level”?
AEye’s iDAR is the only artificial perception system that pushes data capture and processing to the edge of the network. We achieve this by fusing LiDAR and camera at the sensor to create the highest quality data collection. Traditional LiDAR scanning methods attribute the same amount of importance to every aspect of a given scene. However, as we know from our own experiences driving, not all objects are perceived with equal priority. When driving, we pay much more attention to the pedestrian standing near a crosswalk than to a tree. In this same sense, cars must be able to perceive like a human would in order to drive safely and efficiently. That means, enabling the sensor to treat different regions or objects with varying degrees of priority, and collecting only the most situationally relevant information.

Q: Why is this favorable to the development of advanced artificial perception systems?
Since iDAR is intelligent, it can efficiently cycle and prioritize sensory information. Meaning, that it only sends the most relevant data to the vehicle’s path-planning system. In a conventional sensor system, layers upon layers of algorithms are needed to extract out relevant, actionable data, which creates too much latency for the vehicle to navigate safely at highway speeds. Say you are driving 60mph along a highway when, suddenly, you hear the siren of an ambulance coming from behind you, quickly closing in. In this instance, you are left with two choices: either stay in your lane and maintain your speed, or safety slow down and/or pull over to the side of the road. Whichever decision you choose is determined by the auditory and visual cues you are receiving from the environment, such as the speed of the ambulance, or the density of the traffic around you.

Just like in human perception, our iDAR system creates feedback loops that can efficiently cycle and prioritize sensory information. When humans gather information from the visual cortex, it creates a feedback loop that helps make each step of visual perception more efficient. Because we mimic this process in our system, we enable similar behavior to be learned and trained in autonomous vehicles so that they can make better, more accurate decisions, faster. Therefore, it is able to continually learn and adapt, so that, over time, it becomes even better at identifying and tracking potential hazards.

And because we only scan for and retrieve the most relevant information in a scene, this ultimately allows for cost and power optimization. For instance, we don’t need high end, mega-powerful processors to produce any of our AI algorithms because when we emphasize data quality over data quantity, we reduce the need for a highly powerful processor hiding in the trunk of the car. This not only makes us more cost effective, but it could allow for the redistribution of the power budget inside an electric vehicle to enable longer range performance, as an example. But most importantly, this allows us to make systems that are scalable and optimized for the full value chain.

Q: You will be speaking at SAE World Congress in Detroit, one of the largest gatherings of automotive industry engineers. Why is it so important for advanced automotive systems developers to regularly meet and discuss new ideas and innovations in the industry?
Ultimately, autonomous vehicles will spark a radical shift in our society. Not only will it make safer and more efficient public transportation accessible to the masses, it will allow us to have the time to accomplish meaningful tasks which would otherwise be lost to a long commute. Engineers are the leaders in bringing about this societal change. The dream of safe, fully automated vehicles is a herculean and challenging task to take on, but it’s one that is desperately needed to move society forward. Opportunities like SAE World Congress allow engineers to brainstorm and put the foundational stones together for a safer tomorrow.

AEye Team Profile: Indu Vijayan — AEye Adds VP of AI and Software to Executive TeamGartner Names AEye Cool Vendor in AI for Computer VisionAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingAEye Advisory Board Profile: Luke SchneiderAEye Advisory Board Profile: Elliot Garbus

AEye Team Profile: Dr. Allan Steinhardt

We sat down with AEye’s Chief Scientist, Dr. Allan Steinhardt, to learn about the challenges of using publicly available defense technologies in autonomous vehicles, the current state of automotive LiDAR, and the technology that most excites him today…
An IEEE fellow, Dr. Allan Steinhardt is a sought-after expert on radar, missile defense, GMTI and space surveillance. He was Chief Scientist for DARPA, co-author of a book on adaptive radar, and assistant professor in Electrical Engineering and Applied Mathematics at Cornell University, where he performed funded research on sensor arrays and optimal detection capabilities. Dr. Steinhardt is a member of the national academy of Science Naval Studies Board, and recipient of the US Defense Medal for Exceptional Public Service. He has also served as chief scientist at Booz Allen, the radar project lead at MIT Lincoln Laboratory, and director of signal processing for the defense Industry with BAE/Alphatech.

Q: What technologies developed by DARPA, and other agencies, have been adapted for autonomous vehicle use?
The technologies developed by DARPA that have been of value to both LiDAR and other kinds of sensors for autonomous vehicles are: the solid-state laser, Micro-Electro-Mechanical Systems (MEMS), and the computer chips that are able to do all the processing. So, all the building blocks required for the development of autonomous vehicles have their humble beginnings at DARPA. But there’s the system point of view as well. Many of the systems that are being developed now were started by the government. Biomimicry was also a big investment at DARPA. When I was there, we went on to create a separate office on biomimicry, looking at different biological systems and emulating them for various purposes.

However, LiDAR actually has its origins in the relatively distant past. When I first began working for AEye as a consultant, some of the best papers I came across about LiDAR were written in the early 1960s. And this research has really not been improved upon since then in terms of the basic science. The actual concept of the laser was first developed by Albert Einstein, so it has been around for a lot longer than people realize.

Q: What challenges have we faced trying to adapt defense technology for autonomous vehicle use?
One obvious challenge the industry currently faces is bringing down the cost of these technologies for commercial use. Another is miniaturizing these technologies to fit inside a vehicle, as opposed to a fighter jet or tank. However, one issue that we never really considered in the government that is coming to the fore today is the amount of power it takes to do the processing. We generally weren’t thinking about green vehicles that don’t use a lot of gasoline. But nowadays, the processing is becoming so sophisticated in autonomous vehicles that it’s literally eating into fuel efficiency.

Another one, surprisingly, is the whole issue of cyber security. Back when we were beginning these projects, we never imagined that there would be such deep connections between the Internet and wireless systems and lasers. Back in the early 80’s, there was no worry about a potential external entity accessing the internals of a laser system.

Q: How are we mitigating these challenges?
AEye has a perspective that we hope the industry will adopt more widely, which is to use biomimicry to focus energy on things that matter, which is how most biological systems operate. AEye also creatively adopts off-the-shelf technology and uses them to optimize for size, weight, power, and cost. However, cyber security is still very much an open issue and hasn’t, in my opinion, been adequately addressed yet.

Q: Is automotive LiDAR where you thought it would be today?
I’m surprised at how far along we are. Yet, we are still less developed than I would have thought. I am surprised at how much advancement has taken place in the actual sensor itself (i.e., how well can we sense light, the efficiency of the lasers). On the other hand, I think we are way behind in terms of understanding and addressing what to do with the information gathered. So, I find that where we are today is surprisingly primitive in that sense.

Q: What new technology are you most excited about?
There are the obvious candidates: A.I., silicon technology, and basic laser components. And then there are the less obvious, more interesting candidates, such as the way the medical/biological research communities are looking for ways to adopt how biological systems sense and respond to various stimuli. There’s also a telecommunications revolution currently taking place. I’d call it “the next wave in fiber optic communications”, where the bandwidths are going to be even higher and efficiencies even better. There’s a lot of interesting and non-obvious ways that we can leverage these technologies in autonomous vehicle development.

Q: What have you seen as the biggest difference between working in Silicon Valley and a US Defense Agency?
I thought DARPA was fast paced, but that’s nothing compared to Silicon Valley. The velocity of innovation happening is truly breathtaking. It’s very satisfying to be here. I can’t wait to see what’s next.

Join us on April 18, 2019, when AEye’s Chief Scientist, Dr. Allan Steinhardt, gives a keynote address entitled “Life in the Fast Lane: What’s different about transportation AI?” at Bootstraps Labs Applied Artificial Intelligence Conference in San Francisco. Learn more here.

AEye Team Profile: Dr. Allan Steinhardt — AEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyBlair LaCorte Named President of AEyeAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Deconstructing Two Conventional LiDAR MetricsAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye Team Profile: Indu VijayanAEye Announces Addition of Aravind Ratnam as Vice President of Product Management

Unique iDAR Features That Drive SAE’s 5 Levels of Autonomy

In 2014, the Society of Automotive Engineers (SAE International) first published their classification system for the levels of vehicle autonomy, called “SAE J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems”, which has since been widely adopted across the automotive industry. Here, AEye presents how the unique features of its iDAR perception system for autonomous vehicles enable SAE’s 5 Levels of Autonomy.

Unique iDAR Features That Drive SAE’s 5 Levels of Autonomy — Deconstructing Two Conventional LiDAR MetricsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye Team Profile: Indu VijayanAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual Property

The Register and AEye President, Blair LaCorte, Predict the Future of Self-Driving Cars

At the 2019 Intel Capital Global Summit in Phoenix, AEye President, Blair LaCorte, details to The Register how he envisions the future of the self-driving car industry and demonstrates AEye’s advanced artificial perception technology.

Article >
The Register and AEye President, Blair LaCorte, Predict the Future of Self-Driving Cars — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARBlair LaCorte Named President of AEyeLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessDeconstructing Two Conventional LiDAR MetricsAEye Advisory Board Profile: Elliot GarbusAEye Announces Addition of Aravind Ratnam as Vice President of Product Management

Smarter Cars Podcast Talks LiDAR and Perception Systems with AEye President, Blair LaCorte

Smarter Cars host, Michele Kyrouz, sits down with AEye President, Blair LaCorte to discuss AEye’s cutting-edge technology for autonomous vehicles which fuses cameras and LiDAR to mimic human perception

Article >
Smarter Cars Podcast Talks LiDAR and Perception Systems with AEye President, Blair LaCorte — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARBlair LaCorte Named President of AEyeDeconstructing Two Conventional LiDAR MetricsAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual PropertyLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye Adds VP of AI and Software to Executive TeamHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingAEye Introduces Advanced Mobility Product for the Autonomous Vehicle Market

AEye Advisory Board Profile: Luke Schneider

We sat down with each of our Advisory Board Members to ask them why they’re excited about working with AEye…
Mr. Schneider was most recently the CEO of Silvercar. Acquired by Audi in 2017, Silvercar has offerings in the rental car segment (Silvercar by Audi), auto dealership fleet management (Dealerware) and vehicle subscriptions (Audi select). Prior to joining Silvercar in early 2012, Schneider served as CTO of Zipcar, the world’s largest car-sharing company. He came to Zipcar by way of Flexcar, the United States’ first car-sharing company, where he served as CTO and VP of Strategy. Schneider conceived and drove development of new products, including the award-winning Zipcar iPhone app, which he debuted during a keynote presentation at Apple’s 2009 Worldwide Developer Conference.

Schneider began his career at Ford Motor Company in 1992. Luke earned a bachelor’s degree in Mechanical Engineering from the University of Texas at Austin and a MBA with specialization in Operations and Strategy from the Tepper School of Business at Carnegie Mellon University.

Q: What in your past experience ultimately drew you to the autonomous vehicle arena?
For the better part of 25 years, I have worked at the intersection of transportation and technology. Starting as a powertrain engineer at Ford Motor Company, and through executive tenures at Flexcar, Zipcar and Silvercar, I have seen the industry begin the most profound, tectonic shift in its 120 year history. You need to do little more than look at the accelerating pace of change in vehicle product development — beginning with the shift from vacuum systems and mechanical linkages, to semiconductors and electronics — to appreciate how dramatically personal transportation is changing. Add to that the evolution of the consumer model, consistent with what we’ve seen in countless other categories (buy what you want, pay for what you need, and do it on your phone), and it’s clear the revolution is not coming, it is upon us!

For me personally, as I seek a convergence point for the many disparate aspects of the automotive ecosystem, I am certain that the future will be indelibly shaped by 4 primary drivers: autonomous, electric, shared, and connected. Of all of those, the one that inspires the most hope, excitement, and wonder is autonomous. Autonomy has the power to all but eliminate 40,000 fatalities per year in the US alone, and hundreds of thousands of injuries. It will make our journeys faster, less stressful, and more enjoyable. And, it will make our ever more crowded cities more livable, walkable, and sustainable. I want to live in a world like that.

Q: Why AEye?
AEye has a fantastic set of technologies that they’ve combined in a new way to deliver breakthroughs in perception. I’m also very impressed with the unique history of the leadership team. They have a tremendous amount of experience with LiDAR from their work in aerospace. It is unusual to find a start up in the United States with this kind of experience, and a team that has worked with LiDAR for decades.

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond? How do you see AEye playing a pivotal role in this vision?
The first thing anybody notices about AEye is the exceptional caliber of people who comprise the place. They have attracted such a talented, diverse team — and not only scientists, engineers and developers. It’s clear to me that the staff is drawn in by a brilliant central concept at the heart of the company: recasting a daunting problem in an entirely new light.

Successful technology companies take real-world problems, apply fresh, innovative thinking to them, and tum those problems into business opportunities. The rarest of the rare are able to not only conceive and theorize, but also build and grow. It is harder than it looks to take a complicated technology concept and properly characterize it in a way that accurately describes it without oversimplifying it. But, at the same time, to paraphrase Dr. Richard Feynman: “If you can’t explain something in simple terms, you don’t understand it.”

One of the things I admire most about AEye is the way the company lives this statement, commanding the respect of customers, employees, and industry veterans. With its compelling technology case, dedicated team, and vast reach into the expertise of advisors, investors, and customers, how not AEye?

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond? How do you see AEye playing a pivotal role in this vision?
Already, ADAS is penetrating the automotive world at a pace not seen by any technology in my career. As cars get safer, whole industries change or are disrupted. The auto insurance and car rental industries are obvious early examples. I am excited, encouraged, and hopelessly optimistic about the direction we are headed, led by AEye and other kindred spirits. Besides making personal transportation safer, artificial perception — coupled with machine learning, AI, and a dozen other technologies — will begin to re-shape an industry model that is desperately in need of evolving.

 As a society, we have solved the personal transportation problem in the most expensive way imaginable — financially, socially, and environmentally. It may have worked for the first 100 years, but it won’t work for the next. Our cities have become less livable, even as their populations continue to swell.

In this next decade, I believe we will begin to see the first concrete examples of artificial vehicle perception accelerating the pace of change for the benefit of all. Imagine what happens to profitability in the ride share business when the cars can drive themselves. Think about what choices we will have during our morning commutes when our bandwidth isn’t fully consumed in the act of driving. Imagine how much cleaner, more walkable, less congested our cities will be using vehicles equipped with this innovative technology.

AEye finds itself defining, even accelerating, the arrival of this future state. I can’t think of a more exciting place to be (and a better equipped group of people) to make it a reality.

AEye Advisory Board Profile: Luke Schneider — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Advisory Board Profile: Elliot GarbusAEye Advisory Board Profile: Tim ShipleThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanDeconstructing Two Conventional LiDAR MetricsAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKBlair LaCorte Named President of AEyeAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual PropertyHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingAEye Advisory Board Profile: Scott Pfotenhauer