Lidar Leader Award Winner 2019 – Outstanding Innovation in Lidar

Lidar Leader Award 2019
Winner – Outstanding Innovation in Lidar
On January 29th, 2019, AEye was awarded “Outstanding Innovation in Lidar” for the creation of the AE110 Artificial Perception System at the second annual Lidar Leader Awards ceremony at International LiDAR Mapping Forum (ILMF) – in cooperation with Spatial Media’s Lidar Magazine.

The AE110 is built on AEye’s iDAR (Intelligent Detection and Ranging) perception system, which delivers more accurate and more intelligent information faster to a self-driving car’s path planning system. Learn why iDAR is better than LiDAR here.

Pictured below: AEye co-founder & SVP of Engineering, Dr. Barry Behnken (left) accepting the prestigious award on AEye’s behalf, with Lisa Murray (middle), Group Director at Diversified Communications (ILMF), and Dr. Stewart Walker (right), Managing Editor at Lidar Magazine.

READ MORE
Lidar Leader Award Winner 2019 – Outstanding Innovation in Lidar — AEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual PropertyAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Deconstructing Two Conventional LiDAR MetricsAEye Announces Addition of Aravind Ratnam as Vice President of Product Management

AEye Advisory Board Profile: Scott Pfotenhauer

We sat down with each of our Advisory Board Members to ask them about their vision for the future of self-driving cars…
Mr. Pfotenhauer began his career at Intel in the mid 1970’s and has since been involved with numerous technology companies. Since joining Morgan Stanley in 1996, he developed and leveraged an expansive network of investment banking and wealth management resources to help clients formulate exit strategies for their businesses. He is a Senior Investment Management Consultant with Morgan Stanley, advising private client and corporate executives. Pfotenhauer obtained both a BA and MBA in Business from California Coast University.

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond?
I was fortunate enough to work for Intel from the mid 70’s to the mid 90’s — and participated in the advent of desktop and mobile computing, which had their foundation on Intel products.

In the last 25 years, I’ve continued to look around the corner to try and spot the next “big thing”. We now have the advanced computing tools that allow companies to apply Artificial Intelligence (AI) within their decision making and take advantage of big data. These trends converge around the auto industry and its next inflection points — EVs and autonomy.

Q: What do you see as the next logical step for the auto industry?
When I think about the auto industry, it’s remarkable how many inflection points it has gone through. The internal combustion engine of 1876 made cars feasible, while mass production which began in the early 1900’s made them affordable. The starter engine of 1912 rendered hand cranks obsolete, and then the first transcontinental highway in 1913 opened up the nation. The pickup truck of the early 1930’s made the vehicle more functional, while the automatic transmission, power steering, and braking of the 40’s and 50’s made it easier and safer for just about anyone to drive.

Seat belts, anti-lock brakes, and airbags increased the safety, while GPS and maps made us all more efficient. Now we have begun the climb up the 5 levels of automation — we won’t get anywhere without a complete set of artificial eyes — that are always on, don’t blink, don’t get distracted, or won’t tire.

AEye Advisory Board Profile: Scott Pfotenhauer — AEye Advisory Board Profile: Elliot GarbusAEye Advisory Board Profile: Tim ShipleThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessDeconstructing Two Conventional LiDAR MetricsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARElon Musk Is Right: LiDAR Is a Crutch (Sort of.)AEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateBlair LaCorte Named President of AEyeFutureCar Provides an In-Depth Look into How AEye’s AI-Based Perception System Can Become the ‘Eyes’ of Self-Driving Cars

Deconstructing Two Conventional LiDAR Metrics

Executive Summary
Conventional metrics (such as frame rate and resolution) used for evaluating LiDAR systems don’t adequately or explicitly address real-world problems facing autonomous driving. Therefore, AEye, the developer of iDAR™ technology, proposes two new corresponding metrics for evaluation: object revisit rate and instantaneous resolution. These additional metrics are necessary to better describe the safety and performance of more advanced LiDAR sensors in real-world scenarios.

Download “Deconstructing Two Conventional LiDAR Metrics” [pdf]

Introduction
How is the effectiveness of an intelligent detection system measured? Conventional metrics used for evaluating LiDAR systems rely on frame rate and resolution (as well as range which we will discuss at a later time) as the touchstones of success. However, AEye believes that these measurements are inadequate for evaluating the effectiveness of more advanced LiDAR systems for autonomous vehicles. In this white paper, we discuss why object revisit rate and instantaneous resolution are more meaningful metrics to assess the capabilities of our iDAR system, and why these metrics are ultimately more advantageous for autonomous vehicle development.

Deconstructing the Metrics
Makers of automotive LiDAR systems are frequently asked about their frame rate, and whether or not their technology has the ability to detect objects with 10 percent reflectivity at some range and at some frame rate with some arbitrary resolution. While most manufacturers can readily answer these questions, we believe that this description is insufficient and that the industry must adopt a more holistic approach when it comes to assessing LiDAR systems for automotive use. Additionally, we must think of them as they relate to a perception system in general—rather than as an individual point sensor. Below, we have outlined two conventional LiDAR metrics and AEye’s additional metrics.

Conventional Metric #1: Frame rate of xx Hz

AEye’s Metric
Objective revisit rate (the time between two shots at the same point or set of points)
Defining single point detection range alone is insufficient because a single interrogation point (shot) rarely delivers enough confidence—it is only suggestive. Therefore, we need multiple interrogation/detects at the same point or multiple interrogation/detects on the same object to validate or comprehend an object or scene. The time it takes to detect an object is dependent on many variables, such as distance, interrogation pattern and resolution, reflectivity, or the shape of the objects to interrogate, and can “traditionally” take several full frames to achieve. What is missing from the conventional metric, therefore, is a finer definition of time. Thus, AEye proposes that object revisit rate becomes a new, more critical metric for automotive LiDAR because an agile LiDAR such as AEye’s iDAR can have object revisit rate that is vastly superior to its traditional classic frame rate.

The time between the first measurement of an object and the second is critical, as shorter object revisit times can help keep processing times low for advanced algorithms that need to correlate between multiple moving objects in a scene. Additionally, too long of an object revisit time at fast velocities could be the difference between detecting an object before it’s too late and loss of life, since even the best algorithms used to associate/correlate multiple moving objects can be confused when many objects are in the scene and time elapsed between samples is high.

The agile AEye platform accelerates revisit rate by allowing for intelligent shot scheduling within a frame, including the capability to interrogate a target position or object multiple times before the traditional classic frame is completed. For example, an iDAR sensor can schedule two repeated shots on a point or points of interest in quick succession. These multiple interrogations can then be used according to the scene context and the needs of the user (either human or another computer) to increase confidence (or even extend ranging performance).

These interrogations can also be data dependent. For example, an object can be revisited if a (low confidence) detection occurs, and it is desirable to quickly validate, or reject, said detect with a secondary measurement, as seen in Figure 1. A typical completive full frame rate (traditional classic) for conventional sensors is approximately 10Hz, or 100 msec. This is also, for said conventional sensors, equivalent to the “object revisit rate.” With AEye’s flexible iDAR technology, the object revisit rate is now different from the frame rate and it can be as low as 10s of microseconds between revisits to key points/objects as the user/host requires—easily 3 to 4 orders of magnitude faster than alternative fixed scan sensors.

Figure 1. “Foveation in Time” Intra-Frame Revisit Interval and random scan pattern of iDAR (B) compared to Revisit Interval on a typical fixed pattern LiDAR (A)

What this means is that an effective perception engineering team using dynamic object revisit capabilities can create a perception system that is at least an order of magnitude faster than what can be delivered by conventional LiDAR. We believe this capability is invaluable in delivering level 4/5 autonomy as the vehicle will need to handle significantly complex corner cases.

Real-World Application: When you’re driving, the world can change dramatically in a tenth of a second. In fact, two cars closing at a mutual speed of 200 km/hour are 18 feet closer after 0.1 seconds. By having an accelerated revisit rate, we increase the likelihood of hitting the same target with a subsequent shot due to the decreased likelihood that the target has moved significantly in the time between shots. This helps the user solve the “Correspondence Problem” (determining which parts of one “snapshot” of a dynamic scene correspond to which parts of another snapshot of the same scene), while simultaneously enabling the user to quickly build statistical measures of confidence and generate aggregate information that downstream processors might require (such as object velocity and acceleration). While the “Correspondence Problem” will always be a challenge for autonomous systems, the ability to selectively increase revisit rate on points of interest can significantly aid higher level inferencing algorithms, allowing them to more quickly determine correct solutions.

Furthermore, only allocating shots to extract velocity and acceleration when detections have occurred (part of the acquisition chain) rather than allocating repeat shots everywhere in the frame vastly reduces the required number of shots per frame. For example, even in dense traffic, only 1% of the occupancy grid may contain detections. Adding a second detection, via iDAR, to build a velocity estimate on each detection increases the overall number of shots by only 1%, whereas obtaining velocity everywhere, as mandated by fixed scan systems, doubles the required shots (100%, i.e., 2x increase). This speed and shot saliency ultimately makes autonomous driving much safer because it eliminates ambiguity and allows for more efficient use of downstream processing resources. Solving other “Correspondence Problems” (think: camera/LiDAR) with iDAR is the subject of a future paper.

The AEye Advantage: Whereas other LiDAR systems are limited by the physics of fixed laser pulse energy, fixed dwell time, and fixed scan patterns, AEye’s iDAR technology is a software definable system that allows downstream processors to tailor their data collection strategy to best suit their information processing needs at design time and/or run time. Physics, of course, remains the ultimate arbiter, with the primary physics constraints being the photon budget (laser average power), and the speed of light induced round trip flight time, but the AEye software agility allows us to achieve the limit of physics in a tailored (as opposed to global) fashion. The achievable object revisit rate of AEye’s iDAR system for points of interest (not just the exact point just visited) is microseconds to a few milliseconds, compared to conventional LiDAR systems that require many tens or hundreds of milliseconds between revisits, and therefore, a high degree of object correspondence ambiguity. This gives the unprecedented ability to calculate things like object velocity in any direction faster than any other system.

The ability to define the new metric, Object Revisit Rate, which is decoupled from the traditional “frame rate,” is important also for the next metric we introduce. This second metric helps to segregate the basic idea of “search” algorithms from “acquisition” algorithms: two algorithm types that should never be confused. Separation of these two basic types of algorithms provides insight into the heart of iDAR, which is the Principle of Information Quality as opposed to Data Quantity. Or, in other words: “more information, less data.”

Conventional Metric #2: Fixed (angular) resolution over a fixed Field-of-View

AEye’s Metric
Instantaneous (angular) resolution
The assumption behind the use of resolution as a conventional metric is that it is assumed the Field-of-View will be scanned with a constant pattern. This makes perfect sense for less intelligent traditional sensors that have limited or no ability to adapt their collection capabilities. Additionally, the conventional metric assumes that salient information resident within the scene is uniform in space and time, which we know is not true. Because of these assumptions, conventional LiDAR systems indiscriminately collect gigabytes of data from a vehicle’s surroundings, sending those inputs to the CPU for decimation and interpretation (wherein an estimated 70 to 90 percent of this data is found to be useless or redundant, and thrown out). It’s an incredibly inefficient process. Note this is doubly inefficient: the active collection of..

AEye Adds VP of AI and Software to Executive Team

Abhijit Thatte to Lead Software Product Development as AEye Continues to Expand and Enhance iDAR Artificial Perception Platform
“AEye is the only LiDAR company that has built its company with intelligent data as the guiding principle. This gives me a great framework to build from, as I look to enhance the existing software product suite, while delivering a powerful platform for perception innovation via a versatile and powerful toolset for our engineering development customers. It’s exciting to be at the helm of such transformational technology, and I’m thrilled to join AEye, a leader in revolutionizing transportation.”

Pleasanton, CA – February 26, 2019 – Artificial perception pioneer AEye today announced the addition of Abhijit Thatte as VP of AI and Software to its executive team. Thatte is an accomplished leader with more than 20 years of software product development experience across industries from robotics to industrials. At AEye, he is charged with leveraging the company’s core artificial intelligence capabilities to ensure the delivery of better perception data, faster to autonomous vehicle perception systems.

“Abhijit brings both the big picture vision of what AEye can do with our unique architecture and software stack, and the skillset to execute and bring software products to the market,” said AEye co-founder and CEO Luis Dussan. “This is a critical role as we look to extend the capabilities of our software products, enable seamless integration and interoperability, and deliver smarter data that drives actionable information faster to vehicle path-planning systems, for improved safety, efficiency, and performance.”

As the head of software product engineering at AEye, Thatte will be responsible for the entire software product suite development, including 3D perception, visualization, device drivers, embedded software, and SDK. Prior to AEye, he led Artificial Intelligence at Aricent, Data Science at GE and Software Engineering at Varian Medical Systems. Thatte is a member of the Forbes Technology Council, and a sought-after speaker in artificial intelligence, deep learning, and machine learning. He has a bachelors in Electrical Engineering and a Master of Information and Data Science, Artificial Intelligence, from UC Berkeley.

“AEye is the only LiDAR company that has built its company with intelligent data as the guiding principle,” said Thatte. “This gives me a great framework to build from, as I look to enhance the existing software product suite, while delivering a powerful platform for perception innovation via a versatile and powerful toolset for our engineering development customers. It’s exciting to be at the helm of such transformational technology, and I’m thrilled to join AEye, a leader in revolutionizing transportation.”

AEye’s iDAR is a new form of intelligent data collection that fuses 1550 nanometer (nm), solid-state agile LiDAR with a low-light HD camera and embedded AI to intelligently capture data at the sensor level. The only software-definable intelligent agile LiDAR, AEye’s iDAR artificial perception system leads the industry in range and scan rate performance for automotive-grade LiDAR.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Intel Capital and Airbus Ventures.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye Adds VP of AI and Software to Executive Team — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementDeconstructing Two Conventional LiDAR MetricsAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketGartner Names AEye Cool Vendor in AI for Computer VisionBlair LaCorte Named President of AEyeAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual PropertyLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SK

AEye Advisory Board Profile: Tim Shiple

We sat down with each of our Advisory Board Members to ask them why they’re excited about working with AEye…

Mr. Shiple has 30 years of experience in roles such as Chief Operations Officer and Chief Supply Chain Officer. His experience includes: Portfolio management, Procurement, Finance, Operations, M&A, ERP, and Product Hardware & Software Development. At Google, he led Supply Chain and Quality and Customer Care initiatives. As the Senior Executive in charge of M&A with Ecolab, Mr. Shiple helped improve their performance and manage identification of acquisition synergies. Mr. Shiple earned a bachelor’s degree in business at Spring Arbor University. He received Executive Management and Leadership education at Columbia University.

Q: What in your past experience ultimately drew you to the autonomous vehicle arena?
With over 15 years of experience in the automotive industry and another 12 years in technology, I’ve found that the development of autonomous vehicles is the perfect intersection of my experience and interests.

Q: Why AEye?
Because of my experience in automotive and technology, I’m able to spot a lucrative startup in those fields. AEye is one of those.

Talking to many key players in the automotive industry and the autonomous space — such as Tier 1s and OEMs who are trying to assess their positioning and their technology — has given me the unique ability to find a promising startup and help it grow and mature, while also providing resources and connections on both sides of the industry.

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond? How do you see AEye playing a pivotal role in this vision?
As perception layers become more important to automotive companies, Tier 1s will certainly include AEye’s advanced artificial perception system, iDAR, in the ADAS systems they’re developing. In the short term, I see that the industry will first make better safety systems for ADAS, which will eventually lead us through the 5 levels of autonomy.

But I’m not just referring to self-driving cars. This will include planes, trains, and other vehicles, many of which will usher us into a new dimension of transportation. We’ll begin to not only see self-driving vehicles on the road, but in the air, which will get us to where we need to go faster, and in a more economic way. The flying vehicle promise will be fulfilled sooner than we think, and AEye will play a pivotal role in perception for these vehicles every step of the way.

AEye Advisory Board Profile: Tim Shiple — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Advisory Board Profile: Elliot GarbusAEye Advisory Board Profile: Willie GaultThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessAEye Advisory Board Profile: Scott PfotenhauerAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKAEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual PropertyBlair LaCorte Named President of AEye

AEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual Property

Patented technology gives iDAR system the ability to create superior perception data, radically improving safety, reliability, and efficiency of autonomous vehicles.
“The difference is our fundamental system architecture, which allows us to produce mobility solutions with industry leading range, resolution and update rate performance, while at the same time delivering ADAS solutions that are optimized for size, weight, power, and price.”

Pleasanton, CA – February 20, 2019 – Artificial perception pioneer AEye today announced a significant expansion of its global patent portfolio, demonstrating its ongoing commitment to innovative solutions that promote safe and reliable vehicle autonomy. In addition to three patents previously announced, AEye has been awarded eight additional patents covering core components of AEye’s iDAR™ (Intelligent Detection and Ranging) perception system, including its solid-state MEMs-based agile LiDAR, fused HD camera, and software definable AI technology.

Six of the new patents relate to iDAR’s unique ability to enable software-defined frames and dynamic scan patterns. iDAR eliminates the constraints of the typical static point cloud by introducing dynamic scan modes designed to improve information acuity and minimize latency. With iDAR, scan modes can be combined and their data aggregated, enabling an AEye iDAR-powered device to be configured according to situational demands, such as when a vehicle moving at speed encounters congestion. This results in improved perception and time-to-reaction, while significantly reducing resource overhead.

Two of AEye’s recent patents focus on how the iDAR platform monitors and manages the laser energy of its agile LiDAR. iDAR deploys adaptive energy control on a pulse-by-pulse basis, allowing the system to automatically adjust the energy, tuning and shaping of each laser pulse. This advanced architecture allows for both the dynamic adjustment of scan patterns and the ability to adapt the laser energy for each pulse — enabling complete interference mitigation, eye safety and retro mitigation.

“AEye set out to create the optimal architecture for artificial perception, reinventing the core technologies that enable it,” said Aravind Ratnam, vice president of Products at AEye. “These patents represent AEye innovations that allow us to intelligently process data at its source, while using the minimal laser power to actively interrogate each frame. By optimizing and adjusting pulse power and direction in real-time, we are able simultaneously focus on regions of interest, filter irrelevant data, and eliminate interference and spoofing, while delivering the highest levels of eye safety and retro mitigation.”

AEye’s iDAR Brings Intelligence to LiDAR
Conventional LiDAR systems are designed to provide basic search capabilities — passively capturing limited data and treating it all equally without distinguishing critical threats or objects in the scene. In contrast, AEye’s iDAR performs multi-modal intelligent search which can then acquire, pre-classify and track multiple objects quickly. Putting this intelligence in the sensor at the edge of the network produces better quality perception data that is more quickly, efficiently and accurately processed into actionable information. In addition, iDAR is a software-definable artificial perception system designed to be interoperable with existing LiDAR sensors and can be customized and extended as needed.

“The difference is our fundamental system architecture, which allows us to produce mobility solutions with industry leading range, resolution and update rate performance, while at the same time delivering ADAS solutions that are optimized for size, weight, power, and price,” said Joel Benscoter, head of Customer Success for AEye. “We can do this while ensuring both products will be under ASIL-B functional safety certification, to ensure they meet the automotive industry’s stringent safety standards. Nobody else can do this.”

AEye recently announced the AE110 for the mobility applications and the AE200 for ADAS solutions. The features enabled by the technologies covered in these patents will be available in both product lines.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Intel Capital and Airbus Ventures.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye Extends Patent Portfolio, Creating Industry’s Most Comprehensive Library of Solid-State Lidar Intellectual Property — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketGartner Names AEye Cool Vendor in AI for Computer VisionAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKElon Musk Is Right: LiDAR Is a Crutch (Sort of.)AEye Announces Industry Leading Family of Perception Sensors for ADAS Solutions

This Is AEye

Solving the Challenges of Artificial Perception for Autonomous Vehicles
We are the creators of iDAR, the world’s leading artificial perception platform which combines solid-state, agile LiDAR with a low-light HD camera and embedded artificial intelligence to mimic the advanced data structure of the human visual cortex, allowing for safer, more efficient autonomous driving. iDAR’s unique architecture enables industry leading performance in range, scan rate, and resolution.

AEye has demonstrated a range of more 1000m, a scan rate of greater than 100Hz, and the ability to achieve 1 mm scale resolution with true color. And that’s only the beginning! As a company, AEye values integrity, innovation, and a dedication to change the world by bringing sensible, problem-solving ADAS and Mobility perception solutions to market.

What groundbreaking achievements will AEye reach next?

This Is AEye — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessGartner Names AEye Cool Vendor in AI for Computer VisionLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketBlair LaCorte Named President of AEyeAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye Introduces Advanced Mobility Product for the Autonomous Vehicle Market

IEEE Spectrum Examines LiDAR Wavelength Safety for the Human Eye and Digital Cameras

IEEE Spectrum details why the 1550 nanometer wavelength for LiDAR is safe for the human eye, and calls the industry to work together to ensure multimodal sensor compatibility and interoperability.

Article >
IEEE Spectrum Examines LiDAR Wavelength Safety for the Human Eye and Digital Cameras — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketElon Musk Is Right: LiDAR Is a Crutch (Sort of.)AEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketGartner Names AEye Cool Vendor in AI for Computer VisionThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateAEye Advisory Board Profile: Willie Gault

Motor Trend Features AEye’s iDAR System for Self Driving Cars

In its “Innovations in Enviro-Sensing for Robocars” feature, Motor Trend details how AEye’s iDAR embeds microelectromechanical systems into solid-state LiDAR, allowing it to fire photons randomly rather than in a preset pattern, making it easier for the computer to process, thus reducing latency and power consumption five to ten times over competitors.

Article >
Motor Trend Features AEye's iDAR System for Self Driving Cars — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanElon Musk Is Right: LiDAR Is a Crutch (Sort of.)AEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsGartner Names AEye Cool Vendor in AI for Computer VisionLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™

Reuters Rides Along with AEye at CES 2019

AEye takes Reuters on a drive down the Las Vegas Strip to show off how its artificial perception technology can detect up to 1000 meters and mimic human perception by focusing on important objects in a scene.

News Video >
Reuters Rides Along with AEye at CES 2019 — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketLG Electronics and AEye Announce Strategic Partnership to Address Sensing and Perception Needs of ADAS MarketAEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SKGartner Names AEye Cool Vendor in AI for Computer VisionAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI Technology