Digital Journal Asks: “Will agile sensor technology surpass LiDAR”?

New sensor technology presented by AEye, which specializes in artificial perception, aims to combat key safety concerns regarding the development and mainstream adoption of autonomous vehicles.

Article >

Digital Journal Asks: “Will agile sensor technology surpass LiDAR”? — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARGartner Names AEye Cool Vendor in AI for Computer VisionThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateNate Ramanathan Joins AEye as Vice President of Operations

AEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SK

Strategic Automotive Investors Support AEye’s iDAR™ Technology Approach in Addressing the Perception Needs of Both ADAS and Mobility Markets
“The AEye investment aligns perfectly with SUBARU’s aim to invest in leading-edge technology to advance our assisted and autonomous driving efforts. AEye is a front-runner in developing accurate, low-latency, low cost perception systems, and we anticipate great synergy in working with AEye towards safe, reliable vehicle autonomy.”

Pleasanton, CA – December 19, 2018 – AEye, a world leader in artificial perception systems and the developer of iDAR™, today announced the second close of its Series B financing,
bringing the company’s total funding to over $60 million. AEye Series B included global automotive OEM, Tier 1 and Tier 2 strategic investors Hella Ventures, SUBARU-SBI Innovation Fund, LG Electronics, and SK Hynix. AEye previously announced that the round was led by Taiwania Capital along with existing investors Kleiner Perkins, Intel Capital, Airbus Ventures, R7 Partners, and an undisclosed OEM. Series B will be used to scale AEye’s operations to meet global demand for its artificial perception systems for both ADAS and Mobility solutions.

AEye takes a disruptive approach to vehicle perception by putting intelligence at the sensor layer and making it extensible and controllable via a software-driven architecture. AEye’s iDAR is the first and only perception system that can easily scale from modular ADAS solutions to complete 360-degree vision for Mobility applications. iDAR physically fuses the industry’s only 1550nm, solid-state, agile LiDAR with a hi-resolution camera to create a new data type called Dynamic Vixels. This real-time integration occurs in the IDAR sensor, rather than post fusing separate camera and LiDAR data after the scan. By capturing both geometric and true color (x,y,z and r,g,b) data, Dynamic Vixels uniquely mimic the data structure of the human visual cortex, capturing better data for vastly superior performance and accuracy.

“This funding marks an inflection point for AEye, as we scale our staff, partnerships and investments to align with our customers’ roadmap to commercialization,” said Luis Dussan, AEye founder and CEO. “The support we have received from major players in the automotive industry validates that we are taking the right approach to addressing the challenges of artificial perception. Their confidence in AEye and iDAR will be borne out by the automotive specific products will be bringing to market at scale in Q2 of 2019. These products will help OEMs and Tier 1s accelerate their products and services by delivering market leading performance at the lowest cost.”

“The AEye investment aligns perfectly with SUBARU’s aim to invest in leading-edge technology to advance our assisted and autonomous driving efforts,” said Itaru Ueda, Manager of SUBARU-SBI Innovation Fund. “AEye is a front-runner in developing accurate, low-latency, low cost perception systems, and we anticipate great synergy in working with AEye towards safe, reliable vehicle autonomy.”

Built on a robust and rapidly-growing patent portfolio, AEye’s iDAR recently set a new benchmark for solid state LiDAR range. In performance tests validated by VSI Labs, iDAR acquired and tracked targets at more than 1000 meters and at scan rates exceeding 100 Hz – a major breakthrough in speed and long-range threat detection.

AEye has also recently announced the appointment of Blair LaCorte as president. An accomplished executive and strategist, LaCorte brings extensive leadership experience to the company as it scales global operations.

For more information about AEye and its innovative approach to artificial perception, please visit us at CES booth #2100 at the Westgate Convention Center, Las Vegas.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye’s $40M Series B Includes Numerous Automotive Leaders Including Subaru, Hella, LG, and SK — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanBlair LaCorte Named President of AEyeGartner Names AEye Cool Vendor in AI for Computer VisionAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is Ruthless

eeNews Europe Examines AEye’s Ground-Breaking, New Sensor Data Type: Dynamic Vixels

Dynamic Vixels strengthen AEye’s biomimetic approach to visual perception, enabling vehicles to see and perceive like humans do to better evaluate potential driving hazards and adapt to changing conditions.

Article >

eeNews Europe Examines AEye's Ground-Breaking, New Sensor Data Type: Dynamic Vixels — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™The Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesBlair LaCorte Named President of AEyeAutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorGartner Names AEye Cool Vendor in AI for Computer VisionAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI Technology

TechCrunch Reports on AEye’s $40M Series B Funding Led by Taiwania Capital

TechCrunch announces AEye’s close of Series B funding. Led by Taiwania Capital and including returning investors Kleiner Perkins, Intel Capital, Airbus Ventures, and Tyche Partners, the artificial perception pioneer and creator of iDAR raised $40 million, bringing its total funding to roughly $61 million.

Article >

TechCrunch Reports on AEye's $40M Series B Funding Led by Taiwania Capital — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Wall Street Journal Announces Close of AEye’s $16 Million Series A with Funding from Kleiner Perkins Caufield & Byers, Airbus Ventures, Intel Capital, Tyche Partners, and OthersBlair LaCorte Named President of AEyeAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementGartner Names AEye Cool Vendor in AI for Computer VisionThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging Sector

Forbes Features AEye’s iDAR 1000 Meter Detection Range and 100 HZ Scan Rate Achievement

Forbes details why conventional, solid-state LiDAR systems won’t be enough to cultivate the future of autonomous vehicles. Instead, what will catapult autonomous vehicles into the mainstream market is faster, smarter, detection systems, like AEye’s iDAR, which fuses agile LiDAR with a high-resolution, low-light camera to replicate the advanced processes of the human visual cortex.

READ MORE

Forbes Features AEye’s iDAR 1000 Meter Detection Range and 100 HZ Scan Rate Achievement — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesBlair LaCorte Named President of AEyeElon Musk Is Right: LiDAR Is a Crutch (Sort of.)The Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessGartner Names AEye Cool Vendor in AI for Computer VisionAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan Rate

Blair LaCorte Named President of AEye

AEye’s former Chief of Staff will lead the company’s growth and expansion as its iDAR™ system is set to scale production for the global automotive market
“Blair is an invaluable asset to AEye. His unparalleled leadership skills, insight and intellect, as well as his experience successfully leading high-growth businesses will be critical to AEye as we set to scale our business to address the needs of our automotive customers.”

Pleasanton, CA – December 12, 2018 – AEye, leader in artificial perception systems and the creators of iDAR™, today announced the appointment of Blair LaCorte as president. LaCorte initially invested in AEye’s Series A in 2016, and was named chairman of the advisory board. He joined the management team as chief of staff in August 2017. An accomplished executive and strategist, LaCorte will now bring to bear his extensive leadership experience as the company’s new president.

LaCorte has run large, fast growing global corporations and held numerous executive and general management positions in both private and public technology companies. Prior to AEye, he was the global president of Production Resource Group (PRG), the world’s largest live event technology company, the CEO of XOJET, which under his leadership became the fastest growing company in private aviation, and an operating partner at TPG, a premier private equity firm with over $91B in global investments, where he co-founded the TPG growth fund. LaCorte has also held leadership roles at companies including Savi Technology, Autodesk, VerticalNet and Sun Microsystems. He is a current participant in the Virgin Galactic astronaut program which is expected to launch in late 2019.

“Blair is an invaluable asset to AEye,” said AEye founder and CEO, Luis Dussan. “His unparalleled leadership skills, insight and intellect, as well as his experience successfully leading high-growth businesses will be critical to AEye as we set to scale our business to address the needs of our automotive customers.”

Ransom Wuller, AEye co-founder and former president, has joined the company’s board of directors and will continue in his role as chief financial officer.

“The team at AEye is exceptional,” said LaCorte. “They uniquely combine humility with burning curiosity and sincere desire to solve challenges for their partners and customers. I am grateful to be a part of it, and for the chance to work with Luis and the rest of the AEye team to build an extraordinary company.”

AEye takes a disruptive approach to vehicle perception by putting intelligence at the sensor layer and making it extensible and controllable via a software-driven architecture. AEye’s iDAR
is an artificial perception system that physically fuses a unique, agile LiDAR with a hi-res camera to create a new data type called Dynamic Vixels. This real-time integration occurs in the IDAR sensor, rather than post fusing separate camera and LiDAR data after the scan. By capturing x, y, z, r, g, b data, Dynamic Vixels uniquely mimic the data structure of the human visual cortex, capturing better data for vastly superior performance and accuracy.

AEye recently set a new benchmark for solid state LiDAR range. In performance tests validated by VSI Labs, AEye’s iDAR acquired and tracked targets at more than 1000 meters and at scan rates exceeding 100 Hz – a major breakthrough in speed and long-range threat detection. AEye’s use of Biomimicry is more fully explored by LaCorte and Dr. James Doty, world renowned neurosurgeon and clinical professor in the Department of Neurosurgery at Stanford University in a white paper available at aeye.ai.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

Blair LaCorte Named President of AEye — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanGartner Names AEye Cool Vendor in AI for Computer VisionAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorElon Musk Is Right: LiDAR Is a Crutch (Sort of.)AEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan Rate

AEye Advisory Board Profile: Willie Gault

We sat down with each of our Advisory Board Members to ask them why they’re excited about working with AEye…
Willie Gault is a former NFL wide receiver and Olympic athlete. Gault was an All-American at the University of Tennessee from 1979 to 1982. He played in the National Football League for 11 seasons for the Chicago Bears and Los Angeles Raiders. Considered one of the fastest NFL players of all-time, Gault was a member of the Chicago Bears team that won Super Bowl XX, and was also a participant of both the summer and winter U.S. Olympic teams. Gault is currently an investor, remains active, and holds several world records in masters track and field.

Q: What in your past experience ultimately drew you to the autonomous vehicle arena?
As a professional athlete, I have always been fascinated and amazed by human perception and the role it plays in athletic performance. The brain’s ability to sense the details in the world around you and then accurately calculate where your body needs to be in space and time is remarkable. I have been curious about how these capabilities might be replicated with technology and artificial intelligence. Recently, I have been tracking the application of artificial intelligence in autonomous vehicles which led me to AEye.

Q: Why AEye?
What AEye is doing aligns with my interests in biomimicry, which uses knowledge of natural processes found in humans, plants, and animals to better inform technology and design. After I found out AEye was pursuing research in this field, I knew I had to be part of it.

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond? How do you see AEye playing a pivotal role in this vision?
I live in Southern California where traffic has a major impact on quality of life. Autonomous vehicles will not only improve safety and efficiency on the roads, but will greatly improve quality of life around the world. I would like to see this technology adopted quickly and widely. However, one of the barriers to its adoption is cost. I believe that AEye’s iDAR system can be manufactured at tremendous scale, efficiently, and at a price point that encourages rapid adoption.

ALL PROFILES
Advisory Board Profile: Elliot Garbus — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Introduces Groundbreaking iDAR TechnologyThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessCB Insights Unveils Second Annual AI 100 Companies at A-ha!Observe, Orient, Decide, Act: How AEye’s iDAR System Adopts Principles of the OODA Loop to Achieve Intelligent, Long-Range DetectionAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Nate Ramanathan Joins AEye as Vice President of OperationsElon Musk Is Right: LiDAR Is a Crutch (Sort of.)

Elon Musk Is Right: LiDAR Is a Crutch (Sort of.)

By Luis Dussan

Tesla founder Elon Musk recently declared that LiDAR is a “crutch” for autonomous vehicle makers. The comment sparked headlines and raised eyebrows in the industry. Given that this vision technology is the core of many companies’ self-driving car strategies, his view strikes many as anathema or just plain nuts.

But for the moment, let’s ignore the fact that LiDAR is vital to self-driving cars from GM, Toyota and others. Forget that the most advanced autonomous vehicle projects have focused on developing laser-sensing systems.

Even disregard that the alleged theft of LiDAR secrets was at heart of the legal battle between Uber and Alphabet’s Waymo. Waymo claimed that LiDAR is essential technology for autonomous vehicles and won a settlement recently worth about $245 million.

The truth is: Mr. Musk is right. Relying solely on LiDAR can steer autonomous vehicle companies into innovation cul-de-sacs.

LiDAR is not enough. Autonomous vehicles require a rapid, accurate and complete perception system. It is a system-level problem that requires a system-level solution.

My agreement with Mr. Musk may seem surprising given that our company, AEye, sees LiDAR as playing a significant role in making driverless cars a commercial reality.

But we too have realized that if autonomous vehicles are ever going to be capable of avoiding accidents and saving lives, LiDAR is not the answer. At least not by itself.

Not THE answer, but part of the answer…
At Tesla, Mr. Musk is forsaking LiDAR for a 2D camera-based vision system. While Mr. Musk is known for disruptive thinking, it is hard to escape the fact that autonomous vehicles move through a 3D world and successful navigation of that world requires the seamless integration of both 2D and 3D data precisely mapped to both time and space.

At AEye, we believe LiDAR is the foundation of the solution when it seamlessly integrates with a multi-sensor perception system that is truly intelligent and dynamic. Our research has produced an elegant and multi-dimensional visual processing system modeled after the most effective in existence — the human visual cortex.

In fact, AEye’s initial perception system, called iDAR (Intelligent Detection and Ranging), offers a robotic perception system that is more reliable than human vision. LiDAR integrates with a low-light camera, embedded artificial intelligence and at-the-edge processing to enable a car’s vision system to replicate how the human visual cortex quickly interprets a scene.
In short, iDAR enables cars to see like people.

Why this is the superior approach?
In his skepticism of LiDAR, Mr. Musk has curiously bet on a “camera-mostly” strategy when building a vision system for autonomous Tesla vehicles. He has previously made bold (many say unrealistic) predictions that Tesla would achieve full Level 5 autonomous driving with camera-mostly vision in 2019. Navigant Research, in their annual ranking of self-driving vehicle makers, says this is “unlikely to ever be achievable” and rates Tesla at the back of the pack.

The company’s Autopilot system relies on cameras, some radar, and GPS. It has suffered setbacks due to a split with its camera supplier in 2016 after a fatal accident that investigators have blamed partly on Autopilot. Last month, a Tesla smashed into a firetruck in Culver City, California, and the driver said it was “on autopilot.”

The evidence strongly argues against Mr. Musk’s decision to bet on passive optical image processing systems. Existing 2D image processors and 2D to 3D image conversion concepts have serious flaws that can only be addressed with massive computing power and more importantly — algorithms that have not been invented, and are many years away from becoming a reality. This makes this approach too costly, inefficient and cumbersome to achieve Level 5 autonomous driving at commercial scale.

At AEye we know that integrating cameras, agile LiDAR, and AI equals a perception system that is better than the sum of its parts. It surpasses both the human eye and camera alone, which is required if you don’t have the sophistication of the human brain yet replicated.

In his “crutch” comments, Mr. Musk predicted that LiDAR-based systems will make cars “expensive, ugly and unnecessary,” adding: “I think they will find themselves at a competitive disadvantage.” The truth is that size, weight, power, and cost are decreasing for vehicle navigation grade LiDAR. And they will fall further. AEye, and maybe others, will see to that.

We respect Musk’s innovations and are grateful to him shedding light on where LiDAR needs to go to reach full autonomy. But in the end, as we see LiDAR as a lever, rather than a crutch, we can only give him partial credit for his understanding of the way forward.

ALL NEWS & VIEWS
Elon Musk Is Right: LiDAR Is a Crutch (Sort of.) — AEye Introduces Groundbreaking iDAR TechnologyObserve, Orient, Decide, Act: How AEye’s iDAR System Adopts Principles of the OODA Loop to Achieve Intelligent, Long-Range DetectionAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye Announces the AE100 Robotic Perception System for Autonomous VehiclesThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementCB Insights Unveils Second Annual AI 100 Companies at A-ha!AEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyGartner Names AEye Cool Vendor in AI for Computer VisionAEye Welcomes James Robnett to Executive Team as Vice President of Automotive Business Development

AEye Advisory Board Profile: Elliot Garbus

We sat down with each of our Advisory Board Members to ask them why they’re excited about working with AEye…
Elliot Garbus is a strategy & management consultant working with startups, established companies, and Venture Capital firms. Elliot retired from Intel in May of 2017, where he was the Vice President and General Manager of the Transportation Solutions Division, responsible for delivering Intel’s vision for connected cars, autonomous driving, and intelligent transportation systems. Having worked directly with technology companies, automotive manufactures, and automotive suppliers, Mr. Garbus has a unique perspective on the coming disruptive transformation that will occur as self-driving vehicles become reality.

Q: What in your past experience ultimately drew you to the autonomous vehicle arena?
I was formerly the Vice President and General Manager of the Transportation Solutions Division at Intel. In that role, I had a front row seat as autonomous driving went from research to a race to commercialism.

The opportunity for autonomous vehicles excites me. The main reason is the positive social impact. Today there are about 1.3 million people that die every year from traffic accidents globally and an additional 50 million that are injured. Over 94% of collisions are caused by human error. I believe we have an opportunity to largely eliminate these fatalities and injuries, making the world a much better and much safer place.

Q: Why AEye?
AEye has a fantastic set of technologies that they’ve combined in a new way to deliver breakthroughs in perception. I’m also very impressed with the unique history of the leadership team. They have a tremendous amount of experience with LiDAR from their work in aerospace. It is unusual to find a start up in the United States with this kind of experience, and a team that has worked with LiDAR for decades.

Q: Where do you see ADAS solutions, autonomous vehicles, and/or artificial perception, heading within the next few years? The next decade? Beyond? How do you see AEye playing a pivotal role in this vision?
In biological evolution, when sight emerged, it led to a stunning acceleration of biological diversity. It’s my expectation that the integration of visual perception and computing, in the way that AEye is pioneering, will lead to a similar explosion of innovation across many industries. Ultimately, autonomous driving is going to change the way our cities are put together. It will change the way products are delivered. It will address congestion, air pollution, and will have a dramatic impact to the insurance and healthcare industries — all while making the future a better and brighter place.

ALL PROFILES
Advisory Board Profile: Elliot Garbus — The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Introduces Groundbreaking iDAR TechnologyThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessCB Insights Unveils Second Annual AI 100 Companies at A-ha!Observe, Orient, Decide, Act: How AEye’s iDAR System Adopts Principles of the OODA Loop to Achieve Intelligent, Long-Range DetectionAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Nate Ramanathan Joins AEye as Vice President of OperationsElon Musk Is Right: LiDAR Is a Crutch (Sort of.)

The Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is Ruthless

By James R. Doty, MD

The automotive industry is on a quest to create fully functioning, safe, and responsive autonomous vehicles enhanced by artificial intelligence (AI). Today’s AI is a function of machine learning and neural networks that rely almost completely on training and repetition. This makes sense for repeatable tasks, but what about for higher-order skills, such as driving, that require more than mere mechanized logic? Humans do not drive purely from an autonomic response to input. We add emotional intelligence, intuition, and morality as decisive factors while we’re behind the wheel.

At an early age, I was introduced to four mindfulness techniques: relaxation, taming the mind, opening the heart, and clarifying the intent. Over the years, I’ve spoken often about the importance of the third lesson, which I didn’t quite grasp until later in my life: opening the heart. What I learned is that nurturing compassion and its connectedness to all of life clarifies the last practice of setting intention. For when the heart is open, we are able to think more about longer lasting and purpose-driven goals, such as cultivating relationships, helping others, and growing more aware of the intricate beauty in our shared experiences. By opening the heart, we are capable of thinking beyond cost benefit analysis, beyond selfish desires, and into support for others and the greater good.

Today, artificial intelligence faces the same challenge. As a neurosurgeon and founder of the Center for Compassion and Altruism Research and Education at Stanford University, I am fascinated by the possibility that one day AI may not only mimic human sensory perception, but also human compassion. Currently, AI has been trained to relax, focus and clarify — but it hasn’t been trained to open its heart. Together with AEye — a company that mimics human perception to create an artificial perception platform for autonomous vehicles — we are leading the discussion to change that. But we need your help. It is our collective responsibility, as compassionate humans, to initiate a dialogue with those at the helm of this technology, so we may consider the “behaviors” that will ultimately be written into self-driving software programs for potentially dangerous situations. In other words: what will be the car’s guiding ethics? To make truly safe, autonomous vehicles, will we need to “teach” them empathy and compassion?

Can we train AI to have compassion?
As I outline in Part I of this article, Think Like a Robot, Perceive Like a Human, we have been able to give autonomous vehicles human-like perception. But can we give them a heart? And should we? This is where the debate begins. Self-driving vehicles are achieving the ability to identify relevant information, process it, and respond accordingly, as a human would. Now, we must consider how to incorporate human-like empathy and compassion in their decision-making — for blind technology without compassion is ruthless. If we want computers to have compassion, we must allow space and time to build the software appropriately to prevent processes that blinds us from our humanity.

The first way to train computers to perceive with compassion is to give them enough time to infer intent of movement. Is that a child standing on the sidewalk with potentially unpredictable behavior? Is there a ball in the road and a person following it? Is a blind person crossing the intersection, unaware of the approaching vehicle? As humans, we process these evaluations first, allowing us to not only see an object, but a person. We are compassionate in our understanding that a child may run into the street, or an individual may be unaware of a vehicle approaching the intersection. This ultimately allows us to drive with intention, taking on responsibility for the safety of other people, in addition to ourselves. This is more advanced than conventional AI, which is programmed and trained to track objects (or “blobs”), in a cold, repetitive way.

The second way to give computers compassion is to develop AI with situational awareness. Situational awareness means that a driver understands the need to approach an intersection with caution, as people may be crossing the street. Conventional AI in autonomous vehicles lacks this type of perception. However, innovative companies like AEye build sensors to have situational awareness, allowing autonomous vehicles to not only have capabilities that we take for granted in human perception (like differentiating between how we maneuver our vehicles through an urban area versus driving along a country road), but to have intuition, compassion and understanding of possible intent. For example, if the system’s LiDAR sensor identifies a large object in front of the vehicle, the camera and computer vision algorithms work in unison to more fully investigate the scene to identify whether it is a truck, an ambulance, or a school bus and, therefore, initiate an appropriate response (such as slowing down in anticipation of the school bus stopping). Building situational awareness into self-driving systems inherently builds in social morals.

Third, if we look at our own behavior (as a species and as individuals), we see that we continually act upon our assumptions (rules of thumb or shared knowledge), which are guided by feedback and effects witnessed from past behavior. Choosing the most compassionate decision is determined by the context of a single moment, and this context is determined by our unique ability to efficiently assess our environment. Therefore, in the situation of driving a vehicle, for our own survival and for the empathetic survival of others, we make calculated risks. As examples: turning left at an intersection, estimating the required speed to merge with oncoming traffic, predicting the best moment to change lanes on the highway, even simply driving in one city verses another. Each of these scenarios requires knowledge of different contexts (situational awareness) and rules guided by previous experiences (assumptions). Our perspective of the world is built by our unique history, which in turn leads to better, more compassionate perception.

An AI ‘Trolley Problem’
The AI algorithm that coordinates self-driving car responses must make decisions, which at times, may not be easy, even for a human. Consider a scenario where a cat runs into the path of an autonomous vehicle on a crowded, city street. What is the vehicle programmed to do? Option 1) it hits the cat. This decision doesn’t impact the safety of the humans in the vehicle, thus the AI optimizes for the safety of humans overall (sorry, Kitty). Option 2) it breaks hard or swerves to avoid hitting the cat. Although this decision would spare the cat’s life, it could potentially cause a serious accident, which would harm humans and cause traffic delays. Option 3) it develops a sophisticated algorithm that calculates the potential risk of stopping/swerving for the cat and determines the optimal outcome before deciding. But in this scenario, how many dimensions can be considered simultaneously? My choice would be Option 3, as I would opt (if I can) to save the cat. But this poses another ethical conundrum: who determines the programmed decision the vehicle would make?

As hundreds of thousands of autonomous vehicles enter our streets, will we need a standard definition of compassion shared by all vehicles so they can predict behavior based on situational awareness? Or will compassion be a feature that is offered to differentiate one service from another? Should the vehicle owner, the car company, or an independent entity define a vehicle’s ethics? How about its level of empathy and compassion? These are all questions that have yet to be answered.

‘A man without ethics is a wild beast loosed upon this world.’
The danger of our machinery mimicking human perception without compassion is that, if AI works correctly, it eventually won’t require human tinkering. Thus, if we don’t know how to open AI’s heart and do not prioritize certain contextual data now, we will create a world in which we get from Point A to Point B by ruthless calculations, which could potentially result in immense destruction along the way. Therefore, I amend the Camus quote, above: “A machine without ethics is a wild beast loosed upon this world.”

Unlike machines, we humans use our minds and our hearts to make decisions, cultivated by knowledge and perspective based on personal experience. The AI technology being developed today must be sensitive to that, and we must consider setting clear intentions when writing these programs from the beginning. While typical software programs have been calibrated to identify a cost-benefit to many decisions, we will soon face challenges that pose new moral issues where the cost may be a life.

Artificial intelligence will only be as sensitive, compassionate and aware as we design it to be. Humans are caring and kind, and we must incorporate the best parts of humanity (our compassion and empathy) into our artificial intelligence. Otherwise, we risk blindly building a future full of cold, calculating, and ruthless technology. Now is the time to recognize this need for the lesson I learned late in life: to open the heart and make compassion the priority — to make compassion our culture. It’s our responsibility to design our computers in this same light. Therefore, we must move beyond artificial intelligence. We must create artificial humanity.

_____

James R. Doty, MD, is a clinical professor in the Department of Neurosurgery at Stanford University School of Medicine. He is also the founder and director of the Center for Compassion and Altruism Research and Education at Stanford University of which His Holiness the Dalai Lama is the founding benefactor. He works with scientists from a number of disciplines examining the neural bases for compassion and altruism. He holds multiple patents and is the former CEO of Accuray. Dr. D..