AEye Team Profile: Indu Vijayan

On April 11, 2019, AEye’s Technical Product Manager, Indu Vijayan, will speak on “AI & Machine Learning” at SAE World Congress in Detroit, Michigan.

Indu Vijayan is a specialist in systems, software, algorithms and perception for self driving-cars. As the Technical Product Manager at AEye, she leads software development for the company’s leading-edge artificial perception system for autonomous vehicles. Prior to AEye, Indu spent five years at Delphi/Aptiv, where, as a senior software engineer on the Autonomous Driving team, she played a major role in bridging ADAS sensors and algorithms, and extending them for mobility. She holds a BS, Technology in Computer Science from India’s Amrita University, and an MS in Computer Engineering from Stony Brook University.

We sat down with Indu to learn more about why the advancement of edge computing and AI is so critical to the rollout of safe and efficient autonomous vehicles…

Q: What does it mean to implement Artificial Intelligence “at the sensor level”?

AEye’s iDAR is the only artificial perception system that pushes data capture and processing to the edge of the network. We achieve this by fusing LiDAR and camera at the sensor to create the highest quality data collection. Traditional LiDAR scanning methods attribute the same amount of importance to every aspect of a given scene. However, as we know from our own experiences driving, not all objects are perceived with equal priority. When driving, we pay much more attention to the pedestrian standing near a crosswalk than to a tree. In this same sense, cars must be able to perceive like a human would in order to drive safely and efficiently. That means, enabling the sensor to treat different regions or objects with varying degrees of priority, and collecting only the most situationally relevant information.

Q: Why is this favorable to the development of advanced artificial perception systems?

Since iDAR is intelligent, it can efficiently cycle and prioritize sensory information. Meaning, that it only sends the most relevant data to the vehicle’s path-planning system. In a conventional sensor system, layers upon layers of algorithms are needed to extract out relevant, actionable data, which creates too much latency for the vehicle to navigate safely at highway speeds. Say you are driving 60mph along a highway when, suddenly, you hear the siren of an ambulance coming from behind you, quickly closing in. In this instance, you are left with two choices: either stay in your lane and maintain your speed, or safety slow down and/or pull over to the side of the road. Whichever decision you choose is determined by the auditory and visual cues you are receiving from the environment, such as the speed of the ambulance, or the density of the traffic around you.

Just like in human perception, our iDAR system creates feedback loops that can efficiently cycle and prioritize sensory information. When humans gather information from the visual cortex, it creates a feedback loop that helps make each step of visual perception more efficient. Because we mimic this process in our system, we enable similar behavior to be learned and trained in autonomous vehicles so that they can make better, more accurate decisions, faster. Therefore, it is able to continually learn and adapt, so that, over time, it becomes even better at identifying and tracking potential hazards.

And because we only scan for and retrieve the most relevant information in a scene, this ultimately allows for cost and power optimization. For instance, we don’t need high end, mega-powerful processors to produce any of our AI algorithms because when we emphasize data quality over data quantity, we reduce the need for a highly powerful processor hiding in the trunk of the car. This not only makes us more cost effective, but it could allow for the redistribution of the power budget inside an electric vehicle to enable longer range performance, as an example. But most importantly, this allows us to make systems that are scalable and optimized for the full value chain.

Q: You will be speaking at SAE World Congress in Detroit, one of the largest gatherings of automotive industry engineers. Why is it so important for advanced automotive systems developers to regularly meet and discuss new ideas and innovations in the industry?

Ultimately, autonomous vehicles will spark a radical shift in our society. Not only will it make safer and more efficient public transportation accessible to the masses, it will allow us to have the time to accomplish meaningful tasks which would otherwise be lost to a long commute. Engineers are the leaders in bringing about this societal change. The dream of safe, fully automated vehicles is a herculean and challenging task to take on, but it’s one that is desperately needed to move society forward. Opportunities like SAE World Congress allow engineers to brainstorm and put the foundational stones together for a safer tomorrow.

AEye Team Profile: Indu Vijayan —

Original Article

Leave a comment

Your email address will not be published. Required fields are marked *