AEye Team Profile: Ove Salomonsson

On October 30th, AEye’s Sr. Director, LiDAR Product Architect, Ove Salomonsson, will speak at two sessions during SAE Innovations in Mobility in Novi, Michigan: Using Intelligent Sensing to Achieve Accurate, Fast Perception and Bringing Intelligence to the Edge.

Ove Salomonsson Ove Salomonsson has 30+ years of experience in engineering of automotive safety electronics. He came to AEye from Lucid, where he was director of Autonomous Driving and ADAS. Before that, he led long range ADAS system development at Magna Electronics and directed technology development at Autoliv Electronics, where he was also General Manager for the Night Vision camera division. Salomonsson was also VP of Traffic Systems at Saab Systems and in DSRC (V2V) at Saab Combitech. He began his career at Volvo, where he managed safety technology projects. Salomonsson holds a BSC in Innovation Engineering from the University of Halmstad.

We sat down with Ove to learn about how ADAS, self-driving technologies, and perception sensors have evolved over the years, and what he misses most about Sweden.

Q: You’ve been working in ADAS and autonomy for quite some time – how have you seen these technologies evolve over the years?

Radar has been around for quite some time and continues to evolve into smaller, better, and less costly sensors. In fact, cost has come down so significantly that they are now on close to every new car delivered – either providing standalone applications like cross traffic alert or blindspot detection, or as part of a larger ADAS system. However, radar is still quite limited because of its relatively low resolution and multi-path problems.

It was certainly a big deal when cameras became qualified for automotive and low cost enough to make it on to vehicles in greater volume, such as backup cameras and forward facing cameras for collision alerts. There is so much more information captured in a camera image than by radar, and the resolution nowadays has increased up to 8 megapixel. However, the costly and energy consuming AI compute portion of ADAS and AV systems will still need time to catch up to that amount of information rushing into the AI algorithms.

Based on recent test information from AAA, performance is still flawed in automatic emergency braking (AEB) systems, which highlights the importance of LiDAR. LiDAR is the final sensor modality that is needed to make ADAS systems (and eventually full autonomy) work effectively in all conditions. LiDAR is more deterministic by nature, as it can detect and measure the distance to all objects. And with an agile LiDAR, such as AEye’s iDAR, this can be done incredibly fast with the added ability to classify objects and determine their velocity.

Q: How have perception sensors for AVs evolved during this time?

I have seen this evolution take place from both the OEM and Tier 1 point of view. However, the most important part is end-customer and societal benefits, such as a reduction in automotive accidents. The lowering of costs and increased capability in terms of resolution and Field-of-View has meant that new applications have been created, expanded upon, and deployed in the market. The list of driver assistance systems is growing: from lane departure warning, forward collision warning, and automatic high beam assist systems to newer features like front and rear automatic emergency braking (AEB) and adaptive cruise control (ACC) with lane following.

With improved sensors and perception algorithms, the focus has now shifted to allowing “hands off” the wheel and, more recently, “eyes off” the road under certain conditions. This happens to be the first (and most challenging) step towards true autonomy since responsibility is transferred to the vehicle for at least some amount of time. Any time we allow the driver to hand over driving tasks to the vehicle, it also becomes important to constantly monitor the driver’s awareness in case there is a need to transfer control back. For example, driver monitoring cameras have been introduced in certain circumstances.

However, perception sensors still need to achieve enough redundancy for autonomous driving systems to be able to “fail operationally.” For example, in a truly autonomous vehicle, the passenger may be sleeping, and the vehicle will have to be able to continue to drive by itself, say, if the camera loses power or a bird hits the windshield right in front of the camera. The vehicle still needs to be able to operate for a certain time period (or until it reaches a safe place to stop) using LiDAR and radar together.

The last and most important piece of the puzzle needed to provide enough redundancy for these systems to “fail operationally” (and also cover additional edge cases) is indeed LiDAR. LiDAR’s deterministic range measurements, high resolution, and low light capability makes it a great complement to both radar and camera.

Q: You grew up in Sweden! What Swedish traditions (holidays, foods, activities) do you miss most here in the States?

I moved to the US almost 25 years ago, but I still go back to Sweden at least once a year to celebrate Midsummer.

What I miss the most is Swedish chocolate and fresh seafood, including Sweden’s wide variety of marinated herring. And, believe it or not, Sweden is home to an exquisite kebab pizza (not a Viking tradition, but rather, a new delicacy).

By the way, did you know that one of Sweden’s biggest exports is music? ABBA, of course, ruled the seventies; Roxette the eighties, Robyn and the Cardigans the nineties; and, more recently, Tove Lo and Zara Larsson. Even Spotify is Swedish!

—–

Connect with AEye at SAE Innovations in Mobility.

AEye Team Profile: Ove Salomonsson —

Original Article

Leave a comment

Your email address will not be published. Required fields are marked *