Elaine Wu is a Sensor Systems Engineer at AEye, developing sensor systems and testing self-driving technology for R&D to advance current LiDAR technology. Previously, Elaine worked at Aurora with extensive software testing for perception and classification. During her graduate studies, she specialized in vehicle systems and robotics, completing projects involving indoor vehicle navigation, robotic arm manipulation, and sensor integration with micro controllers, which ultimately inspired her career in vehicle autonomy. Elaine holds a Bachelor of Science in Environmental Engineering from the University of California, San Diego and a Master of Science in Mechanical Engineering from the University of Southern California.We sat down with Elaine to learn about why she loves working on autonomous vehicles, her experience as an Experimental Researcher at the USC Biodynamics Research Lab, and what she loves most about living in the Bay Area.
Q: In honor of Valentine’s Day, what do you love most about working on autonomous vehicles?One of the reasons why I like working on autonomous vehicles is because I am working with innovative technology. A lot of people may shrug off that sense of ambiguity – when you’re working with technology that has not yet been built or actually utilized out in the field. But I enjoy that sense of uncertainty. It means that everything can be further tuned and customized. I enjoy the process of going from proof-of-concept all the way to a completed product. To have the opportunity to have an idea and really push through to production is really exciting.
Q: As a Sensor Systems Engineer, what are you responsible for?I’m responsible for testing the architecture and contributing to the overarching development of the sensor system. I work across all the interdisciplinary engineering teams. I verify and test different concepts, as well as execute research and development that are implemented for each team, which is then iterated across all the sensor systems. I then ensure that everything is collaborative and that the system is still cohesive. I also test all the new features, validating that everything is working as designed and that it meets AEye’s and the automotive industry’s rigorous standards.
Q: You were an Experimental Researcher at the USC Biodynamics Research Lab. Can you tell us about the work you did there?One of my research projects was on bio-inspired navigation and localization. For this project, we were interested in identifying the path taken by underwater marine animals that do not have visual sight but are still able to navigate their environments intuitively. We were trying to identify how they navigate underwater during instances with “disturbances” and classify them. These “disturbances” are usually from an object nearby or from other moving collateral objects, and they usually create trails, which we call “eddies,” or underwater vortices. They are easily identifiable because when you have an object that goes through the water, it creates contrasting eddies – or vortices – and with the relative speed and dissipation, you can then identify the trajectory of its path.
For this project, we had an underwater tank with floating particles illuminated with a laser sheet along with a few miniature pressure sensors to simulate how these disturbances are classified underwater. After post processing with a high speed camera, we identified some key characteristics of these disturbances and then interpolated how these marine animals were able to navigate through them.
Q: You’re a San Francisco native! What do you love most about the Bay Area?I enjoy the diversity and all the different cultural aspects of the Bay Area. I think that people in the Bay Area are very colorful in the sense that not many are narrow minded and everyone’s very open to all beliefs, ethnic backgrounds, and gender orientation and identification across the spectrum. I like that people here generally don’t have prejudices against you and it’s nice to be accepted wherever you traverse in the city.
AEye Team Profile: Elaine Wu —AEye Team Profile: Amy IshiguroAEye Team Profile: Ove SalomonssonAEye and ANSYS Accelerate Autonomous Driving SafetyTata Elxsi and AEye Unveil Integrated RoboTaxi SystemAEye Named to Forbes AI 50AEye Team Profile: Vivek ThotlaFalse PositiveCargo Protruding from VehicleAEye Team Profile: Umar Piracha
Author: AEye Official_News
AEye’s iDAR Leverages Infineon Aurix as Host for Communication with Autosar, Functional Safety and Embedded Software
“We are pleased to partner with AEye to demonstrate the unique capability of AURIX for automotive-grade LiDAR systems. iDAR from AEye uniquely enables us to demonstrate the full capabilities of AURIX, from hardware safety methods such as lockstep and monitoring, which in turn enable deterministic and safe sensing with redundant fallback, to security including secure boot, key update management and secure communication.”
Pleasanton, CA, December 18, 2019 – Today, artificial perception pioneer AEye announced it has integrated Infineon’s AURIX™ TC35xx microcontroller into AEye’s iDAR™ platform to ensure a robust, software-definable platform that is functionally safe for automated and autonomous vehicle initiatives. The companies will showcase the sensor fusion at the Infineon Westgate Pavilion booth #1700 and AEye booth #7538 in LVCC North Hall at CES, January 7-10, 2020.
iDAR (Intelligent Detection and Ranging) is a groundbreaking 2D/3D in-sensor perception system, which combines software extensibility, artificial intelligence and smart, agile sensors to deliver better information faster to self-driving vehicles. AURIX is a 32-bit Infineon microcontroller designed to deliver performance and safety to the automotive industry.
AURIX acts as a bridge between AEye’s embedded and perception software. For any software-defined configuration within AEye’s iDAR platform, AURIX is responsible for performance monitoring, time synchronization and other embedded software functions, ultimately standardizing communication on AUTOSAR to AEye’s perception platform, other OEM or automotive perception software developers.
“We are pleased to partner with AEye to demonstrate the unique capability of AURIX for automotive-grade LiDAR systems,” said Ritesh Tyagi, Head of the Automotive Silicon Valley Innovation Center (SVIC) at Infineon. “iDAR from AEye uniquely enables us to demonstrate the full capabilities of AURIX, from hardware safety methods such as lockstep and monitoring, which in turn enable deterministic and safe sensing with redundant fallback, to security including secure boot, key update management and secure communication.”
“The integration of AURIX is a natural progression as we move toward commercial automotive deployments,” said John Stockton, SVP of Operations and Tech Strategy at AEye. “By using AURIX as a key element of our overall system performance monitoring, AEye ensures we meet automotive functional safety requirements while offering the most robust, software-definable perception platform for the automotive market.”
iDAR is designed to run within the sensor for autonomous vehicles. It enables basic perception to be distributed to the edge of the sensor network, so autonomous designers can use sensors to not only search and detect objects, but also to acquire, and ultimately to classify and track these objects. The ability to collect this information in real-time both enables and enhances existing centralized perception software platforms by reducing latency, lowering costs and ensuring functional safety.
Infineon’s AURIX multicore architecture, based on up to six independent 32-bit TriCore™ CPUs, has been developed according to an audited ISO26262 compliant process and designed to meet ASIL-D on an application level. The platform uses a lockstep CPU architecture combined with safety technology such as safe internal communication buses or distributed memory protection systems. Hardware level encapsulation techniques allow integration of software with various safety levels (QM to ASIL-D) from different sources, reducing the system complexity of implementing those safety levels.
For more information or to schedule a meeting at CES, contact [email protected].
About AEyeAEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield and Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, and Airbus Ventures.
Media Contact:
AEye, Inc.
Jennifer Deitsch
[email protected]
925-400-4366
AEye’s iDAR Leverages Infineon Aurix as Host for Communication with Autosar, Functional Safety and Embedded Software —Abrupt Stop DetectionAEye’s New AE110 iDAR System Integrated into HELLA Vehicle at IAA in FrankfurtAEye and Intvo Demo Advanced Pedestrian Intent Capabilities at CES 2020AEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADAEye Unveils New Headquarters in Dublin, CaliforniaAEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsAEye Hires Veteran Financial Exec as VP of FinanceRethinking the Three “Rs” of LiDAR: Rate, Resolution and RangeAEye Named to Forbes AI 50
AEye Hires Veteran Financial Exec as VP of Finance
Andrea Haviley to Lead AEye’s Finance, Accounting and Growth Strategies as Company Scales Operations
Pleasanton, CA, December 12, 2019 – Today artificial perception pioneer AEye announced it has hired veteran financial executive Andrea Haviley as the company’s vice president of finance. Reporting to president Blair LaCorte, Haviley is responsible for driving AEye’s long-term and annual operating plans, capital financing, M&A, strategic initiatives and processes, as well as, building the finance team and implementing best practices across Finance, HR and IT to drive operational excellence as the company continues to expand and grow its business.
“We are thrilled that Andrea is bringing her deep financial expertise and acumen to AEye at this pivotal time in our growth,” said Blair LaCorte, president of AEye. “The complexity of our business is increasing as we expand our global OEM, tier 1 and automobility customer base, while moving into new markets. Andrea has a track record of excellence across corporate, operational and strategic finance roles, and she will be an asset to our maturing business.”
Haviley is a 25-year veteran in the field with a history of strong financial management. Prior to AEye, she served as director of operations finance at Veeva Systems, where she grew the team from a private company with approximately $50 million in subscription revenues to a public company with more than $1 billion in projected total revenues. Prior to Veeva, Haviley held corporate and operational finance roles at Taleo, PeopleSoft, Netscape, and Ernst & Young. She graduated from Cal Poly San Luis Obispo with a BS in Business Accounting, received her graduate MBA from UC Davis, and Fintech Premier Certification from Harvard University.
“I am excited to join AEye, a leader and disruptor in perception systems for autonomous vehicles,” said Haviley. “AEye has a powerful mission, vision and values, with a laser focus on innovation and execution, and executive leadership with depth of experience in both product and industry. I’m delighted to come on board at this pivotal time to prepare for high growth in 2020 and beyond.”
About AEyeAEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, and Airbus Ventures.
Media Contact:
AEye, Inc.
Jennifer Deitsch
[email protected]
925-400-4366
AEye Hires Veteran Financial Exec as VP of Finance —AEye Wins Best Innovative Sensor Technology Award at IDTechExAEye to Feature Advanced Perception Capabilities at CES 2020AEye Team Profile: Ove SalomonssonAbrupt Stop DetectionAEye Redefines the Three “Rs” of LiDAR – Rate, Resolution, and RangeAEye Announces World’s First Commercially Available Perception Software Designed to Run Inside the Sensors of Autonomous VehiclesAEye Unveils New Headquarters in Dublin, CaliforniaTata Elxsi and AEye Unveil Integrated RoboTaxi SystemAEye’s New AE110 iDAR System Integrated into HELLA Vehicle at IAA in Frankfurt
AEye and ANSYS Accelerate Autonomous Driving Safety
ANSYS enables virtual prototyping of AEye solutions to speed design, testing and validation of automotive perception technologies in challenging real-world scenarios
Pittsburgh, PA, and Dublin, CA, January 6, 2020 – The next generation of autonomous vehicles will mimic how human eyes focus on and evaluate road conditions by leveraging AEye and ANSYS (NASDAQ: ANSS) technologies. AEye is incorporating ANSYS’ industry-leading simulation solutions into the design of its Intelligent Detection and Ranging (iDAR™) platform — enabling customers to reduce physical prototyping and improve the safety and reliability of autonomous systems.
Safeguarding autonomous driving requires next-generation sensors to quickly and correctly interpret certain hazardous road scenarios that cannot be reliably detected by conventional perception platforms. To validate the sensors’ effectiveness, exhaustive road testing must be successfully completed – demanding significant development time and expenses. With ANSYS, AEye empowers automotive manufacturers to potentially simulate driving situations across millions of miles in just days, minimizing physical prototyping.
AEye is implementing ANSYS® SPEOS® and ANSYS® VRXPERIENCE®, a state-of-the-art driving simulation tool with physics-based sensor models, into the design of AEye’s iDAR – empowering customers to quickly test and certify iDAR designs within a realistic virtual driving environment. AEye‘s automotive-grade iDAR combines deterministic and AI-driven perception to deliver detection and classification at high speed and far range not possible for conventional LiDAR or camera sensors. Through the integration, automotive customers deploying autonomous vehicle and advanced driver assistance systems (ADAS) will be able to virtually prototype AEye’s software-definable, agile LiDAR to simulate exactly how they want to sense their environment.
“Addressing use cases systematically will eventually allow AEye and its OEM and Tier 1 customers to drive more intelligence from the edge and achieve higher autonomous capabilities, a concept we refer to as autonomy on-demand,” said Luis Dussan, co-founder and CEO at AEye. “By collaborating with ANSYS, we are helping to accelerate customer and partner innovation – bringing safer, more reliable autonomous features to the market.”
“iDAR will substantially advance autonomous vehicles and advanced driver assistance systems’ reliability, enabling improved autonomous perception and, in turn, safer roads,” said Eric Bantegnie, vice president and general manager, Systems Business Unit at ANSYS. “ANSYS helps automotive manufacturers test scenarios that are nearly impossible to physically test, fully validating iDAR’s performance. As OEMs and Tier 1 manufactures adopt iDAR, our simulation solutions will reduce development time and optimize implementation.”
AEye and ANSYS will showcase their ability to detect driving scenarios using SPEOS and VRXPERIENCE at CES on Jan. 7-10 in Las Vegas at ANSYS Booth 3310 and AEye Booth 7538 in the Las Vegas Convention Center, North Hall.
About ANSYS, Inc.If you’ve ever seen a rocket launch, flown on an airplane, driven a car, used a computer, touched a mobile device, crossed a bridge or put on wearable technology, chances are you’ve used a product where ANSYS software played a critical role in its creation. ANSYS is the global leader in engineering simulation. Through our strategy of Pervasive Engineering Simulation, we help the world’s most innovative companies deliver radically better products to their customers. By offering the best and broadest portfolio of engineering simulation software, we help them solve the most complex design challenges and create products limited only by imagination. Founded in 1970, ANSYS is headquartered south of Pittsburgh, Pennsylvania, U.S.A., Visit www.ansys.com for more information.
About AEyeAEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, Airbus Ventures, and others.
ANSYS and any and all ANSYS, Inc. brand, product, service and feature names, logos and slogans are registered trademarks or trademarks of ANSYS, Inc. or its subsidiaries in the United States or other countries.
ANSS-G
Media Contact:
AEye, Inc.
Jennifer Deitsch
[email protected]
925-400-4366
AEye and ANSYS Accelerate Autonomous Driving Safety —AEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADAEye Advisory Board Profile: Adrian KaehlerAEye Unveils New Headquarters in Dublin, CaliforniaCargo Protruding from VehicleAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye’s New AE110 iDAR System Integrated into HELLA Vehicle at IAA in FrankfurtAEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsA Pedestrian in HeadlightsAEye: Developing Artificial Perception Technologies That Exceed Human Perception
IWPC: Next Generation ADAS, Autonomous Vehicles and Sensor Fusion, December 4-6, 2019
December 4–6, 2019 | Next Generation ADAS, Autonomous Vehicles and Sensor Fusion | San Jose, CADecember 6th, 10:30am | Safe, Cost-Effective and Efficient Perception Systems for Autonomous VehiclesSpeaker: Dr. Barry Behnken – Co-founder and SVP of Engineering, AEyeIWPC: Next Generation ADAS, Autonomous Vehicles and Sensor Fusion, December 4-6, 2019 —A Pedestrian in HeadlightsAEye Team Profile: Vivek ThotlaAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye Announces World’s First Commercially Available Perception Software Designed to Run Inside the Sensors of Autonomous VehiclesAEye Team Profile: Dr. Allan SteinhardtAEye Expands Business Development and Customer Success Team to Support Growing Network of Global Partners and CustomersDo you believe you’re a better driver than an autonomous car?Obstacle AvoidanceAEye Team Profile: Umar Piracha
AEye Unveils New Headquarters in Dublin, California
Larger Headquarters to House Growing Workforce for Disruptive Autonomous Vehicle Technology Provider
Dublin, CA, December 16, 2019 – Today Bay Area-based artificial perception pioneer AEye announced the opening of its new, state-of-the-art headquarters at 1 Park Place in Dublin, California. The building accommodates the company’s growing workforce, while furthering its mission to pioneer breakthroughs in intelligent sensing that pave the way for safe, reliable vehicle autonomy. Employees are moving into the 50,000 square-foot facility in stages, as progress is made on the building’s extensive remodel.
AEye commemorated the opening during an evening sign-lighting led by the company’s founders, the mayor of Dublin, and AEye president Blair LaCorte.
“Our business and employee base has grown in concert with the maturing automated vehicle space, necessitating the need for a larger headquarters that meets our needs both today and in the future,” said Luis Dussan, CEO of AEye. “This location checks all of the boxes, providing a top notch facility that’s close to public transit in a location that enables us to attract and retain the very best workforce.”
AEye employs approximately 100 people across the U.S., Germany and Japan, and has been growing rapidly – doubling its size in 2019. That growth trajectory is expected to continue into 2020, as customers’ commercialization ramp up there automated and autonomous vehicle initiatives. The Dublin headquarters will house AEye’s executive, engineering, product, business development and marketing teams. The facility will also be home to the company’s research and development labs, its indoor testing facilities, as well as a garage for AEye’s test vehicles.
“On behalf of the City of Dublin, we are thrilled to officially welcome AEye to the community,” said Dublin Mayor David Haubert. “AEye is working on exciting, world-changing technology, and we are proud to welcome such an innovative tech company to our city.”
About AEyeAEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, Airbus Ventures, and others.
Media Contact:
AEye, Inc.
Jennifer Deitsch
[email protected]
925-400-4366
AEye Unveils New Headquarters in Dublin, California —AEye Team Profile: Dr. Allan SteinhardtAbrupt Stop DetectionFlatbed Trailer Across RoadwayAEye Wins Best Innovative Sensor Technology Award at IDTechExA Pedestrian in HeadlightsAEye Announces World’s First Commercially Available Perception Software Designed to Run Inside the Sensors of Autonomous VehiclesMercury News Announces AEye’s Move to New Dublin California HeadquartersFalse PositiveAEye Advisory Board Profile: Adrian Kaehler
AEye Team Profile: Umar Piracha
AEye’s very own Umar Piracha will be chairing the Optical Technologies for Autonomous Cars and Mobility symposium at CLEO 2020 in San Jose this spring.
Umar Piracha is a Staff LiDAR Systems Engineer at AEye. He has a Masters from the University of Southern California and a PhD from the College of Optics (CREOL) at the University of Central Florida, where he developed a LiDAR system using a mode locked laser for high resolution ranging at tens of kilometers of distances. He has experience working for large and small companies, including Intel, Imec, and Luminar Technologies, and has successfully co-founded a fiber sensing startup. He is a Senior Member of the IEEE, Associate Editor for SPIE’s Optical Engineering Journal, and serves as a reviewer for NSF’s SBIR program. Dr. Piracha has 32 conference and journal publications and 3 patents.We sat down with Umar to discuss chairing a session at CLEO 2020, why LiDAR is critical for autonomous vehicle perception, and why he’s known around town as “Dr. Laser”.
Q: Congratulations on being named Chair of Optical Technologies for Autonomous Cars and Mobility at CLEO 2020! Can you tell us a bit about what this role entails?Thank you! It will be a very rewarding and fun experience. As Chair, I’ll review the latest results from different research groups in academia and industry around the world and select the ones that are making the most impact in the field of self-driving cars. These research groups will then have the opportunity to present their results to an audience of technical leaders from around the world during my session at CLEO.
Q: Why is LiDAR so imperative to the overall safety and reliability of artificial perception systems for self-driving cars?LiDAR is enabling self-driving cars to become a reality by ensuring that they drive with the least amount of risk to other drivers or those around it. Humans are emotional, can be easily distracted, and are prone to mistakes. However, the human visual cortex is the most advanced perception engine on the planet. Since the processing and perceptive power of the human visual cortex and brain is far beyond the fastest supercomputer in the world, it is not an easy task to safely replace human drivers with automation and artificial intelligence. Therefore, the inclusion of LiDAR is necessary for autonomous vehicles because it reduces the burden of real-time perception and prediction, which is not possible using AI and stereo-cameras alone. The reality is: multiple sensors, including LiDAR, radar, cameras, etc., coupled with advanced data processing and AI will be required to make self-driving cars safe and reliable.
Q: You’re known around town as “Dr. Laser.” Care to elaborate on that nickname?I used to teach at CREOL (The University of Central Florida) as an Adjunct Instructor, where my students would call me Dr. Umar. But since I love lasers, when I purchased my new car, I wanted to get a fun license plate to go with it, so I got one that says “Dr. Laser”. I think it’s very appropriate since I love cars – and lasers too.
Q: You used to do standup comedy! How would you describe your act? Who are a few of your favorite comedians?After getting a double Masters degree, a PhD in Lasers, and the nickname of “Dr. Laser”, I wanted to involve myself in something totally opposite of all that! So I took improv and stand-up comedy classes and performed a few times in Orlando. Unlike most stand up comics, I always made sure my humor was suitable for all audiences. Personally, I think Maz Jobrani is an extremely hilarious comic!
AEye Team Profile: Umar Piracha —AEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADFlatbed Trailer Across RoadwayAEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsAEye Team Profile: Ove SalomonssonDo you believe you’re a better driver than an autonomous car?Ride by Kelley Blue Book Names AEye “Top 6 New Mobility Innovations” at the 2019 Los Angeles Auto ShowAEye Team Profile: Jim RobnettLeading Global Automotive Supplier Aisin Invests in AEye through Pegasus Tech VenturesAEye Wins Best Innovative Sensor Technology Award at IDTechEx
DesignNews Spotlights AEye President Blair LaCorte’s Press Conference at AutoMobility LA
Chris Wiltz of DesignNews chronicles AEye President Blair LaCorte’s AutoMobility LA 2019 press conference where he discusses how iDAR enables autonomous vehicle sensing technology to “out-perceive the human eye” using biomimicry, iDAR’s motion forecasting ability, and how AEye’s technology is similar to the Indoraptor from “Jurassic World: Fallen Kingdom.”
Read MoreDesignNews Spotlights AEye President Blair LaCorte's Press Conference at AutoMobility LA —Autonomous Cars with Marc Hoag Talks “Biomomicry” with AEye President, Blair LaCorteA Pedestrian in HeadlightsAEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADObstacle AvoidanceFlatbed Trailer Across RoadwayAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye Named to Forbes AI 50Unique iDAR Features That Drive SAE’s 5 Levels of AutonomyAEye Advisory Board Profile: Adrian Kaehler
LA AutoMobility – November 18-21, 2019
November 18–21, 2019 | LA AutoMobility | Los Angeles, CAPress Conference: AEye | Tuesday, November 19th, 3:55-4:10pm PST | Technology Pavilion: Structure AConnect with AEye! Request a MeetingLA AutoMobility – November 18-21, 2019 —Unique iDAR Features That Drive SAE’s 5 Levels of AutonomyA Decade of Autonomous Vehicle InvestmentsAEye’s New AE110 iDAR System Integrated into HELLA Vehicle at IAA in FrankfurtAEye Team Profile: Ove SalomonssonThe Human Classification Framework: Search, Acquire, and ActAEye Team Profile: Indu VijayanAbrupt Stop DetectionThe Register and AEye President, Blair LaCorte, Predict the Future of Self-Driving CarsAEye Advisory Board Profile: Adrian Kaehler
Tech.AD – Autonomous Vehicle Technology Innovation of the Year
Tech.AD – Autonomous Vehicle Technology Innovation of the Year READ MORETech.AD – Autonomous Vehicle Technology Innovation of the Year —AEye Advisory Board Profile: Adrian KaehlerThe Register and AEye President, Blair LaCorte, Predict the Future of Self-Driving CarsAEye Team Profile: Aravind RatnamAEye: Developing Artificial Perception Technologies That Exceed Human PerceptionAutonomous Cars with Marc Hoag Talks “Biomomicry” with AEye President, Blair LaCorteForbes On How AEye Teaches Autonomous Vehicles to Perceive Like a HumanAEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADAEye Named to Forbes AI 50iDAR Sees Only What Matters