“Is LiDAR ready for mainstream security use?” asks Security Magazine. They sat down with Akram Benmbarek, AEye’s VP of Business Development and Strategic Initiatives at CES 2020, who says “the most suitable customers for security applications today are critical infrastructures and government agencies.”
Article >Security Magazine Explores LiDAR for Security Applications —A Pedestrian in HeadlightsAbrupt Stop DetectionAEye Team Profile: Steven WongAEye and ANSYS Accelerate Autonomous Driving SafetyAEye and Infineon: Perception, Safety, and PerformanceAEye and Intvo Demo Advanced Pedestrian Intent Capabilities at CES 2020DesignNews Spotlights AEye President Blair LaCorte's Press Conference at AutoMobility LAAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye and ANSYS: Simulation and Prototyping
Author: AEye Official_News
AEye Team Profile: Viktoria Parker
Viktoria Parker is Supply Chain Manager at AEye. She is APICS-certified with over 15 years experience in supply chain management, inventory control, project management, strategic supplier management, contract negotiation, supplier selection, forecasting, and planning. Prior to AEye, she worked at medical device company Fresenius as Senior Inventory Analyst. She also has semiconductor experience from working for Nikon as a Key Account Spare Parts Planner (Nikon Europe), and as a Senior Materials Planner (US). Originally from Belarus, Viktoria is fluent in four languages and came to the Bay Area in 2006. She holds a BA in Linguistics from Minsk State Linguistics University and a Business Administration Certificate from UCSC.We sat down with Viktoria to learn about her role as Supply Chain Manager, the differences amongst the medical, semiconductor, and automotive industries, and why she loves dancing the tango!
Q: What are your responsibilities as Supply Chain Manager at AEye?As Supply Chain Manager, I’m responsible for purchasing, warehousing, inventory control, shipping/receiving, and production planning. It is exciting to be involved and fully engaged in so many areas of the company and to have the unique opportunity to set up processes and establish rules to help make it succeed. AEye has a very inspiring environment with brilliant minds and experts in their fields. It is just a fun crowd to be around!
Q: You’re fluent in four languages: English, German, French, and Russian. How have your language skills been a value to you and your role?English has been my passion since I was 14 years old. I love linguistics–a fascinating science with its own laws, rules, and exceptions. My first job was working for an American company while still living in Belarus. Most of my business communication was conducted in English. Knowing the language at that time after the Soviet Union collapsed was the leading factor in finding a decent job. Knowing English opened up so many doors! Then I moved to Germany–I mastered German pretty quickly and was hired by the leading Japanese semiconductor company, which brought me to America 7 years later. Sadly, I almost forgot my French along the way. However, a French colleague at AEye is inspiring me to refresh it. I started listening to a French language podcast on my lunch walks. Russian is my mother tongue and it will never be forgotten. I was always lucky to have Russian-speaking colleagues at any company I worked, both in Germany and the US. Same goes for AEye: I have the opportunity to share a Russian joke with my colleague and to talk about the good old days.
Q: You’ve worked in the medical and semiconductor industries – how do they differ from automotive?The medical industry is highly regulated by the FDA. There are lots of procedures to follow and approvals to collect. The semiconductor industry gives you more flexibility to be creative at the workplace. The automotive industry is regulated even more than medical. The difference is that it is self-regulated. There is no governing body that would shut a company down for violating the rules. However, the rules are even more stringent, established by the industry, which a company needs to follow to be in business. Self-regulation is quite a remarkable factor, showing that automotive companies take safety very seriously and that’s a good thing for us all in the end.
Q: You’ve been taking tango classes! What sparked your interest in tango and what about it do you love?I’ve been fascinated by tango for a very long time, watching pretty much every episode of Dancing with the Stars (DWTS). When my husband expressed interest in learning it with me, I was thrilled to try. We’ve been taking lessons on and off for over 5 years now. I can tell you that learning tango is a lifetime journey. Tango is a passionate communication without words, an intimate connection between two strangers, where no language skills are necessary.
AEye Team Profile: Viktoria Parker —AEye and Tata Elxsi: Autonomous VehicleTata Elxsi and AEye Unveil Integrated RoboTaxi SystemAbrupt Stop DetectionAEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADA Pedestrian in HeadlightsAEye to Feature Advanced Perception Capabilities at CES 2020Time of Flight vs. FMCW LiDAR: A Side-by-Side ComparisonCargo Protruding from VehicleFlatbed Trailer Across Roadway
AEye Reveals Advanced MEMS That Delivers Solid-State Performance and Reliability Needed to Advance Adoption of Low-Cost LiDAR
Independent tests validate that the new 4Sight™ long range intelligent MEMS-based sensor exceeds automotive and industrial standards for shock and vibe
Dublin, CA – June 9, 2020 – AEye, Inc, an artificial perception pioneer today announced that their 4Sight™ M sensor based on patented intelligent perception system design has established a new standard for sensor reliability. In testing completed at NTS, one of the most respected testing, inspection, and certification companies in the US, the 4Sight M scan block surpassed automotive qualification for both shock and vibe. AEye also announced the availability of 4Sight – a new family of advanced 1550nm LiDAR vision systems.
“The LiDAR industry has struggled with attaining solid-state performance, especially when trying achieve sufficiently high resolution at long range,” said Luis Dussan, Co-Founder and CEO of AEye. “With the very public failures of several Optical Phased Arrays and Flash LiDAR concepts, MEMS-based systems were seen as a potential solution, but the inability of FMCW or other TOF systems to effectively reduce the size of or harden their MEMS continues to be a stumbling block. Our non-arrayed micro-MEMS is at the core of our unique system design and helps us solve this challenge-providing the automotive industry the combination of reliability, performance and price they have been seeking.”
4Sight is the fifth-generation sensor from AEye and is based on AEye’s powerful iDAR™ platform. AEye’s unique patented system design is elegant in its simplicity with one laser, one MEMS, one receiver, and one SOC. Driven by extensible software, 4Sight is designed from the ground up to identify and deliver salient information while exceeding all industry quality and reliability standards, and can be manufactured at scale at low cost. To prove its reliability, AEye recently engaged NTS, to conduct extensive shock and vibration testing, on the 4Sight sensor. The results of the test showed a 4Sight Sensor can sustain a mechanical shock of over 50G, random vibration of over 12Grms (5-2000Hz), and sustained vibration of over 3G.
“Having funded hundreds of radar and LiDAR projects over the years,” said Allan Steinhardt, AEye’s Chief Scientist and former Chief Scientist at DARPA. “I appreciate the power of simple and elegant design and how that impacts system reliability. In real terms, our 4sight remains fully operational significantly past the acceleration point where air bags deploy, and passengers black out.”
AEye is changing the calculus for adding long range, high-performance LiDAR to a vehicle. Now automotive OEMs and Tier 1s along with trucking, construction, transit, rail, ITS, aerospace, and defense markets can implement LiDAR with the confidence that it can withstand forces similar to those generated by the recent historic launch of a Falcon rocket.
Size Matters – Not all MEMS are created equalThe size of the mirror in a MEMS largely determines its reliability. Larger mirrors also have larger inertia, generating 10x to 600x more torque from shock and vibration events. In addition, larger mirrors do not allow for fast, quasi-static movement for agile scanning, which is key to intelligent and reliable artificial perception.
The unique patented system design of AEye’s MEMS allows a mirror that is less than 1mm in size. Other LiDAR systems use 3mm to 25mm mirrors – which equates to 10X – 600X larger surface area. In addition, lacking intelligence-driven agility, these systems are forced to rely on these larger mirrors increasing both complexity and cost. Combined with an 1550nm amplifiable laser and sophisticated receiver, the small mirrors in AEye’s custom-designed MEMS are produced in volume using standard processes and deliver the unique high-performance of iDAR with ground breaking reliability.
“The philosophy of AEye unique iDAR approach is to use resources where they have the greatest leverage for the application and make it the highest resolution, longest range, and most reliable LIDAR engine on the market,” said Jan Grahmann, Head of Micoroptical Devices and Systems at the Fraunhofer Institute for Photonic Microsystems. “Unlike competitors that have to work with large MEMS mirrors for long distances, we have worked with AEye to prove that a solid-state MEMS-scanning based LIDAR engine with MEMS mirrors at their design space sweet spot provide exceptional performance and extreme shock and vibration resistance.”
“The cumulative fatigue data from these tests is used to determine the lifespan, quality and reliability of 4Sight,” said Indu Vijayan, 4Sight Product Manager. “Passing these tests demonstrate that 4Sight will endure the vibrations of a vehicle for its rated lifetime as well as shock of 50G impact events such as collisions or massive potholes. From an automotive perspective, it also shows that 4Sight sensors are on-track to pass shock and vibration standards such as GMW3172 and LV124.”
Built on unique patented technologiesAEye’s iDAR platform and 4Sight sensors are backed by over 27 granted patents, 10 more patents in process, and over 1,300 claims covering system and component design and implementation. This broad patent portfolio includes several groundbreaking innovations such as the only scanning lidar patent granted for a camera and lidar sharing the same optical axis (co-boresited), eliminating enormous parallax correction; MEMS agile control, feedback, and intraframe sampling, allowing for edge processing and low-latency feedback; and advanced perception, enabling real-time capabilities such accurate intraframe calculation of object velocity.
Given the current constraints in travel, AEye is also announcing another industry-first innovation in the launch of Raptor – a unique high-performance web-based remote demo platform. Raptor will enable participants to engage in a real-time interactive test drive with an AEye engineer. From the comfort of their own home or office, AEye’s customers and partners will have the ability to see what a truly software-defined sensor can do and witness the record breaking 4Sight M performance in real time and to customize the demo to meet their specific use cases. Please contact [email protected] to schedule a demo.
About AEyeAEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, Airbus Ventures, and others.
Media Contact:
AEye, Inc.
Jennifer Deitsch
[email protected]
925-400-4366
AEye Reveals Advanced MEMS That Delivers Solid-State Performance and Reliability Needed to Advance Adoption of Low-Cost LiDAR —AEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADAEye to Feature Advanced Perception Capabilities at CES 2020AEye Team Profile: Viktoria ParkerAEye Sets New Benchmark for LiDAR RangeAEye and ANSYS Accelerate Autonomous Driving SafetyObstacle AvoidanceAEye’s iDAR Leverages Infineon Aurix as Host for Communication with Autosar, Functional Safety and Embedded SoftwareA Pedestrian in HeadlightsAEye and ANSYS: Simulation and Prototyping
AEye Unveils 4Sight™, a Breakthrough LiDAR Sensor That Delivers Automotive Solid-State Reliability and Record-Breaking Performance
Available in July, 2020, 4Sight provides automotive-grade ruggedness combined with software-definable range of up to 1000M and 0.025°resolution at a fraction of the price of current LiDAR sensors.
Dublin, CA – June 9, 2020 – AEye, Inc, an artificial perception pioneer today announced 4Sight™, a groundbreaking new sensor family built on its unique iDAR™ platform. 4Sight redefines LiDAR performance while establishing a benchmark for the next generation of LiDAR sensors and intelligent robotic vision systems. Debunking previous assumptions that high performing long-range 1550nm LiDAR could not achieve both solid state reliability and lower cost, 4Sight delivers on all three – performance, reliability and price.
The first 4Sight sensor to be released in July is the 4Sight M. The 4Sight M is designed to meet the diverse range of performance and functional requirements to power autonomous and partially automated applications. 4Sight family of sensors has been developed and tested over the last 18 months in conjunction with a wide range of customers and integrators in automotive, trucking, transit, construction, rail, intelligent traffic systems (ITS), aerospace and defense markets. 4Sight leverages the complete iDAR software platform, which incorporates an upgraded visualizer (which allows you to model various shot patterns) and a comprehensive SDK so it is fully extensible and customizable.
“The primary issue that has delayed broad adoption of LiDAR has been the industry’s inability to produce a high-performance deterministic sensor with solid state reliability at a reasonable cost,” said Blair LaCorte, President of AEye. “We created a more intelligent, agile sensor that is software definable to meet the unique needs of any application – the result is 4Sight.”
Some of the unique features of the 4Sight M are:
LiDAR Performance
Software definable range optimization of up to 1,000 meters (eye- and camera-safe)Up to 4 million points per second with horizontal and vertical resolution less than 0.1°Instantaneous addressable resolution of 0.025°Integrated Intelligence
Library of functionally-safe deterministic scan patterns that can be customized and fixed or triggered to adjust to changing environments (highway, urban, weather, etc.)Integrated automotive camera, boresight aligned with AEye’s agile LiDAR – instantaneously generating true color point clouds. Parallel camera-only feed can provide cost-effective redundant camera sensor.Enhanced ground plane detection to determine topology at extended rangesAdvanced Vision Capabilities
Detection and classification of objects with advanced perception features such as intraframe radial and lateral velocityDetection through foliage and adverse weather conditions such as rain and fog through the use of dynamic range, full-waveform processing of multiple returnsDetection of pedestrians at over 200 metersDetection of small, low reflective objects such tire fragments, bricks or other road debris (10x50cm at 10% reflectivity) at ranges of over 120 metersReliability
Shock and Vibration – designed and tested for solid-state performance and reliability. 4Sight has proven in third-party testing to sustain mechanical shock of over 50G, random vibration over 12Grms (5-2000Hz), and sustained vibration of over 3G for each axis.Automotive-grade production:Automotive-qualified supply chain utilizing standard production processes and overseen by global manufacturing partnersDesigned for manufacturability using a simple solid-state architecture consisting of only 1 scanner, 1 laser, 1 receiver, and 1 SoCCommon hardware architecture and software/data structures across all fully autonomous to partially automated applications (ADAS) – leveraging R&D and economies of scalePrice
4Sight can be configured for high-volume Mobility applications with SOP 2021 at an estimated 2x-5x lower price than any other high-performance LiDAR, additionally for ADAS applications with SOP 2023 4Sight is designed to be priced 1.5x-3x lower than any other long or medium range LiDAR.4Sight series production packaging options include roof, grill, and behind the windshield, software optimized depending on the placement“In my work with automotive OEMs, the value delivered by the 4Sight sensor surpasses anything I have seen in the automotive industry,” said Sebastian Bihari, Managing Director at Vektor Partners. “By starting with understanding the data a perception system needs, AEye has developed a simpler more responsive design for efficiently capturing and processing perception data. In doing this, 4Sight establishes standards for the industry that will take years for others to achieve.”
In addition to setting new standards for performance, reliability and price, the 4Sight M is also extremely power efficient. 4Sight’s groundbreaking system attributes such as time-to-detect, agile scanning, shot energy optimization, power optimization of returns, and boresight camera/LiDAR data generation, greatly reduces the processing and bandwidth typically required for full perception stacks. When implemented in a vehicle, AEye’s expects 4Sight sensors to be nearly power neutral in relation to a vehicle’s perception stack power budget.
Given the current constraints in travel, AEye is also announcing another industry-first innovation in the launch of Raptor – a unique high-performance web-based remote demo platform. Raptor will enable participants to engage in a real-time interactive test drive with an AEye engineer. From the comfort of their own home or office, AEye’s customers and partners will have the ability to see what a truly software-defined sensor can do and witness the record breaking 4Sight M performance in real time and to customize the demo to meet their specific use cases. Please contact [email protected] to schedule a demo.
About AEyeAEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since the demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Hella Ventures, LG Electronics, Subaru-SBI, Aisin, Intel Capital, Airbus Ventures, and others.
Media Contact:
AEye, Inc.
Jennifer Deitsch
[email protected]
925-596-3945
AEye Unveils 4Sight™, a Breakthrough LiDAR Sensor That Delivers Automotive Solid-State Reliability and Record-Breaking Performance —DesignNews Spotlights AEye President Blair LaCorte's Press Conference at AutoMobility LAAbrupt Stop DetectionAEye Reveals Advanced MEMS That Delivers Solid-State Performance and Reliability Needed to Advance Adoption of Low-Cost LiDARAEye Unveils New Headquarters in Dublin, CaliforniaCargo Protruding from VehicleAEye Announces World’s First Commercially Available Perception Software Designed to Run Inside the Sensors of Autonomous VehiclesObstacle AvoidanceAEye Hires Veteran Financial Exec as VP of FinanceAEye Sets New Benchmark for LiDAR Range
Time of Flight vs. FMCW LiDAR: A Side-by-Side Comparison
IntroductionRecent papers1–5 have presented a number of marketing claims about the benefits of Frequency Modulated Continuous Wave (FMCW) LiDAR systems. As might be expected, there is more to the story than the headlines claim. This white paper examines these claims and offers a technical comparison of Time of Flight (TOF) vs. FMCW LiDAR for each of them. We hope this serves to outline some of the difficult system trade-offs a successful practitioner must overcome, thereby stimulating robust informed discussion, competition, and ultimately, improvement of both TOF and FMCW offerings to advance perception for autonomy.
Competitive ClaimsBelow is a summary of our views and a side-by-side comparison between TOF vs. FMCW LiDAR claims.
Download “Time of Flight vs. FMCW LiDAR: A Side-by-Side Comparison” [pdf]
Claim #1: FMCW is a (new) revolutionary technologyThis is untrueContrary to the recent news articles, FMCW LiDAR has been around for a very long time, with its beginnings stemming from work done at MIT Lincoln Laboratory in the 1960s,8 only seven years after the laser itself was invented.9 Many of the lessons we learned about FMCW over the years—while unclassified and public domain—have unfortunately been long forgotten. What has changed in recent years is the higher availability of long coherence-length lasers. While this has justifiably rejuvenated interest in the established technology, as it can theoretically provide an extremely high signal gain, there are still several limitations, long ago identified, that must be addressed to make this LiDAR viable for autonomous vehicles. If not addressed, the claim that “new” FMCW will cost-effectively solve the automotive industry’s challenges with both scalable data collection and long-range, small object detections, will prove untrue.
Claim #2: FMCW detects/tracks objects farther, fasterThis is unprovenTOF LiDAR systems can offer very fast laser shot rates (several million shots per second in the AEye system), agile scanning, increased return salience, and the ability to apply high density Regions of Interest (ROIs)—giving you a factor of 2x–4x better information from returns versus other systems. By comparison, many low complexity FMCW systems are only capable of shot rates in the 10’s to 100’s of thousands of shots per second (~50x slower). So, in essence, we are comparing nanosecond dwell times and high repetition rates with tens of microsecond dwell times and low repetition rates (per laser/rx pair).
Detection, acquisition (classification), and tracking of objects at long range are all heavily influenced by laser shot rate, because higher laser shot density (in space and/or time) provides more information that allows for faster detection times and better noise filtering. AEye has demonstrated a system that is capable of multi-point detects of low reflectivity: small objects and pedestrians at over 200m, vehicles at 300m, and a class-3 truck at 1km range. This speaks to the ranging capability of TOF technology. Indeed, virtually all laser rangefinders use TOF, not FMCW, for distance ranging (e.g., the Voxtel rangefinder10 products, some with a 10+km detection range). Although recent articles claim that FMCW has superior range, we haven’t seen an FMCW system that can match the range of an advanced TOF system.
Claim #3: FMCW measures velocity and range more accurately and efficientlyThis is misleadingTOF systems, including AEye’s LiDAR, do require multiple laser shots to determine target velocity. This might seem like extra overhead when compared to the claims of FMCW with single shots. Much more important, is the understanding that not all velocity measurements are equal. While radial velocity in two cars moving head-on is urgent (one of the reasons a longer range of detection is so desired), so too is lateral velocity as it comprises over 90% of the most dangerous edge cases. Cars running a red light, swerving vehicles, pedestrians stepping into a street, all require lateral velocity for evasive decision making. FMCW cannot measure lateral velocity simultaneously, in one shot, and has no benefit whatsoever in finding lateral velocity over TOF systems.
Consider a car moving between 30 and 40 meters/second (~67 to 89 MPH) detected by a laser shot. If a second laser shot is taken a short period later, say 50us after the first, the target will only have moved ~1.75mm during that interval. To establish a velocity that is statistically significant, the target should have moved at least 2cm, which takes about 500us (while requiring sufficient SNR to interpolate range samples). With that second measurement, a statistically significant range and velocity can be established within a time frame that is negligible compared to a frame rate. With an agile scanner, such as the one AEye has developed, the 500us is not solely dedicated or “captive” to velocity estimation. Instead, many other shots can be fired at targets in the interim. We can use the time wisely to look at other areas/targets before returning to the original target for a high confidence velocity measurement. Whereas, an FMCW system is captive for their entire dwell time.
Compounding the captivity time is the additional fact that FMCW often requires a minimum of two laser frequency sweeps (up and down) to form an unambiguous detection, with the down sweep providing information needed to overcome ambiguity arising from the mixing range + Doppler shift. This doubles the dwell time required per shot above and beyond that already described in the previous paragraph. The amount of motion of a target in 10us can be typically only 0.5mm. This level of displacement enters the regime where it is difficult to separate vibration versus real, lineal motion. Again, in the case of lateral velocity, no FMCW system will instantly detect lateral speed at all without multi-position estimates such as those used by TOF systems, but with the additional baggage of long FMCW dwell times.
Lastly, in an extreme TOF example, the AEye system has demonstrated detected objects at 1km. Even if it required two consecutive shots to get velocity on a target at 1km, it’s easy to see how that would be superior to a single shot at 100m given a common frame rate of 20Hz and typical vehicle speeds.
Claim #4: FMCW has less interferenceQuite the opposite actually!Spurious reflections arise in both TOF and FMCW systems. These can include retroreflector anomalies like “halos,” “shells,” first surface reflections (even worse behind windshields), off-axis spatial sidelobes, as well as multipath, and clutter. The key to any good LiDAR is to suppress sidelobes in both the spatial domain (with good optics) and the temporal/waveform domain. TOF and FMCW are comparable in spatial behavior, but where FMCW truly suffers is in the time domain/waveform domain when high contrast targets are present.
ClutterFMCW relies on window-based sidelobe rejection to address self-interference (clutter) which is far less robust than TOF, which has no sidelobes to begin with. To provide context, a 10us FMCW pulse spreads light radially across 1.5km range. Any objects within this range extent will be caught in the FFT (time) sidelobes. Even a shorter 1us FMCW pulse can be corrupted by high intensity clutter 150m away. The 1st sidelobe of a Rectangular Window FFT is well known to be -13dB, far above the levels needed for a consistently good point cloud. (Unless no object in the shot differs in intensity by any other range point in a shot by more than about 13dB, something that is unlikely in operational road conditions).
Of course, deeper sidelobe taper can be applied, but at the sacrifice of pulse broadening. Furthermore, nonlinearities in the receiver front end (so-called spurious-free dynamic range) will limit the effective overall system sidelobe levels achievable due to: compression and ADC spurs (third order intercepts); phase noise;6 and atmospheric phase modulation etc., which no amount of window taper can mitigate. Aerospace and defense systems of course can and do overcome such limitations, but we are unaware of any low-cost automotive grade systems capable of the time-instantaneous >100db dynamic range required to sort out long-range small objects from near-range retroreflectors, such as arise in FMCW.
In contrast, a typical Gaussian TOF system, at 2ns pulse duration, has no time-based sidelobes whatsoever beyond the few cm of the pulse duration itself. No amount of dynamic range between small and large offset returns has any effect on the light incident on the photodetector when the small target return is captured. We invite anyone evaluating LiDAR systems to carefully inspect the point cloud quality of TOF vs FMCW under various driving conditions for themselves. The multitude of potential sidelobes in FMCW lead to artifacts that impact not just local range samples, but the entire returned waveform for a given pulse!
First surface (e.g., FMCW behind a windshield or other first surface)A potentially stronger interference source is a reflection caused by either a windshield or other first surface that is applied to the LiDAR system. Just as the transmit beam is on near continuously, the reflections will be continuous, and very strong, relative to distant objects, representing a similar kind of low frequency component that creates undesirable FFT sidelobes in the transformed data. The result can also be a significant reduction of usable dynamic range. Furthermore, windshields, being multilayer glass under mechanical stress, have complex inhomogeneous polarization. This randomizes the electric field of the signal return on the photodetector surface complicating (decohering) optical mixing.
Lastly, due to the nature of the time domain processing vs frequency domain processing, the handling of multi-echoes—even with high dynamic range—is a straightforward process in TOF systems. Whereas, it requires significant disambiguation in FMCW systems. Multi-echo processing is..
AEye Team Profile: Steven Wong
Steven Wong is a critical member of the AEye Operations Team. His responsibilities include: checking products for defects, recording all materials into Netsuite, working with buyers and the purchasing team to ensure that invoices and sales orders are received, and much, much more. Previously, he worked at BioVision as a Shipping Specialist, where he coordinated domestic and international shipments and fulfilled materials for various departments and resolved discrepancies. Steve also worked in Technical Support at Cisco, where he would analyze and build systems per customer’s request, troubleshoot and diagnose hardware and software, and program and assemble special orders.We sat down with Steven to learn more about his professional credentials and achievements, why he wanted to work in the AV industry, and his favorite San Francisco 49er’s memory.
Q: For your role, you’ve accumulated an impressive amount of professional credentials and achievements. What are they and how are they a value to AEye and its customers?Most of my career has been in warehouse operation logistics. Over the years, I’ve worked at many companies and achieved a unique skill set that helps move AEye forward. Because I’ve worked in a variety of industries, like semiconductors, biotech, and now automotive, I’ve had to complete a lot of different trainings, from “Dangerous Goods” training, to IATA certifications, leadership compliance, and more, which all play a key role in my work here at AEye.
Q: What made you want to be a part of the autonomous vehicle industry?I decided to join the automotive industry because I wanted to learn more about it. I had heard a lot of great things about what vehicles will be like in the future, and working hands-on in the industry is a great way to learn about the future of automotive! I watch a lot of Sci-Fi movies – and to be a part of something that’s like straight out of Sci-Fi is very exciting. I’m looking forward to seeing what’s next.
Q: You’re a huge San Francisco 49ers fan! What is your favorite 49ers memory?My favorite 49ers memory is going to this past season’s Division Title game against the Green Bay Packers. This was the first time I’d seen the Niners play in a Title game and it was so electrifying. And then they won and went to the Super Bowl. I’m glad I was there and able to witness that exhilarating moment.
The celebration at Levi’s Stadium after the San Francisco 49ers clinch the NFC Title against the Green Bay Packers 37-20 on January 19, 2020. Video taken by Steve Wong.
AEye Team Profile: Steven Wong —Rethinking the Three “Rs” of LiDAR: Rate, Resolution and RangeAEye and Intvo Demo Advanced Pedestrian Intent Capabilities at CES 2020AEye’s New AE110 iDAR System Integrated into HELLA Vehicle at IAA in FrankfurtAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye and ANSYS Accelerate Autonomous Driving SafetyAEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsAEye Team Profile: Ove SalomonssonAEye Unveils New Headquarters in Dublin, CaliforniaFlatbed Trailer Across Roadway
Auto.AI – February 24 , 2020
February 24, 2020 | Auto.AI | San Francisco, CAFebruary 24th, 4:40-5:30pm | “Challenge Your Peers” Session: Advanced Perception for Safe Vehicle AutonomySpeaker: Dr. Barry Behnken – Co-founder and SVP of Engineering, AEyeAuto.AI – February 24 , 2020 —Rethinking the Three “Rs” of LiDAR: Rate, Resolution and RangeAbrupt Stop DetectionAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye Team Profile: Ove SalomonssonTata Elxsi and AEye Unveil Integrated RoboTaxi SystemAEye Team Profile: Umar PirachaAEye Unveils New Headquarters in Dublin, CaliforniaAEye’s iDAR Leverages Infineon Aurix as Host for Communication with Autosar, Functional Safety and Embedded SoftwareDo you believe you’re a better driver than an autonomous car?
AEye and Infineon: Perception, Safety, and Performance
AEye’s growing ecosystem of global partners are leveraging AEye’s artificial perception platform to advance safe, reliable autonomy. AEye has integrated Infineon’s AURIX microcontroller into AEye’s iDAR platform to ensure a robust, software-definable platform that is functionally safe for automated and autonomous vehicle initiatives. Watch how Infineon leverages AEye’s iDAR platform to create sensor fusion in their AURIX microcontroller.Watch how.
AEye and Infineon: Perception, Safety, and Performance —A Pedestrian in HeadlightsAEye and ANSYS Accelerate Autonomous Driving SafetyAEye Team Profile: Ove SalomonssonAEye Announces World’s First Commercially Available Perception Software Designed to Run Inside the Sensors of Autonomous VehiclesAEye to Feature Advanced Perception Capabilities at CES 2020AEye Wins Best Innovative Sensor Technology Award at IDTechExFlatbed Trailer Across RoadwayAEye Named to Forbes AI 50Abrupt Stop Detection
AEye and Tata Elxsi: Autonomous Vehicle
AEye’s partner ecosystem is embracing and extending iDAR to accelerate innovation and the availability of autonomous features. AEye and Tata Elxsi have unveiled RoboTaxi, Tata Elxsi’s in-house concept demonstrator vehicle developed using AEye’s iDAR platform and Tata Elxsi’s autonomous stack. Watch a demonstration of the fully autonomous RoboTaxi vehicle, fitted with AEye’s iDAR, successfully encounter various scenarios, such as cross-traffic detection at a junction and round-about, follow the road ahead, and cueing the sensor with HD maps and V2X information.
AEye and Tata Elxsi: Autonomous Vehicle —AEye and Intvo Demo Advanced Pedestrian Intent Capabilities at CES 2020Obstacle AvoidanceAEye Named to Forbes AI 50False PositiveAEye Wins Best Innovative Sensor Technology Award at IDTechExAEye Team Profile: Elaine WuAEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsRethinking the Three “Rs” of LiDAR: Rate, Resolution and RangeCargo Protruding from Vehicle
AEye and ANSYS: Simulation and Prototyping
AEye and ANSYS are accelerating autonomous driving safety by enabling virtual prototyping of iDAR™ to speed design, testing and validation of autonomous systems. By working with ANSYS, AEye is empowering partners and customers to simulate driving situations across millions of miles in just days, minimizing physical prototyping. Watch the demonstration of AEye’s iDAR using the VRXPERIENCE and SPEOS elements of ANSYS Autonomy, showcasing hazard detection in a virtual world.
AEye and ANSYS: Simulation and Prototyping —AEye Team Profile: Vivek ThotlaObstacle AvoidanceFlatbed Trailer Across RoadwayDo you believe you’re a better driver than an autonomous car?A Pedestrian in HeadlightsAEye Wins Most Outstanding Autonomous Vehicle Technology Innovation of the Year Award at TECH.ADAEye to Feature Advanced Perception Capabilities at CES 2020Abrupt Stop DetectionRethinking the Three “Rs” of LiDAR: Rate, Resolution and Range