Affalterbach. With the new Performance models based on the GLE, Mercedes-AMG is continuing a precisely 20 year-old tradition: in 1999 the ML 55 as the progenitor based on the then M-Class was the top model in the range with its 255 kW (347 hp) V8 engine. The same applies to its current successor in the guise… Continue reading @Daimler: More powerful, efficient and suitable for day-to-day use than ever before: The new Mercedes-AMG GLE 63 4MATIC+
Category: Official Press Release
@Hyundai: Refreshed Niro Hybrid debuts at Los Angeles Auto Show
New headlights, grille and wheels highlight exterior changes Added tech upgrades and more Advanced Driver Assistance Systems Today Kia Motors America (KMA) unveiled the enhanced Niro Hybrid. The popular and award-winning compact crossover receives several exterior enhancements for its mid-cycle refresh, including new projector-type headlights and fog lamps, a new diamond-pattern grille and dual chevron-shaped… Continue reading @Hyundai: Refreshed Niro Hybrid debuts at Los Angeles Auto Show
A new camouflaged Maserati spotted on the streets of Modena. The engine is 100% Maserati.
The experimental vehicles are equipped with a new powertrain entirely developed and built in Maserati and will be the forefather of a new family of engines integrated exclusively on the vehicles of the Brand. The data acquired through the kilometres covered by the mules will be integrated with the experience gathered in the Maserati Innovation… Continue reading A new camouflaged Maserati spotted on the streets of Modena. The engine is 100% Maserati.
Flatbed Trailer Across Roadway
skip to Main ContentHuman drivers confront and handle an incredible variety of situations and scenarios—terrain, roadway types, traffic conditions, weather conditions—for which autonomous vehicle technology needs to navigate both safely, and efficiently. These are edge cases, and they occur with surprising frequency. In order to achieve advanced levels of autonomy or breakthrough ADAS features, these edge cases must be addressed. In this series, we explore common, real-world scenarios that are difficult for today’s conventional perception solutions to handle reliably. We’ll then describe how AEye’s software definable iDAR™ (Intelligent Detection and Ranging) successfully perceives and responds to these challenges, improving overall safety.
Download AEye Edge Case: Flatbed Trailer Across Roadway [pdf]
Challenge: Flatbed Trailer Across RoadwayA vehicle equipped with an advanced driver assistance system (ADAS) is traveling 45mph down a four-lane road that passes through a sparsely populated town. Relying on the vehicle to navigate, the ADAS driver has largely stopped paying attention. Ahead, a semi-truck towing a flatbed trailer slowly traverses across the road. As the distance between the vehicle and the trailer shrinks rapidly, it’s up to the perception system to detect and classify the trailer, as well as measure its velocity and distance. At SAE Level 3 and beyond, where the car is assumed to be in control, the vehicle’s path planning software must make a critical decision about whether to swerve or slam on the brakes before it’s too late.
How Current Solutions Fall ShortToday’s advanced driver assistance systems (ADAS) will experience great difficulty recognizing this threat or reacting appropriately. Depending on its sensor configuration and perception training, the system may fail to register the trailer due to its very thin profile.
Camera. A perception system based on camera sensors will be prone to either misinterpret the threat, register a false positive, or miss the threat entirely. In the distance, the trailer will appear as little more than a two-dimensional line across the roadway. If the vehicle is turning, those same pixels could also be interpreted as a guardrail. In order to be accurate in all scenarios, the perception system must be trained in every possible light condition in combination with all color and size permutations. This poses an immense challenge, as there will be instances that haven’t been foreseen, creating a potentially deadly combination for perception systems that primarily depend on camera data.
Radar. Approached from the side, the profile of a flatbed trailer is very thin. With no better than a few degrees of angular resolution, radars are ill-equipped to detect such narrow horizontal objects. In this case, a majority of the radar’s radio waves will miss the slim profile of the trailer.
Camera + Radar. A perception system that only relies on camera and radar would likely be unable to detect the flatbed trailer and react in time. The camera data would be insufficiently detailed to classify the trailer and would likely lead the perception system to mistakenly classify the trailer as one of several common roadway features. As radar would also be unlikely to accurately detect the full length of the trailer, it would also mislead the perception system. In this instance, the combination of a camera and radar does little to improve the odds of accurately classifying the trailer.
LiDAR. Today’s conventional LiDAR produces very dense horizontal scan lines coupled with very poor vertical density. This scan pattern creates a challenge for detection when objects are horizontal, thin, and narrow—it’s easy for LiDAR’s laser shots to miss them entirely. Some LiDAR shots will hit the trailer. However, it takes time to gather the requisite number of detections to register any object. Depending on the vehicle’s speed, this process may take too much time to prevent a collision.
Successfully Resolving the Challenge with iDARA vehicle that enters a scene laterally is very difficult to track. iDAR overcomes this difficulty with its ability to selectively allocate LiDAR shots to Regions of Interest (ROIs). As soon as the LiDAR registers a single detection of the trailer, iDAR dynamically changes both the LiDAR’s temporal and spatial sampling density to comprehensively interrogate the trailer, thus gaining critical information like its size and distance ahead.
iDAR can schedule LiDAR shots to revisit Regions of Interest in a matter of microseconds to milliseconds. This means that iDAR can interrogate an object up to 3000x faster than conventional LiDAR systems, which typically require hundreds of milliseconds to revisit an object. As a result, iDAR has an unprecedented ability to calculate valuable attributes, including object distance and velocity (both lateral and radial), faster than any other system.
Software ComponentsComputer Vision. iDAR combines 2D camera pixels with 3D LiDAR voxels to create Dynamic Vixels. This data type helps the system’s AI refine the LiDAR point cloud around the trailer edges, effectively eliminating all the irrelevant points. As a result, iDAR is able to clearly distinguish the trailer from other roadway features, like guardrails and signage.
Cueing. For safety purposes, it’s essential to classify threats at range because their identities determine the vehicle’s specific and immediate response. To generate a dataset that is rich enough to apply perception algorithms for classification, as soon as LiDAR detects an object, it will cue the AI camera for deeper real-time analysis about its color, size, and shape. The camera will then review the pixels, running algorithms to define the object’s possible identities. To gain additional insights, the camera cues the LiDAR for additional data, which allocates more shots.
Feedback Loops. A feedback loop is triggered when an algorithm needs additional data from sensors. In this scenario, a feedback loop will be triggered between the camera and the LiDAR. The camera can cue the LiDAR, and the LiDAR can cue additional interrogation points, or a Dynamic Region of Interest, to determine the trailer’s true velocity. This information is sent to the domain controller so that it can decide whether to apply the brakes or swerve to avoid a collision.
The Value of AEye’s iDARLiDAR sensors embedded with AI for intelligent perception are very different than those that passively collect data. As soon as iDAR registers a single detection of the flatbed trailer, it dynamically modifies the LiDAR scan pattern, scheduling a rapid series of shots to cover the trailer with a dense pattern of laser pulses to extract information about its distance and velocity. Flexible shot allocation vastly reduces the required number of shots per frame to extract the most valuable information in every scenario. This not only enables the vehicle’s perception system to more accurately track objects through time and space, it also makes autonomous driving much safer because it eliminates ambiguity, accelerates the perception process, and allows for more efficient use of processing resources.
Flatbed Trailer Across Roadway —Smarter Cars Podcast Talks LiDAR and Perception Systems with AEye President, Blair LaCorteThe Human Classification Framework: Search, Acquire, and ActCargo Protruding from VehicleAEye Advisory Board Profile: Luke SchneiderUnique iDAR Features That Drive SAE’s 5 Levels of AutonomyAEye: Developing Artificial Perception Technologies That Exceed Human PerceptionFalse PositiveAEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsAEye Advisory Board Profile: Adrian Kaehlerprevious post: Previousnext post: Next ← A Pedestrian in Headlights ← Obstacle AvoidanceAbout Management Team Advisory Board InvestorsiDAR Agile LiDAR Dynamic Vixels AI & Software Definability iDAR in ActionProducts AE110 AE200 iDAR Select Partner ProgramNews Press Releases AEye in the News Events AwardsLibrary Technology News & Views Profiles Videos BlogCareersSupportContact Back To Top
Obstacle Avoidance
skip to Main ContentHuman drivers confront and handle an incredible variety of situations and scenarios—terrain, roadway types, traffic conditions, weather conditions—for which autonomous vehicle technology needs to navigate both safely, and efficiently. These are edge cases, and they occur with surprising frequency. In order to achieve advanced levels of autonomy or breakthrough ADAS features, these edge cases must be addressed. In this series, we explore common, real-world scenarios that are difficult for today’s conventional perception solutions to handle reliably. We’ll then describe how AEye’s software definable iDAR™ (Intelligent Detection and Ranging) successfully perceives and responds to these challenges, improving overall safety.
Download AEye Edge Case: Obstacle Avoidance [pdf]
Challenge: Black Trash Can on RoadwayA vehicle equipped with an advanced driver assistance system (ADAS) is cruising down a city street at 35mph. Its driver is somewhat distracted and also driving too close to the vehicle ahead. Suddenly, the vehicle ahead swerves out of the lane, narrowly avoiding a black trash can that has fallen off a garbage truck. To avoid collision, the ADAS system must make a quick series of assessments. It must not only detect the trash can, it must also classify it and gauge its size and threat level. Then, it can decide whether to brake quickly or plan a safe path around it while avoiding a collision with parallel traffic.
How Current Solutions Fall ShortToday’s advanced driver assistance systems (ADAS) will experience great difficulty detecting the trash can and/or classifying it fast enough to react in the safest way possible. Typically, ADAS vehicle systems are trained to avoid activating the brakes for every anomaly on the road. As a result, in many cases they will simply drive into objects. In contrast, level 4 or 5 self-driving vehicles are biased toward avoiding collisions. In this scenario, they’ll either undertake evasive maneuvers or slam on the brakes, which could create a nuisance or cause an accident.
Camera. A perception system must be comprehensively trained to interpret all pixels of an image. In order to solve this edge case, the perception system would need to be trained on every possible permutation of objects lying in the road under every possible lighting condition. Achieving this goal is particularly difficult because objects can appear in an almost infinite array of shapes, forms, and colors. Moreover, the black trash can on black asphalt will further challenge the camera, especially at night and during low visibility and glare conditions.
Radar. Radar performance is poor when objects are made of plastic, rubber, and other non-metallic materials. As such, a black plastic trash can is difficult for radar to detect.
Camera + Radar. In many cases, a system using camera and radar would be unable to detect the black trash can at all. Moreover, a vehicle that constantly brakes for every road anomaly creates a nuisance and can cause a rear end accident. So, an ADAS system equipped with camera plus radar would typically be trained to ignore the trash can in an effort to avoid false positives when encountering objects like speed bumps and small debris.
LiDAR. LiDAR would detect the trash can regardless of perception training, lighting conditions, or its position on the road. At issue here is the low resolution of today’s LiDAR systems. A four-channel LiDAR completes a scan of the surroundings every 100 milliseconds. At this rate, LiDAR would be not be able to achieve the required number of shots on the trash can to register a valid detection. It would take 0.5 seconds before the trash can was even considered an object of interest. Even 16-channel LiDAR would struggle to get five points fast enough.
Successfully Resolving the Challenge with iDARAs soon as the trash can appears in the road ahead, iDAR’s first priority is classification. One of iDAR’s biggest advantages is that it is agile in nature. It can adjust laser scan patterns in real time, selectively targeting specific objects in the environment and dynamically changing scan density to learn more about them. This ability to instantaneously increase resolution is a critical ability that enables it to classify the trash can quickly. During this process, iDAR simultaneously keeps tabs on everything else. Once the trash can is classified, the domain controller uses what it already knows about the surrounding environment to respond in the safest way possible.
Software ComponentsComputer Vision. iDAR is designed with computer vision that creates a smarter, more focused LiDAR point cloud. In order to effectively “see” the trash can, iDAR combines the camera’s 2D pixels with the LiDAR’s 3D voxels to create Dynamic Vixels. This combination helps the AI refine the LiDAR point clouds around the trash can, effectively eliminating all the irrelevant points and leaving only its edges.
Cueing. For safety purposes, it’s essential to classify objects at range because their identities determine the vehicle’s specific and immediate response. To generate a dataset that is rich enough to apply perception algorithms for classification, as soon as LiDAR detects the trash can, it will cue the camera for deeper real-time analysis about its color, size, and shape. The camera will then review the pixels, running algorithms to define its possible identities. If it needs more information, the camera may then cue the LiDAR to allocate additional shots.
Feedback Loops. Intelligent iDAR sensors are capable of cueing themselves. If the camera lacks data, the LiDAR will generate a feedback loop that tells itself to “paint” the trash can with a dense pattern of laser pulses. This enables it to gather enough information for the LiDAR to run algorithms to effectively guess what it is. At the same time, it can also collect information about the intensity of laser light reflecting back. Because a plastic trash can is more reflective than the road, the laser light bouncing off of it will be more intense. Thus, the perception system can better distinguish it.
The Value of AEye’s iDARLiDAR sensors embedded with AI for intelligent perception are very different than those that passively collect data. When iDAR registers a single detection of an object in the road, its priority is to determine its size and identify it. iDAR will schedule a series of LiDAR shots in that area and combine that data with camera pixels. iDAR can flexibly adjust point cloud density around objects, using classification algorithms at the edge of the network before anything is sent to the domain controller. This ensures that there’s greatly reduced latency and that only the most important data is used to determine whether the vehicle should brake or swerve.
Obstacle Avoidance —AEye Wins Award for Most Innovative Autonomous Driving Platform at AutoSens BrusselsAEye Team Profile: Aravind RatnamAEye Team Profile: Ove SalomonssonLeading Global Automotive Supplier Aisin Invests in AEye through Pegasus Tech VenturesCargo Protruding from VehicleAEye Team Profile: Indu VijayanThe Human Classification Framework: Search, Acquire, and ActAEye: Developing Artificial Perception Technologies That Exceed Human PerceptionAEye Team Profile: Jim Robnettprevious post: Previousnext post: Next ← Flatbed Trailer Across Roadway ← Abrupt Stop DetectionAbout Management Team Advisory Board InvestorsiDAR Agile LiDAR Dynamic Vixels AI & Software Definability iDAR in ActionProducts AE110 AE200 iDAR Select Partner ProgramNews Press Releases AEye in the News Events AwardsLibrary Technology News & Views Profiles Videos BlogCareersSupportContact Back To Top
NVIDIA and Microsoft Team Up to Aid AI Startups
NVIDIA and Microsoft are teaming up to provide the world’s most innovative young companies with access to their respective accelerator programs for AI startups. Members of NVIDIA Inception and Microsoft for Startups can now receive all the benefits of both programs — including technology, training, go-to-market support and NVIDIA GPU credits in the Azure cloud… Continue reading NVIDIA and Microsoft Team Up to Aid AI Startups
@FCA: FCA Statement in Response to GM Lawsuit
Privacy Statement Legal Notices and Terms © 2019 FCA US LLC. All Rights Reserved.Chrysler, Dodge, Jeep, Ram, Mopar and SRT are registered trademarks of FCA US LLC.ALFA ROMEO and FIAT are registered trademarks of FCA Group Marketing S.p.A., used with permission. Go to Source
Faraday Future Reveals Its New Concept of the Third Internet Living Space
Revolutionary user experience designed to create a mobile, connected and luxury third internet living spaceSignificant product innovations, including an all-in-one car with smart mobility and advanced artificial intelligenceIntegrated internet and AI applications including voice controls, predictive interfaces and autonomous driving capabilitiesLOS ANGELES, Nov. 19, 2019 (GLOBE NEWSWIRE) — Faraday Future (FF), a California-based global shared… Continue reading Faraday Future Reveals Its New Concept of the Third Internet Living Space
@Groupe PSA: Groupe PSA: The Trémery Plant in France’s Grand Est Region Is at the Forefront of Groupe PSA’s Energy Transition
RUEIL-MALMAISON, France–(BUSINESS WIRE)–Regulatory News: Yann Vincent, Executive Vice President, Manufacturing & Supply Chain for Groupe PSA (Paris:UG) said, “Years ago we made the decision to invest in the energy transition and make our plants more flexible, as illustrated by the Trémery plant. We are very proud of all our plant employees in the Grand Est… Continue reading @Groupe PSA: Groupe PSA: The Trémery Plant in France’s Grand Est Region Is at the Forefront of Groupe PSA’s Energy Transition
Further milestones of the Daimler sustainability initiative: Daimler Sustainability Dialogue
20.
November 2019
Stuttgart
Press Contact (6)
Silke Mockert
Integrity and Legal Affairs Communications
silke.mockert@daimler.com
Tel: +49 711 17-25518
Fax: +49 711 17790-42238
Heike Rombach
Manager International Business Communications Mercedes-Benz Cars
Procurement & Supplier Quality
heike.rombach@daimler.com
Tel: +49 711 17-35012
Fax: +49 711 1779031880
Birgit Zaiser
Manager International Business Communication Mercedes-Benz Cars Production & Supply Chain Management
birgit.zaiser@daimler.com
Tel: +49 160 8614753
Fax: +49 711 1779012269
Christoph Johannes Sedlmayr
Manager Communications Vehicle R&D and Sustainable Mobility
christoph.sedlmayr@daimler.com
Tel: +49 711 17-91404
Fax: .
Peter Smodej
Daimler Trucks Technologies & Regulations
peter.smodej@daimler.com
Tel: +49 711 17-53230
Fax: +49 711 17-79078088
Vera Pfister
Spokesperson Van Technology Communications – Sustainability and Environment
vera.pfister@daimler.com
Tel: +49 (0)711 17-54029
Fax: +49 (0)711 17-52030
Download
Filter
Settings
Do you really want to delete the data record?
Please wait a moment …
Please wait a moment …
Please wait a moment …
Please wait a moment …
Creating lasting values with sustainable mobility: Short version: 12. Daimler Sustainability Dialogue
Stuttgart, Nov 20, 2019
Science Based Targets Initiative (SBTI) defines CO2 targets that support the Paris Climate Accord: Transparency initiative in climate protection & prevention of air pollution
Stuttgart, Nov 20, 2019
From 2022 European Daimler plants will produce on a CO2-neutral basis: Production
Stuttgart, Nov 20, 2019
The EQC as the trailblazer for the electric car initiative: 360° environmental check
Stuttgart, Nov 20, 2019
The path to sustainable global transport of goods and passengers: Daimler Trucks & Buses
Stuttgart, Nov 20, 2019
Smart mobility for a better quality of life in the cities: Mercedes-Benz Vans
Stuttgart, Nov 20, 2019
Loading