Volvo’s autonomous trucks tackle mining operations

Six autonomous trucks will move limestone about 5km from the pit to the crusher, through outdoor and enclosed spaces It might not be as flashy as a supercharged Tesla racing up the motorway with no hands on the wheel, but Volvo Trucks’ new contract to haul limestone at a mine in Norway could be more impactful.… Continue reading Volvo’s autonomous trucks tackle mining operations

Ghosn’s arrest casts doubt on Renault-Nissan alliance

Ghosn’s arrest casts doubt on Renault-Nissan allianceDetroit – For years, France’s Renault and Japan’s Nissan struggled to make money in the global auto business.
Then came Carlos Ghosn, a Renault executive who helped to orchestrate an unprecedented transcontinental alliance, combining parts of both companies to share engineering and technology costs.
Now Ghosn’s arrest in Japan for alleged financial improprieties at Nissan could put the nearly 20-year-old alliance in jeopardy.
Ghosn, 64, born in Brazil, schooled in France and of Lebanese heritage, is set to be ousted this week from his spot as Nissan chairman. He could also lose his roles as CEO and chairman of Renault, threatening the alliance formed in 1999 that’s now selling more than 10 million automobiles a year.
He’s been “the glue that holds Renault and Nissan together,” Bernstein analyst Max Warburton wrote in a note to investors. “It is hard not to conclude that there may be a gulf opening up between Renault and Nissan.”
Nissan has said it will dismiss Ghosn after he was arrested for allegedly abusing company funds and misreporting his income. That opens up a leadership void at the entire alliance, for which Ghosn officially still serves as CEO and chairman.
Ghosn added Mitsubishi to the alliance two years ago after the tiny automaker was caught in a gas-mileage cheating scandal. He had even floated the idea of a full merger between the three companies.
“Today’s events throw any prospect of that up in the air,” Michael Hewson, chief market analyst at CMC Markets in London, wrote in a note to investors.
Nissan CEO Hiroto Saikawa has publicly resisted the idea of an outright merger. So with Ghosn out at Nissan and probably Renault as well, the companies are unlikely to get any closer.
The companies now share technology, and they save money by jointly purchasing components.
While there could be some scrutiny of the relationships between the companies, they’re so intertwined now that cutting them apart would be difficult, said Kelley Blue Book analyst Michelle Krebs. “I would not predict its demise,” Krebs said of the alliance.
She said she sees further consolidation in an industry that faces unprecedented research costs for autonomous and electric vehicles, while at the same time continuing to develop cars and trucks powered by internal combustion engines.
“The last thing one of the world’s biggest automakers needs is the disruption caused by an investigation into the behavior of a man who has towered over the global auto sector,” said Hewson.
Nissan’s board is to meet Thursday to consider Ghosn’s fate. Renault, where Ghosn is also CEO, said its board will hold an emergency meeting soon, and experts say it is unlikely that he will be able to stay at the company or the broader alliance.
The brash Ghosn was once viewed as a savior in the auto business with the ability to turn around the two struggling companies. In 2006 he even proposed an alliance with global giant General Motors.
Bernstein’s Warburton wrote that Ghosn’s once-mighty reputation has been declining for years, while Krebs said Nissan never could meet Ghosn’s goal of 10 percent U.S. market share even though it has relied on “bad behavior” such as heavy discounts and sales to rental car companies.
Saikawa reiterated Nissan’s commitment to the venture, while a Renault statement expressed “dedication to the defense of Renault’s interest in the alliance.”
––––
Charlton reported from Paris. News Researcher Rhonda Shafner contributed from New York.
Copyright 2018 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
Read or Share this story: https://www.detroitnews.com/story/business/autos/foreign/2018/11/20/ghosns-arrest-casts-doubt-renault-nissan-alliance/38570467/

Congress considers extending electric vehicle tax credits, approval of self-driving cars

Congress considers extending electric vehicle tax credits, approval of self-driving carsWashington — With Congress returning to Washington on Tuesday for a flurry of legislative activity before the end of the year, transportation advocates are hoping to win support for pair of measures that would allow carmakers to sell thousands of self-driving cars and extend tax credits for electric vehicles.
Supporters of a U.S. Senate bill championed by U.S. Sen. Gary Peters, D-Bloomfield Township, that would allow automakers to sell more than 80,000 self-driving cars each per year are hoping to finally pass the measure in the upcoming so-called lame duck session after a year-long wait. They note that the current Republican-led House passed a similar measure with relative ease in 2017.
Additionally, General Motors Co., Nissan Motor Co. and Tesla Inc. have joined forces with environmental groups to form a new coalition that is pushing to remove a cap on a federal tax credit that provides up to $7,500 to buyers of electric cars. GM, Nissan and Tesla, makers of the Chevrolet Bolt, Nissan Leaf and Tesla's electric-fleet, are among the biggest electric car producers in the U.S. Current rules allow automakers to offer credits for up to 200,000 electric vehicles per manufacturer.
Republican senators may be more likely to compromise with their Democratic colleagues on the self-driving legislation instead of waiting to have to negotiate a new deal with the House after Democrats take control of that chamber in January.
A spokeswoman for Peters said he “continues to work with his colleagues on both sides of the aisle” to get the bill signed into law before the end of the year, noting that major companies have already begun testing autonomous vehicles at several sites around the U.S., including at the American Center for Mobility in Ypsilanti Township.
“As companies move forward with their self-driving vehicle plans, Sen. Peters is focused on ensuring there is a federal regulatory framework in place to oversee the safe deployment of self-driving vehicles,” Peters' office said.
But critics of the bill argue that not enough attention is being paid to safety concerns, and that there isn't enough oversight on the road-readiness of the technology.
The picture is slightly more complicated for supporters of lifting the cap on electric car tax credits. A measure by U.S. Sen. John Barrasso, R-Wy., would eliminate the tax credit for electric cars and institute a new tax on electric cars and alternative fuel vehicles to boost the coffers of the federal Highway Trust Fund that pays for construction projects.
A separate measure by U.S. Sen. Dean Heller, R-Nev., would keep the electric vehicle tax credit in place and lift the cap. A similar measure was also introduced by U.S. Sen. Dianne Feinstein, D-Calif., Jeff Merkley, D-Ore., Martin Heinrich, D-N.M. and Catherine Cortez Masto, D-Nev.
Heller lost his seat in last week's election to Democratic U.S. Rep. Jackie Rosen, who has also co-sponsored legislation in the House to extend the electric car tax credit for 10 years. Nevada is home to Tesla's Gigafactory 1 lithium-ion battery factory.
When carmakers hit the 200,000-vehicle ceiling, they face a phasing-out process of the $7,500 tax credit offered to buyers of full-electric vehicles — reducing that credit by half every six months. At least one automaker, Tesla, has already hit the limit, and GM is also expected to hit the mark during the fourth quarter of 2018.
GM sold 23,297 all-electric Chevrolet Bolts and 20,349 plug-in hybrid Chevrolet Volts in the U.S. in 2017.
Dan Turton, vice president of public policy at GM, said in announcing a new group known as the EV Drive Coalition that includes GM, Nissan and Tesla: “A federal tax credit to help make electric vehicles more affordable for all consumers is integral to reaching a zero-emissions future and establishing the U.S. as the leader in electrification. We feel that the tax credit should be modified so all customers continue to receive the full benefit going forward.”
Advocates for the self-driving bill are hoping for favorable action. Scott Hall, director of communications and public affairs of the Washington, D.C.-based Alliance for Automobile Manufacturers, which lobbies for major U.S. and foreign-owned automakers, said automakers “remain optimistic the Senate will take action on this bipartisan legislation, given the tremendous promise of this technology to make our roadways safer and provide greater mobility options to persons with disabilities and seniors.”
But critics of the self-driving bill are on high alert.
John Simpson, privacy and technology project director at the Los Angeles-based Consumer Watchdog group, which has raised concerns about the safety of self-driving cars after recent high-profile crashes, said he is “concerned there will be a mad rush to try to slam it through” now that the contentious election season has passed.
“It's simply insanity to rush through a bad bill just to say you've got a bill,” Simpson said, adding that Congress has done little to address concerns that have been raised by safety groups about giving automakers wide latitude to sell self-driving cars.
Groups that represent trial lawyers have complained about a lack of protections that would ensure the right to sue if someone is hurt or killed in a self-driving car.
Peter Knudsen, director of communications for the Washington, D.C-based American Association for Justice, which lobbies for trial lawyers who typically represent plaintiffs, added that his group is also still “strongly opposed” to the Senate's self-driving bill.
“We remain hopeful that proponents of AV START will adopt the vital changes necessary to ensure that the bill brings transparency and accountability to the driverless car industry,” Knudsen said.
The arguments appear to have held sway with some U.S. senators thus far. At least five have publicly expressed concerns about the measure, pointing to accidents this year that involved Uber and Tesla vehicles that were operating autonomously or semi-autonomously. The opposition prevented the self-driving bill from being quickly passed in the notoriously deliberate upper chamber.
klaing@detroitnews.com
(202) 662-8735
Twitter: @Keith_Laing
Read or Share this story: https://www.detroitnews.com/story/business/autos/mobility/2018/11/20/driverless-electric-vehicle-bills-face-lame-duck-rush/1991341002/

Top Automotive Industry News for Week of November 19 – November 25, 2018

Here is the most important news associated with the automotive industry
identified by the AEA for the week of November 19, 2018 -November 25, 2018.

We hope it helps you stay up to speed on the key developments in our
industry:

-Automotive Manufacturing News-

Ford, VW could announce electric, driverless-car collaborations:
analyst

(MarketWatch)

Ford wants to get rid of that new-car smell. Here's why.

(USA Today)

General Motors buyouts likely to fall short and layoffs loom

(USA Today)

German court rules Volkswagen must reimburse owner full price of car

(Reuters)

Ghosn scandal could trigger a series of crises for Nissan, Renault,
Mitsubishi

(CNBC)

GM under investigation for faulty brake vacuum pumps

(Detroit Free Press)

Mazda Toyota Manufacturing kicks off construction on $1.6B Alabama
plant

(Made In Alabama)

Nissan board votes to remove Carlos Ghosn as chairman

(CNBC)

Renault taps interim chairman, COO to replace Ghosn: sources

(Reuters)

Tesla will cut prices in China in response to import tariffs; Reuters

(MarketWatch)

These are the best cars we tested in 2018

(CNBC)

-Automotive Evolution News-

AEye Raises $40M To Build Autonomous Car Sensor That Sees Better Than
Humans

(Forbes)

China Is Leading the World to an Electric Car Future

(Bloomberg)

Congress considers extending electric vehicle tax credits, approval of
self-driving cars

(The Detroit News)

Electric vehicle sales to 'see a big lift' over the next 2 to 3 years,
BlackRock says

(CNBC)

Needing Growth, Uber Returns to Germany. This Time on Best Behavior.

(The New York Times)

-Automotive Retail News-

3 straight quarters of more than 10 million used-car sales

(Auto Remarketing)

Analysts Expect First November Car Sales Slide in 9 Years

(The Detroit Bureau)

AutoNation and Scott Painter patch things up

(Automotive News)

Black Friday is breathing life back into the 0% auto loan

(Automotive News)

Digital Crystal Ball Gives Auto Dealers A View To Future Sales

(Forbes)

Every Plug-In-Hybrid Vehicle Available in America Today

(Car and Driver)

What's the Best New-Car Deal for Black Friday?

(Cars.com)

Where the deals are for Black Friday car shopping

(CNBC)

-Automotive Wholesale News-

J.D. Power’s wholesale price projection through 2019

(Auto Remarketing)

Update on late-model auction volume

(Auto Remarketing)

Used cars with the least depreciation in 2018

(Autoblog)

-Automotive Enthusiast News-

23 hot cars we can't wait to see at the 2018 LA Auto Show

(Business Insider)

-Automotive Servicing News-

Citing Brake Concern, Feds Investigate 2.7M Pickups, Sport Utes

(Forbes)

-General Business & Executive News-

An early holiday gift: Lower gas, oil prices could boost spending,
economy

(USA Today)

Billionaire threatens to tackle the nation’s most expensive auto
insurance

(Insurance Business)

On Black Friday, more U.S. shoppers chose the computer over the mall

(Reuters)

PureCars launches an attribution platform just for car dealerships

(MarTech)

Tesla is turning to partners to help with a growing used-car business

(CNBC)

U.S. retail sales rebound, but consumer spending slowing

(Reuters)

-AEA Reminder-

Did we miss something? Let us know via our

Contact Us Page >>

. If you have specific important news going public soon that you would like
to share with your fellow AEA Members, submit your

PR Distribution Request >>

Have a great week,

Member Services

memberservices@automotiveexecutives.com

Automotive Executives Association

www.automotiveexecutives.com

Mercedes-Benz FutureInsight: “Human first”: empathy as anchor in the digital transformation

Stuttgart/Berlin. What does a desirable future that is worth living look like? How can individuality and digital transformation be reconciled? How can trust be established between humans and machines? A series of “FutureInsight” debates from Mercedes-Benz address questions like these. In these debates, Mercedes-Benz experts discuss such questions around the theme of mobility with academics,… Continue reading Mercedes-Benz FutureInsight: “Human first”: empathy as anchor in the digital transformation

  Milton Keynes, the Model Town Building Itself Around Self-Driving Cars 21 Nov

About the Cars That Think blog

IEEE Spectrum’s blog about the sensors, software, and systems that are making cars smarter, more entertaining, and ultimately, autonomous.

Follow @/CarsThatThink

Philip E. Ross, Senior Editor

Willie D. Jones, Assistant Editor

Evan Ackerman, Senior Writer

Lucas Laursen, Contributor

Subscribe to RSS Feed

  An Airplane With No Moving Parts 21 Nov

About the Cars That Think blog

IEEE Spectrum’s blog about the sensors, software, and systems that are making cars smarter, more entertaining, and ultimately, autonomous.

Follow @/CarsThatThink

Philip E. Ross, Senior Editor

Willie D. Jones, Assistant Editor

Evan Ackerman, Senior Writer

Lucas Laursen, Contributor

Subscribe to RSS Feed

The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a Human

By James R. Doty, MD and Blair LaCorte

For over three decades, I’ve studied and performed surgery on the human brain. I have always been fascinated by the power, plasticity and adaptability of the brain and by how much of this amazing capacity is dedicated to processing and interpreting data we receive from our senses. With the rapid ascension of Artificial Intelligence (AI), I began to wonder how developers would integrate the complex, multi-layers of human perception to enhance AI’s capabilities. I have been especially interested in how this integration would be applied to robots and autonomous vehicles. It became clear the artificial intelligence that will be needed to drive these vehicles will require artificial perception that is modeled after the greatest perception engine on the planet — the human visual cortex. These vehicles will need to think like a robot, but perceive like a human.

To learn more and to better understand how this level of artificial perception will be created, I recently became advisor to AEye, a company developing cutting edge artificial perception and self-driving technologies, to help them use knowledge of the human brain to better inform their systems. This is known as biomimicry: the concept of learning from and then replicating natural strategies from living systems and beings (plants, animals, humans, etc.) to better adapt design and engineering. Essentially, biomimicry allows us to fit into our existing environment and evolve in the way life has successfully done for the past few billion years. But why is incorporating biomimicry and aspects of human perception integral to the development and success of autonomous vehicles?

Because nothing can take in more information and process it faster and more accurately than the human perception system. Humans classify complex objects at speeds up to 27 Hz, with the brain processing 580 megapixels of data in as little as 13 milliseconds. If we continue using conventional sensor data collection methods, we are more than 25 years away from having AI achieve the capabilities of the human brain in robots and autonomous vehicles. Therefore, to facilitate self-driving cars to safely move independently in crowded urban environments or at highway speeds, we must develop new approaches and technologies to meet or exceed the performance of the human brain. The next question is: how?

Orthogonal data matters
(Creating an advanced, multi-dimensional data type)
Orthogonal data refers to complimentary data sets which ultimately give you more quality information about an object or situation than each would alone, allowing us to make efficient judgements about what in our world is important, and what is not. Orthogonality concepts for high information quality are well understood and rooted in disciplines such as quantum physics where linear algebra is employed and orthogonal basis sets are the minimum pieces of information one needs to represent more complex states without redundancy. When it comes to perception of moving objects, two types of critical orthogonal data sets are often required — spatial and temporal. Spatial data specifies where an object exists in the world, while temporal is where an object exists in time. By integrating these data sets along with other complementary data sets such as color, temperature, sound, smell, etc. our brains generate a real-time model of the world around us, defining how we experience it.

The human brain takes in all kinds of orthogonal data naturally, decoupling and reassembling information instantaneously, without us even realizing it. For example, if you see that a baseball is flying through the air towards you, your brain is gathering all types of information about it, such as spatial (the direction of where the ball is headed) and temporal (how fast it’s moving). While this data is being processed by your visual cortex “in the background” all you’re ultimately aware of is the action you need to take, which might be to duck. The AI perception technology that is able to successfully adopt the manner by which the human brain captures and processes these types of data sets will dominate the market.

Existing robotic sensory data acquisition systems have focused only on single sensor modalities (camera, LiDAR, radar) and only with fixed scan patterns and intensity. Unlike humans, these systems have not learned nor have the ability to efficiently process and optimize 2D and 3D data in real-time while both the sensor and detected objects are in motion. Therefore, they cannot use real-time orthogonal data to learn, prioritize, and focus. To effectively replicate the multi-dimensional sensory processing power of the human visual cortex will require a new approach to thinking about how to capture and process sensory data.

AEye is pioneering one such approach. AEye calls its unique biomimetic system iDAR (Intelligent Detection and Ranging). AEye’s iDAR is an intelligent artificial perception system that physically fuses a unique, agile LiDAR with a hi-res camera to create a new data type they call Dynamic Vixels. These Dynamic Vixels are one of the ways in which AEye acquires orthogonal data. By capturing x, y, z, r, g, b data (along with SWIR intensity), these patented Dynamic Vixels are uniquely created to biomimic the data structure of the human visual cortex. Like the human visual cortex, the intelligence of the Dynamic Vixels is then integrated in the central perception engine and motion planning system which is the functional brain of the vehicle. They are dynamic because as they actively interrogate a scene and adjust to changing conditions, such as increasing the power level of the sensor to cut through rain, or revisiting suspect objects in the same frame to identify obstacles. Better data drives more actionable information.

Not all objects are created equal
(See everything, and focus on what is important)
Humans continuously analyze their environment, always scanning for new objects, then in parallel and as appropriate focus in on elements that are either interesting, engaging, or potentially pose a threat. We process at the visual cortex fast, with incredible accuracy, and with very little of the brain’s immense processing power. If a human brain functioned as autonomous vehicles do today, we would not have survived as a species.

In his book The Power of Fifty Bits, Bob Nease writes of the ten million bits of information the human brain processes each second, but how only fifty bits are devoted to conscious thought. This is due to multiple evolutionary factors, including our adaptation to ignore autonomic processes like our heart beating, or our visual cortex screening out less relevant information in our surroundings (like the sky) to survive. It is an intelligent system design.

This is the nature of our intelligent vision. So, while our eyes are always scanning and searching to identify new objects entering a scene, we focus our attention on objects that matter as they move into areas of concern, allowing us to track them over time. In short, we search a scene, consciously acquire the objects that matter, and track them as required.

As discussed, current autonomous vehicle sensor configurations utilize a combination of LiDAR, cameras, ultrasonics, and radar as their “senses” that are serial collection (one way) and are limited to fixed patterns of search. These “senses” collect as much data as possible, which is then aligned, processed, and analyzed long after the fact. This post-processing is slow and does not allow for situational changes to how sensory data is captured in real-time. Because these sensors don’t intelligently interrogate, up to 95% of the sensory data currently being collected is thrown out as it is either irrelevant or redundant at the time it is processed. This act of triage also itself comes with a latency penalty. At highway speeds, this latency results in a car moving more than 20 feet before the sensor data has been fully processed. Throwing away data you don’t need with the goal of being efficient is inefficient. A better approach exists.

The overwhelming task of sifting through this data — every tree, curb, parked vehicle, the sky, the road, leaves on trees, and other static objects — also requires immense power and data processing resources, which slows down the entire system significantly, and introduces risk. These systems’ goal is to focus on everything and then try to analyze each item in their environment, after the fact, at the expense of timely action. This is the exact opposite of how humans process spatial and temporal data in situations that we associate with driving.

AEye’s iDAR teaches autonomous vehicles to “search, acquire, and track” objects as we do. By defining new data and sensor types that more efficiently communicate actionable information while maintaining the intelligence to analyze this data as quickly and accurately as possible. AEye’s iDAR enables this through its unique foundational solid-state agile LiDAR. Unlike standard LiDAR, AEye’s agile LiDAR is situationally adaptive so that it can modify scan patterns and trade resources such as update rate, resolution, and max range detection among others. This enables iDAR to dynamically adjust as it optimally searches a scene, conserve power and apply that power to efficiently identify and acquire critical objects, and track these objects over time. iDAR’s unique ability to intelligently use power to search, acquire, and track scenes helps identify that the object is a child walking into the street or that it is a car entering the intersection and accelerating high speed. Doing this in real-time is the difference between a safe journey and an avoidable tragedy.

Humans Learn Intuitively
(Feedback loops enable intelligence)
As we have discussed, the human visual cortex can scan ay 27Hz (much faster than current sensors on autonomous vehicles, which average around 10Hz). The brain naturally gather..

AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDAR

Company Simultaneously Closes $40 Million Series B Funding to Fuel Global Expansion
“The test conducted by AEye delivered impressive results…This is an outstanding achievement that demonstrates the true potential of perception systems to reliably and accurately detect and track objects at great range.”

Pleasanton, CA – November 19, 2018 – AEye, a world leader in artificial perception systems and the developer of iDAR™, today announced a major breakthrough in long-range threat detection and safety. In performance specification tests monitored and validated by VSI Labs, one of the nation’s leading automated vehicle technology advisors, AEye’s iDAR system detected and tracked a truck at 1,000 meters, or one kilometer – four to five times the distance current LiDAR systems are able to detect. AEye’s test sets a new benchmark for solid-state LiDAR range, and comes one month after AEye announced a 100Hz scan rate – setting a new speed record for the industry.

The company simultaneously announced $40M in Series B funding, led by Taiwania Capital. The round was significantly oversubscribed and includes multiple global automotive OEMs, Tier 1s, and Tier 2s to be formally announced at CES in January. In addition to Taiwania Capital, existing investors Kleiner Perkins, Intel Capital, Airbus Ventures and Tychee Partners also participated.

New Range and Scan Rate Records Key to Autonomous Automotive and Trucking Safety
Using AEye’s standard iDAR sensor, the company set up a formal test, monitored by VSI Labs, the leading research and development resource for active safety and automated vehicle technologies. The test was structured to establish and verify the range and scan rates of the iDAR system.

The test was conducted on the runway of an airport in Byron, California in order to isolate targets to better measure and calibrate iDAR’s performance. To test range, a standard 20-foot moving truck was tracked and continuously scanned down the length of the 914 meter runway. At the end of the runway, the iDAR system was fully able to continuously detect, and track the movements of the vehicle as well as detect runway signs and markers en route. The AEye sensor vehicle was then taken off the runway to extend the available test range to over 1000m, where iDAR continued to track the truck without difficulty.

“The test conducted by AEye delivered impressive results,” said Sara Sargent, senior engineer at VSI Labs. “We monitored the performance and the truck was clearly identifiable and visible at 1 kilometer. We were also able to verify that AEye’s iDAR system achieves scan rates of 100Hz and that the fusion of the camera and LiDAR in the iDAR sensor produces accurate true color real-time point clouds in the form of Dynamic Vixels. This is an outstanding achievement that demonstrates the true potential of perception systems to reliably and accurately detect and track objects at great range.”

iDAR and Biomimicry
AEye’s iDAR is an intelligent artificial perception system that physically fuses an agile, solid-state LiDAR with a hi-res camera to create a new data type called Dynamic Vixels. These Dynamic Vixels are the result of real-time integration of iDAR’s Agile LiDAR and a low-light camera in the IDAR sensor, not post fusion of a separate camera and LiDAR system after the scan. By capturing x, y, z, r, g, b data, Dynamic Vixels are uniquely created to “biomimic” the data structure of the human visual cortex. Better data drives vastly superior performance and delivers more accurate information. AEye’s use of Biomimicry is more fully explored by Dr. James Doty, world renowned neurosurgeon and clinical professor in the Department of Neurosurgery at Stanford University, in an article he recently published on Medium.

“After establishing a new standard for LiDAR scan speed, we set out to see just how far we could accurately search, acquire and track an object such as a truck”, said Blair LaCorte, Chief of Staff at AEye. “The iDAR system performed as we expected. We detected the truck with plenty of signal to identify it as an object of interest, and then easily tracked it as it moved over 1000m away. We now believe that with small adaptations, we can achieve range performance of 5km to 10km or more. These results have significant implications for the autonomous trucking and Unmanned Aircraft Systems (UAS) markets, where sensing distance needs to be as far as possible and potential threats identified as early as possible to achieve safe, reliable vehicle autonomy.”

New Funds Fuel Company’s Global Expansion
In addition, AEye announced the close of its Series B round, bringing the company’s total funding to over $61 million. The funds will be used to scale AEye’s operations to meet global demand for the company’s artificial perception systems for autonomous vehicles. AEye is uniquely structured to effectively scale through partnerships with contract manufacturers and Tier 1s on a global basis. This has allowed the company to focus on its core design and innovation competencies, avoiding the costs of building manufacturing capacity, while optimizing investment dollars on higher value activities. AEye’s growth has been fueled by its ability, as a software driven platform, to provide artificial perception systems that address both ADAS and Mobility solutions and engagements with customers and partners in Europe, North America, and Asia.

“This funding marks an inflection point for AEye, as we scale our staff, partnerships and investments to align with our customers’ roadmap to commercialization,” said Luis Dussan, AEye founder and CEO. “Our strategic relationship with Taiwania will serve as a gateway to Asia, with valuable manufacturing, logistics and technology resources that will accelerate our ability to address the needs of a global market. We intend to launch our next generation product at CES, which we believe will help OEMs and Tier 1s accelerate their products and services by delivering market leading performance at the lowest cost.”

“We see AEye as the foremost innovator in this space, whose systems deliver highly precise, actionable information at speeds and distances never seen in commercially available LiDAR sensors,” said Huang Lee, Managing Partner at Taiwania. We look forward to working closely with AEye’s team to explore and pursue growth opportunities in this burgeoning space.”

AEye takes a disruptive approach to vehicle perception by putting intelligence at the sensor layer and making it extensible and controllable via a software driven architecture. The company’s iDAR system is an intelligent artificial perception system that physically fuses an agile, solid-state LiDAR with a hi-res camera to create a new data type called Dynamic Vixels with integrated software definable feedback control loops. This enables the iDAR sensor to also dynamically assess and prioritize what’s most relevant in a scene, then process this data at the edge. This is way above and beyond the function of legacy fixed pattern LiDAR systems and standalone cameras with 2D computer vision algorithms. This unique approach enables rapid, dynamic perception & path planning, for drastically improved autonomous vehicle safety and performance.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Intel Capital, & Airbus Ventures.

About Taiwania Capital
Taiwania Capital is a venture capital firm sponsored by the Taiwan government and large private enterprises. Founded in 2017, Taiwania Capital is focused on ICT-related sectors and startups in fields including: enterprise IT infrastructure and software, AI, IoT, network security, industrial automation, drones and robotics, next-gen semiconductors, autonomous vehicle technology, and digital devices. With offices in both Taiwan and Silicon Valley, Taiwania Capital exclusively backs startups that will turn the promises of technological advancement into scalable applications.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDAR — AEye Introduces Groundbreaking iDAR TechnologyGartner Names AEye Cool Vendor in AI for Computer VisionThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateAutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Nate Ramanathan Joins AEye as Vice President of Operations

TomTom navigation for motorcyclists now available on the BMW Motorrad Connected app

TomTom navigation for motorcyclists now available on the BMW Motorrad Connected app

MILAN, 07-Nov-2018 — /EuropaWire/ — TomTom (TOM2) today announced that BMW Motorrad owners can now experience the best of TomTom navigation for motorcyclists running on the BMW Motorrad Connected app. The smartphone app stays safely in the rider’s pocket, while visual directions are shown on the bike’s integrated handlebar display. Audio directions are provided via Bluetooth® into the rider’s compatible helmet.

Features have been motorcycle-optimized, with one of the most requested – the option to choose winding routes – being introduced.

The new functionality is available from today, with app users needing only to update their app, free of charge, before their next ride.

Antoine Saucier, Managing Director, TomTom Automotive, said: “The combination of TomTom’s maps, software and services provides a fantastic motorbike navigation experience for BMW Motorrad riders.”

TomTom’s navigation components are provided to BMW Motorrad via TomTom’s Navigation software, NavKit, alongside TomTom’s NDS maps, and services including TomTom Traffic, weather and Speed cameras.

TomTom is at EICMA 2018 – Pavilion 13, Booth N72.

The Bluetooth® word mark and logos are registered trademarks owned by Bluetooth SIG, Inc. and any use of such marks by TomTom is under license. Other trademarks and trade names are those of their respective owners.

ENDS

About TomTom
TomTom is the leading independent location technology specialist, shaping mobility with highly accurate maps, navigation software, real-time traffic information and services.

To achieve our vision of a safer world, free of congestion and emissions, we create innovative technologies that keep the world moving. By combining our extensive experience with leading business and technology partners, we power connected vehicles, smart mobility and, ultimately, autonomous driving.

Headquartered in Amsterdam with offices in 37 countries, TomTom’s technologies are trusted by hundreds of millions of people worldwide.

www.tomtom.com

SOURCE: TomTom International BV

MEDIA CONTACT
tomtom.pr@tomtom.com
+ 31 (0) 20 7574730

Tweet

Share

0

+1

LinkedIn

0

Email