Introducing an Open Standard for Autonomous Vehicle Visualizations
By: Xiaoji Chen, Joseph Lisee, Tim Wojtaszek, and Abhishek Gupta
Understanding what autonomous vehicles perceive as they navigate urban environments is essential to developing safe and reliable autonomous systems. Just as we have standards for street signs and traffic infrastructure to help human drivers, autonomous vehicle developers would be well-served by a standard visualization platform to represent input from sensors, image classification, motion inference, and all other techniques used to build an accurate image of the immediate environment.
That’s why we’re so excited to open source the redesigned and expanded Autonomous Visualization System (AVS), a new way for the industry to understand and share its data.
AVS is a new standard for describing and visualizing autonomous vehicle perception, motion, and planning data, offering a powerful web-based toolkit to build applications for exploring, interacting and, most critically, making important development decisions with that data.
As a stand-alone, standardized visualization layer, AVS frees developers from having to build custom visualization software for their autonomous vehicles. With AVS abstracting visualization, developers can focus on core autonomy capabilities for drive systems, remote assistance, mapping, and simulation.
Building the most capable self-driving system requires collaboration across disciplines, teams and partners. With AVS, we hope to inspire a more consistent way for self-driving technology to be visualized throughout the industry, and ultimately bring the future into the present.
Read on at Uber Engineering.