Nearly three years ago, Aurora was founded with one mission: to deliver the benefits of self-driving safely, quickly, and broadly. From the earliest days of the company, we knew that achieving this mission would require a deeply experienced team and a meticulously-designed Aurora Driver consisting of the hardware, software, and data services required to navigate vehicles safely through the world.
Given hardware’s foundational role in powering the Aurora Driver, we’ve made several key investments in the last three years to build a world-class hardware engineering team and technology. This post briefly highlights who that team is, how we approach self-driving hardware development, what we’re building, and why this matters.
Who we are
Aurora’s hardware engineering team brings together a unique group of technologists and product designers with deep experience in building self-driving and other high-performance products at scale. Members of our hardware engineering team have shipped some of the most advanced automotive systems on the road today, architected high-performance consumer electronics, developed cutting-edge defense robotics, designed advanced lidar systems, and launched a number of successful networking and telecom infrastructure products at scale.
How we approach self-driving hardware development
As you may have heard, self-driving is a challenging problem. Solving it safely, quickly, and at scale requires an elegant combination of carefully-crafted sensors, computers, and networking hardware custom-fit to its corresponding software.
Understanding the hardware requirements for self-driving is challenging. For instance, it takes experience to know what combination of sensors will give the perception system the best chance of detecting, classifying, localizing, and tracking all objects of interest in the world. “First principles” from physics are a necessary, but ultimately insufficient consideration for this design task. For instance, does a single pixel on a box 300 meters away provide enough information to take action? If not, how many points of data will our perception system ultimately require?
Our hardware design starts with a “first principles” evaluation of the most extreme cases we expect the system to handle and ends with simulation, experiments, and evaluation by a team of perception experts who are uniquely suited to predict future software capabilities.
From here, our perception team builds boundary scenarios; a few examples of these situations are illustrated above. Boundary scenarios are essentially the “edge cases” or difficult situations that our vehicle must be equipped to handle to safely operate on the road. These scenarios help us develop a list of required sensing capabilities and requirements, which then inform us where we should place each sensor. We also virtually test possible layouts through a suite of detection scenarios derived by on-road testing before we commit to a sensor configuration. So, we’re constantly challenging our hardware and software teams to work together to find optimal solutions.
High-precision self-driving software also requires carefully-crafted hardware to power, synchronize, and ingest the data from dozens of high-bandwidth sensors in and around the vehicle. Designing such a system requires a deep understanding of the software architecture and its dependencies and a close collaboration between experienced hardware designers and software engineers. In our experience, meeting the unique needs of these systems requires custom design.
Our approach to design customization in hardware mirrors the approach we’ve taken in software, in that, we find ways to incorporate critical elements of rapidly evolving technology developed across multiple industries into our product. For example, rolling shutter imagers have received a tremendous amount of focus and investment over the last several years as consumer products like cell phones drive improvements in resolution, cost, and signal-to-noise ratios (a measure of a camera’s sensitivity). Our camera system rides this wave of innovation by incorporating these imagers into custom-designed modules peering through Aurora-designed lenses. Similarly, the performance and efficiency of central and general processing silicon continues to advance, driven by trends across a variety of use cases. We leverage these advances by incorporating this silicon into our own custom computing platform tailored specifically to the needs of self-driving.
What we’re building
Below are a few key elements of our hardware design.
The Aurora computer is based on an enterprise-class server architecture and processors designed specifically for machine learning acceleration and camera signal processing. This computer acts as the central hub across all Driver hardware. It conditions and distributes power, measures and controls the power and thermal loads of all elements of the system, and ingests, synchronizes, and processes all sensor and vehicle telemetry data required to operate the vehicle.
Inside this computer, a custom Aurora networking switch uses an advanced networking chip and a unique combination of next-generation, high-bandwidth automotive physical layers to efficiently move data between nodes, duplicate data packets, provide redundant pathways, and synchronize our sensors to the microsecond.
- Cameras: We’ve custom-designed the lensing, layout, and cleaning solution for our cameras to meet the demands of the broad set of use cases in which our Driver will operate. This ensures we have sufficient range to drive safely on high-speed highways and sufficiently broad visibility to operate in congested urban settings.
- Radar: We’re working with partners to develop custom imaging radar solutions that provide far greater range and resolution than traditional automotive radar.
- Lidar: Our lidar design is focused on the development and productization of next-generation Frequency Modulated Continuous Wave technology. The inherent precision, range, and interference rejection of this lidar make it a game-changer in high-speed applications that require long-range sensing to operate safely.
Finally, in keeping with the “broadly” part of our mission statement, we are building a vehicle-agnostic hardware platform. That means we are designing our computer and sensors as a tightly-integrated and self-sufficient suite that maintains its safety and performance guarantees even when incorporated into vehicles of various makes, models, and classes. From a small battery-electric sedan to a large, diesel-powered truck, our hardware will enable our Driver to operate any vehicle, provided the platform meets a minimum set of interface requirements.
While the rewards of such a platform are tremendous (cost and learning economies of scale for all vehicles that use it), building it requires a substantial up-front investment. Rather than depending on many interfaces throughout the vehicle, our hardware operates as a central, largely self-sufficient hub. It conditions and distributes its own power, coordinates and synchronizes its own sensors, communicates with the vehicle over a simple umbilical, and communicates with transportation networks over a common network. This centralization of critical functions makes Aurora’s hardware system highly adaptable and allows us to realize many of the benefits of vertical integration, despite the fact that the vehicle ecosystem it powers is anything but vertical in nature.
Where we’re going from here
We’re proud of the teams we’ve built and the progress we’ve made over the last three years in software, hardware, and data services. And we’re not done. In the coming months, we’ll be continuing the development of our hardware systems that increase their capability and maturity and lay the groundwork for scaling this hardware across a larger number of vehicles operating on public roadways.
If you’re inspired by our team and our mission and think you have what it takes to build the world’s Driver, check out the Hardware Engineer openings on our careers page and reach out!