- Software
- Smart Technology
- EVs
- Vehicle Subscriptions
- Advanced Driver Assist Systems
Computing Efficiency Will Be Essential to Software-Defined Vehicles
Apart from the shift to electrification, one of the hottest topics in the automotive industry of late has been the software-defined vehicle (SDV). Computing and software are nothing new to vehicles, with the first embedded electronic controls appearing in the 1970s. However, the effort has shifted to a whole new level in the decade since the production debut of the Tesla Model S. Automakers increasingly see the SDV as a pathway to securing new recurring revenue streams, while semiconductor suppliers see it as an area to diversify their revenues into new markets. But it all needs to be done while consuming as little power as possible.
Those early electronic controls such as engine management and antilock brakes are typically referred to as software enabled. Deeply embedded code, often written in assembly language on the lowest power microcontrollers that engineers could get away with, is tightly coupled to the hardware. Every feature, from sequential turn signals to adaptive cruise control, typically has its own dedicated electronic control unit (ECU), with many modern vehicles having 100 or more ECUs.
Bring On the Software-Defined Vehicle
The SDV approach tends to rely on a more centralized compute architecture with a software abstraction layer that separates the hardware from the applications. Instead of small, low powered microcontroller chips that handle one or a few functions, the SDV generally relies on a high performance system on a chip (SoC).
In the early 1990s, when I worked on antilock braking systems (ABS) powered by an Intel 80C196 microcontroller, we used only integer math and shift operations rather than multiplication and division to save a few clock cycles. Today, companies like NVIDIA and Qualcomm are providing the auto industry with SoCs like the Orin and Snapdragon Ride that include general purpose computing cores, graphics, and neural processing cores that combine to deliver hundreds of trillions of integer and floating point operations per second. This performance is necessary for driver assist and automated driving systems to perceive the world around the vehicle and make control decisions as well as for rendering the graphical interfaces on massive touchscreens.
Highly automated vehicles like the prototype robotaxis that Cruise, Waymo, Motional, and Argo AI are using to pick up passengers in multiple cities use customized compute platforms with even higher performance. While these state-of-the-art chips enable vehicles to do things I only dreamed of when I began my engineering career more than 30 years ago, they also consume a lot of power. The compute platform in a Chevrolet Bolt EV robotaxi operated by Cruise in San Francisco consumes as much as 4 kW. With a 65 kWh battery pack, that Bolt would only have enough energy for about 50 miles of driving over a 10-hour shift.
Cutting the Power
For consumer electric SDVs, the power draw won’t be quite as extreme, probably no more than a few hundred watts on even the most advanced vehicles. However, on an EV, every single watt counts and takes away from the available range. Given the cost and mass of batteries, squeezing every bit of propulsion capability from the available energy is just as important today as it was when developing ABS 30 years ago.
If automakers hope to convince consumers to pay for recurring subscriptions to in-vehicle services and features, vehicle purchase costs will have to drop. In the age of EVs, that means reducing energy requirements on the battery. Making the most efficient use of energy in power conversion, heat recovery for climate control, and of course in-vehicle computing will be essential. Chip designers must do their part to improve performance per watt if they want to expand their automotive business.