• Automated Vehicles
  • Transportation Efficiencies
  • Mobility Transformation
  • Apple

Custom Computing Power May Be Apple's Automated Vehicle Ace

Sam Abuelsamid
Oct 11, 2018

Smart Car 2

Apple is one of the most notoriously secretive companies in Silicon Valley. The company has been reported to seed misinformation to employees in hopes of ferreting out leakers when stories appear in the media. Despite its penchant for keeping its cards close to the vest, it has been an open secret that Apple is working on some sort of automated vehicle (AV) project since at least early 2015. While nothing was directly mentioned about this during the company’s September 2018 iPhone announcement, there were some clues about what could make an Apple AV special.

Creating an automated driving system requires several elements—high resolution sensors, sophisticated machine learning software to interpret what the sensors are seeing, and enough compute power to process it all. Apple has been among the leaders in computational photography for several years now, taking the limited amount of data available from tiny smartphone image sensors and turning it into images that, in many cases, rival much larger digital single-lens reflex cameras. Last year, the company added other types of sensors, including infrared, to the iPhone X to enable its Face ID authentication.

Some of the sensing expertise came from the 2013 acquisition of PrimeSense, the Israeli startup responsible for the original Microsoft Kinect sensor. Apple’s expertise in image processing and object recognition has been leveraged in Face ID and for photography using both single and stereo imaging. Apple’s portrait mode can detect an individual in a complex image and then blur the background to replicate the shallow depth of field typically only possible with far larger optics than are possible on a smartphone. Similar object detection could be extremely valuable in an AV perception system.

Apple’s Chips Are All in the AV Pot

The heart of this is the compute platform. Ever since the launch of the A4 system on a chip in 2010, Apple has built a reputation as one of the world’s premier chip design companies. The A12 chip announced in September 2018 takes that capability to new levels. In addition to a six-core central processing unit, it features a four-core graphics processor (GPU) and an updated neural engine.

In recent years, Nvidia has become one of the dominant players in the field of compute platforms for AVs with its line of Drive PX platforms powered by its GPU technology. Its latest chip, the Xavier, is claimed to provide 30 trillion operations per second (TOPS) at just 30 W power consumption. Power consumption is enormously important in AVs, which are expected to be predominantly electrified.

The Apple A12’s neural engine has been specifically optimized for processing the types of neural networks that AVs rely on and Nvidia’s chips are so efficient at. The A12 neural engine is claimed to crunch through 5 TOPS compared to 600 giga operations per second for the version in 2017's A11. While Apple has not disclosed the power consumption, this has always been an important factor in smartphones with small batteries, another area where Apple is considered an industry leader.

Given the size and power constraints of a phone compared to an AV, it is entirely conceivable that Apple may already be running a scaled-up version of the A12 neural engine in its AV test fleet. Six A12s in parallel or an automotive specific version with more neural cores could match the Xavier.

Partnerships Pave New Roads to the AV Future

While Apple may never get into the low margin business of building and selling cars, it may just have something incredibly compelling to offer automakers in partnerships. Apple’s perception software capabilities, powered by its custom silicon, could be the winning combination.