• Advanced Driver Assist Systems
  • Driver Assistance Systems
  • Advanced Sensors
  • Autonomous Vehicle

Distance Measurement Is Critical for Automotive Safety Systems

Sam Abuelsamid
Sep 15, 2021

Guidehouse Insights

Cameras are everywhere: on our phones, computers, and doorbells and on poles around our communities. We use them to capture moments in time for nostalgia, convenience, and security. Cameras have been a cornerstone of advanced driver assist systems (ADAS) since early on, certainly for most of the last 2 decades. However, they can be used more effectively to improve safety.

As ADAS has become ubiquitous on mainstream vehicles, automakers have begun using cameras in combination with other sensor types, including ultrasonic, radar, and most recently, lidar. Besides being inexpensive, there is a fundamental difference between cameras and other sensors. 

Active Versus Passive Sensors

Radar, lidar, and ultrasonic sensors are all active sensors, sending out a signal and looking for a reflection from other objects to determine how far away they are. Through understanding the speed of the signal, whether it’s a laser pulse, a radio wave, or a sound, it’s possible to accurately measure the distance to the reflected object. Passive camera sensors only capture the ambient light reflected from objects without knowing where the source is. That light is captured on a two-dimensional image sensor. 

Most vehicles have either a single forward-facing camera or, in cases where there are multiple cameras of different focal lengths, cameras that are typically clustered together. Although advocates of cameras for ADAS perception point out that humans primarily use our organic cameras, or eyes, for driving, our eyes have a distance between them. That gap allows us to perceive depth since each eye has a slightly different point of view. 

Many ADAS systems use a single camera and try to estimate the distance by recognizing objects and how many pixels of the image they consume. However, that approach is prone to significant error, especially since vehicles of vastly different size but similar appearance can’t be distinguished. 

Spreading multiple cameras apart but facing them in the same direction allows the range to be calculated using basic trigonometry. Some vehicles, most notably those from Subaru, use stereo camera configurations with cameras of the same focal length pointed in the same direction to provide distance measurements in place of active sensors, such as radar. However, those only have about a 30 cm gap, which limits the distance over which accurate measurements can be made to about 150 m. That’s adequate for driver assistance and cruise control but insufficient for automated driving at highway speeds. 

Biomimicry with Multiple Cameras

The depth-perception technology company Light has developed an approach with three cameras that feature a 1 m spread, which can provide range up to distances of 1,000 m with little error. Since these sensors are only used for ADAS, eliminating the infrared cut filter that is used on consumer cameras rather than creating images that replicate what the human eye sees enables these sensors to work in low light or foggy conditions. 

Low cost sensing solutions, such as Light’s multi-camera system, can enable much more capable ADAS and pedestrian detection than is currently available. Since pedestrians are the fastest growing segment of traffic fatalities, it’s critical to improve these systems for as many vehicles as possible. 

Learn more about the importance of accurate distance measurement and the capabilities of different types of sensors in a new Guidehouse Insights white paper, Accurate Ranging Perception for Assisted and Automated Driving, sponsored by Light.