Accurate Ranging Perception for Assisted and Automated Driving

Nov 16, 2021 - 2:00 PM EST
Please sign in to view this webinar replay. This content is only available to registered Guidehouse Insights users.
Speakers
Sponsored By Light

In order for software to reliably assist a human or completely take over the task of driving a vehicle, it needs accurate and reliable information on the location of other road users or objects in the local environment. Current vision-based solutions with either single or clustered multi-focal length cameras must rely on machine learning to estimate the range and trajectory of road users, an inherently error-prone approach. Active sensors like lidar and radar can improve the data, but they are costly and consume additional power.

Wide-based multi-camera solutions that use physics can provide accurate range detection over distances most of the more expensive active sensors cannot approach. These solutions can also complement sensors as part of a multi-modal sensing suite for fail operational capability. This webinar discusses how multi view cameras with physics-based measurement can improve advanced driver assistance systems (ADAS) and automated driving performance reliability and safety.

  • The sensor landscape including redundant and complementary sensor applications
  • How cameras have been used historically (to gather depth) in advanced driver assistance systems (ADAS) / automated vehicles (AVs)
  • How multi view perception works
  • The benefits of multi view (range, detail, design, cost)
  • What are the redundant and complementary applications of ADAS/AV sensor technology?
  • How do cameras gather depth in ADAS/AV applications?
  • What is multi view perception and how does it work?
  • What are the benefits of multi view perception in the short and long term?
  • Automakers
  • Automotive suppliers and auto component makers
  • AV stakeholders
  • Startup technology firms
  • Safety groups