Mobileye Continues to Champion Camera-First Approach for Assisted and Autonomous Driving

What’s Happening: Mobileye, a pioneer in camera-based advanced driver-assistance systems (ADAS), is reaffirming its commitment to a camera-first approach for both assisted and autonomous driving technologies. The company’s core competence in computer vision and the use of low-cost cameras have been central to its success.

Why It Matters: Mobileye’s camera-first approach has revolutionized automotive safety, making ADAS a mainstream feature in millions of vehicles. As camera technology continues to improve, it will play an increasingly vital role in the future of transportation and vehicle safety.

Key Points:

  • Mobileye was founded by computer science professor Amnon Shashua, who demonstrated the effectiveness of a single camera system for measuring the distance to the vehicle ahead.
  • Cameras, as opposed to radar or lidar, are the most similar to the human eye, providing rich semantic understanding of driving environments, including lane markings, traffic light colors, and traffic sign text.
  • Low-cost cameras enable the integration of ADAS technology into more vehicles without significantly affecting purchase prices. Over 135 million vehicles have been equipped with Mobileye technologies to date.
  • Mobileye’s advanced solutions, such as SuperVision™ and Chauffeur, incorporate multiple cameras for 360-degree surround coverage. Mobileye Drive, a driverless solution for autonomous commercial vehicles, also employs a camera-first approach with secondary radar/lidar suite for redundancy.
  • Mobileye’s REM™ mapping technology leverages the proliferation of camera-based solutions for crowdsourcing capabilities, which are implemented in a range of applications for both self-driving and human-driven vehicles.

Bottom Line: Mobileye’s camera-first approach has been a driving force behind the company’s success in revolutionizing automotive safety through ADAS technology. As cameras continue to offer low-cost, scalable solutions with rich semantic understanding of driving environments, they will remain central to the future of assisted and autonomous driving.