Lessons from Uber: A Deep Dive into Sensor Technology

Lessons from Uber: A Deep Dive into Sensor Technology

Amnon Shashua, the Chief Technology Officer of Mobileye, reflects on sensor technology and advanced driver assistance systems (ADAS) after the tragic Uber self-driving accident.

After the unfortunate death of Elaine Herzberg, who was struck by an autonomous Uber vehicle in Arizona, it’s crucial to discuss what safety means in terms of sensing and decision-making technology.

One primary challenge is interpreting sensor information. The police-released video from the accident highlights that even the essential ability of an autonomous vehicle to detect and classify objects accurately is difficult. This capability, however, is fundamental to today’s ADAS, which includes features like automatic emergency braking and lane-keeping support.

The high-accuracy sensing systems used in ADAS are saving lives today, backed by billions of kilometers driven. This same technology forms the foundational element necessary for developing fully autonomous vehicles in the future.

To showcase the advanced nature of current ADAS technology, we analyzed the police video of the incident using our software. Despite the less-than-ideal conditions, our system detected the object approximately one second before the impact. The diagram images show three snapshots with bounding box detections on Herzberg and her bicycle. These detections were made by a combination of pattern recognition and a free-space detection module, indicating the presence of a road user.

Another module separated objects based on motion structure, confirming the 3D presence of the detected object. The confidence level, indicated as “fcvValid: Low,” was low due to the poor quality of the video and missing information typically available in a production vehicle.

The software we used for this analysis is the same as the one implemented in today’s ADAS-equipped vehicles, which have reliably logged billions of kilometers in real-world conditions.

Recent advancements in artificial intelligence, particularly deep neural networks, have led many to believe that developing a highly accurate object detection system is straightforward. This has attracted new entrants into the field. However, the extensive experience in identifying and resolving numerous edge cases, annotating vast data sets, and rigorous preproduction testing in established ADAS programs cannot be overlooked. Experience is crucial, especially in safety-critical areas.

Transparency is the next key point. People often claim that safety is the top priority, but to earn public trust, we need to be clear about what that means. Decision-making systems should follow the common-sense judgment of humans. We’ve developed a mathematical model of notions like “dangerous situation” and “proper response” to ensure our system adheres strictly to those definitions.

Lastly, true redundancy in perception systems requires independent sources of information from cameras, radar, and lidar. Combining these different technologies enhances reliability in detecting and responding to various driving conditions.

smartautotrends