A fatal accident involving a Tesla Model S car operating in “Full Self-Driving” mode resulted in the death of a 28-year-old motorcyclist in the Seattle area. The 56-year-old driver, who was arrested on suspicion of vehicular homicide, admitted to looking at his cell phone while using the driver assistant feature. Tesla maintains that its “Full Self-Driving (Supervised)” software requires active driver supervision and does not make vehicles completely autonomous. This incident marks at least the second fatal accident involving Tesla’s self-driving technology, which CEO Elon Musk has heavily promoted.
The National Highway Traffic Safety Administration (NHTSA) has previously reported one fatal accident involving a Tesla vehicle using FSD software between August 2022 and August 2023. The NHTSA is currently investigating the recent fatal accident in Seattle, gathering information from local law enforcement and Tesla. Despite Tesla’s claims about the capabilities of its self-driving technology, experts point out limitations with the camera-based system, highlighting the challenges of accurately detecting objects in various driving conditions. Tesla’s reliance on cameras and artificial intelligence contrasts with competitors like Waymo, which use expensive sensors like lidars for environment detection.
Guidehouse Insights analyst Sam Abuelsamid emphasizes the potential inaccuracies of Tesla’s camera-only system, citing challenges in accurately measuring object distances. Raj Rajkumar, a professor at Carnegie Mellon University, notes the difficulty of collecting and curating data from real-world elements such as motorcycles and bicycles in diverse weather and road conditions. Elon Musk has redirected Tesla’s focus towards self-driving vehicles, postponing the release of affordable cars and expressing confidence in achieving full self-driving capability by next year. Musk envisions future Tesla vehicles as “tiny mobile lounges” where drivers can watch movies, play games, work, and even drink and sleep.
Musk’s pursuit of self-driving capability has faced regulatory and legal scrutiny, with the NHTSA launching a probe into Tesla’s Autopilot system in August 2021 following multiple crashes involving Tesla vehicles hitting stationary emergency vehicles. In December 2023, Tesla had to recall nearly all its vehicles on U.S. roads to implement additional safeguards in the software. The incident in Seattle underscores the complexities and risks associated with self-driving technology, highlighting the ongoing debate surrounding the safety and effectiveness of autonomous vehicles. Tesla’s reliance on cameras and artificial intelligence raises concerns about the system’s ability to accurately perceive and respond to various real-world driving scenarios. As Tesla continues to push the boundaries of self-driving technology, regulatory oversight and public safety remain paramount concerns.