Tesla’s ‘Full Self-Driving’ Software Raises Safety Concerns in Recent Test
A recent test ride of Tesla’s Full Self-Driving (FSD) software has sparked renewed concerns about the company’s reliance on camera-based technology for autonomous driving. The experience, documented by Rolling Stone’s Miles Klee, highlighted several shortcomings of the $15,000 optional add-on that Tesla CEO Elon Musk has long touted as a solution to eliminate human driver errors.
Klee’s test drive took place in a Model 3 owned by Dan O’Dowd, a vocal Tesla critic and founder of the Dawn Project, an organization advocating for the ban of unsafe software from critical infrastructure. During the ride, Arthur Maltin, a chauffeur working for a PR firm representing the Dawn Project, pointed out numerous inaccuracies in Tesla’s FSD system.
Unlike conventional autonomous driving systems that utilize LiDAR technology, Tesla’s FSD relies solely on cameras. The system analyzes video feeds from these cameras using a neural network to interpret the vehicle’s surroundings. However, Maltin demonstrated that the system often misrepresents the environment, such as displaying one person instead of two or showing parked cars in motion.
Weather conditions also play a significant role in the software’s performance. Tesla warns that poor visibility, including direct sunlight, can impair the system’s functionality. Maltin noted that sunrise and sunset are particularly problematic, often triggering urgent warnings for driver intervention.
Despite its name, Tesla’s FSD requires drivers to remain alert and ready to take control at any moment. This contradiction has drawn scrutiny from regulators, including the National Highway Traffic Safety Administration (NHTSA), which is investigating numerous injuries and deaths linked to the software. The agency has found that drivers using FSD often become complacent and less engaged in driving.
During Klee’s test ride, Maltin had to intervene multiple times to prevent accidents. The vehicle nearly collided with a recycling bin and a plastic bollard, and it ran a stop sign on a highway on-ramp. In a previous demonstration by the Dawn Project, the FSD-equipped vehicle reportedly ignored a school bus stop sign and red flashing lights, striking a child-sized mannequin.
These incidents raise questions about the safety of Tesla’s camera-only approach for public roads. Evidence suggests that drivers may be lulled into a false sense of security, potentially compromising road safety. As Tesla moves to roll out its controversial FSD technology in China, the company faces mounting pressure to prove the safety and reliability of its self-driving vision.
As regulatory scrutiny intensifies and real-world tests continue to reveal flaws, the future of Tesla’s Full Self-Driving software remains uncertain. The company must address these concerns to maintain public trust and ensure the safety of all road users.