Tesla recalls over 2 million cars in US and Canada due to THIS reason
Tesla's recall filing said that Autopilot's software system controls "may not be sufficient to prevent driver misuse" and could increase the risk of a crash.
Tesla is recalling over 2 million vehicles in the US to install new safeguards in its Autopilot advanced driver-assistance system, after a federal safety regulator cited safety concerns. Earlier in 2015, when Tesla introduced Autopilot software, its CEO Elon Musk heralded it as a profound experience for people. Other automakers such as Mercedes, Audi and Volvo were already offering what amounted to fancy cruise control — keeping cars in their lanes and a distance from the traffic in front of them.
But Musk had an innovation: Autopilot, he said, could change lanes on its own. “It will change people’s perception of the future quite drastically,” Musk said while cautioning that drivers still have to pay attention.
Eight years later, US auto safety regulators pressured Tesla into recalling nearly all the vehicles it has sold in the country because its driver monitoring system is too lax. The fix, with more alerts and limits on where the system can operate, will be done with a software update.
Here’s how Autopilot has evolved over the past eight years and why it’s being recalled:
How Tesla's Autopilot works
Basic Autopilot can steer, accelerate and brake automatically in its lane by using two features called "Autosteer" and "Traffic Aware Cruise Control". Another level called "Navigate on Autopilot" suggests lane changes and makes adjustments to stop drivers from getting stuck behind slow traffic. Autosteer is intended to be used on limited-access highways. But there’s another feature called Autosteer on City Streets. Tesla owners also are testing what the company calls “Full Self-Driving” software. Despite their names, the company says the systems are there to assist drivers, none can drive themselves, and human drivers must be ready to intervene at all times.
Why Tesla cars failed to detect human behaviour
Studies show that once humans start using automated technology, they tend to trust it too much and zone out. Crashes started to happen, with the first fatality in June of 2016 when a Tesla Model S drove beneath a tractor-trailer crossing in front of it, killing the driver in Williston, Florida.
The National Highway Traffic Safety Administration investigated and blamed the driver and Tesla for not spotting the truck.
It closed the probe without seeking a recall, but criticized the way Tesla marketed Autopilot. Tesla’s monitoring system measured hands on the steering wheel, but some drivers found it easy to fool. And more Teslas started crashing into emergency vehicles parked on highways.
In 2021, NHTSA opened a new investigation focusing on 322 crashes involving Tesla’s Autopilot. The agency sent investigators to at least 35 Tesla crashes in which 17 people were killed.
Why Tesla is recalling its so-called 'futuristic' cars
The agency announced that Tesla had agreed to recall more than 2 million vehicles dating to 2012. The agency said Tesla’s driver monitoring system is defective and “can lead to foreseeable misuse of the system.”
Tesla disagreed with the conclusion but decided to do a software update to strengthen monitoring.
The added controls include more prominent visual alerts, simplifying how Autosteer is turned on and off, and additional checks on whether Autosteer is being used outside of controlled access roads and when approaching traffic control devices.
In some cases, it could limit where the system can operate. Critics say detecting hands on the steering wheel isn’t enough and that all Teslas should have cameras that monitor a driver’s eyes.
(With inputs from agencies)