May 7 A 40-year-old, named Joshua Brown, was killed when his Tesla model S sedan collided with a tractor trailer crossing his 27A motorway near Williston, Florida. Nearly three years later, another 50-year-old Tesla owner, Jeremy Beren Banner, was also killed inside a Florida highway in a very similar context: his 3 model collided with a tractor trailer crossing its path, a fall roof process.
There was another great similarity: researchers found that both drivers used Tesla's advanced driver assistance system, Autopilot, during their accidents.
Autopilot is a Level 2 semi-autonomous system described by the Automotive Engineers Association, which combines an adaptive cruise control, a bar, helps to keep the car, self-build cars, and, more recently, the ability to automatically change lanes. Today, Tesla is one of the safest roads, but Brown and Banner's deaths raise questions about these claims and show that Tesla has ignored the major shortcomings of its flagship technology.
There are some major differences between the two accidents. For example, Brown and Banner cars had completely different driver assistance technologies, though both are called "Autopilot". Brown's Autopilot model was based on the technology of Mobileye, an Israeli launch that Intel acquired. Brown's death was partly responsible for the distribution of two companies in 201
This suggests that Tesla, after recalculating Autopilot, had the opportunity to solve this so-called "extreme case" or an unusual circumstance, but has not done so far. After Brown's death, Tesla said his camera was unable to recognize white trucks from the sky; In the US, the National Highway Traffic Safety Administration (NHTSA) found that Brown did not take the road and dismissed Tesla. He found that his car cruise control was set at about 74 mph about two minutes before the accident, and he had to have at least seven seconds to build a truck before breaking it.
Federal investigators have yet to decide on Banner's death. In a preliminary announcement published on May 15, the National Traffic Safety Board (NTSB) said that Banner had enabled Autopilot about 10 seconds before the collision. "From less than 8 seconds before the accident, the vehicle did not set the driver's hand on the steering wheel," NTSB said. The car traveled 68 mph when it crashed.
In his statement, the Tesla spokesman stated it differently by changing the passive "vehicle driver's hand on the steering wheel" to a more active "driver immediately removed his hands from the wheel". do not answer any further questions about what the company has done to solve this problem.
Previously, Tesla CEO Elonas Muskas accused of accidents related to Autopilot due to driver confidence. "When there is a serious disaster, it is almost always, perhaps always, a case of being an experienced user and the problem is more of a satisfaction," Muskas said last year.
The latest accident occurs at the time when Musk brings Tesla's plans for 2020. Deploy autonomous taxi fleet. "From year to year we will have more than a million cars with completely self-driving, software, everything," he said at a recent Autonomy Day event for investors.
If federal regulators decide to suffer from Autopilot, these plans will be pointless. Consumer advocates call on the government to launch an investigation into an advanced driver assistance system. "Or Autopilot can't see a wide 18-wheeler, or it can't respond safely," said David Friedman, vice president of consumer report protection. "This system is not capable of routinely routinely routing multiple situations and is unable to maintain driver recruitment exactly as needed."