Tesla “Full Self-Driving” safety complaint investigated by NHTSA

0
10

DETROIT – USA car safety regulators are looking into a complaint from a Tesla driver that the company’s “Full Self-Driving” software has caused a crash.

The driver was testing the beta version of the “Full Self-Driving” software and the Tesla SUV went into the wrong lane and was hit by another vehicle, according to a complaint filed by the driver with the National Highway Traffic Safety Administration.

“The car entered the wrong lane and I was hit by another driver in the lane next to mine,” wrote the driver.

The vehicle, a Tesla Model 2021 small SUV, warned the driver halfway through the bend and the driver tried to turn the steering wheel to avoid more traffic, according to the complaint. But the car took over and “forced itself into the wrong lane, creating a dangerous maneuver that put everyone involved at risk,” the driver wrote.

No one was injured in the crash but the Model Y was severely damaged on the driver’s side, according to the complaint filed with the online agency on Monday and posted in its public complaints database.

The crash occurred on November 3 and the driver’s location is Bree, California, but the location of the crash has not been identified. NHTSA does not release the names of those who submit complaints.

It is likely that it was the first complaint filed with the agency claiming that the “Full Self-Driving” software caused a crash. A message was left on Friday seeking comment from Tesla, which disbanded its media relations department.

An NHTSA spokesperson said Friday night that the agency is aware of the complaint and is communicating with Tesla to obtain more information. The spokesperson says people should report security concerns to the agency.

The investigation is another sign that NHTSA is becoming more aggressive in looking at autonomous and partially automated driving systems underneath. President Joe Biden. In the past, the agency has been reluctant to regulate the systems, saying it does not want to delay potentially life-saving technology.

Tesla claims that “Autopilot” and “Full Self-Driving” are driver assistance systems and cannot drive alone, despite their names. The automaker says drivers need to be ready to intervene at any time.

Tesla’s select drivers have tested the software on public roads in beta, a practice that critics say endangers others because the software has flaws and drivers are untrained. Other companies that test on public roads have human safety drivers on board ready to intervene.

Beta testing is a field test of software performed by users before the full commercial version is ready.

Critics asked the NHTSA to take action after several videos were posted on the internet showing Tesla’s software was making mistakes and drivers had to take action.

“Hopefully this gives @NHTSAgov the ammunition he needs to act on the FSD now rather than waiting for Tesla to take his time by releasing partial data,” Philip Koopman, professor of electrical engineering and computer science at Carnegie Mellon University.

In June, NHTSA ordered automakers to report any fully-involving accidents autonomous vehicles o partially automated driver assistance systems. It is unclear whether Tesla reported an accident involving the Californian driver. Two months later it opened a formal investigation into Tesla Autopilot’s partially automated driving assistance system after a series of collisions with parked emergency vehicles.

NHTSA has already asked Tesla about beta testing, including the requirement for testers not to disclose information. The agency said nondisclosure agreements could hinder its ability to investigate.

LEAVE A REPLY

Please enter your comment!
Please enter your name here