Self-driving truck startup TuSimple is reportedly under investigation for trying to blame “human error” for self-driving truck accidents. Self-driving car researchers at Carnegie Mellon University say blaming humans is misleading, and common safeguards could prevent accidents.
In April, a self-driving semi truck equipped with TuSimple technology veered left and hit a concrete barrier while driving on a highway in Tucson, Arizona, according to dash cam footage leaked to YouTube.
TuSimple blames ‘human error’ for accident, but an internal report review Depend on Wall Street Journal Shows that blaming humans for the collapse is an oversimplification.
According to the internal report, the accident occurred because “a person in the cab did not properly restart the autopilot system before activating it, causing it to execute an outdated command,” Wall Street Journal Report.
Essentially, the left turn order was 2.5 minutes ago and should have been removed but didn’t.
But self-driving car researchers at Carnegie Mellon University say it would be misleading to place the blame on humans, when common safeguards could have prevented the incident.
researchers told Wall Street Journal Trucks shouldn’t respond to commands of even a few hundredths of a second, and the system shouldn’t make self-driving trucks make such sharp turns while traveling at 65 miles per hour.
“This information suggests that their testing on public roads is very unsafe,” Phil Koopman, an associate professor at Carnegie Mellon University, told CNN. Magazine.
On Tuesday, TuSimple blog post“We have a responsibility to identify and address all safety issues very seriously,” adding that it made an “immediate ground-truth” over the April accident.[ing] Our entire self-driving fleet and launched an independent review to determine the cause of the incident. “
The company added: “Learning from this review, we upgraded all systems with new automated system checks to prevent this human error from happening again, and we reported the incident to NHTSA and the Arizona Department of Transportation.” “
Still, the National Highway Traffic Safety Administration (NHTSA) is investigating the San Diego-based company along with the Federal Motor Transportation Safety Administration (FMCSA).
In a letter, the FMCSA said it had launched a “safety compliance investigation” into TuSimple — referring to the April incident.
TuSimple isn’t the only self-driving car company under investigation by the NTSA.
Federal agencies have launched an investigation into another fatal crash involving Tesla’s Autopilot “Full Self-Driving” system. Three people have died in the latest Tesla crash under federal investigation.
In June, the federal investigation into Tesla’s Autopilot feature escalated, and NHTSA is now investigating whether the Autopilot feature has potential flaws. The agency, which is looking at data from 200 Tesla crashes, said “on average, in these crashes, Autopilot aborted vehicle control less than a second before the first impact.”