Tesla Model X owner tries to replicate Autopilot error at crash site

With the NTSB is still investigating the fatal crash involving a Tesla Model X on Highway 101 and Highway 85 near Mountain View, there is still many questions yet to be answered. Tesla did confirm that Autopilot was engaged when the Model X struck the highway barrier. San Jose State University mechanical engineering professor Fred Barez wanted to see if the Autopilot can replicate this error in his own Model X. The results were shockingly similar to the fatal accident last week. The Autopilot did indeed get confused at the same spot, improperly steering the SUV into the median and towards the highway divider barrier.

“The car definitely started swerving left without giving me any early warnings, right into that divider,” Barez said.

Meanwhile, Tesla claims the victim of the fatal accident ignored warnings to take back control of the vehicle prior to the crash. The driver’s hands were not detected on the steering wheel for six seconds prior to collision. Tesla also stated, that “the driver had five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Tesla has updated its Autopilot system to discourage drivers from ignoring the hands on alerts completely. This brings the question about autonomous cars, which several tech firms are trying to shove down the throats of consumers in the future. The Tesla Model X’s Autopilot system is semi-autonomous and is more of a driver’s assist. However, if this happens with a self-driving car without a steering wheel or a fully-autonomous vehicle then we are in trouble. Autonomous cars are far from perfect and never will replace human judgment. Man cannot rely on technology all the time, we must also use are own judgment and also pay attention to the roads too.

Video credit: Shantanu Joshi

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: