Tesla Autopilot safety shows improvement as Full Self Driving fail videos rise

Posted on

Tesla has released its latest vehicle safety report, and it shows a slight improvement in the numbers recorded for its Autopilot driver assistance technology. Here’s the latest update:

In the third quarter, we recorded one accident for every 4.59 million kilometers driven in which drivers engaged Autopilot. For those driving without Autopilot but with our active safety features, we recorded one accident for every 2.42 million kilometers driven. For those driving without Autopilot and without our active safety features, we recorded one accident for every 1.79 million kilometers driven. By comparison, the most recent data from NHTSA shows that there is a car accident every 479,000 miles in the United States.

In comparison, there was one accident for every 4.53 million kilometers driven with Autopilot in the previous quarter. That’s good, and a seemingly impressive performance that indicates on the surface that the cars crash less when Autopilot is engaged. A lot of useful information is missing, such as the stretches where drivers are most likely to turn on the Driving Assist, or how many drivers turn off the system before areas they think they need to take over.

It’s also important to note that this information pertains to previous quarter Autopilot miles, so this is not a reflection of Tesla’s recently launched Full Self Driving beta. However, there is an interesting potential parallel that we think is likely to emerge over the coming months and years. Tesla’s software is designed to ‘learn’ as it goes, meaning it should improve with more vehicles on the road and more interaction from human drivers. Based on Tesla’s numbers, that appears to have been proven with Autopilot, and we hope it will remain true as more features are added to the system.

  Automaker doesn't mess with success

Meanwhile, a Stef Schrader story in The Drive chronicles a series of failed motoring caused by Tesla’s Full Self Driving technology, released in beta form to a limited number of owners. If you don’t know the term, beta means it’s not a finished, fully polished version of the technology. As we previously reported, Tesla owners are required to accept the terms and conditions which contain a disclaimer that the system can “do the wrong thing at the worst time”. But – and this is a huge “but” – even if a Tesla owner is willing to agree to participate in the beta, the other drivers the Tesla owner shares the road with were not given the same choice. And that’s a big problem.

🚨 Omar Qazi nearly crashes his #Tesla while using the beta software “Full Self Driving”. None of the other cars agreed to his experiment. $ TSLA $ TSLAQ pic.twitter.com/uU2RT9l5ZI

– Greta Musk (@GretaMusk) October 25, 2020

🚨🚨🚨
Take 41 seconds and watch this video.
DISGUSTING !!

H / T @StultusVox
cc @Tweetermeyer @PAVECampaign @ AlexRoy144 $ TSLAQ pic.twitter.com/4neqQxJPwr

– TC (@TESLAcharts) October 25, 2020

Not surprisingly, the National Highway Traffic Safety Administration says it is monitoring Tesla’s rollout “and will not hesitate to take action to protect the public from unreasonable safety risks.”

  Honda shows how its future 5G-connected vehicles could save lives, reduce crashes