Tesla ‘Full Self-Driving’ Beta video shows it has a long way to go

Posted on

Tesla’s “Full Self-Driving” Beta (which we have in quotes because, despite its name, it is not completely self-driving, as it still requires the driver’s attention) is software on public streets, and has since been at least on October. The company has expanded the prototype software rollout to small groups of Tesla owners, as stated on Twitter by the company’s CEO, Elon Musk, with further expansion in April. We’re not too sure that rapid program expansion is such a good idea after seeing how poorly the feature performed in the above video, which was also reported on by Road & Track and Jalopnik.

During the sunny drive that YouTube user AI Addict takes, the car shows many signs that the Advanced Driving Assistance has a long, long way to go before it is ready for prime time. Some of the more minor issues include some hesitation and confusion when choosing a lane, stopping far ahead of the line, and getting stuck behind parked cars. Much more worrying are a few near-collisions, one when crossing an intersection with no intersecting stop sign, and then a moment when it looked like it was trying to drive through a gate. In many of these situations, the driver had to manually take over to continue driving or to avoid a crash. In another video taken a few days prior to the above video, AI Addict had many of the same issues on an evening ride.

  San Diego dealer targeted in gambler's lawsuit over giveaway

Of course, no one expects a beta version of anything to be perfect, and getting people to test it is one way to diagnose and fix the issues. As shown in the video, users of the “Full Self-Driving” Beta report when something goes wrong so Tesla has that information and can work to correct it. But the beta version of self-driving cars is different from most in that it can cause serious damage if something goes wrong and the driver doesn’t catch it in time. It seems to us that this software needs more time in the hands of Tesla employees, rather than being unleashed on the public, even in the currently limited numbers. If something goes wrong with what are basically amateur testers behind the wheel, it can have very serious consequences if someone is killed or seriously injured simply by eliminating the cause of automated driving.

Related video: