Rand study shows it’s hard to gauge self-driving car safety

Posted on

Rand's findings suggest that a common basis is needed to build trust between government officials and the public as an era of automatic driving gets closer. Photo credit: REUTERS

Measuring the safety of self-driving vehicles has become increasingly difficult.

Manufacturers often tune automated vehicles as technological marvels that will drastically reduce the human toll of accidents. But that is a broad promise, and a new Rand Corp. study says a safety framework is needed, because car manufacturers are preparing for commercial implementations.

"The significance of security with regard to AV & # 39; s is surprisingly unclear – there is no standard definition," says the study of the non-profit think tank published last week. The study, "Measuring Automated Vehicle Safety", was sponsored by Uber.

Rand's findings suggest that a common basis is needed to build trust between government officials and the public as an era of automatic driving gets closer. Furthermore, the study is critical about the closed nature of vehicle development.

Even if companies such as Waymo drive millions of kilometers autonomously, Rand researchers say that the number of vehicle miles traveled during industrial tests is insufficient to provide meaningful data on the safety of self-driving cars.

Instead of waiting to use self-driving vehicles until statistics can prove that they are safer than vehicle-driven vehicles – and in the meantime, they will miss their advantage – Rand says leaders, academics and policy makers need to develop leading indicators that essentially measures related to positive safety outcomes instead of waiting for crashes to occur and counting them.

  Ford Maverick: The Bold and Thrilling Adventure-Ready Pickup Truck

"We know that there can be benefits with self-driving cars, and we do not want the perfect to become the enemy of the good," says Marjory Blumenthal, project leader for the research. "The challenge is to measure."

New barometers

One of the few criteria is the annual withdrawal reports that manufacturers have to submit to the California Department of Motor Vehicles. Manufacturers must state and describe circumstances in which their self-propelled systems do not connect to the car or engage in unsafe situations that require a human driver to intervene.

But the exchange reports are at best snapshots more than an extensive tool. They do not provide a context about the nature of the tests, whether they occur in complex urban environments or rural motorways, or weather conditions. Individual operators even say that what constitutes a withdrawal, and therefore a requirement to report, is open to interpretation.

Rand introduced the idea of ​​"roadmanship" as a potential measure for the competence of self-propelled systems. Instead of counting crashes or disengagements, a system could measure events related to safety outcomes, such as whether the actions of a vehicle would have violated a traffic law.

In addition to legality, road use can benchmark how a self-steering system interacts with the traffic around it. It can set security envelopes, lateral distances and make a distinction between which vehicles cause unsafe conditions and respond to them. Rand notes that the Responsibility Sensitive Safety model, proposed by computer vision supplier Mobileye, which takes human ideas about safe driving and places them within mathematical rules for vehicles to follow, is an example of roadmanship.

  Tesla 'Full Self-Driving' Beta video shows it has a long way to go

"We need indicators before we come to unwanted events," Blumenthal said.

Less confidentiality

To achieve safety benefits, companies need to share data about the circumstances in which their systems are experiencing problems on the road. Rand proposes that data from crashes and unexpected events be shared with crash researchers, government officials and academics. That does not happen now.

While transport officials have been talking about data exchange as a way to improve automated vehicle safety, the Rand study says: "The industry appears to be ambivalent about when, where and how to coordinate on important aspects related to development. there seems to be more to talk about sharing than about sharing. "

Even organizations such as the National Transportation Safety Board, the federal agency that investigates accidents, need to rely on manufacturers' assistance to gain access to the data flows describing and understanding their self-driving activities.

By sharing information and agreeing on safety parameters, manufacturers can get support for self-driving vehicles from a skeptical audience, says the study. Nearly three-quarters of drivers are afraid to drive fully self-driving vehicles, according to a AAA study this year. But in addition to helping win future customers, Rand claims that manufacturers have a responsibility to share information with the public.

"On the public road, the road becomes a living laboratory," says the study. "Other road users become involved in an investigation for which they have not agreed to participate and can not refuse."

Whether you like it or not, they are already part of the self-driving experiment, one that has been undertaken to achieve dramatic safety gains, but still no details on how to achieve that progress.

  EU regulators make anti-speeding tech and 'black box' mandatory