California is re-evaluating Tesla’s Full Self-Driving (FSD) testing program to determine whether the electric-car maker’s software should be subject to the DMV’s autonomous vehicle regulations, the Los Angeles Times reported Tuesday.
FSD is an advanced driver assistance system that performs some driving tasks, but Tesla says it doesn’t make vehicles fully autonomous. Its characteristics “require a fully attentive driver,” the company said.
If Tesla’s cars are considered autonomous by California, state laws would require it to disclose all crashes on public roads, even if manually controlled. Those reports will be made public, as will data about self-driving systems being turned off.
The California Department of Motor Vehicles (DMV) informed Tesla of the regulator’s review last week, according to the Los Angeles Times.
“Recent software updates, videos showing dangerous uses of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in the field have prompted a re-evaluation,” the DMV said, the report said.
The RDW and Tesla did not immediately respond to Reuters’ requests for comment.
In October last year, Tesla vehicles running the then-latest 10.3 FSD software repeatedly issued forward collision warnings when there was no immediate danger, according to video posts from beta users. However, Tesla fixed the software within a day.