Argo AI worked with advocacy group the League of American Cyclists (LAB) to come up with guidelines for how self-driving vehicles should identify and interact with cyclists. The goal is to set a standard for other AV companies in the industry to follow, especially as the self-driving industry moves away from testing and commercialization and will become more commonplace in the coming years.
The World Health Organization estimates that 41,000 cyclists are killed in traffic-related incidents each year. While self-driving vehicles are expected to significantly reduce the number of collisions, much of that expected safety is the result of good coding at the outset. Self-driving cars learn from huge databases that categorize and identify objects and situations that may arise, and Argo’s guidelines emphasize training its models in a way that specifically considers cyclists, cycling infrastructure and cycling laws.
“The preparation of these guidelines is part of Argo’s commitment to building trust among community members and developing a self-driving system that provides cyclists with a level of comfort by behaving consistently and safely,” Peter Rander, president and fellow founder of Argo AI, said in a statement. “We encourage other developers of autonomous vehicles to also use them to further build trust among vulnerable road users.”
Argo, which currently operates self-driving test vehicles in the US and parts of Germany, said it worked with the LAB community to hear about common cyclist behaviors and interactions with vehicles. Together, Argo and LAB came up with six technical guidelines for self-driving systems to detect cyclists, predict cycling behavior and drive consistently.
Cyclists should be a separate object class
Treating cyclists as a separate class and labeling them as such creates a diverse set of cycling images from which a self-driving system can learn. Systems must be trained on images of cyclists from different positions, orientations, viewpoints and speeds. Argo said this will also help the system take into account the different shapes and sizes of bikes and riders.
“Due to the unique behavior of cyclists that distinguishes them from scooter users or pedestrians, a self-driving system (or ‘SDS’) should designate cyclists as a core object representation within its perception system to accurately detect cyclists,” a statement from Argo said.
Typical cycling behavior is to be expected
Cyclists can be quite unpredictable. They can split into lanes, run their steed, make quick, jerky movements to avoid obstacles in the road, yield to stop signs, jump off the sidewalk and onto the street. A good self-driving system should not only be able to predict their intentions, but also be prepared to respond to them.
“An SDS should use specialized, cycling-specific motion prediction models that take into account a variety of cycling behaviors, so when the self-driving vehicle encounters a cyclist, it generates multiple possible trajectories that capture the potential options of a cyclist’s path, allowing the SDS to determine the to better predict and respond to the cyclist’s actions.”
Map cycling infrastructure and local laws
Self-driving systems often rely on high-definition 3D maps to understand their environment. Part of that environment should be cycling infrastructure and local and state cycling laws, Argo said. For example, the self-driving system can anticipate the movements of cyclists – such as merging into traffic to prevent parked cars from blocking the cycle path or driving through red lights when there is no traffic – and keep a safe distance from the cycle path.
The system should work in a consistent, understandable and extra safe way around cyclists
Self-driving technology should operate in a way that appears natural so that the AV’s intentions are clearly understood by cyclists, including things like using turn signals and adjusting vehicle position while still in one lane when preparing to pass. , insert or discard.
In addition, when riding near cyclists, the system should “target conservative and appropriate speeds in accordance with local speed limits and margins equal to or greater than local laws, and pass a cyclist only if they margins and speeds for the entire maneuver,” Argo said.
The self-driving system should also provide cyclists with ample space if they fall, so that it can swerve or stop.
Prepare for uncertain situations and slow down proactively
Self-driving systems must account for uncertainty in a cyclist’s intention, direction and speed, Argo said. The company gave the example of a cyclist traveling in the opposite direction of the vehicle, but in the same lane, and suggested that the vehicle should be trained to slow down in those conditions.
In fact, in the most uncertain conditions, the self-driving system should reduce the speed of the vehicle and, if possible, give some more space between vehicle and cyclist. Slowing down speeds when the system is insecure is already pretty standard in the AV developer world, even if it’s not always specifically aimed at cyclists.
Continue testing cycling scenarios
The best way to build the safety case for AVs is to keep testing them. Argo and LAB propose that developers of self-driving technology continue with both virtual and physical tests specifically aimed at cyclists.
“A virtual testing program should consist of three major testing methodologies: simulation, resimulation and playforward to test an exhaustive permutation of autonomous vehicle and bicycle interactions on a daily basis,” the company said. “These scenarios should capture varying vehicle and cycling behavior as well as changes in social context, road structure and visibility.”
Physical testing, usually done in closed courses and then on public roads, allows developers to validate simulation and ensure that the technology behaves the same in the real world as it does in the virtual world. Argo says developers should test AVs for likely scenarios, as well as “edge cases” or rare situations. Testing on multiple public roads in many cities to give the system a diverse set of urban environments to learn from can yield both rare and common cases.
On the hunt for public acceptance… and safety, of course
Social adoption is one of the major hurdles to getting more AVs on the road, and many people are still not convinced about the safety of autonomous vehicles. In fact, nearly half of those surveyed by market research firm Morning Consult said AVs are either slightly less safe or much less safe than human-driven cars.
Making a vehicle safe for all road users is only half the battle. Companies like Argo AI also need to make sure people believe their vehicles are safe, and standardizing safety practices across the industry could be one way to do that.