A self-driving car relies on a network of sensors placed throughout the vehicle to gather data about its location and surroundings. Artificial intelligence algorithms use this data to note objects in the immediate vicinity, plan a course to avoid them and ultimately convert that plan into the actions of steering, accelerating and braking. The vehicles that Roborace is developing feature five light detection and ranging (LiDAR) sensors, which use pulsed lasers to measure distance; two radars that bounce high-frequency electromagnetic waves off objects; 18 ultrasonic sensors that rely on sound waves; two optical speed sensors; six cameras; and GPS. Roborace’s machines—called Robocars—rely on high-end onboard computers capable of performing up to 24 trillion operations per second to make split-second decisions based on the sensor data.
Roborace wants to start running races with as many as 20 Robocars—10 teams each fielding two vehicles—within two years. With all of the teams using essentially a Robocar clone, the winners will be those whose software is deftest at handling speeds of many meters per second. “When you go from [50 kilometers] per hour to [320 kilometers] per hour, it doesn’t mean you’re just increasing the speed by six times; it means you need to have a totally different approach to everything,” Roborace CEO Denis Sverdlov says.
The technical challenges start with the hardware. A sensor’s range—a limiting factor for autonomous driving—“has a very different meaning when you’re traveling at higher speed, because the same…distance gives you much less time to react,” says Chris Gerdes, a Stanford University professor of mechanical engineering whose lab studies racing to improve autonomous driving. Additionally, University of Michigan mechanical engineering professor Huei Peng notes, when a vehicle moves at very high speed, the sensor computation requirement—processing raw physical data into computer-friendly coordinates—is much higher. A stationary LiDAR can easily figure out the absolute position of every reflecting object, but on a fast-moving vehicle the software must account for how the world will appear blurred due to the distance traveled between measurements. And although the control algorithms that execute the vehicle’s plans operate quickly, the algorithms interpreting camera and sensor data take much longer—which means the car has moved nearly the length of a bus by the time it makes sense of what it saw.
Sverdlov’s hope is that faster sensors and AI algorithms developed for racing will find their way into the consumer and commercial vehicles. But whether any algorithmic innovations impact those sectors will depend on how well they generalize to street driving, says David Held, who will join Carnegie Mellon University next month as an assistant professor of robotics. If the racing algorithms speed up processing by making strong assumptions—for example, that all moving objects around a Robocar are other race cars—then that obviously will not translate to public roadways. But improvements to general-purpose vision algorithms could also shave time off of passenger vehicles’ perceptual processing. That would allow cars either to dedicate more time to other tasks such as optimizing their trajectories or to simply shorten their response time to reduce total braking or swerving distance, Held says.
The strongest case for racing as a way to improve regular driving is with respect to safety. “The best race car drivers are able to really use all of the tire friction to do some pretty extraordinary maneuvers and to push the car to its very limits,” Gerdes says. “We want to understand how to duplicate that, not to go fast but to be safe.” (Gerdes spoke with Scientific American via telephone over the noise of his lab’s own autonomous DeLorean screeching around cones on a racetrack.) Racing driver–inspired algorithms might take into account the limited amount of friction available to a vehicle, and use that information to set the optimal speed for a particular turn or for staying on the road in ice or snow, Gerdes says: “By studying racing, we can learn a lot about how you would control a car through adverse conditions that you might get in everyday driving.”
It is also plausible that seeing high-performing autonomous racers will make consumers more confident in autonomous driving tech—although a lack of crashes in robot races might disappoint a few race fans. Like the Roborace founders, Held thinks such confidence could increase public support for self-driving cars, which most people have yet to ride in or even see on the street. “Seeing with their own eyes an autonomous car and what it actually can do, instead of just hearing about it—I think that could help change people’s ideas,” he says.
One indication of how far the Robocar has to go before it is ready for the starting line is that Roborace tends to run its public demonstrations with the company’s “Devbot” vehicles. The Devbots feature the same sensors as the Robocar but also include a cockpit for a human driver who can monitor the vehicle’s performance and take over if needed. Devbots have performed high-speed demonstrations at several Formula E “ePrix” races, which feature Formula 1–like electric cars powered entirely by batteries. In February Roborace raced two Devbots on a custom-built city street track at Formula E’s ePrix in Buenos Aires, becoming the first company to exhibit two driverless race cars simultaneously. One of the Devbots finished race at a top speed of 186 kph (and managed to avoid hitting a dog that ran onto the track) whereas the other crashed and could not finish.
Robocar’s only on-course appearance to date was one tentative lap in May at the Paris ePrix. More recently, Roborace opted not to demonstrate the vehicle on a 1.95-kilometer Formula E track constructed in Brooklyn, N.Y. Sverdlov says that race organizers did not finalize the course of the 10-turn track until late in the race’s planning stages. “The racetrack built here is quite a challenge for us because of the sharp corners,” he said prior to the event, where Roborace instead tested one of its Devbots. “We made the decision that we need to spend more time on the development algorithms and fine-tuning the hardware.” Sverdlov also noted that GPS signal strength was poor in Brooklyn’s Red Hook neighborhood, which hosted the race. Forced to rely more on its cameras for positioning information, the Robocar would have had to negotiate the course at slower speeds.
Even under the best conditions, autonomous racers still are not quite at the level of the best human professionals. “We’ve been working with J. R. Hildebrand,” Gerdes says, “and he’s still faster than we are, because of the amazing human ability…to really understand…: ‘What is the best path to take around this corner in this car at this exact moment, having heated up the tires exactly as much as I have?’ Being able to feel that, sense that and come up with the best solution in the moment is still something we’re inspired by.” If Roborace has its way, it will not be long before that ability will be inspiring many other teams of engineers as well.