According to Ford and others, fully autonomous robo-taxis could be deployed on public streets by the start of the next decade. And while self-driving technology is progressing at a much faster pace than ever imagined, humans are still better drivers in certain instances. Autonomous cars have some catching up to do.
The sensors used in self-driving cars—radar, Lidar, cameras—and the accompanying software still can't outperform similar information-gathering and processing aspects of the human brain. That's the premise of a recent whitepaper by Brandon Schoettle, a project manager at the University of Michigan's Transportation Research Institute.
"Machines/computers are generally well suited to perform tasks like driving, especially in regard to reaction time (speed), power output and control, consistency, and multichannel information processing," Schoettle writes. But he adds that, "Human drivers still generally maintain an advantage in terms of reasoning, perception, and sensing when driving."
A person, for example, may be able to perceive the difference between a gray sky and the trailer of a semi that's roughly the same color. But a self-driving car cannot, as demonstrated in the deadly Tesla Model S crash.
There are countless other limitations for self-driving sensors and software. Humans can stay in their lane even if road markings are faded or completely gone. They can also quickly and easily tell whether a person crossing the street ahead is staring at a phone instead of paying attention to the car heading towards them.
While machine learning will eventually be able to decipher hand gestures that human drivers use to communicate, such as a wave-through at a four-way stop, decoding subtle but critical eye contact that's common between drivers and pedestrians may take longer. Researchers at the University of Washington recently showed how self-driving cars can even be fooled by slight modifications to traffic signs, whereas a human can instantly recognize, say, a stop sign or even the speed limit at a glance. Whether they obey is another matter.
Driving Like a Bat at Night
But as good as (some) humans are at driving, machines have distinct advantages.
"While no single sensor completely equals human-sensing capabilities, some offer capabilities not possible for a human driver," according to Schoettle, who points to radar sensors that can "see" much better than humans at night.
- Driverless Cars: When the Internet Takes the Wheel Driverless Cars: When the Internet Takes the Wheel
According to the AAA Foundation for Traffic Safety, the chance of a fatal crash is four times higher at night. The human eye can see about 250 feet at night, and headlights can only illuminate the area about 350 feet ahead of a vehicle. Like the bats that inspired the technology, radar sensors detect objects up to 820 feet ahead, even in low or no light. Lidar sensors have a similar depth of field, and can detect objects 360 degrees around a vehicle.
Combining these capabilities via sensor fusion and software will eventually make self-driving technology superior to humans. But for now, "you're probably safer in a self-driving car than with a 16-year-old, or a 90-year-old. But you're probably significantly safer with an alert, experienced, middle-aged driver than in a self-driving car," Schoettle tells Wired.
Sensors and software doesn't get distracted, tired, drunk, or stoned, and can see better at night and in low-light conditions. So I'd bet that with self-driving technology improving by leaps and bounds, autonomous vehicles will quickly surpass human capability. Maybe even in the next decade.