In the world of autonomous vehicles, Pittsburgh and Silicon Valley are bustling hubs of development and testing. But ask those involved in self-driving vehicles when we might actually see them carrying passengers in every city, and you’ll get an almost universal answer: Not anytime soon.
An optimistic assessment is 10 years. Many others say decades as researchers try to conquer a number of obstacles. The vehicles themselves will debut in limited, well-mapped areas within cities and spread outward.
The fatal crash in Arizona involving an Uber autonomous vehicle in March slowed progress, largely because it hurt the public’s perception of the safety of vehicles. Companies slowed research to be more careful. Google’s Waymo, for instance, decided not to launch a fully autonomous ride-hailing service in the Phoenix area and will rely on human backup drivers to ferry passengers, at least for now.
Here are the problems that researchers must overcome to start giving rides without humans behind the wheel:
SNOW AND WEATHER
When it’s heavy enough to cover the pavement, snow blocks the view of lane lines that vehicle cameras use to find their way. Researchers so far haven’t figured out a way around this. That’s why much of the testing is done in warm-weather climates such as Arizona and California.
Heavy snow, rain, fog and sandstorms can obstruct the view of cameras. Light beams sent out by laser sensors can bounce off snowflakes and think they are obstacles. Radar can see through the weather, but it doesn’t show the shape of an object needed for computers to figure out what it is.
“It’s like losing part of your vision,” says Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University.
Researchers are working on laser sensors that use a different light beam wavelength to see through snowflakes, said Greg McGuire, director of the MCity autonomous vehicle testing lab at the University of Michigan. Software also is being developed so vehicles can differentiate between real obstacles and snowflakes, rain, fog, and other conditions.
But many companies are still trying to master the difficult task of driving on a clear day with steady traction.
“Once we are able to have a system reliably perform in those, then we’ll start working toward expanding to those more challenging conditions,” said Noah Zych, Uber’s head of system safety for self-driving cars.
PAVEMENT LINES AND CURBS
Across the globe, roadway marking lines are different, or they may not even exist. Lane lines aren’t standardized, so vehicles have to learn how to drive differently in each city. Sometimes there aren’t any curbs to help vehicles judge lane width.
For instance, in Pittsburgh’s industrial “Strip District,” where many self-driving vehicles are tested, the city draws lines across the narrow lanes to mark where vehicles should stop for stop signs. Sometimes the lines are so far back and buildings are so close to the street that autonomous cars can’t see traffic on the cross street if they stop at the line. One workaround is to program vehicles to stop for the line and creep forward.
“Is it better to do a double stop?” asked Pete Rander, president of Argo AI, an autonomous vehicle company in which Ford has invested heavily. “Since intersections vary, it’s not that easy.”
DEALING WITH HUMAN DRIVERS
For many years, autonomous vehicles will have to deal with humans who don’t always play by the rules. They double-park or walk in front of cars. Recently in Pittsburgh, an Argo backup driver had to take over when his car stopped during a right turn, blocking an intersection when it couldn’t immediately decide whether to go around a double-parked delivery truck.
“Even if the car might eventually figure something out, it’s shared space, and it’s socially unacceptable” to block traffic, Rander said.
Humans also make eye contact with other drivers to make sure they’re looking in the right direction, something still being developed for autonomous vehicles.
Add to that the antagonism that some feel toward robots. People have reportedly been harassing Waymo’s autonomous test vehicles near Phoenix. The Arizona Republic reported in December that police is suburban Chandler have documented at least 21 cases in the past two years, including a man waiving a gun at a Waymo van and people who slashed tires and threw rocks. One Jeep forced the vans off the road six times.
LEFT TURNS
Deciding when to turn left in front of oncoming traffic without a green arrow is one of the more difficult tasks for human drivers and one that causes many crashes. Autonomous vehicles have the same trouble.
Waymo CEO John Krafcik said in a recent interview that his company’s vehicles are still encountering occasional problems at intersections.
“I think the things that humans have challenges with, we’re challenged with as well,” he said. “So sometimes unprotected lefts are super challenging for a human, sometimes they’re super challenging for us.”
CONSUMER ACCEPTANCE
The fatal Uber crash near Phoenix last year did more than push the pause button on testing. It also rattled consumers who someday will be asked to ride in self-driving vehicles.
Surveys taken after the Uber crash showed that drivers are reluctant to give up control to a computer. One by AAA found that 73 percent of American drivers would be too fearful to ride in a fully self-driving vehicle. That’s up from 63 percent in late 2017.
Autonomous vehicle companies are showing test passengers information on screens about where the vehicles are headed and what its sensors are seeing. The more people ride, the more they trust the vehicles, says Waymo’s Krafcik.
“After they become more and more confident they rarely look at the screens, and they’re on their phones or relaxing or sleeping,” he said.
Was this article valuable?
Here are more articles you may enjoy.