An autonomous systems defense company contends it has successfully spoofed the GPS mechanism of a Tesla Model 3 using the automaker’s latest Autopilot technology, sending the vehicle off its intended route.
Regulus Cyber said it used commercially available hardware and software to wirelessly divert the electric car using Navigate on Autopilot, a Tesla feature that—with driver supervision—guides a car along the highway, from on-ramp to off-ramp, executing lane changes and navigating interchanges along the way.
According to Haifa, Israel-based Regulus, the car was 3 miles from a planned exit, traveling at a steady speed and in the middle of the lane with the Navigate feature activated when its test began. The car reacted as if the exit was 500 feet away, according to Regulus, slowing “abruptly,” flicking on the turn signal and turning off the road.
Now, to get this to work, the company said it had to install a 10-centimeter-long antenna on the roof of the target car. And Tesla, responding to questions about the software maker’s test, dismissed it as a sales ploy.
“These marketing claims are simply a for-profit company’s attempt to use Tesla’s name to mislead the public into thinking there is a problem that would require the purchase of this company’s product,” a Tesla spokesperson said. “That is simply not the case. Safety is our top priority, and we do not have any safety concerns related to these claims.”
But the issue of GPS spoofing has hovered over autonomous driving from its inception. Relying on a wonky signal to get to your destination in a normal car may simply mean missing your exit. Relying on it to keep your car on the right path at 60 mph is something else entirely. Now that the general public has awakened to the fact that autonomous driving is getting closer to reality, addressing consumer safety concerns will be critical to facilitating mass adoption.
In a 2018 paper winkingly titled “All Your GPS Are Belong to Us: Towards Stealthy Manipulation of Road Navigation Systems,” researchers demonstrated the possibility that spoofing—substituting pirate signals for those of a GPS satellite—could stealthily send you to the wrong destination.
While they note the threat of GPS spoofing has been discussed as far back as 2001, and that spoofing has been shown to work in other contexts, their experiment was the first to test road navigation systems. The researchers used real drivers behind the wheel of a car that was being told to go to the wrong place.
Some 38 out of 40 participants followed the illicit signals, the researchers said.
“The problem is critical, considering that navigation systems are actively used by billions of drivers on the road and play a key role in autonomous vehicles,” wrote the authors, who hail from Virginia Tech, the University of Electronic Science and Technology of China and Microsoft Research.
And while cars with autonomous features have additional tech to protect against spoofing, they cautioned that other studies raised the specter of attacks on other systems, such as ultrasonic sensors, millimeter-wave radar, lidar (light detection and ranging), and wheel speed sensors.
“These new semi-autonomous features offered on new cars places drivers at risk, and provides us with a dangerous glimpse of our future as passengers in driverless cars,” said Roi Mit, chief marketing officer of Regulus Cyber.
Curtis Kexiong Zeng, one of the authors of the 2018 study, said in an interview that successfully spoofing a Tesla Autopilot system depends on what kinds of maneuvers the car can make based on GPS location and without driver participation or permission.
“Generally speaking,” he said, “the threat of GPS spoofing increases as the level of automation goes up.”
So what about this test by Regulus Cyber? “This is entrepreneurial hacking,” said Colin Bird-Martinez, senior analyst in connected car software at IHS Markit. The Regulus attack is both time and labor intensive, he said, and relied on someone placing an antenna on the car itself, something any reasonably alert motorist would likely notice.
And the impact was extremely limited, he said—the Model 3 didn’t crash; it just braked and took a different road. “You can hack anything if you put the effort into it,” Bird-Martinez said. But in this case, he said he’s not all that worried.
Another reason spoofing is unlikely to succeed is Tesla’s onboard computer. It uses GPS, radar, maps and cameras to funnel data into a central processing unit, where a final driving decision is made in a process known as sensor fusion. Tesla vehicles don’t use GPS or maps to control the steering of the car, and also learn when to disregard faulty information.
You’ve likely seen the process yourself, if you’ve ever used GPS in a city. Your little blue dot will wander off a street and into the middle of an office building. The computer, knowing you’re likely not driving through a building, realizes the signal is incorrectly placed and uses other data (from cell towers or other known waypoints) to place you back on the road.
In fact, researchers concluded that any patch for a spoofing risk will likely involve those other sensors.
The Regulus attack is, however, a good reminder that “nothing is really foolproof at the moment,” said Kevin Mak, principal analyst of the automotive practice at Strategy Analytic. Systems like Autopilot are not fully autonomous—they are, as the company is quick to point out, driving assistance.
Current systems “aren’t capable of driving the vehicle by themselves,” said Mak. “You need the driver to be aware and capable of taking over.”
Was this article valuable?
Here are more articles you may enjoy.