Research shows Tesla Model 3 and Model S are vulnerable to GPS spoofing attacks

Tesla Model S and Model 3 electric cars are vulnerable to cyberattacks aimed at their navigation systems, according to research from Regulus Cyber.

Tesla GPS spoofing attacks

Staged attack caused the car to veer off the main road

During a test drive using Tesla’s Navigate on Autopilot feature, a staged attack caused the car to suddenly slow down and unexpectedly veer off the main road.

Regulus Cyber initially discovered the Tesla vulnerability during its ongoing study of the threat that easily accessible spoofing technology poses to GNSS (global navigation satellite systems, also known as GPS) receivers.

The researchers found that spoofing attacks on the Tesla GNSS (GPS) receiver could easily be carried out wirelessly and remotely, exploiting security vulnerabilities in mission-critical telematics, sensor fusion, and navigation capabilities.

Tesla Model 3: Navigate on Autopilot

Regulus Cyber experts traveled to Europe last week to test-drive the Tesla Model 3 using Navigate on Autopilot. An active guidance feature for its Enhanced Autopilot platform, it’s meant to make following the route to a destination easier, which includes suggesting and making lane changes and taking interchange exits, all with driver supervision.

While it initially required drivers to confirm lane changes using the turn signals before the car moved into an adjacent lane, current versions of Navigate on Autopilot allow drivers to waive the confirmation requirement if they choose, meaning the car can activate the turn signal and start turning on its own. Tesla emphasizes that “in both of these scenarios until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain ready to take manual control of their car at all times.”

Reaction to spoofing attack

Designed to reveal how the semi-autonomous Model S and Model 3 would react to a spoofing attack, the Regulus Cyber test began with the car driving normally and the autopilot navigation feature activated, maintaining a constant speed and position in the middle of the lane.

Although the car was three miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away—abruptly slowing down, activating the right turn signal, and making a sharp turn off the main road. The driver immediately took manual control but couldn’t stop the car from leaving the road.

The testing revealed another unexpected finding that significantly amplified the threat—a link between the car’s navigation and air suspension systems. This resulted in the height of the car changing unexpectedly while moving because the suspension system “thought” it was driving through various locations during the test, either on smooth roadways, when the car was lowered for greater aerodynamics, or “off-road” streets, which would activate the car elevating its undercarriage to avoid any obstacles on the road.

Yoav Zangvil, Regulus Cyber CTO and co-founder, explains that GNSS spoofing is a growing threat to ADAS and autonomous vehicles. “Until now, awareness of cybersecurity issues with GNSS and sensors has been limited in the automotive industry. But as dependency on GNSS is on the rise, there’s a real need to bridge the gap between its tremendous inherent benefits and its potential hazards. It’s crucial today for the automotive industry to adopt a proactive approach towards cybersecurity.”

Using low-cost hardware and software for assessment

The Regulus Cyber testing is designed to assess the impact of spoofing with low-cost, open source hardware and software, the same kind of technology that is accessible to anyone via e-commerce websites and open source projects on GitHub. Taking control of Tesla’s GPS with off-the-shelf tools took less than one minute.

The researchers were able to remotely affect various aspects of the driving experience, including navigation, mapping, power calculations, and the suspension system. Under attack, the GNSS system displayed incorrect positions on the maps, making it impossible to plot an accurate route to the destination.

Response from the Tesla Vulnerability Reporting Team

Prior to the Model 3 road test, Regulus Cyber provided its Model S research results to the Tesla Vulnerability Reporting Team, which responded with the following points at that time:

Any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime. Even though this research doesn’t demonstrate any Tesla-specific vulnerabilities, that hasn’t stopped us from taking steps to introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.

The effect of GPS spoofing on Tesla cars is minimal and does not pose a safety risk, given that it would at most slightly raise or lower the vehicle’s air suspension system, which is not unsafe to do during regular driving or potentially route a driver to an incorrect location during manual driving.

While these researchers did not test the effects of GPS spoofing when Autopilot or Navigate on Autopilot was in use, we know that drivers using those features must still be responsible for the car at all times and can easily override Autopilot and Navigate on Autopilot at any time by using the steering wheel or brakes, and should always be prepared to do so.

Tesla GPS spoofing attacks

Researcher comments

“This is a distressing answer by a car manufacturer that is the self-proclaimed leader in the autonomous vehicle race,” Zangvil comments. “As drivers and safety/security experts, we’re not comforted by vague hints towards future safeguards and statements that dismiss the threats of GPS attacks.” He offers the following counterpoints in response:

  • Attacks against any GPS system are indeed considered a crime because their effects are dangerous, as we’ve shown, yet the same devices we used to simulate the attacks are legally accessible to any person, online via e-commerce sites
  • Taking steps to “introduce safeguards for the future” indicates that spoofing is, in fact, a major issue for Tesla, which relies heavily on GNSS
  • In the case of cars, a spoofing attack is confusing in the best case, and a threat to safety in more severe scenarios
  • The more GPS data is leveraged in automated driver assistance systems, the stronger and more unpredictable the effects of spoofing becomes
  • The fact that spoofing causes unforeseen results like unintentional acceleration and deceleration, as we’ve shown, clearly demonstrates that GNSS spoofing raises a safety issue that must be addressed
  • In addition, the spoofing attack made the car engage in a physical maneuver off the road, providing a dire glimpse into the troubled future of autonomous cars that would have to rely on unsecure GNSS for navigation and decision-making
  • Given that the trust of the public still has to be earned as the automotive industry moves towards autonomy, the leading players are accountable for a responsible deployment of new technology
  • As Tesla clearly stated, drivers are responsible for overriding autopilot under a spoofing attack, so it appears its auto pilot system can’t be trusted to function safely under a spoofing attack.
  • Because every GNSS/GPS broadcast system can be affected by GNSS/GPS spoofing, the issue is everyone’s problem and shouldn’t be ignored; furthermore, governments and regulators that have a mandate to protect the public’s safety must engage in proactive measures to ensure only safe GNSS receivers are used in cars

“According to Tesla, they’ll soon be releasing completely autonomous cars utilizing GNSS, which means that, in theory, an attacker could remotely control the car’s route planning and navigation,” Zangvil says. “We’re obligated to ask what steps they’re taking to address this threat, and whether new safeguards will be implemented in its next generation of entirely autonomous cars.”

Although Regulus Cyber researchers tested only the Model S and Model 3, they concluded that the “disturbing vulnerability” of Tesla’s GNSS system is most likely company-wide, as the same chipsets are used across the Tesla fleet.

“Just a few months ago we saw that during a spoofing incident in a car show in Geneva, seven different car manufacturers complained that their cars were being spoofed. This incident proves that many other automotive companies that are working on the next generation of autonomous cars are also vulnerable to these attacks. As an industry, to win public trust and succeed, every car manufacturer should be proactive and prepare against these threats,” Zangvil says.

Don't miss