Hacking virtual and augmented reality: Short-term FUD, long-term danger

hacking virtual augmented realityI believe virtual reality (VR) and augmented reality (AR) are on the cusp of mass success, and will dramatically change the way we use technology. However, with new technology comes new attack surfaces.

After watching security researchers and attackers pick apart the Internet of Things (IoT) so easily, it would be foolish not to consider the security of new technologies. So that begs the question – what are the risks of virtual and augmented reality?

To answer simply, AR and VR pose little risk today, and likely will post little risk for the next five years. However, these technologies could become seriously dangerous in the next decade. Let’s explore why.

Little new attack surface

Today’s VR and AR technology offer little new attack surface since they’re built upon existing platforms. At the highest level, AR and VR are mostly just new display and input mechanisms added to pre-existing devices. The underlying computers that power the technology—whether it’s a PC, console, or mobile device—haven’t really changed much. AR/VR tracking software doesn’t really need to connect to the Internet and doesn’t pose any greater risk than any other software you might add to your computer; perhaps even less risk than your average network connected game.

No valuable data

Another way VR/AR could increase your risk profile is by collecting new data that makes you a more valuable target. VR/AR technology does track the motion of your head (and sometimes your hands), but this data is of little use to a criminal attacker. Other data that VR/AR systems track, like voice and video, have risks of their own, but these aren’t unique to VR/AR – people have shared voice and video data online for years.

Current motion tracking data is relatively low fidelity and doesn’t provide much value to an attacker or criminal. For instance, VR motion tracking is pretty accurate, but the criminal value of knowing where your head or hands are positioned is negligible.

Currently no way to monetize a VR attack

Even with these drawbacks, attacks would still target VR/AR if they could find some way to monetize their attacks, like through social engineering for example. But fortunately for end-users, there’s little criminal money to be made with how most early adopters use AR/VR.

Today, VR is more commonly used for gaming. But since gamers already expect to be in a fantasy world disconnected from reality, there’s little opportunity for attackers to alter VR for social engineering either.

On the other hand, real AR currently exists in the form of experimental, non-consumer products like Google Glass and Hololens, and novelty uses like Pokemon Go. Until AR becomes such a ubiquitous tool that it’s used in everyday life, it won’t be trusted enough by the public, which means attackers won’t have the opportunity to trick us. In short, AR/VR doesn’t provide a rich enough target for attackers… yet.

Safe present, potentially grim future

So far, I’ve painted a relatively safe picture for AR/VR that will likely hold true for the next five years or so. However, as these technologies improve and become more commonplace, they will pose a bigger danger—especially AR. Here are some examples of what VR/AR hacks of the future might look like:

1. Finer tracking data allows for more dangerous hacks

Imagine the future of online shopping. This could become an entirely VR experience, where users literally browse a virtual store front, interact with items, and perhaps even try them on an avatar. Of course, the program used knows a person’s credit or debit card, so when an item is purchased for shipment, it can be processed and sent. However, for added security, online shops could make a user avatar virtually enter some sort of pin or code to verify the debit or credit used.

In the virtual world, a user might do this like they would in the real world, by using fingers to type the code on a virtual keypad (in the air from the user’s perspective). However, doing this means the system must record and transmit the fine finger tracking data showing fingers type a pin. If an attacker can capture that data, they have all they need to recreate a user pin (and they would presumably have some way to capture a card’s digital data too).

The future of AR/VR headsets may also include eye tracking, for a variety of reasons (one being to render a virtual world from the proper perspective). This eye tracking data could provide additional value to malicious actors. Knowing exactly what a user is looking at could reveal valuable information to an attacker. For instance, if a user was in an online shop, the eye tracking data would show what the person is most interested in. Marketers are already using web-coding tricks to monitor mouse movements and clicks, to help figure out buying habits and interests. Attackers that capture that data could recreate user actions, in the same way the manual pin entry could be mimicked in the above example.

2. Warping augmented reality

In the future, users may wear some sort of AR device that overlays a digital heads up display (HUD) over their real-life field of vision. Hands could be used to gesture and pin computer screens and browser windows to various walls in the real world. Virtual speed limit signs could pop up to warn drivers of excessive speeds. Contact information could remind users about acquaintances they meet (or even initial info for first time interactions). A user could even get instant calorie information when handling food items in a grocery store. As the public begins to use AR technology more regularly, this type of information will seem more like reality, causing users to trust it more. But if an attacker can hack these AR devices, it presents a huge opportunity for them to poison this information and get end users into trouble.

For example, what if an attacker wanted to hurt someone. Imagine a person driving in an unfamiliar town. Unbeknownst to them, the person is approaching a tight turn on the freeway. The turn has a physical speed limit warning sign telling the driver to drop speed to “25mph,” due to the corner. If an attacker could gain control of the driver’s AR system, they could overlay those 25mph signs with “60mph,” putting the driver in the dangerous situation of hitting a corner far too fast.

And that’s just the start of what they could do if humanity becomes “programmed” to trust their AR HUD. Frankly, once AR becomes advanced enough to overlay anything we see with real-time CGI, and we use AR as a second-nature, trusted “sense,” there is no limit to how hackers might alter our realities.

3. Your perfect digital VR clone

The ideal future of VR depends on a few things. First is full body digital tracking, where literally every movement of every appendage is tracked very finely, and recreated in digital space. Second, is the perfect digital avatar. In the future, the cameras or devices tracking us, will create quick maps of our physical 3D space, apply our texture to that model, and make a perfect replica in a virtual world. This might sound like sci-fi, but it’s closer than you think.

However, imagine the social engineering possibilities if malicious actors got a hold of a person’s 3D model, and a history of all their movements in VR. Animators and computer scientists have already created many methods to make a person sound like they said something they didn’t, based on previous recordings of their voice. They can even alter video of a person, to give them different expressions and lip movements. In fact, you can see a scary example of this on a site created by NPR’s Radio Lab.

While these fake videos haven’t been perfected yet, imagine how VR tracking data and accurate 3D models could change things. One of the unique identifiers of an individual is their unique movements and verbal or physical “ticks.” If compromised, these personal intricacies could allow hackers to socially engineer a person’s friends, or digitally impersonate a user.

Conclusion

You really shouldn’t worry about the security of AR/VR today. Most people are only using these technologies for entertainment, and they don’t introduce much new data or attack surface. The worst VR risk today is making you unaware of your real physical surroundings. That said, as these technologies mature, expect criminals to target them more. I look forward to us realizing AR and VR’s full potential, but we should head into this future with our defenses ready.

Don't miss