When it comes to protecting your mobile data, things aren’t so certain anymore. From phishing and fake hotspots to malicious apps and corrupt websites, your phone is vulnerable to a variety of far-off threats. And unfortunately, hacking abilities only stand to get more and more sophisticated with each passing day.
The latest threat to be uncovered by security researchers from Georgetown University and UC Berkeley demonstrates this growing vulnerability rather effectively. Through your phone’s personal assistant – such as Google Now or Siri – a hacker can infiltrate your phone with a slew of unintelligible commands. If you leave your phone on “an always so model,” then you could potentially be affected by this threat.
What this model means is that your phone listens for voice input on a continuous basis. In other words, to issue a command, all that’s required is a voice input. You don’t need to push a button or open an app. And to the researchers at Georgetown and Berkeley, this could eventually become a major problem.
Their research consisted of a smartphone positioned over ten feet away from a speaker issuing a set of commands. These commands did not sound like a legitimate language and appeared to be garbled – so much so that even a highly intelligent person would not have the ability to decipher these commands. After the speaker issued the commands, the phone’s personal assistant was able to open a webpage. This occurred with and without background noise present.
While it may not seem so bad for a webpage to open, the possibilities here are endless. The website opened could be corrupt, and malware could subsequently be installed on your phone. The researchers even suggest these commands could initiate the download of a malicious app or post embarrassing content to your social media profiles.
These commands could potentially come to you in the form of a popular YouTube video or even by means of a loudspeaker at a major public event. While the researchers admit you would notice the unintelligible sounds and immediately become suspicious, this may not always be the case. Currently, they’re working on a way to hide these sounds. Not only will you not understand these commands, but you won’t hear them either.