Admit it. Siri is great if you don’t have a free hand to send a text or you need a quick answer to the question, “What is the capital of Vermont?” And with every new update to iOS, Apple continues to refine and expand Siri’s voice-recognition capabilities, in turn giving users a whole new set of reasons to love this slightly monotone, snark-enabled assistant. However, there’s been a few bumps along the way. In the past year, a pair of exploits were discovered in Siri’s security that allowed criminals to access personal information or remotely gain control of an iPhone’s functionality. Apple was quick to release updates for some of these issues, but it’s important for engaged smartphone users to understand how these incidents occurred so they can better protect themselves in the future. From Siri to Google Now, here is a brief synopsis of two recent voice-recognition-based security breaches.
As you probably know Siri is accessible through the lockscreen, which leaves hackers with a small window into your phone even if the front door is locked. The flaw was first discovered in iOS 6 by a 37-year-old-Spanish soldier named Jose Rodriguez who posted a YouTube clip of him bypassing the lockscreen to post on Facebook, view notes and access the phone’s call history. The video garnered a lot of attention, and by the time iOS 7 was released, a whole new wave of exploits were found that added up to one, simple conclusion: Siri just isn’t a very good gatekeeper. The good news is that these unwanted intrusions mostly fell under the category of prankster behavior. Unlike with a mobile wallet security breach, your private information still remained protected as long as your lockscreen had a passcode. Still, users have questions as to why Apple would have a default setting that allowed for this kind of incident — one that is still present in the latest update. For now, the burden of protection is on us. The only way to keep someone from accessing your personal information through Siri is to head to “Settings” and disable her lockscreen functionality. It’s a minor inconvenience that might save you some embarrassment down the road.
The latest voice-recognition security fumble affects both Siri and her Android counterpart, Google Now. As Wired reports, it was discovered by a research team at ANSSI, an agency that handles information security for the French government. It’s almost too clever to be true. They discovered that hackers could potentially trigger voice commands from up to 16 feet away as long a microphone-enabled pair of headphones was attached to the device. The hack uses the headphone cord as an antenna to transmit electromagnetic waves that are understood by Siri to be voice commands. If the phone’s owner leaves it unlocked and unattended, a hacker could remotely dial their own number and use the phone as an eavesdropping device. They could even manipulate the browser or email application to download and send malware or spam, or phish for personal data. In a crowded location like a stadium or airport, this exploit has the potential to affect several users with one attack. Although it hasn’t been remedied with a software update from either Apple or Google, you can take steps to protect your phone in the meantime. Never leave headphones plugged into your phone when you aren’t using it, and disable voice recognition services when the phone is locked.