You’ve probably heard about the ‘friends’ who, late at night yell “Alexa, play Slayer at full volume!” through a neighbour’s letterbox. It’s an amusing story, it’s got the ring of authenticity because as a security breach, it’s not only plausible, it’s entirely possible. As a prank it’s annoying, but harmless But what it does highlight is one of voice control’s most obvious and easily exploited vulnerabilities: if you don’t train Alexa or Google Assistant to respond to your voice alone, anybody within earshot can make them do things.
For more than two years Google and Amazon have also been aware of another possible chink in the armour that surrounds voice responsive technology, that it’s so sensitive and works on frequencies beyond human hearing, meaning that commands loud and clear enough to get Smart Speakers and Home Hubs to do anything can be broadcast without the owner ever being aware.
While you can train your device to respond to your voice, and the voices of members of your household alone, without proper security there’s always a risk that your Smart Home Devices can be used to act fraudulently. This can include anything from buying goods and services using your authenticated account, buying non-existent goods from their online market place, to making calls to premium rate numbers which the hacker owns, meaning that they funnel money from your bank account, via your phone bill, pay pal or credit card, into theirs.
You may have stopped Alexa from paying for stuff without your consent because you’re security conscious, or you don’t trust the kids not to buy extravagant skins and skills on Fortnight but that won’t stop it from accessing and downloading malware if that’s what the command tells it to do. Once malware or other security violations have been set in motion it’s entirely possible for anyone with a yen to take over your home security devices.
If you’ve been living with Alexa for any length of time you’ll know how she can’t distinguish between the radio, TV, YouTube, and real voices. If she mis-hears her name being said she’ll start answering to the best of her ability, and researchers in China found that inaudible commands are just as effective as those spoken in the human audible range. Those commands included taking photos via webcams, opening spywear, downloading viruses, listening via microphones and starting phone calls over the net. The researchers found that in order to be effective they had to be very close to the device, however, if the command is incorporated into the media that the device is playing, say in a video or an ad during a podcast then, if there is no other security set up, your Smart Speaker has no other choice but to obey.
It’s not only Smart assistants, speakers, and the like which are susceptible to high frequency assault, and it’s not only high frequency which can be used to achieve the same ends. Malicious commands can be ‘hidden in plain sight’ within white noise or general hubbub. You might not hear it, but nevertheless it’s there and many Smart devices, including smart phones, and tablets as well as home hubs will pick it up.
While this vulnerability has been known about and experimented with by a variety of institutions including Zhejiang University in China and U.C. Berkeley since 2017, but the problem still exists and, while Amazon don’t talk about their security activities, they don’t appear to have dealt with the issue yet, besides making voice recognition a useful and easy to set up and use.
As well as being a threat to your online security, they can be a threat to your physical security too. If someone can give commands to your Smart Hub to access websites they can equally use the technology to get access to your home. They can use it to unlock doors, turn off security cameras, and erase CCTV recordings just in case they caught anything.
Naturally there are several ways you can take security into your own hands, something that we at Briant Communications recommend whole-heartedly.
If you’re worried about ultrasonic attacks on your Home Hub (or even just your mates yelling through the door late at night) ensure that you’ve trained your Alexa devices and Google Assistant to recognise your voice and the voices of your family alone.
Voice recognition isn’t perfect (or else it wouldn’t be vulnerable to ultrasonic attack) so it’s a good idea to make sure that you apply verification for everything. That way if your device tries to spend your money it will need you to verify, via a password or security code, that you are indeed you and that the purchase is legitimate.
When your Alexa, Google Assistant or other home hub is trained to obey specified voices only it’s much harder for outsiders to control the connected devices within your home. Strangers won’t be able to unlock doors or control security cameras but those who have been added will still enjoy all the controls that come with smart home technology.
If you’re not sure how to apply rigid security to your home network, check out this article which gives simple instructions on the basics you need to apply as soon as you take your Smart Home devices out of the box.