This new age in Artificial Intelligence is fascinating- and terrifying too. From ubiquitous digital assistants like Siri and Alexa to usage on factory floors, the impact of AI is by all means dizzying.
Your cell phone mostly features a voice assistant. At best, Apple’s Siri- for instance- may seem like a “naïve” helper that can check weather status, order pizza, or even read you a bedtime story. At worst, the chatbot can make your device susceptible to data theft and malicious attacks.
Just recently, researchers at Zhejiang University in China revealed that voice assistants are prone to eavesdropping by hackers using inaudible voice commands. With the gigantic leap that voice assistants have made in the tech sphere, we can’t (and must not!) cancel out the possibility of data hacking taking a sharp trajectory.
Even more disturbing is the reality that bot attacks are sparing no device- from smart home appliances to smartphones. So, this brings us to the big question: Has the popularity of voice assistants come at the cost of security? In this article, we take an in-depth look at how cyberattacks affect those using voice assistants. Plus, how you can leverage SAST solutions from Kaiwan to secure your applications, and more importantly, your code.
Most digital assistants out there today require a voice command to activate, “Ok Alexa” or “Hello Siri,” as it informs them that the user is ready to pop a question. As such, any uttering- on TV or radio- can accidentally spark the assistant to live status. While this might seem harmless, once the voice assistant is up, it will record everything and store it in its servers. So if you forget to delete a saved audio file, botnets may sniff and grab confidential data in a jiffy. However, this is just the tip of the iceberg.
Here, we dive into the top vulnerability points you should be aware of when using voice assistants.
A couple in the US recently discovered that their Alexa-based Amazon Echo Home Hub had picked up one of their private conversations and sent it to a person in the husband’s contact list. In their defense, Amazon responded that the mishap was due to “an unlikely string of events” and conceded that the smart app had misinterpreted the speech- as most digital assistants tend to do. The exchange was harmless- apparently about flooring- but imagine if it was more delicate.
Digital assistants like Google Assistant, Alexa, and Siri use voice recognition to serve as their primary technological interface. This means that they are always eavesdropping, even when they are not in use. According to researcher Mark Barnes, a hacker can utilize any voice assistant as a potential listening device to pique confidential data.
What’s worse than waking up and finding your most intimate photos flooding social media? Researchers from Washington University recently uncovered a new trick to infiltrate voice assistants through inaudible ultrasonic waves. Dubbed “Surfing Attacks“, this method can exploit a variety of smartphone features (iPhone & Android), from phone calls to reading messages, without touching the gadget.
Using a tapping device, a signal processing module, and an ultrasonic transducer, hackers can quickly propagate voice command signals to interfere with your device through the mechanical coupling method. The assistant will “think you are saying a command” and proceed to let up critical info.
But the worst-case scenario of this technique is yet to unravel. Experts warn that these high-frequency commands (above 20 kHz) could ‘overhaul’ your entire digital system. They could download a virus, add false events into a personalized calendar, or, even worse, send unsolicited messages.
Who doesn’t love a bit of privacy with their devices? Who likes changing passwords now and then?
Sadly, with hackers on our necks, the aspect of authentication (and verification) continues to be a pitfall.
Researchers recently uncovered the absence of potent user authentication systems in most digital assistants. Hence, hackers can meddle with a variety of smart cars, smart home systems, and other devices.
Enter ‘LightCommands’.
LightCommands inherently exploit flaws in MEMS systems by injecting voice sound into the assistants’ microphone, with the use of laser rays. So, the attacker doesn’t have to be close to the target device; they can focus laser light from separate buildings and within a distance of 110m.
Additionally, many smart devices or programs don’t use end-to-end encryption. This leaves essential data open to mining by third parties. For instance, Allo messaging app- a Google product- uses voice assistants without end-to-end encryption.
A cybersecurity expert at NET BOX, Michael Gazeley, shares the following insights:
“Most IoT devices are a hacker’s dream; each smart device is potentially another way into your home- to access data, abscond with your money, or steal your identity.”
And we couldn’t agree more.
Voice Assistants do offer a lot of conveniences. Yet, for some reason, products using IoT (Internet of Things) end up as honeypots for rogue access.
Smart assistants may test our wit because they raise a challenging question: Can intelligent assistants be hacked?
The answer is essential, since your privacy and personal information may be at stake.
The correct answer is YES- under certain conditions- hackers can exploit digital voice assistants’ vulnerabilities. But there are a couple of steps you can take to protect your information.
It’s important to filter the kind of information you feed to your voice assistant. What it doesn’t know can’t hurt you. Start by learning how to configure settings. Add this to the fact that you may not know who has access to your data- a voice assistant compiles and stores too much. After all, limiting loopholes is the first vital step to prevent hackers from hauling personal data.
Here are more tips to help you stay more secure:
It’s imperative not to connect security functions, such as a door lock or surveillance camera. You wouldn’t want a thief to shout “Open the Door” only for your digital assistant to oblige! Moreover, disconnect features that link to your address or your calendar- often rich sources of data.
Your credit card info, your passwords, and other credentials are just some things you don’t want your voice assistant to know.
Smart assistants allow you to bin commands or listen past them. This is an effective way to wipe any critical data that you don’t want lingering around. Remember, Siri or Alexa can always “re-learn” commands, and quickly too.
Mute your voice assistant the next time you look away, go shopping, or nap. That’s the easiest way to get it to stop listening.
Often, smart assistants can run purchasing errands. Any hacker sniffing the device can make a buy. That could be disastrous. The solution? Set up purchase credentials and keep them a secret.
Rather than an open hotspot, consider using a WPA2 encrypted Wi-Fi. If you occasionally have guests over, create a guests-only Wi-Fi network. Also, add to that list unsecured IoT devices.
You may have heard this a gazillion times, but configuring your digital assistant for voice recognition is a proficient way to avoid hackers. Tune it in such a way that it only recognizes your voice.
To prevent remote intrusions like the case with “Surfing Attacks,” use 2-factor authentication to beef up your device’s security.
Voice assistants carry the risk of regular cyber-attacks, and hackers will stop at nothing to exploit any vulnerabilities. The best way to protect your device is by minding the kind of information you share.
For developers who want to secure their code from hacking, sniffing, eavesdropping, or any other type of cyber-attacks, DevSecOps is the way to go. Kiuwan enables you to leverage SCA and SAST solutions to deliver high-end security to the application end. Our tried-and-tested scanning tool can not only detect flaws in software inputs/outputs but also conduct efficient testing. Add to that our capability to provide your software with lighting fast and scalable shielding within any DevOps ecosystem, and you’re sure to avoid recurrent security issues now and into the future. To learn more, request a trial today.