• Smart speaker voice apps, by third-party developers, can be used to listen in on users or execute vishing (voice-phishing) attacks to extract their passwords.
  • Amazon Echo and Kindle devices were recently found vulnerable to two old KRACK vulnerabilities than can lead to user credential leak.

Science fiction has finally arrived and it is here for good, and bad. In this automated world, we have graciously surrounded ourselves with smart assistants—such as Google Assistant, Alexa, and Siri—who obey our commands. As per a 2018 report, Amazon Echo and Google Home installation counts have already crossed 50 million. Since the past few years, there have been multiple instances of these devices being exploited to spy on people.

The privacy and security issues in these devices range from voice recording leaks to the abuse of these devices by malicious actors, who continue to exploit vulnerabilities in these devices to date.

Threats, Threats, Threats

A report by Security Search Labs discusses both the phishing and eavesdropping vectors in smart speakers from Google and Amazon; both exploitable via the backend.

  • It gives third-party developers the ability to extend voice capabilities of devices by customizing commands.
  • Through this, hackers can abuse to listen in on users or vish (voice-phish) their passwords.

In its disclosure note, the report acknowledges that the vulnerabilities were shared with Amazon and Google through their responsible disclosure process.

Also, in other research, ESET Smart Home researchers found Amazon Echo 1st gen and Kindle 8th gen devices vulnerable to two KRACK vulnerabilities; which may also lead to user credentials leak. Initially discovered in 2017, the threat still exists for many Wi-Fi enabled devices.

According to the ESET team, the vulnerabilities allow attackers to:

  • Decrypt the data transmitted and forge packets;
  • Replay old packets to cause a DOS attack or interruptions;
  • Intercept sensitive details such as passwords, session cookies, etc.

As per the report, Amazon distributed a new version of software application wpa_supplicant to fix the vulnerability. You’re safe if you have it. Or, you can go into Echo and Kindle settings to ensure the latest firmware.

Amazon, Google and Apple’s Human Vetting and Conditions

Earlier this year, we also came across not-so-surprising revelations about how giants themselves, or via contractors, hear the recordings without the knowledge of the device owners. After whistle-blowers’ reports on Siri recordings and Alexa activations, many security groups raised concern on the degree to which they listen to humans.

  • As per reports, whistle-blowers admitted to listening to couples having sex and criminals making drug deals on the Siri and Alexa respectively.
  • Also, a Google subcontractor had shared more than 1,000 excerpts from Google recordings, which was used (by journalists) to identify some individuals

How to protect yourself from someone else listening?

Amazon and Google has offered straightaway solution to disable human vetting for their virtual assistants, whereas Apple plans to release a software update that will let people opt into its program for quality control.

Listed below are some ways to minimize the information shared with the companies through various assistant devices.

For Amazon: Alexa devices include a physical button to disable their microphones.

  • You may hit the kill switch whenever you are having sensitive conversations. One red blink is the indication for a switched off microphone.
  • Some Alexa devices (like the Echo Spot alarm clock) also have a built-in camera. You may simply request “Alexa, turn the camera off” and it will stop recording.
  • If you still wish to get rid of the camera, buy a webcam cover.

Amazon also provides the Alexa privacy hub, which contains a thorough explanation of the types of data collected by the virtual assistant and how its privacy settings can be changed.

For Siri: You may choose to disable Siri on an iPhone to erase your data and reset your identifier.

  • Disabling Siri will also delete your data associated with it, including the recordings.
  • To reset that random identifier, turn off Siri and then turn it on.

For Google Home: There are several privacy settings for Google Assistant on Android phones and Google Home smart speakers to tweak controls.

  • Though Human review program is now inactive, you can still opt-out to be sure.
  • It also lets you automatically delete Google Assistant requests made after a period of time.
  • You may set your recordings to automatically delete.
Cyware Publisher