Amazon has always tried to push its Alexa-enabled smart speakers as a platform, boasting the number of available “third-party” skills (more than 100,000 in the latest score). In our experience, most of these skills are useless gimmicks; one-note jokes that you install and forget. But it turns out that they could also pose a privacy threat.
The first large-scale study of privacy vulnerabilities in Alexa’s skills ecosystem was conducted by researchers from the State of North Carolina and the Ruhr University Bochum in Germany. They found a number of concerns, especially in the research processes Amazon uses to test the integrity of each skill. Here is a quick summary of their findings:
- Activating the wrong skill. Since 2017, Alexa will automatically enable skills when users ask the right question (otherwise known as a ‘call phrase’). But researchers found that there were only 9,948 skills in the US store duplicate call phrases. This means that if you ask Alexa for ‘space facts’, for example, it will automatically enable one of the many skills that this phrase uses. Choosing that skill is a complete mystery, but it can lead to users activating the wrong or unwanted skills.
- Publish skills under false names. If you are installing a skill, you can check the name of the developer to ensure that it is reliable. But researchers have found that Amazon’s research process to control developers is not very safe. They were able to publish skills under the names of big companies like Microsoft and Samsung. Attackers can easily publish skills that pretend to be from reputable companies.
- Change code after publication. The researchers found that publishers can make changes to the backend code that use post-publication skills. This does not mean that they can change a skill to do just about anything, but they can use this loophole to turn questionable actions into skills. For example, you could publish a child skill that is verified by Amazon’s security team before changing the backend code to ask for sensitive information.
- Lax Privacy Policy. Privacy policy is supposed to inform users about how their data is collected and used, but Amazon does not require skills to have associated policies. Researchers have found that only 28.5 percent of U.S. skills have valid privacy policies, and this figure is even lower for child-oriented skills – just 13.6 percent.
None of these findings are a smoking gun for a specific Alexa skill that deletes unseen data. But together they paint a troubling picture of Amazon’s (un) attention to privacy issues. With that in mind, this is probably just the right time to tweak the Alexa skills you have enabled on your devices.
You can do this via the Alexa app or, more easily, via the internet. Just go to alexa.amazon.com, sign in to your Amazon account, click “Skills” on the sidebar, then “your skills” in the top right corner and disable any skills you do not use. I just checked my own account and found that over the years I have installed over 30 from different tests. It has now been cut to a healthy try.
We can only hope that Amazon pays a little more attention to this area in the future. In a comment to ZDNet, a company spokesman said, “the safety of our devices and services is a top priority” and that the firm regularly reviews to identify and remove malicious skills. Maybe some of the protocols need to be updated.