Why would you ever trust Amazon’s Alexa after this?

Amazon Echo Show 10.jpg

Proficient, but not necessarily reliable?

Amazon

I was just wondering the other day if it would be nice to have a cuckoo clock in my kitchen.

more technically wrong

An Amazon Alexa-powered cuckoo clock, that is.

I came to the conclusion that the idea was bonkers, just like most of Alexa activities.

But we all have our prejudices, and many Americans are very happy that Amazon’s Echos and Dots are being scattered across their homes to make their lives easier.

Alexa can even buy you a mommy if you want.

But perhaps Alexa lovers should be warned that things may not be as enjoyable as they seem.

Skills? Oh, everyone has skills.

New research from concerned academics at the Ruhr University Bochum in Germany, along with colleagues from North Carolina who are equally concerned – and even a researcher who joined Google during the project – may just make Alexa owners wonder about the true meaning of an easy life.

The researchers looked at 90,194 Alexa skills. What they found was a safe Emmenthal that would make a mouse wonder if there was cheese.

How much do you want to shiver, oh happy Alexa owner?

How about this sentence from dr. Martin Degeling: ‘A first problem is that Amazon has partially activated skills automatically since 2017. Previously, users had to agree to the use of each skill. Now they barely have an overview of where the answer Alexa gives. they come from and who programmed it in the first place. ‘

So the first problem is that you have no idea where your smart answer comes from when you wake Alexa out of her sleep. Or, indeed, how safe your question was possible.

Ready for another quote from the researchers? Here you go: “When a skill is published in the Skill Store, it also displays the name of the developer. We found that developers can register themselves with any business name when they create their developer account with Amazon. This makes it a attacker easy to emulate any known manufacturer or service provider. “

It’s a kind of thing that makes us laugh when big companies are hacked – and not told for months or even years.

These researchers actually tested the process for themselves. “In an experiment, we were able to publish skills in the name of a big company. Valuable information from users can be tapped here,” they said modestly.

This finding was also steel. Yes, Amazon has a certification process for these skills. But “there is no restriction on changing the backend code, which may change at any time after the certification process.”

In essence, a malicious developer could modify the code and start moving sensitive personal data.

Safety? Yes, it’s a priority.

Then, say the researchers, there are the skills developers who publish under a false identity.

Maybe it sounds too dramatic, though. All of these skills certainly have privacy policies that determine what they may and may not do.

Please sit down. From the research: “Only 24.2% of skills have a privacy policy.” So not three-quarters of the skills.

Do not worry, but there are even worse: ‘For certain categories such as’ children’ and ‘health and fitness’, only 13.6% and 42.2% skills are a privacy policy respectively. health-related skills must meet higher standards of data privacy. “

Of course, I asked Amazon what he thinks of these slightly cold findings.

An Amazon spokesperson told me: ‘The safety of our devices and services is a top priority. We conduct security assessments as part of skills certification and have systems in place to continuously monitor live skills for possible malicious behavior. Any offensive skills we identify are blocked. during certification or quickly disabled. We are constantly improving these mechanisms to further protect our customers. ‘

It’s nice to know that safety is a highest priority. I like to get customers amused by as many Alexa skills as possible so that Amazon can collect as much data as possible, perhaps a higher priority.

The spokesperson nevertheless added: “We appreciate the work of independent researchers who are helping to bring potential issues to our attention.”

Some might translate it as, “Darn it, they are right. But how do you expect us to monitor all these little skills? We’re too busy thinking big.”

Hi, Alexa. Does anyone really care?

Of course, Amazon believes that its monitoring systems work well to identify true perpetrators. Somehow, though, it’s not quite the same as expecting developers to stick to the rules.

I also understand that the company believes that skills for children are often not linked to a privacy policy because it does not collect personal information.

For which one or two parents might it mumble, “Uh-huh?”

Finally, like so many tech companies, Amazon prefers to monitor and change your own permissions because it is very cost effective for Amazon. But who really has those monitoring skills?

This research, presented at the Network and Distributed System Security Symposium last Thursday, provides such candid reading that at least one or two Alexa users can consider what they did. And with whom.

Does the majority really care then? Until an unpleasant event occurs, most users just want an easy life while entertaining themselves by talking to a machine if they can turn off the lights easily.

After all, this is not even the first time that researchers are exposing the vulnerability of Alexa skills. Last year, academics tried to upload 234 policy-breaking Alexa skills. Tell me how many have been approved, Alexa? Yes, everyone.

The latest skills that researchers themselves have contacted with Amazon to offer a kind of ‘Hey, look at this’.

They say: “Amazon has confirmed some of the issues to the research team and says it is working on countermeasures.”

I wonder what skills Amazon is using to achieve this.

Source