The Limited Times

Now you can see non-English news...

Amazon Echo and Google Home: Berlin Hackers Turn Smart Speakers into Bedbugs

2019-10-20T18:25:38.732Z


Berlin security researchers have developed apps that allow users of an Amazon Echo or Google Home to listen in unnoticed under certain circumstances. The controls of both companies had failed.



Smart speakers should be useful friends and helpers, not secret listeners. But researchers from the Berlin Security Research Labs (SRLabs) have found clever ways to listen to the users of an Amazon Echo or Google Home unnoticed. Their methods require a certain carelessness of the victims, but are explosive enough to force both companies to react.

Starting point are the so-called skills for Alexa or the actions for the Google Assistant, so the external apps for the virtual assistant in the smart speakers. Luise Frerichs and Fabian Bräunlein, security researchers at SRLabs, have developed seemingly innocent skills and actions and passed them through the security controls of Amazon and Google. In fact, these do other things than the users expect.

An Alexa skill, for example, superficially turns out to be a horoscope app. At an appropriate voice command, Amazon's assistant asks for the sign of the zodiac and then reads out a horoscope. Users can abort this with the command "Alexa, Stop", after which the app says good bye.

But after Amazon's mandatory control and approval of the skill changed Frerichs and Bräunlein the code in the background, without it had a re-examination result. The command "Stop" then ensures that users hear a "goodbye", but the program remains active. Users who do not notice and then start a conversation within earshot are intercepted.

Inadvertently, they give the command themselves: The fake horoscope skill starts recording as soon as someone says "I" or some other security researcher's word that suggests personal information is following. But it could be a whole catalog of words that would enable listening, up to 2500 words are called in developer forums.

You can see it in this video.

For Google Assistant, Frerichs and Bräunlein programmed a random number generator. Also their code changed after the review and release by Google. On command, the generator outputs a random number and says goodbye with a tone that every user would interpret as a kind of switch-off signal.

But even in this case, the speaker or the action remains active. In the code deposited is the speech of unpronounceable Unicode characters, the Google Assistant remains so mute, because he can not read the string. However, he waits for the inaudible speech again on a command from the user. If none comes, the process repeats itself.

If the user says something, the supposed random number generator records this and sends it to the server of Frerichs and Bräunlein. A specific activation word is not necessary. Then the app responds again with a brief silence, after which it waits for further commands from the user, even if he no longer communicates with the loudspeaker.

Again, there is a video of the SRLabs.

In addition, Frerichs and Bräunlein have devised a phishing method to get at the passwords of the Amazon or Google users. The corresponding apps were again changed after the review and release and responded to any question of users with an error message. This said that the corresponding function was currently not available.

After that, the app again read only inexpressible strings, for example, for a minute. After this period of silence, she played a message saying, "There is an important security update for your device, please say 'Start' followed by your password." For inattentive users, that had to sound as if the request came not from the app, but directly from Amazon, as seen in this video.

So, the two SRLabs experts creatively made sure that their loudspeaker apps stayed active and listened in when users thought that the conversation with their devices was actually over for the time being. Nevertheless, several restrictions apply to the interception and phishing attempts.

  • First, among the thousands of apps victims have to find and use the malicious ones. Karsten Nohl, head of SRLabs, told the SPIEGEL: "The victims can not pick a criminal directly, but they can define their target groups via the skills." In this respect, the malicious skills are similar to malicious smartphone apps in an app. Store waiting to be installed ".
  • Secondly, the victims must overlook the fact that active, ie listening, smart speakers are always recognizable by their LED lights - which may well occur depending on location and room lighting or the attention of the users.
  • Third, they must remain within range of the microphones of their speakers, otherwise, of course, the interception does not work.
  • Fourth, in the case of the phishing attempt, the victims should not become suspicious when asked for their password and should say it out loud. Nohl is sure that happens.

Even Amazon and Google, which were informed some time ago by the SRLabs, consider the risk sufficiently realistic enough. An Amazon spokesperson said on SPIEGEL's request, "We've taken protective measures to detect and prevent this type of skill behavior, and skills are denied or removed as soon as such behavior is identified." Amazon did not want to reveal how these protective measures look like.

photo gallery


34 pictures

Smart speakers in the test: Gadgets compared with Alexa and Google Assistant

Google spokeswoman Lena Heuermann wrote on request: "We prohibit and remove any action that violates these guidelines." The actions developed by the researchers have cleared Google. "We use additional mechanisms to prevent this in the future."

For the users of smart home devices, the success of the SRLabs means that some attention in dealing with the comfortable everyday helpers makes sense: If you want to use skills or actions, they should try to make sense of each provider.

You should from time to time check that the LEDs on the speakers only light up when in use. You should know that Amazon and Google would never ask for their Amazon or Google Account password through their smart assistants. And they should enable two-factor authentication to protect their accounts from being taken over by strangers. Amazon's instructions are here, Google's here.

Source: spiegel

All tech articles on 2019-10-20

You may like

Life/Entertain 2024-03-14T20:25:56.376Z
Life/Entertain 2024-03-08T17:08:46.700Z
News/Politics 2024-04-04T15:18:41.155Z
Life/Entertain 2024-03-19T07:19:44.696Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.