The grandchild trick is used by fraudsters to rob the elderly of their savings. AI could significantly increase the risk for pensioners.
Kassel – The grandchild trick is no longer a new phenomenon. Scammers call the elderly, pretend to be relatives, and try to trick their victims into giving them money. To do this, they often use dramatic stories, which is why everything has to happen quickly, such as unpaid hospital bills. In order not to fall for the scam, you should be careful on the phone – especially if money is demanded from the other side of the line. But what if the caller not only pretends to be a relative, but also sounds the same?
As the US daily newspaper The Washington Post reported, AI is increasingly being used for fraud. In Canada, an elderly couple received a call from their alleged grandson. He said that he was in prison – he didn't have a mobile phone or wallet – but he urgently needed money to pay the bail.
Scammers use AI to reproduce the voices of loved ones, making it difficult to detect scam calls. (Symbolic image) © K. Schmitt/Imago
Fraudsters use AI for deceptively real fake calls
Since the voice on the phone sounded exactly the same as usual, the elderly couple did not suspect anything. And decided to help out the grandson. The couple withdrew money, but they couldn't get the full amount in a single bank. That's why the seniors went to a second branch. There, an employee became aware of the two and warned them – there had already been an incident of this kind in the bank.
AI is making such strong progress that a few sentences are now enough to reproduce a voice with simple tools, the Washington Post reported. With the help of AI, fraudsters can use almost any vote they choose. According to experts, the authorities are not yet in a position to put a stop to the new scam. Victims often have little clue to identify a perpetrator. This makes it difficult for the police to investigate such cases.
Fraud with AI – Voices on the phone sound deceptively real
But what can you do yourself to avoid becoming a victim of AI fraud? There are various measures to protect yourself from fraud on the phone. WDR recommends staying calm for the time being. A certain amount of suspicion is warranted. Money should never be given to strangers. In the case of the elderly couple from Canada, for example, they could have insisted on accompanying the alleged grandson to the police or prison to bail there.
Angry Netflix customers are up in arms: "We've quit and so have many others"
Major passport innovation planned – it will be significantly more expensive for holidaymakers
Real estate prices are plummeting: This is how devastating it looks on the housing market
Car washing prohibited on one's own property – up to 50,000 euros fine
Haribo sounds the alarm and launches recall – do not eat popular cult candy under any circumstances
Fancy a voyage of discovery?
Fraudsters are not the only ones who use AI to reach their victims over the phone. With a particularly perfidious scam, criminals pretend to be investigators from Europol and Interpol. (Kiba)