The news is currently going around the world that an employee of a multinational company fell into the trap of unknown fraudsters who, using artificial intelligence, tricked him into transferring a total of 200 million Hong Kong dollars (around 24 million euros) into their bank accounts.
The Hong Kong police made the fact known, and also explained how the scam was carried out in several phases: firstly, the employee - who remained anonymous as well as the company that was the victim of the fraud - received a message via email from the head of the finance department of the company where he works, based in the UK.
In the message he was asked to secretly initiate a series of large transactions to various accounts.
Initially suspicious of what he thought was a phishing email, the man was convinced to move the funds after being called to a conference call attended by both the manager and several colleagues.
An apparently normal meeting, especially with people that the man knew, but who then all turned out to be digital clones generated with AI.
A disturbing story, which the Hong Kong police chose to tell at a press conference also to warn the widest possible public against this type of fraud, which seems to be multiplying in the region.
A criminal operation that strikes the imagination above all because what convinced the man was precisely the credibility of the clones met in video calls, which he said perfectly resembled real people both in appearance and in voice.
Then there is also the fact that this is not an isolated case: the Hong Kong authorities have in fact reported that this type of scam is on the increase and that, recently, at least six arrests have been made in relation to this type of crimes.
There are also at least 20 occasions in which deep fakes have been used to deceive and bypass facial recognition systems, while at the same time using stolen identity documents.
Between July and September last year, around 90 loan requests and 54 bank account registrations were made using this technique.
Reproduction reserved © Copyright ANSA