Не своим голосом: ЦБ предупредил россиян, что мошенники научились использовать дипфейки

Мошенники стали чаще использовать дипфейки, чтобы выманить деньги у россиян

Scammers in Russia are increasingly using deepfake technologies to steal money. With the help of neural networks, fraudsters create a realistic video image of a person, on whose behalf they send messages to their relatives and friends, the Central Bank's press service reported.

Even famous artists fall for the tricks of fraudsters. Recently, scammers used a deepfake of Larisa Dolina totake out a loan of 50 million rubles.

The regulator emphasized that a message from an acquaintance asking for financial assistance can often be a trick by scammers, and also compiled a list of recommendations to avoid falling for it:

  • be sure to call the person on whose behalf the money is being requested and verify the information
  • if it is not possible to call, ask a personal question in the message that only this acquaintance knows the answer to
  • check the video message for sound defects, monotonous speech of the interlocutor, unnatural facial expressions, these are the most noticeable signs of a deepfake

Earlier it became known that fraudsters began to disguise malicious files as applications based on neural networks for changing voices. Such a fake AI application for changing voices among Russians steals their personal data and gives scammers access to a smartphone or other device.

Read materials on the topic:

AI scammers extort money using fabricated voices of the SUSU leadership

Telegram hacking with AI - how scammers in Russia use your voice and passport. Requests to transfer money will sound in your voice - and they will most likely believe it

The Central Bank and the Ministry of Internal Affairs want to create an anti-fraud platform to combat scammers. And telecom operators are afraid of losing money