Главное оружие — критическое мышление: как распознать фейк и не дать мошенникам использовать свой образ для обмана других

Fraudsters are using AI to create fake voices and faces to deceive people and get their money

Modern technologies are increasingly becoming a tool for fraudsters. With the help of artificial intelligence, scammers are faking voices and faces to deceive people and extort money. We tell you how to distinguish deepfakes and prevent scammers from using your image to deceive.

"Dad, help!"

"Dad, I've been in an accident, help!" — after these words, the connection was interrupted. Although the phone number on the screen was not recognized, the 84-year-old Muscovite had no doubt that he heard his daughter's voice. He didn't have time to recover when a new call came.

A stranger introduced himself as a witness to the accident and reported that the pensioner's daughter was the culprit in the accident, which resulted in injuries. She faces a ten-year term for this. He refused to hand her the phone, stating that the culprit of the accident was already unconscious. He demanded to act quickly: if the damage is compensated before the protocol is drawn up, prison can be avoided.

The pensioner said that he had almost one and a half million rubles stored at home. "That will be enough," the voice in the phone answered. Soon a "investigator's assistant" appeared in the pensioner's apartment, to whom the owner gave a bag of money.

A few hours later, the pensioner called his daughter, and she said that she had not been in any accidents. Only then did the man realize that he had become a victim of fraud.

As it became known from the message of the Moscow Prosecutor's Office, the attackers used a new technology — a deepfake, that is, a fake voice or image using neural networks. This method allows them to deceive not only elderly people, but also other people.

Call from the authorities

Another popular fraud scheme using deepfake is calls on behalf of management. Thus, Anatoly Kozlachkov, President of the Association of Banks of Russia, said that his voice, generated by a computer, "scared one very solid bank." And some top managers were called on his behalf with a request to lend money.

IT market participants reported similar cases: in some companies, fake managers tried to get money from ordinary employees.

Not only in Russia, but also abroad, celebrities are faced with becoming victims of fraudsters. It recently became known that attackers were trying to gain access to the funds of famous Italian businessmen, including fashion designer Giorgio Armani and Prada Chairman Patrizio Bertelli.

How to recognize a deepfake?

It used to be easy to distinguish a fake, but with the development of neural networks, it has become very difficult. However, even modern programs are not perfect. In some cases, deception can be seen with the naked eye.

One of the main signs of low-quality and quickly created deepfakes is the desynchronization of speech and lip movements. Also, unnatural movements of the pupils and blinking may indicate the artificial origin of the video, explained the team of specialists of the Ruptly video agency verification department.

In addition, deepfakes may have other anomalies: deformation of the anatomy of the face (spread of eyebrows, location of the nose), hands (the appearance of a sixth finger or the absence of one of the five), as well as a change in birthmarks or an unnatural appearance of vegetation on the face.

Ekaterina Kornyushina, founder of the deepfake content detection project www.aIDeepFake.ru, recommends paying attention to the following points: whether the light is reflected correctly on surrounding objects, whether the skin color is natural.

If a person suddenly turned his head, covered part of his face with his hand, or just appeared in the frame, the deepfake mask often disappears for a moment or becomes blurred. Sometimes elements of the mask fall on other objects — and this is immediately noticeable.
Philipp Shcherbanich, IT expert

The specialist suggests using the following advice: if you suspect that your interlocutor is not real, ask him to switch to another device. Perhaps it will not have special software installed.

There are programs that help identify deepfakes. These include McAfee Deepfake Detector, Sensity AI, and Deepware for video. These systems are trained on extensive databases with deepfakes and can determine whether an image or video is a fake. However, they cannot guarantee 100% accuracy.

The main protection is critical thinking

However, the main tool is still critical thinking. In the process of online communication, you should not make hasty decisions. Fraudsters often try to create a stressful situation and rush you so that you do not have time to notice the forgery. If you feel pressure from the interlocutor, take a break and do not rush to conclusions. Always try to call back yourself through another communication channel to make sure that your interlocutor really exists, Shcherbanich advises.

People whose images were used to create deepfakes can also suffer. In order for a deepfake to look convincing, attackers need to have a lot of source material. That is why deepfakes involving actors and other public figures are so plausible.

To avoid possible voice and face forgery, experts recommend not uploading your photos and videos to social networks, or at least limiting access to them.

Thus, in the era of artificial intelligence, compliance with digital hygiene comes into conflict with the very essence of social networks, created to share photos and videos with others. Perhaps users will have to reconsider their usual models of behavior on the Internet, which have become an integral part of their lives.

The use of fake calls has become one of the most dangerous methods of fraud. According to VisionLabs, tens of thousands of such attacks were recorded in 2024. It is expected that their number will increase tenfold in 2025.

Read materials on the topic:

AI fraudsters extort money using fabricated voices of SUSU management

Sber and the Ministry of Internal Affairs will begin a joint fight against deepfakes

A new level of deception: deepfakes have begun to be used to deceive on dating sites

Not in his own voice: the Central Bank warned Russians that fraudsters have learned to use deepfakes

Fraudsters used a deepfake of Larisa Dolina to take out a loan of 50 million rubles

Now on home