Крадут ваши голоса: мошенники звонят россиянам под видом соцопросов для создания дипфейков

Fraudsters may pose as employees of various research organizations and centers for sociological research

Malefactors are calling Russians, posing as representatives of research organizations and sociological centers. Their goal is to record a person's voice to create deepfakes. This was reported by Tatyana Deshkina, Deputy Director for Products at VisionLabs.

Fraudsters may pose as employees of various research organizations and centers for sociological research in order to record a person's voice and clone it in the future to create a deepfake. The main task of the attackers is to keep the user on the line as long as possible.
Tatyana Deshkina, Deputy Director for Products at VisionLabs

The expert explained that creating a voice clone requires a recording of about 20 seconds in length, in which the speaker expresses different emotions and there is no extraneous noise. The longer the recording, the more accurate the fake.

Attackers strive to give the recordings a natural feel in order to deceive detectors. To do this, they add effects of poor communication and background noise. For example, the sounds of passing cars or someone's conversations.

Developers take all these factors into account when developing deepfake detectors and are constantly updating such systems so that they can recognize new types of deepfakes.
Tatyana Deshkina, Deputy Director for Products at VisionLabs

The most risky are complex methods that combine a fake voice and image or video recording. For example, attackers can create a fake account in a messenger or social network, and then send fake video and voice messages. This can mislead even those who are well versed in technology. However, according to the expert, such methods are still difficult to implement technically.

Deshkina added that financial institutions and government services use multi-factor authentication and deepfake detectors to protect users. In the future, mobile phones, social networks and messengers will introduce systems for recognizing content created by neural networks.

To protect yourself, use a caller ID. Experts advise not to disclose confidential information or personal data, regardless of who is calling. Most often, this data is not needed.

The State Duma plans to introduce amendments to the articles of the Criminal Code relating to libel, theft, fraud, extortion and causing property damage by deception. These amendments are aimed at combating deepfakes, which have become a powerful tool in the hands of cybercriminals. With the help of neural networks and artificial intelligence, attackers create fake videos, audio recordings and images of people, using them for extortion and other illegal activities.

Read materials on the topic:

AI fraudsters extort money using fabricated voices of SUSU management

Hacking Telegram with AI — how fraudsters in Russia use your voice and passport. Requests to transfer money will sound in your voice — and they will most likely be believed

Code word to protect against deepfakes: the State Duma called on Russians to create a family password

MTS AI system: fighting deepfakes with 97.5% accuracy

Not in their own voice: the Central Bank warned Russians that fraudsters have learned to use deepfakes

Now on home