Criminal liability may be introduced in Russia for deepfakes

The punishment for the illegal use of images, voices, and biometric data of citizens could be up to seven years of imprisonment

The State Duma is preparing to introduce amendments to the articles of the Criminal Code of the Russian Federation ("Defamation", "Theft", "Fraud", "Extortion", "Causing property damage by deception or abuse of trust") related to the use of deepfakes. For more than a year, they have been actively used by cybercriminals, creating fake videos, voice recordings, and images of citizens through neural networks and AI. They are used for extortion and other crimes.

According to Izvestia, whose editors have reviewed the bill, it is proposed to supplement a number of articles of the Criminal Code of the Russian Federation with an additional qualifying characteristic. This is the commission of a crime using an image or voice (including falsified or artificially created) and biometric data. Those found guilty of such crimes will face fines of up to 1.5 million rubles and up to seven years in prison.

The development of computer technologies has led to an expansion of opportunities for creating video and audio materials based on samples of images and voices of people, artificially recreating non-existent events. Modern hardware and software systems, as well as the use of neural networks and artificial intelligence (deepfake, digital mask technologies, etc.) allow creating fakes that are almost impossible for a non-specialist to distinguish from reality.
Explanatory note to the draft law on amendments to the Criminal Code of the Russian Federation regarding the illegal use of deepfakes

Currently, there is no precise definition of what a deepfake is in Russian legislation. There is also no responsibility in the form of a fine or criminal liability for their use in criminal purposes in Russian legislation. This gap is proposed to be eliminated, as explained to Izvestia by the author of the bill.

We propose to introduce more serious responsibility under a number of criminal articles for the use of this technology. The fact that it is actively used has already been proven at a meeting with the president at the end of last year on a direct line. There are many examples of people being left with nothing because of forgeries of their voices. Therefore, it is proposed to provide for modern realities in the Criminal Code.
Yaroslav Nilov, author of the bill, head of the State Duma Committee on Labor, Social Policy and Veterans Affairs

However, Russians themselves, who want to create deepfakes for various purposes, both for illegal actions and for pranks, are already suffering from the desire to hide or replace their voice for various purposes. It was recently revealed that attackers have begun to disguise malicious files as neural network-based applications for changing voices. Such a fake AI application for changing the voice of Russians steals their personal data and gives fraudsters access to a smartphone or other device.

Read materials on the topic:

AI fraudsters extort money using fabricated voices of SUSU management

Hacking Telegram with AI — how fraudsters in Russia use your voice and passport. Requests to transfer money will sound in your voice — and they will most likely be believed

The Central Bank and the Ministry of Internal Affairs want to create an anti-fraud platform to combat fraudsters. And telecom operators are afraid of losing money

Now on home