Southwest State University
Russian Federation
Kaliningrad, Kalinigrad, Russian Federation
343.85
Introduction. Creating synthetic human images by neural networks and Deepfake technology, based on illicitly obtained authentic biometric personal data for the subsequent commission of an offence using them, remains a significant and unresolved problem. In the age of digital transformation, the security of personal data, particularly biometric data, often depends not so much on the potential victim, whose data may be unlawfully misappropriated, but on third parties that collect and keep it, thereby inadvertently facilitating the emergence of criminal scenarios. In the development of contemporary crime prevention programmes and the introduction of new regulations pertaining to the use of digital devices and software, it is necessary to acknowledge the profound impact of scientific and technological advancements on the evolution of human communication. These advancements have pushed many processes and document flow into the virtual space. Concurrently, the methods to ensure the security of biometric personal data, which can be falsified and used for the subsequent commission of offences, are not sufficiently developed in contemporary practice. Methods. In writing article, various methods of cognition were used: dialectical, statistical, analytical method and documentary analysis techniques. The material of the study encompasses normative legal acts, statistical data, as well as scientific publications of authors examining the issues of using of fake biometric personal data in the creation of audio and video recordings for the subsequent commission of crimes. Results. The study revealed regulatory, organisational and other problems in controlling the collection, processing and storage of biometric personal data, as well as access to them during the creation of fake audio and video recordings. It is suggested that the Criminal Code of the Russian Federation be amended with a new provision stipulating criminal liability for the creation and distribution of fake audio and video recordings created by neural networks and Deepfake technology based on illicitly obtained biometric personal data.
biometric personal data; illegal access; neural networks; Deepfake; spoofing; human video recording; audio recording
1. Milovanova M. M., Shuruhnov V. A. Kibermoshennichestvo: vzaimosvyaz' sposoba soversheniya prestupleniya i lichnosti prestupnika // Rassledovanie prestupleniy: problemy i puti ih resheniya. 2024. № 2 (44). S. 63–71; https://doi.org/10.54217/2411-1627.2024.44.2.006.
2. Dobrobaba M. B. Dipfeyki kak ugroza pravam cheloveka // Lex Russica. 2022. T. 75, № 11 (192). S. 112–119; https://doi.org/10.17803/1729-5920.2022.192.11.112-119.
3. Zheludkov M. A. Izuchenie vliyanie novyh cifrovyh tehnologiy na determinaciyu moshennicheskih deystviy (tehnologiya deepfake) / Razvitie nauk antikriminal'nogo cikla v svete global'nyh vyzovov obschestvu : sbornik trudov po materialam vserossiyskoy zaochnoy nauchno-prakticheskoy konferencii s mezhdunarodnym uchastiem, g. Saratov, 16 oktyabrya 2020 g. Saratov : Saratovskaya gosudarstvennaya yuridicheskaya akademiya, 2021. S. 262–270.
4. Ivanov V. G., Ignatovskiy Ya. R. Deepfakes: perspektivy primeneniya v politike i ugrozy dlya lichnosti i nacional'noy bezopasnosti // Vestnik Rossiyskogo universiteta druzhby narodov. Seriya: Gosudarstvennoe i municipal'noe upravlenie. 2020. T. 7, № 4. S. 379–386; https://doi.org/10.22363/2312-8313-2020-7-4-379-386.
5. Kiselev A. S. O neobhodimosti pravovogo regulirovaniya v sfere iskusstvennogo intellekta: dipfeyk kak ugroza nacional'noy bezopasnosti // Vestnik Moskovskogo gosudarstvennogo oblastnogo universiteta. Seriya: Yurisprudenciya. 2021. № 3. S. 54–64; https://doi.org/10.18384/2310-6794-2021-3-54-64.
6. Li Ya. Ispol'zovanie tehnologii «dipfeyk» v Kitae: problemy pravovogo regulirovaniya i puti ih resheniya // Lex Russica. 2024. T. 77, № 11 (216). S. 21–31; https://doi.org/10.17803/1729-5920.2024.216.11.021-031.
7. Sviridova E. A. Pravila ispol'zovaniya tehnologiy dipfeyk v prave SShA i KNR: adaptaciya zarubezhnogo opyta pravovogo regulirovaniya // Sovremennoe pravo. 2024. № 3. S. 119–123; https://doi.org/10.25799/NI.2024.96.67.019.
8. Pozdnyak I. N. Cifrovye ugrozy v sovremennom mire: tehnologiya deepfake // Sudebnaya ekspertiza Belarusi. 2024. № 2 (19). S. 72–77.
9. Vinogradov V. A., Kuznecova D. V. Zarubezhnyy opyt pravovogo regulirovaniya tehnologii «dipfeyk» // Pravo. Zhurnal Vysshey shkoly ekonomiki. 2024. № 2. S. 215–240; https://doi.org/10.17323/2072-8166.2024.2.215.240.
10. Kupka I. P., Scherbakov S. S. Dipfeyk kak informacionnoe oruzhie sovremennosti // Dinamika mediasistem. 2023. T. 3, № 1. S. 375–381.
11. Alekseeva A. P., Belokobyl'skaya O. I. Vidy prestupnosti, obespechivayuschie sovershenie drugih prestupleniy, i vozmozhnosti ih preduprezhdeniya // Vestnik Kaliningradskogo filiala Sankt-Peterburgskogo universiteta MVD Rossii. 2024. № 2 (76). S. 9–15.
12. Klimova Ya. A. Kriminalisticheskiy analiz prestupleniy, sovershennyh s ispol'zovaniem dipfeyk-tehnologii // Vestnik Kaliningradskogo filiala Sankt-Peterburgskogo universiteta MVD Rossii. 2024. № 2 (76). S. 29–35.
13. Golyatina S. M. Kriminalisticheskoe prognozirovanie distancionnogo moshennichestva // Vestnik Kaliningradskogo filiala Sankt-Peterburgskogo universiteta MVD Rossii. 2024. № 4 (78). S. 75–80.
14. Zaharov N. D. Pravovye osnovy regulirovaniya otnosheniy v informacionno-telekommunikacionnyh setyah v Rossiyskoy Federacii v ramkah protivodeystviya prestupnosti (chast' 1) // Vestnik Volgogradskoy akademii MVD Rossii. 2024. № 2 (69). S. 181–190.
15. Zaharov N. D. Pravovye osnovy regulirovaniya otnosheniy v informacionno-telekommunikacionnyh setyah v Rossiyskoy Federacii v ramkah protivodeystviya prestupnosti (chast' 2) // Vestnik Volgogradskoy akademii MVD Rossii. 2024. № 3 (70). S. 119–127.