Preprint
Article

This version is not peer-reviewed.

The Weaponisation of Artificial Intelligence (AI): Legal Shortfalls and Regulatory Difficulties in Governing Non-Consensual Intimate Deepfakes (NCIDs)

Submitted:

18 July 2025

Posted:

21 July 2025

You are already at the latest version

Abstract
This article examines the increasingly prevalent threat posed by non-consensual intimate deepfakes (NCIDs), AI-generated sexually explicit content which resembles a real person, and critiques the current legislative framework in the UK, which fails to criminalise the creation of NCIDs. While the Online Safety Act (OSA) 2023 criminalises the distribution of NCIDs, the simple act of creating NCIDs for sexual gratification or future criminal activity remains lawful. Utilising interdisciplinary research, victim testimony, and a comparative analysis with similar legislation within the European Union (EU), this article submits that the current legislation in the UK fails to protect victim-survivors and overlooks the serious harms caused by the creation of NCIDs. Instead, we propose a strict liability model that focuses on a lack of consent rather than a defendant’s mens rea, aligning NCID offending with the broader context of image-based sexual abuse (IBSA). This article concludes that legislative reform is needed immediately to criminalise the creation of NCIDs, close legal loopholes and, most importantly, protect the dignity, privacy and sexual autonomy of victim-survivors.
Keywords: 
;  ;  ;  ;  ;  ;  ;  
Subject: 
Social Sciences  -   Law

Introduction

In recent years, the dissemination of deepfakes has flooded the internet, at the forefront of media attention was the widely shared explicit deepfake imagery of Taylor Swift, with one video gaining 47 million views before it was removed 17 hours later.1 A deepfake is an ‘AI-generated or manipulated image… that resembles existing persons, … and would falsely appear to a person to be authentic or truthful.’2 Concerningly, deepfakes are often weaponised, allowing users to generate and circulate non-consensual image-based sexual abuse (IBSA).3 The creation of such non-consensual intimate imagery (NCII) can have devastating impacts on a victim’s life,4 with research suggesting a correlation between the dissemination of NCII and the development of depressive and anxiety disorders in adolescents.5 Therefore, it is clear that the creation and distribution of NCII and non-consensual intimate deepfakes (NCIDs) have a notable impact on victims. Disturbingly, until the introduction of the Online Safety Act (OSA) 2023, no specific offence criminalised the distribution of the NCIDs. Recently, it seems the OSA has criminalised the sharing of NCIDs but has failed to address the creation of NCIDs,6 a decision the Law Commission recommended.7 Throughout this article, we will examine NCIDs and the current legal frameworks in place that govern this area before turning to consider our recommendations for legislative reform.
This article is structured as follows. Chapter 1 tackles the definitions of deepfakes and NCIDs. Here, we will also demonstrate how artificial intelligence (AI) and deepfakes have been weaponised into one of the primary origins of image-based sexual abuse (IBSA). Naturally, we will consider the devastating impact the creation and dissemination of NCIDs has on victim-survivors. Chapter 2 will examine the current and historical legislative frameworks which regulate NCIDs as well as other forms of IBSA. In particular, we will closely review the OSA and the amendments it makes to the Sexual Offences Act (SOA) 2003, which criminalises the dissemination of NCIDs. Additionally, we will consider legislative advances in the European Union (EU) using a comparative analysis model.8 Finally, we will critically evaluate recent governmental plans to criminalise the creation of NCIDs, which specifically raises concern about the government’s over-reliance on the motives of the defendant rather than the lack of consent. Building on this, in Chapter 3, we will put forward our synthesised recommendations for reform. It rejects the Law Commission’s recommendation that it ‘should not be a criminal offence simply to “make” an intimate image without the consent of the person depicted.’9 Instead, it argues that the introduction of a specific offence for the creation of NCIDs should be enacted with a particular focus on consent rather than the defendant’s mens rea.
It is pertinent to note that in the following article, you will notice an absence of any case law, a rarity in legal scholarship, barring jurisprudence. This is due to how current and advanced this topic is. During writing, the government announced plans to introduce legislation and then subsequently made further announcements and alterations regarding this legislation. We also had a change in government, which caused delays and changes in policy. Therefore, as of the date of writing, there is no case law that is directly relevant to this matter. Instead, we have focused our critical evaluation on the real-world experience of victims, past, present and proposed legislation and scholarly debate. This allows us to provide the most current, informed and relevant discussion on the matter of NCIDs.

Chapter 1: Artificial Intelligence, Deepfakes and the Creation and Dissemination of Non-Consensual Intimate Imagery

Artificial Intelligence and Deepfakes

Artificial intelligence is a program that allows computers to exhibit human thought processes, such as learning, understanding, problem-solving, and creativity.10 Such programs can be utilised to create so-called deepfakes alongside a plethora of other uses. A deepfake is any non-genuine digital media created utilising deep learning technology; this can include images, videos and audio.11 Deepfakes can also include partially fabricated content;12 for example, a genuine image of an individual may be altered to make the individual look like they are in an intimate situation. The recently approved EU AI Act provides the first widespread legal definition of deepfakes: a deepfake is an ‘AI-generated or manipulated image, audio or video content that resembles existing persons, objects, places, entities or events and would falsely appear to a person to be authentic or truthful.’13 The move to codify the definition of deepfakes indicates their rapidly increasing prevalence. In fact, in 2024, Ofcom reported that 43% of people aged 16+ had seen at least one deepfake in the last 6 months; concerningly, 50% of children aged 8-15 also reported having seen a deepfake within the same timeframe.14 More recently, joint research from the Alan Turing Institute and the University of Oxford propounded that 15% of people, on average, report exposure to harmful deepfakes, with 91.8% of respondents raising concerns regarding deepfake technology adding to online child sexual abuse material (CSAM), increasing disinformation and manipulating public opinion.15

The Weaponisation of Deepfakes: Non-Consensual Intimate Deepfakes (NCIDs)

Deepfakes can be used in a wide range of positive manners, benefiting society and individuals alike, for example, deepfakes have been used to help treat those with aphantasia (the inability to create mental images in one’s mind).16 However, as with everything in modern society, deepfakes can be abused and exploited by bad actors. An NCID is created when a perpetrator generates a deepfake of an individual, making them appear to be in an intimate situation without their consent. Criminals can use these NCIDs in various ways. For example, NCIDs may be used for sextortion,17 disrupting financial markets18 or creating CSAM.19
The rapid growth of intimate deepfakes online is astounding. Research by Deeptrace in 2019 identified that there were 14,678 deepfake pornography20 videos online, nearly a 100% increase from 2018.21 These videos have amounted 134,364,438 views across the top 4 websites alone; 96% of these deepfake videos online have been identified as being non-consensual.22 Our concerns have been echoed by both the previous Conservative and now Labour Governments.23 However, the legislation surrounding the ever-prevalent creation and distribution of NCIDs is poor, ineffective and in some cases non-existent; this will be discussed in detail in the coming chapters.
In addition to the prevalence of deepfakes and NCIDs there also comes difficulty when selecting appropriate terminology when discussing sensitive topics. The use of the term deepfake ‘porn’ is a frequently used colloquialism by the media,24 and even academics,25 when describing NCIDs. We opine the term ‘porn’ is misplaced and inappropriate. Rather, we submit that NCIDs appear to fall into the category of image-based sexual abuse defined as, ‘non-consensual creation and/or distribution of private sexual images,’ by McGlynn and Rackley.26 Terminology often sets debate frameworks and the routes for legal redress; therefore it is crucial to acknowledge that the creation and distribution of NCIDs is a form of image-based sexual abuse, not ‘deepfake porn’ or ‘revenge porn’.27 The use of the term ‘pornography’ is problematic, causing legislators to require images to be pornographic in nature or rooting culpability with a prerequisite that the images are generated for sexual gratification.28 This focus on the defendant’s mens rea is a stumbling block for prosecutions and securing closure for victims, this will be discussed in depth in the coming chapter. Additionally, McGlynn and Rackley opine that the monica ‘porn’ suggests the imagery is legitimate and ‘instils a sense of choice’; they submit this is inappropriate although note this does not suggest all pornography is abusive.29 To the contrary, others opine all pornography is abusive regardless of consent.30
To conclude, we submit that NCIDs are a form of image-based sexual abuse; as such, there is a necessity to stringently regulate this area to protect victim-survivors and hold offenders to account. We argue that due to the abusive nature and prevalence of NCID abuse, the government has a duty of care to the public, particularly women and girls, to introduce legislation that regulates the creation and distribution of NCIDs.

The Impact of Deepfake Technology on Victim Survivors

The psychological effects caused by the creation of NCIDs can have a significant effect on victims. Victims knowing their manipulated, sexualised image is being viewed for the sexual gratification of others without their consent can be profoundly disempowering,31 especially in the current climate where there are no options for redress until the image is shared. Research reveals that 57% of children are worried about becoming a victim of deepfake abuse, and 1 in 10 respondents reported being a victim of such abuse or knowing a victim.32 The impact of nonconsensual sharing of intimate images on adolescents is of particular notice. Research suggests that young victims of non-consensual creation and/or sharing of intimate images are at particular risk of developing severe psychological distress, in some cases developing depression and anxiety.33 Therefore, it is clear to us that the creation of NCIIs, including NCIDs, is incredibly harmful and has lasting psychological effects on victims. These effects can be elucidated by countless real-world examples.34
Due to rapid technological advancements, in some cases, NCIDs have become indistinguishable from genuine imagery.35 Therefore, when NCIDs are shared, the images may be difficult to tell apart from genuine images of the individual, which only increases the level of harm to the victim. These so-called ‘hyper-realistic’ images have become a stumbling block for law enforcement, for example, even expertly trained analysts from the Internet Watch Foundation (IWF) are sometimes unable to tell the difference between genuine and artificial CSAM, meaning they are unable to decide whether a child needs to be safeguarded.36 When it comes to investigating NCIDs, this could be problematic, with investigators not knowing if the images reported to them are genuine NCII or an NCID. This may present difficulties during investigations and with subsequent charging decisions.
In sum, it is clear that the creation and distribution of NCIDs can have a devastating and life-changing impact on a victim’s life. Again, we argue that NCIDs are a form of image-based sexual abuse as defined by McGlynn and Rackley and, therefore, should be treated as such.37 Next, we will turn to consider the current legal framework that governs NCIDs in the UK before considering our recommendations for reform.

Chapter 2: Evaluating Current Legal Frameworks

In England and Wales, we have extensive legislation designed to regulate how individuals use the internet to interact with one another. In particular, there is strong legislative footing when it comes to image-based sexual abuse, especially concerning CSAM.38 However, there is a distinct lack of protections offered concerning NCIDs. Recently, we have seen the introduction of the OSA, which criminalises the non-consensual sharing of intimate imagery;39 however, these provisions fail to tackle the prevalent issue of NCID creation. This chapter will explore the existing legislation which relates to NCIDs and other forms of image-based sexual abuse before considering relevant legislation from the EU, deploying a comparative analysis model.40

The Criminal Justice and Courts Act 2015

Before the OSA, sections 33-35 of the Criminal Justice and Courts Act (CJCA) 2015 introduced offences relating to non-consensual disclosure of ‘private sexual photographs and films with the intent to cause distress’.41 However, Gillespie noted this provision was introduced solely to combat so-called ‘revenge porn’;42 significantly restricting its applicability to other, increasingly frequent, image-based sexual abuse. In addition, further concerns were lodged concerning the mens rea requirement of causing distress.43 This is an issue that will be discussed in detail in the coming chapter. Further, academic Kira highlighted the CJCA ‘did not directly apply to fully synthetic content’ and that s.34(6) specifically stated that a photograph or video was to be defined as, ‘a still or moving image that was originally captured by photography or filming, or is part of an image originally captured by photography or filming’.44 This stringently limits the CJCA’s applicability to NCIDs due to their artificial nature, leaving victim-survivors with little hope for legal redress or closure in cases where the imagery is totally artificially generated.

Communications Offences

Communication offences may be applied when NCIDs are distributed using an electronic communications device. However, it is pertinent to note that these offences were not designed to combat such offending, with scholars McGlynn and Rackley arguing that the application of such provisions is haphazard at best.45 Section 1 of the Malicious Communications Act 1988 sets out the offence of sending a communication that is indecent, grossly offensive and threatening or false; however, there must be evidence of an intention to cause distress or anxiety to the victim.46 Taking the legislation at face value, it seems possible that s.1 could be used to prosecute an individual who shared an NCID since this could be considered ‘false’ as the imagery is not genuine.47 However, we must note that this would not have been the application of s.1 that legislators intended.
Additionally, section 127 of the Communications Act 2003 makes it an offence to send, using a public electronic communications network, a message that is ‘grossly offensive or of an indecent, obscene or menacing character’.48 However, this leaves ambiguity as to what would be considered grossly offensive, indecent, obscene or menacing. This was elucidated by the House of Commons when it was opined that s.1 and s.127 could not apply to non-consensual intimate imagery since the images are not necessarily grossly offensive; notably, Gillespie argued this is the incorrect element to focus on, and the ‘indecent’ or ‘obscene’ nature of the imagery was far more pertinent.49 We submit that the use of communications offences by the Crown to tackle NCIDs, whilst creative, is far from ideal, placing yet another hurdle between a victim-survivor and securing justice.

The Online Safety Act 2023

In April 2019, the government published the Online Harms White Paper50 before the first draft of the OSA was introduced as a Bill in May 2021.51 Following some parliamentary ‘ping-pong,’52 the bill received Royal Assent on 26th October 2023.
The OSA repealed the provisions mentioned above,53 introducing a range of criminal offences relating to image-based sexual abuse. The basic offence is established when person A intentionally shares an intimate photograph or film of person B without B’s consent or without a reasonable belief that B consented.54 Upon conviction, this base offence carries a maximum sentence of imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).55 In addition to this baseline offence, the OSA adds a series of more serious, aggravated forms of the s.66B(1) offence. These apply where person A shares an image, without consent, with the intention of causing B alarm, distress or humiliation,56 or A does so to secure sexual gratification for A or another individual.57 Finally, the OSA also tackles the issue of coercion and threatening behaviour. Here, an offence is committed if A threatens to share an intimate photograph or film and does so with the intention that B (or another) will fear the threat will be carried out or is reckless thereto.58 The increasing seriousness of these offences is demonstrated by the statutory maximum of (1) a custodial sentence not exceeding the limit in a magistrates court or a fine (or both) and (2) on conviction on indictment to a term of custody not exceeding 2 years.59 As submitted by Kira, the tiered system of penalties reflects the increasing degrees of harm and culpability associated with each offence.60 Critically, the provision adapts previous definitions of ‘photography’ and ‘film’ to include images ‘made or altered by computer graphics or in any other way, which appears to be a photograph or film’.61 Therefore, it seems the OSA has criminalised the distribution of NCIDs; however, it seems to have completely ignored the issue of creating an NCID in the first place. Therefore, it would be legal for an individual to create an NCID of a celebrity, family member or political figure so long as they did not distribute it, even if the NCID was generated for their own sexual gratification.
We opine that the OSA’s amendments to the SOA have significantly widened the scope of criminal liability, providing the Police and the Crown with comprehensive legislation to tackle IBSA, including deepfakes and other modified media. However, the OSA fails to criminalise the act of creating sexually explicit deepfakes and focuses far too much on the offender’s motives, elucidated by the requirement of the intention to cause alarm, distress or humiliation. The proof of motive requirement places yet another statutory hurdle in the way of holding offenders accountable and protecting victim-survivors. Additionally, the failure to criminalise the creation of NCIDs creates a market for its dissemination, we contend by criminalising the simple creation, we will see a reduction of NCID distribution offences in the future, ultimately protecting victims.

Comparative Analysis:62 Legislative Advances in the European Union (EU)

Now that we have considered the current legal frameworks in the UK that can be used to regulate NCIDs, it may be helpful to briefly consider other legal jurisdictions before we turn to our recommendations for reform. We argue the EU is the most suitable territory to examine due to the recent introduction of legislation designed to regulate the spheres of technology, particularly AI.63 Additionally, the EU provided the first widely accepted legal definition of deepfakes.64
In May 2024, the EU implemented a directive which aimed to harmonise legislation and policy in response to violence against women and girls across its Member States.65 The directive contains provisions relating to female genital mutilation,66 forced marriage,67 cyberstalking,68 cyber harassment,69 cyber violence70 and the non-consensual sharing of intimate or manipulated material.71
Article 5 will be the focus of our comparative analysis due to its focus on non-consensual intimate imagery, including the contention of artificially manipulated content. Article 5 criminalises the distribution of ‘materials depicting sexually explicit activities’ or ‘intimate parts’ where the person shown does not consent and the material is made public.72 Rigotti et al. opine that the EU’s legislative acknowledgement of this form of IBSA is a connotation of its prevalence and escalation in recent years.73
Article 5(1)(b) specifically ensures that artificially manipulated images and videos, known as deepfakes, are criminalised. It is an offence to distribute manipulated material of an individual that makes them look to be engaged in sexually explicit activities without their prior consent.74 However, article 5(1)(b) is only engaged if the individual appears to be ‘engaged in sexually explicit activities,’ but, confusingly, excludes images of nudity. Therefore, it would not be an offence under this directive to create a deepfake of an adult, making them appear naked, but not actively participating in a sexual act and distributing it without consent; even if the nude image could be considered sexual by itself. This gap in the law was identified; however, the commission took no action to amend the directive.75
We contend this is inexcusable. The use of so-called ‘nudify’ apps is increasingly prevalent and is harmful to victim-survivors,76 where naked imagery of them is shared without consent. For the law in the EU not to criminalise non-consensual sharing is unacceptable and leaves people, particularly women and girls, in an extremely vulnerable position. This places article 5(1)(b) significantly behind the development of the OSA in the UK, which criminalises the non-consensual sharing of intimate imagery without the hurdle of the imagery needing to depict sexual activity.77
Another significant limitation is that Article 5 is limited to the distribution or threats to distribute intimate imagery. Therefore, the creation of such non-consensual intimate imagery is not criminalised.78 This mirrors the current legislation in the UK, which also fails to criminalise the creation of NCIDs, instead choosing to focus on its distribution,79 or threats thereto.80 This is problematic since it allows individuals to generate NCIDs of colleagues, friends and family members for their sexual gratification without redress. We submit that this is morally contentious from the perspective of victims having their images digitally manipulated, sexualising them without their consent.81
In sum, the EU is at the forefront of developing legislation concerning artificial intelligence. The introduction of Article 5(1)(b) helpfully specifies that the directive does cover artificially manipulated content, known as deepfakes. So far, the EU has shown a level of commitment to combating gender-based sexual violence and includes technology-facilitated offences in this battle.82 However, as previously noted, it seems to be less developed than the OSA when it comes to criminalising the non-consensual sharing of intimate imagery. We will now turn our attention back to recent UK Government proposals from both the previous Conservative and present Labour governments relating to criminalising the creation of NCIDs.

Recent Government Proposals

In recent months, the issue of the lack of regulation surrounding the creation of NCIDs has been at the forefront of both media83 and government, with the Women’s Equality Committee publishing their enquiry on non-consensual intimate image abuse in March of 2025.84 This is an area of law that has developed rapidly throughout the research and writing of this project.
The government has pledged to introduce a provision which criminalises the creation of NCIDs.85 Under the planned offence, those who create NCIDs could face an unlimited fine and a criminal record; therefore, an individual creating an NCID with no intention to share it but instead wants to cause alarm, humiliation or distress to the victim will be committing an offence.86 This provision would be introduced through an amendment to the Criminal Justice Bill. However, when the 2024 general election was called by the then-Prime Minister Rishi Sunak MP, the proposals were dropped.87 As of the date of writing, the creation of NCIDs has not been criminalised in the UK.
A petition started by victim-survivor Jodie and #NotYourPorn demanding that the government implement laws to criminalise image-based abuse, including the creation of NCIDs, has received over 70,500 signatures.88
In January of 2025, the newly formed Labour government announced plans to continue their predecessors’ efforts to criminalise the creation of NCIDs.89 This is partly due to the advocacy efforts of End Violence Against Women Coalition, Jodie, #NotYourPorn, Glamour UK, Baroness Owen and expert Professor Clare McGlynn KC (Hon) of Durham University.
It is understood that the Government plans to make it an offence to create ‘artificial images either for ‘sexual gratification’ or to cause ‘alarm, distress, or humiliation’ as an amendment to the Data (Use and Access) Bill, which is currently being debated before parliament.90 This is similar to the conservative government’s initial plans but includes the aspect of sexual gratification rather than solely focusing on the intention to cause alarm, distress or humiliation. Scholars opine that the steps toward criminalising the creation of NCIDs are a ‘necessary and welcome step toward fostering individual accountability.’91
However, the Government’s plans, whilst generally receiving positive support, have attracted some criticisms due to their limited scope.92 To elucidate, there have been calls for the Government to introduce a specific offence to cover the solicitation of NCIDs; a call that had been initially rejected.93 However, since then, the Government has made a U-turn,94 intending for the law to include asking someone else to create an NCID for you, known as a solicitation.95 Here, Professor McGlynn’s evidential submissions96 were noted as being ‘overwhelmingly persuasive’.97 We conclude that this is a step in the right direction, preventing offenders from avoiding liability for the creation of NCIDs by asking someone outside of the UK to create it for them.
In addition to the issue of solicitation, similar concerns to those raised concerning the CJCA may also be pertinent here. Scholars were concerned by the mens rea focus, namely the need to prove the intention to cause distress.98 The government plans to introduce an offence which again relies on the intention to cause ‘alarm, distress, or humiliation’ although we must note the addition of ‘sexual gratification’.99 We submit that the overreliance on the defendants’ motives and intentions will be problematic and will likely cause prosecutorial delays with defendants avoiding liability due to the difficulty in proving the defendant’s intentions. For example, if a defendant created an NCID of an individual but argues it is simply a hobby or a piece of art, it would be increasingly difficult to prove otherwise. Unfortunately, this places vulnerable victims in a difficult situation due to the drafting of the planned offence. We opine that the government should amend the creation offence to criminalise the non-consensual creation of intimate deepfakes, a point we will discuss in detail in the next chapter.
To conclude, we suggest the labour government’s inclusion of sexual gratification as an additional mens rea element is a step in the right direction, which widens the situations in which the mens rea requirement can be satisfied. In addition to this, the government’s change in opinion on the matter of socialisation is welcomed; again, this strengthens the planned legislation, ensuring individuals can not escape liability by simply asking someone in a different territory to generate an NCID for them. However, whilst the proposed legislation has developed significantly, it remains incomplete. We opine that instead of an offence which hinges on the defendant’s mens rea the government should instead introduce a consent-based model. This would make the creation of NCIDs prohibited if the person depicted did not consent. In the next chapter, we will introduce our fully synthesised recommendation for the introduction of a consent-focused offence for the creation of NCIDs.

Chapter 3: Recommendations for Reform

In the coming chapter, I will briefly summarise the shortcomings discussed in detail in the previous chapter before moving on to our recommendations for reform. As previously noted, as of the date of submission, the creation of NCIDs has not been criminalised, although we must note the government has announced its plans to do so.100 Image-based sexual abuse in the UK is prolific; in 2019, the Revenge Porn Helpline managed 1,600 cases of non-consensual intimate images, this figure doubled in 2020, and in 2023, we saw a tenfold increase, with the helpline handling nearly 19,000 cases.101 Specifically, NCID abuse has skyrocketed by over 400% since 2017.102 Therefore, it is clear that the current legislative frameworks in place are not effectively protecting the public. Having critically evaluated current and planned legislation to tackle the creation of NCIDs, we will now put forward our recommendations for legislative reform that we submit are necessary to reduce the prevalence of NCID abuse and equip victims with the necessary tools to seek redress.

Criminalising the Creation of NCIDs

As of writing, the law in the UK criminalises the distribution of an ‘intimate photograph or film’ where the individual pictured does not consent.103 Helpfully, the OSA has adopted the definition of a photograph to include images ‘made or altered by computer graphics or in any other way, which appears to be a photograph or film’.104 Therefore, it is clear that s.66B(1) criminalises the non-consensual distribution of an NCID. However, the law does not, at present, criminalise the creation of NCIDs. Therefore, an individual can legally generate non-consensual sexual imagery of another, even if it is intended to be used maliciously in the future.
We recommend that the Government introduce a standalone offence that criminalises the creation of NCIDs. There is already similar legislation designed to prevent the creation of AI-generated CSAM.105 Therefore, we can conclude that this type of legislation is a feasible recommendation. We opine this would reduce the criminalised sharing of NCIDs by deterring the production of such imagery at its source, ultimately protecting the public.
We reject the argument of the Law Commission that harm only arises when an NCID is shared or is threatened to be shared.106 Rather, we agree that the creation of NCIDs is a form of sexual violence, and permitting such behaviour legitimises individuals intruding on the privacy and sexual autonomy of another.107 Additionally, we share the view that deepfake abuse is a major contributor to the systematic degradation and oppression of women.108
When discussing criminalising the creation of NCIDs, a common response is that deepfakes are an expression of an individual’s sexual fantasy and are no different to imagining it in one’s head.109 However, we submit this is not the case; it is instead creating a digital file that could be shared at any moment, even accidentally or via malicious routes such as unauthorised access.110 Therefore, the level of harm and culpability can not be compared to mere imagination; the creation of an NCID is simultaneously morally contentious and distinctly wrong.111
We submit that the Law Commission’s recommendation that it should not be a criminal offence to ‘make’ an intimate image without consent is categorically incorrect and represents a level of ignorance when it comes to sexual violence and abuse.112

A Shift from the Requirement of Intention to the Lack of Consent

Current and planned legislation to combat the creation and distribution of NCIDs hinges on the individual’s intention to cause ‘alarm, distress, or humiliation’ or to gain ‘sexual gratification.’113 However, the drafters seem to have completely ignored the issue of consent.
The concept of consent114 is central to defining sexual offending; without consent, the construction of offences such as rape and sexual assault would be impossible.115 Legally recognised consent is what separates an action of intimacy between partners from a serious sexual offence and a violation of the human dignity and personal autonomy of one of those people.116 Creating an NCID should be no different. Recent victims of NCID abuse have spoken about the ‘life-shattering’ effects NCID abuse can have.117 In addition to this, researchers have demonstrated the relationship between non-consensual sexual imagery dissemination and depression and low self-esteem in adolescents.118 Consequently, we opine that non-contact, online offending, such as the creation and distribution of NCIDs, can be just as harmful as other types of sexual offending, which hinge on the issue of consent.
To conclude, we argue that the offence of creating NCIDs should be consent-based rather than based on the defendant’s mens rea; this would mitigate the risk of defendants avoiding liability by arguing they did not intend to cause ‘alarm, distress, or humiliation’ and claiming they did not gain ‘sexual gratification.’119 Instead, it would focus on the lack of consent of the person depicted. This is the accepted standard of liability in the vast majority of sexual offences; NCIDs are a form of image-based sexual abuse and should be treated no differently.120

Conclusion

The current legal framework in the UK criminalises the distribution of NCIDs;121 however, it fails to criminalise the simple act of creating NCIDs. Therefore, individuals can legally create NCIDs of their sexual fantasies for private sexual gratification without legal redress. Concerningly, this allows individuals to create NCIDs with future malicious intent; however, the law will not intervene until after the NCID has been distributed. The current legislation enables the dissemination of NCIDs by not criminalising their creation. Understandably, the law stringently criminalises the creation and distribution of AI-generated CSAM.122 We argue this connotes that legislating against the creation of NCIDs is a feasible endeavour.
We welcome the government’s recent plans to criminalise the creation of NCIDs,123 an announcement that was made during the writing of this article. However, we contend that the overreliance on the mens rea of offenders will be problematic. To elucidate, the planned requirement to prove the offender intended to cause ‘alarm, distress or humiliation’124 will present a significant sticking point when it comes to charging decisions and subsequently securing a conviction. This is since it places a high evidentiary burden on the Crown. Evidential issues aside, we submit that a consent-based approach would be far more appropriate. Consent is central to sexual offences and is what separates intimacy and a violation of human dignity and personal autonomy.125 We opine that a strict liability model should be adopted, shifting the focus from intent to the lack of consent. This would align the law on NCIDs with the foundations of sexual offence legalisation.126
The aforementioned gaps in the law place victims in a vulnerable position and indicate how the government has failed to protect them. The creation of NCIDs is not trivial; we argue that NCIDs are a serious form of image-based sexual abuse which violates the dignity, privacy and sexual autonomy of victims.127 The Law Commission heard that NCID abuse is a significant driver for the systematic oppression and degradation of women.128 Concerningly, between 2019 and 2023, we saw a 10x increase in the number of reported NCII incidents.129 Additionally, the 400% increase in NCID abuse since 2017 should be an urgent call for action.130 Unfortunately, legislators seem to underestimate the psychological and physical harm suffered by victims of NCID abuse.131
In sum, we submit there is an urgent need to criminalise the creation of NCIDs with the imposition of robust penalties to act as a deterrence and to offer victims closure. Any legislation that is introduced must follow a strict liability approach, which focuses on the lack of consent, not the intentions of the offender. The failure to criminalise the creation of NCIDs legitimises non-contact sexual abuse and enables harmful behaviour.132 We opine that without legislative reform criminalising the creation of NCIDs, the UK risks developing a cohort of offenders whose behaviours will only develop in severity, possibly leading to a future rise in more serious, contact offences against vulnerable victims.

Notes

1
Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106024-106025; Geschwindt S, ‘Taylor Swift deepfake porn deluge a “wake-up call” for lawmakers’ (TheNextWeb, 1 February 2024) <https://www.proquest.com/blogs-podcasts-websites/taylor-swift-deepfake-porn-deluge-wake-up-call/docview/2920603673/se-2?accountid=14557> accessed 5 January 2025
2
Council Regulation (EC) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [2024] OJ L168/1, Art 3(60); See also Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
3
Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106024-106025; For an in-depth discussion of image-based sexual abuse please see generally Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534
4
Bloomberg UK, ‘Levittown: A Victim of Fake Pornography Hunts for her Harasser’ (Bloomberg, 22 March 2025) <https://www.bloomberg.com/news/articles/2025-03-22/levittown-victim-of-deepfake-porn-hunts-for-her-harasser-online> accessed 25 March 2025; Rachel Bowman, ‘Deepfake Porn Victim Elliston Berry’s harrowing Story as Teen Joins Melania at Trump’s Joint Session Speech’ (Mail Online, 4 March 2025) <https://www.dailymail.co.uk/news/article-14460669/elliston-berry-melania-donald-trump-joint-session-speech.html> accessed 10 March 2025; Tiffanie Turnball, ‘Woman’s Deepfake Betrayal by Close Friend: “Every moment turned into Porn”’ (BBC, 8 February 2025) <https://www.bbc.co.uk/news/articles/cm21j341m31o> accessed 25 March 2025; Cathy Newman, ‘“Deepfake Porn”: How I Became a Victim’ (4 News, 27 December 2024) <https://www.channel4.com/news/deepfake-porn-how-i-became-a-victim> accessed 25 March 2025l Simone Obadia, ‘Survivor Safety: Deepfakes and the Negative Impacts of AI Technology’ (Maryland Coalition Against Sexual Assault, 8 May 2024)
5
Beatrice Sciacca, ‘Nonconsensual Dissemination of Sexual Images Among Adolescents: Associations with Depression and Self-Esteem’ (2023) 38 (15-16) Journal of Interpersonal Violence 9438, 9452-9455
6
The Online Safety Act (2023) has amended the Sexual Offences Act (2003) criminalising the distribution of non-consensual intimate images, this includes digitally modified images known as deepfakes; Please see Sexual Offences Act 2003, s.66B(1)
7
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.220]
8
For an in-depth discussion of comparative analysis methodology please see Dawn Watkins and Mandy Burton, Research Methods in Law (2nd edn, Taylor & Francis) Ch 6; See also generally Michael Salter and Julie Mason, Writing Law Dissertation (Pearson 2007)
9
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.220]
10
Woodrow Barfield and Ugo Pagallo, Advanced Introduction to: Law and Artificial Intelligence (Edward Elgar 2020) 1; See also Fazal Wahab and Others, ‘Artificial Intelligence in the Internet of Things, Recent Challenges and Future Prospects’ in Inam Ullah and Others (eds), Future Communication Systems Using Artificial Intelligence, Internet of Things and Data Science (Routledge 2024) Ch 1, [1.1.1]; Cole Stryker and Esa Kavlakoglu, ‘What is Artificial Intelligence (AI)?’ (IBM, 9 August 2024) <https://www.ibm.com/think/topics/artificial-intelligence> accessed 6 March 2025
11
Felix Juefei-Xu and Others, ‘Countering Malicious Deepfakes: Survey, Battleground, and Horizon’ (2022) 130(7) International Journal of Computer Vision 1678, 1679; Ruben Tolosana and others, ‘An Introduction to Digital Face Manipulation’, in Christian Rathgeb and others (eds), Handbook of Digital Face Manipulation and Detection (Springer Nature 2022); See also Ruben Tolosana and Others, ‘Deepfakes and Beyond: a Survey of Face Manipulation and Fake Detection’ (2020) 64 Information Fusion 131, 131-132; Luisa Verdoliva, ‘Media Forensics and DeepFakes: an Overview’ (2020) 14(5) IEEE Journal of Selected Topics in Signal Processing 910, 910-911
12
Bart van der Sloot and Yvette Wagensveld, ‘Deepfakes: regulatory challenges for the synthetic society’ (2022) 46 Computer Law & Security Review 105716, 105716-105717
13
Council Regulation (EC) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [2024] OJ L168/1, Art 3(60); See also Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
14
Ofcom, ‘A Deep Dive into Deepfakes that Demean, Defraud and Disinform’ (Ofcom, 23 July 2024) <https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/deepfakes-demean-defraud-disinform/> accessed 7 March 2025
15
Tvesha Sippy and Others, ‘Behind the Deepfake: 8% Create; 90% Concerned: Surveying Public Exposure to and Perceptions of Deepfakes in the UK’ [2024] 1, 5-9 <https://arxiv.org/abs/2407.05529> accessed 7 March 2025; See also Europol, ‘Malicious Uses of Abuses of Artificial Intelligence’ (Europol, 6 December 2021) <https://www.europol.europa.eu/cms/sites/default/files/documents/malicious_uses_and_abuses_of_artificial_intelligence_europol.pdf> accessed 6 March 2025
16
See generally University of Bath, ‘Deepfake Shows its Positive Face’ (University of Bath, 20 June 2024) <https://www.bath.ac.uk/announcements/deepfake-shows-its-positive-face/> accessed 10 March 2025; Dominic Lees, ‘Deepfakes are Being Used for Good’ (University of Reading, 8 November 2022) <https://research.reading.ac.uk/research-blog/2022/11/08/deepfakes-are-being-used-for-good-heres-how/> accessed 6 March 2025
17
UK Police, ‘Sextortion’ (Police.UK) <https://www.met.police.uk/advice/advice-and-information/online-safety/online-safety/sextortion/> accessed 10 March 2025; Federal Bureau of Investigation (FBI), ‘Malicious Actors Manipulating Photos and Videos to Create explicit Content and Sextortion Schemes’ (FBI – Public Service Announcement – Alert I-060523-PSA, 5 June 2023) <https://www.ic3.gov/PSA/2023/PSA230605> accessed 10 March 2025
18
Sebastian Shamo, ‘The Deepfake and its Impact on Trading Signals’ (2025) Bentley University Working Paper, 10-15 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5070125> accessed 10 March 2025; See also Caroline Dawson, ‘Financial Services Face Up to Deepfake Risks’ (Clifford Chance, 14 October 2024) <https://www.cliffordchance.com/insights/resources/blogs/talking-tech/en/articles/2024/10/financial-services-face-up-to-deepfake-risks.html> accessed 10 March 2025
19
Devangana Sujay, Vineet Kapoor and Shishir Shandilya, ‘A Comprehensive Survey of Technological Approaches in the Detection of CSAM’ in Shishir Shandilya, Devangana Sujay and VB Gupta (eds), Advancements in Cyber Crime Investigations and Modern Data Analytics (CRC Press 2024) [3.11]; See also IWF, ‘What has Changed in the AI CSAM Landscape?’ (IWF, July 2024) <https://www.iwf.org.uk/media/nadlcb1z/iwf-ai-csam-report_update-public-jul24v13.pdf> accessed 10 March 2025; IWF, ‘Artificial Intelligence (AI) and the Production of Child Sexual Abuse Material’ (IWF, 2024) <https://www.iwf.org.uk/about-us/why-we-exist/our-research/how-ai-is-being-abused-to-create-child-sexual-abuse-imagery/> accessed 10 March 2025
20
For the purposes of this report, we shall use the terminology used by each source. However, we agree that the use of the word ‘pornography’ or ‘porn’ is inappropriate and submit NCIDs are a form of image-based sexual abuse (IBSA).
21
Deeptrace, ‘The State of Deepfakes: Landscape, Threats, and Impact’ (Deeptrace Labs, 1 September 2019) 1 <https://regmedia.co.uk/2019/10/08/deepfake_report.pdf> accessed 7 March 2025
22
Deeptrace, ‘The State of Deepfakes: Landscape, Threats, and Impact’ (Deeptrace Labs, 1 September 2019) 1 <https://regmedia.co.uk/2019/10/08/deepfake_report.pdf> accessed 7 March 2025
23
UK Government, ‘Government Cracks down on “Deepfakes” Creation’ (UK Government, 16 April 2024) <https://www.gov.uk/government/news/government-cracks-down-on-deepfakes-creation> accessed 3 January 2025; UK Government, ‘Government Crackdown on Explicit Deepfakes’ (Ministry of Justice, 7 January 2025) <https://www.gov.uk/government/news/government-crackdown-on-explicit-deepfakes> accessed 7 January 2025
24
Rachel Bowman, ‘Deepfake Porn Victim Elliston Berry’s harrowing Story as Teen Joins Melania at Trump’s Joint Session Speech’ (Mail Online, 4 March 2025) <https://www.dailymail.co.uk/news/article-14460669/elliston-berry-melania-donald-trump-joint-session-speech.html> accessed 10 March 2025; Daniel Bates, ‘Melania Trump to Tackle Revenge Porn in Second First Lady Stint’ (The Telegraph, 3 March 2025) <https://www.telegraph.co.uk/us/news/2025/03/03/melania-trump-to-tackle-revenge-porn-in-second-first-lady/> accessed 10 March 2025
25
See generally Karolina Mania, ‘Legal Protection of Revenge and Deepfake Porn Victims in the European Union: Findings From a Comparative Legal Study’ (2024) 25(1) Trauma, Violence & Abuse 117; Adrienne de Ruiter, ‘The Distinct Wrong of Deepfakes’ (2021) 34(4) Philosophy & Technology 1311, 1314-1322; See also Durham University, ‘Deepfake Porn: Why we need to make it a crime to create it, not just share it’ (Durham University, 9 April 2024) <https://www.durham.ac.uk/research/current/thought-leadership/2024/04/deepfake-porn-why-we-need-to-make-it-a-crime-to-create-it-not-just-share-it/> accessed 6 January 2025
26
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 535-536; Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
27
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 535
28
For example, Sexual Offences Act 2003, s.66A(b); Sexual Offences Act 2003, s.67; Criminal Justice and Courts HL Bill (2014-15), cl after 28; See also Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 536
29
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 536
30
See generally Catharine MacKinnon, ‘Not a Moral Issue’ (1984) 2 Yale Law & Policy Review 321; Meagan Tyler, ‘All Porn is Revenge Porn’ (Feminist Current, 24 February 2016) <https://www.feministcurrent.com/2016/02/24/all-porn-is-revenge-porn/> accessed 10 March 2025; See also Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 536
31
Simone Obadia, ‘Survivor Safety: Deepfakes and the Negative Impacts of AI Technology’ (Maryland Coalition Against Sexual Assault, 8 May 2024) <https://mcasa.org/newsletters/article/survivor-safety-deepfakes-and-negative-impacts-of-ai-technology> accessed 25 March 2025
32
ESET, ‘Nearly Two-Thirds of Women Worry about Being a Victim of Deepfake Pornography, ESET UK Research Reveals’ (ESET, 20 March 2024) <https://www.eset.com/uk/about/newsroom/press-releases/nearly-two-thirds-of-women-worry-about-being-a-victim-of-deepfake-pornography-eset-uk-research-reveals/> accessed 25 March 2025
33
Beatrice Sciacca, ‘Nonconsensual Dissemination of Sexual Images Among Adolescents: Associations with Depression and Self-Esteem’ (2023) 38 (15-16) Journal of Interpersonal Violence 9438, 9452-9455; See also American Academy of Paediatrics, ‘The Impact of Deepfakes, Synthetic Pornography, & Virtual Child Sexual Abuse Material’ (AAP, 13 March 2025) <https://www.aap.org/en/patient-care/media-and-children/center-of-excellence-on-social-media-and-youth-mental-health/qa-portal/qa-portal-library/qa-portal-library-questions/the-impact-of-deepfakes-synthetic-pornography--virtual-child-sexual-abuse-material/?utm_source=chatgpt.com> accessed 25 March 2025
34
For example please see Bloomberg UK, ‘Levittown: A Victim of Fake Pornography Hunts for her Harasser’ (Bloomberg, 22 March 2025) <https://www.bloomberg.com/news/articles/2025-03-22/levittown-victim-of-deepfake-porn-hunts-for-her-harasser-online> accessed 25 March 2025; Rachel Bowman, ‘Deepfake Porn Victim Elliston Berry’s harrowing Story as Teen Joins Melania at Trump’s Joint Session Speech’ (Mail Online, 4 March 2025) <https://www.dailymail.co.uk/news/article-14460669/elliston-berry-melania-donald-trump-joint-session-speech.html> accessed 10 March 2025; Tiffanie Turnball, ‘Woman’s Deepfake Betrayal by Close Friend: “Every moment turned into Porn”’ (BBC, 8 February 2025) <https://www.bbc.co.uk/news/articles/cm21j341m31o> accessed 25 March 2025; Cathy Newman, ‘“Deepfake Porn”: How I Became a Victim’ (4 News, 27 December 2024) <https://www.channel4.com/news/deepfake-porn-how-i-became-a-victim> accessed 25 March 2025; Helen Bushby, ‘Deepfake Porn Documentary Explores its “Life-Shattering” Impact’ (BBC, 18 June 2023) <https://www.bbc.co.uk/news/entertainment-arts-65854112> accessed 25 March 2025
35
Marco Viola and Cristina Voto, ‘Designed to Abuse? Deepfakes and the non-consensual diffusion of intimate images’ (2023) 201 Synthese 30, 30-31
36
Internet Watch Foundation (IWF), ‘Online Safety Briefing: House of Lords – Briefing for Report Stage’ (IWF, 4 July 2023) <https://www.iwf.org.uk/media/cmubuhzb/osb-lords-report-briefing-04-07-2023.pdf> accessed 6 March 2025
37
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 535-536; Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
38
Protection of Children Act 1978, s.1 (possession, creation, permit to be created or distribution of indecent images of a child); Criminal Justice Act 1988, s.160 (possession of indecent photograph of a child); Criminal Justice Act 1988, s.62 (possession of a prohibited image of children); See also CPS, ‘Indecent and Prohibited Images of Children’ (CPS – Legal Guidance, 7 May 2024) < https://www.cps.gov.uk/legal-guidance/indecent-and-prohibited-images-children> accessed 1 April 2025
39
When the OSA was introduced it amended the SOA to criminalise the distribution of NCII; Please see Sexual Offences Act 2003, s.66B(1)
40
For an in-depth discussion of comparative analysis methodology please see Dawn Watkins and Mandy Burton, Research Methods in Law (2nd edn, Taylor & Francis) Ch 6; See also generally Michael Salter and Julie Mason, Writing Law Dissertation (Pearson 2007)
41
Criminal Justice and Courts Act 2015, s.33 (repealed by the Online Safety Act 2023, ss.190, 240(1)); See also Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106028
42
Alisdair Gillespie, “Trust Me, It’s Only for Me’: Revenge Porn and the Criminal Law’ (2015) 11 CLR 866, 866-870; Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 539-540; Found in Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106028
43
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 547; Nicola Henry and Anastasia Powell, ‘Sexual Violence in the Digital Age: The Scope and Limits of Criminal Law’ (2016) 25(4) Social & Legal Studies 397, 402-403; See also Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106028
44
Criminal Justice and Courts Act 2015, s.34(6); Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106028
45
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 553
46
Alisdair Gillespie, “Trust Me: It’s Only for Me’: Revenge Porn and the Criminal Law’ (2015) 11 CLR 866, 879; See also Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 553
47
Malicious Communications Act 1988, s.1(a)(iii); repealed by the Online Safety Act 2023, ss. 189(2)(b), 240(1)
48
Alisdair Gillespie, “Trust Me: It’s Only for Me’: Revenge Porn and the Criminal Law’ (2015) 11 CLR 866, 879; See also Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 553; Note s.127 has now been repealed by the Online Safety Act 2023, ss.189(1), 240(1)
49
HC Deb 1 December 2014, vol 589, col 120 (Maria Miller MP); Alisdair Gillespie, “Trust Me: It’s Only for Me’: Revenge Porn and the Criminal Law’ (2015) 11 CLR 866, 880
50
Department for Digital, Culture, Media and Sport and Home Office, Online Harms White Paper (CP 57, April 2019)
51
House of Commons Library, ‘Online Safety Bill: progress of the Bill’ (UK Government, 31 October 2023) < https://commonslibrary.parliament.uk/research-briefings/cbp-9579/> accessed 8 January 2024
52
For a definition of parliamentary ping-pong please see UK Parliament, ‘Ping-pong’ (UK Parliament – Glossary) < https://www.parliament.uk/site-information/glossary/ping-pong/> accessed 9 January 2025
53
Malicious Communications Act 1988, s.1 and Communications Act 2003, s.127
54
Sexual Offences Act 2003, s.66B(1)
55
Sexual Offences Act 2003, s.66B(9)
56
Sexual Offences Act 2003, s.66B(2)
57
Sexual Offences Act 2003, s.66B(3)
58
Sexual Offences Act 2003, s.66B(4)
59
Sexual Offences Act 2003, s.66B(10)
60
Those who commit offences involving serious distress, sextortion or coercion may receive harsher punishment than those who intentionally share an intimate image without a specified malicious aim; See Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106029
61
Sexual Offences Act 2003, s.66A(5)(a); Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106029
62
For an in-depth discussion of comparative analysis methodology please see Dawn Watkins and Mandy Burton, Research Methods in Law (2nd edn, Taylor & Francis) Ch 6; See also generally Michael Salter and Julie Mason, Writing Law Dissertation (Pearson 2007)
63
Council Regulation (EC) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [2024] OJ L168/1; Directive (EU) 2024/1385 of the European Parliament and of the Council of 14 May 2024 on Combating Violence Against Women and Domestic Violence [2024] OJ L1385/1
64
Council Regulation (EC) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [2024] OJ L168/1, Art 3(60); See also Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
65
Directive (EU) 2024/1385 of the European Parliament and of the Council of 14 May 2024 on Combating Violence Against Women and Domestic Violence [2024] OJ L1385/1; Carlotta Rigotti, Clare McGlynn and Franziska Benning, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472, 1472-1473
66
Directive (EU) 2024/1385 of the European Parliament and of the Council of 14 May 2024 on Combating Violence Against Women and Domestic Violence [2024] OJ L1385/1, Art 3
67
Ibid, Art 4
68
Ibid, Art 6
69
Ibid, Art 7
70
Ibid, Art 8
71
Ibid, Art 5
72
Ibid, Art 5; Carlotta Rigotti, Clare McGlynn and Franziska Benning, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472, 1482-1483
73
Carlotta Rigotti, Clare McGlynn and Franziska Benning, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472, 1483
74
Directive (EU) 2024/1385 of the European Parliament and of the Council of 14 May 2024 on Combating Violence Against Women and Domestic Violence [2024] OJ L1385/1, Art 5(1)(b)
75
Carlotta Rigotti and Clare McGlynn, ‘Towards an EU Criminal Law on Violence against Women: The Ambitions and Limitations of the Commission’s Proposal to Criminalise Image-Based Sexual Abuse’ (2022) 13(4) NJECL 452, 773; See also Carlotta Rigotti, Clare McGlynn and Franziska Benning, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472, 1484
76
HL Deb 13 February 2024 vol 836, col 1269; Emilie Lavinia, ‘I’ve seen boys request fake nudes of their teachers and mothers’: How Nudify Apps are Violating Women and Girls in the UK’ (Glamour, 24 June 2024) <https://www.glamourmagazine.co.uk/article/nudify-apps-investigation> accessed 22 March 2025; For advice relating to ‘Nudify’ apps please see generally Family Online Safety Institute (FOSI), ‘Understanding “Nudify” Apps’ (FOSI) <https://cdn.prod.website files.com/5f4dd3623430990e705ccbba/66b4ddcbf6edc08bb9c98e5b_Understanding%20%E2%80%98Nudify%E2%80%99%20Apps%20Resource.pdf> accessed 24 March 2025
77
Sexual Offences Act 2003, s.66B(1)
78
Carlotta Rigotti, Clare McGlynn and Franziska Benning, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472, 1484
79
Sexual Offences Act 2003, s.66B(1)
80
Sexual Offences Act 2003, s.66B(4)
81
For an in-depth discussion on the distinct wrong of deepfakes please see generally Adrienne de Ruiter, ‘The Distinct Wrong of Deepfakes’ (2021) 34(4) Philosophy & Technology 1311
82
Carlotta Rigotti, Clare McGlynn and Franziska Benning, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472, 1493
83
End Violence Against Women, ‘Government Criminalises Creation of Deepfakes, but with a Major Loophole’ (End Violence Against Women, 16 April 2024) <https://www.endviolenceagainstwomen.org.uk/government-criminalises-creation-of-deepfakes-but-with-a-major-loophole/> accessed 2 January 2025; Harriet Line, ‘Crackdown on Deepfake Porn Makers’ (Daily Mail London, 16 April 2024); Herbet Smith Freehills, ‘Criminalising Deepfakes – the UK’s new offences following the Online Safety Act’ (Herbet Smith Freehills, 21 May 2024) <https://www.herbertsmithfreehills.com/notes/tmt/2024-05/criminalising-deepfakes-the-uks-new-offences-following-the-online-safety-act> accessed 5 January 2025 <https://www-proquest-com.ezproxy.kingston.ac.uk/newspapers/crackdown-on-deepfake-porn-makers/docview/3039086565/se-2?accountid=14557> accessed 5 January 2025
84
Women and Equalities Committee, ‘Tackling Non-Consensual Intimate Image Abuse’ (HC 336, 5 March 2025) <https://committees.parliament.uk/publications/46899/documents/241995/default/> accessed 26 March 2025
85
UK Government, ‘Government Crackdown on Explicit Deepfakes’ (Ministry of Justice, 7 January 2025) <https://www.gov.uk/government/news/government-crackdown-on-explicit-deepfakes> accessed 7 January 2025; UK Government, ‘Government Cracks down on “Deepfakes” Creation’ (UK Government, 16 April 2024) <https://www.gov.uk/government/news/government-cracks-down-on-deepfakes-creation> accessed 3 January 2025
86
UK Government, ‘Government Cracks down on “Deepfakes” Creation’ (UK Government, 16 April 2024) <https://www.gov.uk/government/news/government-cracks-down-on-deepfakes-creation> accessed 3 January 2025
87
End Violence Against Women, ‘Government Criminalises Creation of Deepfakes, but with a Major Loophole’ (End Violence Against Women, 16 April 2024) <https://www.endviolenceagainstwomen.org.uk/government-criminalises-creation-of-deepfakes-but-with-a-major-loophole/> accessed 2 January 2025
88
Change.org, ‘Deepfake Sexual Abuse is not ‘Porn’: Demand action to stop image-based abuse!’ (Change.org) <https://www.change.org/p/deepfake-sexual-abuse-is-not-porn-demand-action-to-stop-image-based-abuse> accessed 26 March 2025; See also End Violence Against Women, ‘Government U-Turn on Deepfakes Offence’ (End Violence Against Women, 27 January 2025) <https://www.endviolenceagainstwomen.org.uk/government-u-turn-on-deepfakes-offence/> accessed 26 March 2025
89
UK Government, ‘Government Crackdown on Explicit Deepfakes’ (Ministry of Justice, 7 January 2025) <https://www.gov.uk/government/news/government-crackdown-on-explicit-deepfakes> accessed 7 January
90
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025
91
Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106036
92
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025; End Violence Against Women, ‘Campaign Win: Law to Stop Deepfake Abuse’ (End Violence Against Women, 7 January 2025) <https://www.endviolenceagainstwomen.org.uk/campaign-win-law-to-stop-deepfake-abuse/> accessed 26 March 2025
93
Clare McGlynn and Gemma Davies, ‘Soliciting the Creation of Sexually Explicit Deepfakes: Analysis of the Current Criminal Law, Loopholes and Reform Options’ (IIA0012 - Written Evidence to the House of Commons Enquiry, January 2025) <https://committees.parliament.uk/writtenevidence/134382/html/> accessed 14 January 2025
94
HL Deb 28 January 2025 vol 843 col 216; Durham University, ‘Professor Clare McGlynn helps change law on sexually explicit deepfakes’ (Durham Law School, 4 February 2025) <https://www.durham.ac.uk/departments/academic/law/news-and-events/news/2025/02/changing-the-law-on-explicit-deepfakes/?utm_source=chatgpt.com> accessed 26 March 2025
95
Durham University, ‘Professor Clare McGlynn helps change law on sexually explicit deepfakes’ (Durham Law School, 4 February 2025) <https://www.durham.ac.uk/departments/academic/law/news-and-events/news/2025/02/changing-the-law-on-explicit-deepfakes/?utm_source=chatgpt.com> accessed 26 March 2025
96
Clare McGlynn and Gemma Davies, ‘Soliciting the Creation of Sexually Explicit Deepfakes: Analysis of the Current Criminal Law, Loopholes and Reform Options’ (IIA0012 - Written Evidence to the House of Commons Enquiry, January 2025) <https://committees.parliament.uk/writtenevidence/134382/html/> accessed 14 January 2025
97
HL Deb 28 January 2025 vol 843 col 216, Lord Pannick at 8:30pm; See also Durham University, ‘Professor Clare McGlynn helps change law on sexually explicit deepfakes’ (Durham Law School, 4 February 2025) <https://www.durham.ac.uk/departments/academic/law/news-and-events/news/2025/02/changing-the-law-on-explicit-deepfakes/?utm_source=chatgpt.com> accessed 26 March 2025
98
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 547; Nicola Henry and Anastasia Powell, ‘Sexual Violence in the Digital Age: The Scope and Limits of Criminal Law’ (2016) 25(4) Social & Legal Studies 397, 402-403; See also Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106028
99
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025
100
UK Government, ‘Government Crackdown on Explicit Deepfakes’ (Ministry of Justice, 7 January 2025) <https://www.gov.uk/government/news/government-crackdown-on-explicit-deepfakes> accessed 7 January 2025; UK Government, ‘Government Cracks down on “Deepfakes” Creation’ (UK Government, 16 April 2024) <https://www.gov.uk/government/news/government-cracks-down-on-deepfakes-creation> accessed 3 January 2025
101
Women and Equalities Committee, ‘Tackling Non-Consensual Intimate Image Abuse’ (HC 336, 5 March 2025) 5-6 <https://committees.parliament.uk/publications/46899/documents/241995/default/> accessed 26 March 2025
102
End Violence Against Women, ‘Parliament Hears Call for Action on Deepfake Sexual Abuse’ (End Violence Against Women, 15 November 2024) <https://www.endviolenceagainstwomen.org.uk/parliament-hears-call-for-action-on-deepfake-sexual-abuse/> accessed 28 March 2025
103
Sexual Offences Act 2003, s.66B(1)
104
Sexual Offences Act 2003, s.66A(5)(a); Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106029
105
Protection of Children Act 1978, s.(1)(a)
106
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.209]
107
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.210]-[4.211]
108
Carl Öhman, “Introducing the pervert’s dilemma: a contribution to the critique of Deepfake Pornography” (2020) 22 Ethics and Information Technology 133, 137; See also Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.176]
109
Durham University, ‘Deepfake Porn: Why we need to make it a crime to create it, not just share it’ (Durham University, 9 April 2024) <https://www.durham.ac.uk/research/current/thought-leadership/2024/04/deepfake-porn-why-we-need-to-make-it-a-crime-to-create-it-not-just-share-it/> accessed 6 January 2025
110
Ibid
111
Adrienne de Ruiter, ‘The Distinct Wrong of Deepfakes’ (2021) 34(4) Philosophy & Technology 1311, 1322-1328
112
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.220]
113
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025
114
Consent is defined as ‘a person consents if he agrees by choice and has the freedom and capacity to make that choice’ by the Sexual Offences Act 2003, s.74
115
Terry Thomas, Sex Crime: Sex Offending and Society (2nd edn, Taylor & Francis 2005) 8-9, 62
116
Elisa Hoven and Thomas Weigend (eds), Consent and Sexual Offenses: Comparative Perspectives (Nomos 2022) 7; Tom O’Malley and Elisa Hoven, ‘Consent in the Law Relating to Sexual Offences’ in Kai Ambos and Others (eds) Core Concepts in Criminal Law and Criminal Justice (CUP 2020) Ch 5
117
Bloomberg UK, ‘Levittown: A Victim of Fake Pornography Hunts for her Harasser’ (Bloomberg, 22 March 2025) <https://www.bloomberg.com/news/articles/2025-03-22/levittown-victim-of-deepfake-porn-hunts-for-her-harasser-online> accessed 25 March 2025; Rachel Bowman, ‘Deepfake Porn Victim Elliston Berry’s harrowing Story as Teen Joins Melania at Trump’s Joint Session Speech’ (Mail Online, 4 March 2025) <https://www.dailymail.co.uk/news/article-14460669/elliston-berry-melania-donald-trump-joint-session-speech.html> accessed 10 March 2025; Tiffanie Turnball, ‘Woman’s Deepfake Betrayal by Close Friend: “Every moment turned into Porn”’ (BBC, 8 February 2025) <https://www.bbc.co.uk/news/articles/cm21j341m31o> accessed 25 March 2025; Cathy Newman, ‘“Deepfake Porn”: How I Became a Victim’ (4 News, 27 December 2024) <https://www.channel4.com/news/deepfake-porn-how-i-became-a-victim> accessed 25 March 2025l Simone Obadia, ‘Survivor Safety: Deepfakes and the Negative Impacts of AI Technology’ (Maryland Coalition Against Sexual Assault, 8 May 2024) <https://mcasa.org/newsletters/article/survivor-safety-deepfakes-and-negative-impacts-of-ai-technology> accessed 25 March 2025; Helen Bushby, ‘Deepfake Porn Documentary Explores its “Life-Shattering” Impact’ (BBC, 18 June 2023) <https://www.bbc.co.uk/news/entertainment-arts-65854112> accessed 25 March 2025
118
Beatrice Sciacca, ‘Nonconsensual Dissemination of Sexual Images Among Adolescents: Associations with Depression and Self-Esteem’ (2023) 38 (15-16) Journal of Interpersonal Violence 9438, 9452-9455; See also American Academy of Paediatrics, ‘The Impact of Deepfakes, Synthetic Pornography, & Virtual Child Sexual Abuse Material’ (AAP, 13 March 2025) <https://www.aap.org/en/patient-care/media-and-children/center-of-excellence-on-social-media-and-youth-mental-health/qa-portal/qa-portal-library/qa-portal-library-questions/the-impact-of-deepfakes-synthetic-pornography--virtual-child-sexual-abuse-material/?utm_source=chatgpt.com> accessed 25 March 2025
119
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025
120
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 535-536; Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
121
Sexual Offences Act 2003, s.66B(1)
122
Protection of Children Act 1978, s.(1)(a)
123
UK Government, ‘Government Crackdown on Explicit Deepfakes’ (Ministry of Justice, 7 January 2025) <https://www.gov.uk/government/news/government-crackdown-on-explicit-deepfakes> accessed 7 January 2025; UK Government, ‘Government Cracks down on “Deepfakes” Creation’ (UK Government, 16 April 2024) <https://www.gov.uk/government/news/government-cracks-down-on-deepfakes-creation> accessed 3 January 2025
124
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025
125
Elisa Hoven and Thomas Weigend (eds), Consent and Sexual Offenses: Comparative Perspectives (Nomos 2022) 7; Tom O’Malley and Elisa Hoven, ‘Consent in the Law Relating to Sexual Offences’ in Kai Ambos and Others (eds) Core Concepts in Criminal Law and Criminal Justice (CUP 2020) Ch 5
126
For example the following offences all are constructed around the lack of consent; Sexual Offences Act 2003, s.1 (Rape), s.2 (assault by penetration), s.3 (sexual assault), s.4 (causing a person to engage in sexual activity)
127
Clare McGlynn and Erika Rackley, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534, 535-536; Beatriz Kira, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024, 106026
128
Carl Öhman, “Introducing the pervert’s dilemma: a contribution to the critique of Deepfake Pornography” (2020) 22 Ethics and Information Technology 133, 137; See also Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.176]
129
Women and Equalities Committee, ‘Tackling Non-Consensual Intimate Image Abuse’ (HC 336, 5 March 2025) 5-6 <https://committees.parliament.uk/publications/46899/documents/241995/default/> accessed 26 March 2025
130
End Violence Against Women, ‘Parliament Hears Call for Action on Deepfake Sexual Abuse’ (End Violence Against Women, 15 November 2024) <https://www.endviolenceagainstwomen.org.uk/parliament-hears-call-for-action-on-deepfake-sexual-abuse/> accessed 28 March 2025
131
Beatrice Sciacca, ‘Nonconsensual Dissemination of Sexual Images Among Adolescents: Associations with Depression and Self-Esteem’ (2023) 38 (15-16) Journal of Interpersonal Violence 9438, 9452-9455; See here for real world examples of the significant harm caused by NCID creation and dissemination Bloomberg UK, ‘Levittown: A Victim of Fake Pornography Hunts for her Harasser’ (Bloomberg, 22 March 2025) <https://www.bloomberg.com/news/articles/2025-03-22/levittown-victim-of-deepfake-porn-hunts-for-her-harasser-online> accessed 25 March 2025; Rachel Bowman, ‘Deepfake Porn Victim Elliston Berry’s harrowing Story as Teen Joins Melania at Trump’s Joint Session Speech’ (Mail Online, 4 March 2025) <https://www.dailymail.co.uk/news/article-14460669/elliston-berry-melania-donald-trump-joint-session-speech.html> accessed 10 March 2025; Tiffanie Turnball, ‘Woman’s Deepfake Betrayal by Close Friend: “Every moment turned into Porn”’ (BBC, 8 February 2025) <https://www.bbc.co.uk/news/articles/cm21j341m31o> accessed 25 March 2025; Cathy Newman, ‘“Deepfake Porn”: How I Became a Victim’ (4 News, 27 December 2024) <https://www.channel4.com/news/deepfake-porn-how-i-became-a-victim> accessed 25 March 2025; Helen Bushby, ‘Deepfake Porn Documentary Explores its “Life-Shattering” Impact’ (BBC, 18 June 2023) <https://www.bbc.co.uk/news/entertainment-arts-65854112> accessed 25 March 2025
132
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022) [4.210]-[4.211] https://www.bbc.co.uk/news/entertainment-arts-65854112> accessed 25 March 2025

Bibliography

Legislation
Criminal Justice and Courts Act 2015
Criminal Justice and Courts HL Bill (2014-15)
Data (Use and Access) HL Bill (2024-25)
Malicious Communications Act 1988
Online Safety Act 2023
Protection from Harassment Act 1997
Protection of Children Act 1978
Sexual Offences Act 2003
International Legislation
Council Regulation (EC) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [2024] OJ L168/1
Directive (EU) 2024/1385 of the European Parliament and of the Council of 14 May 2024 on Combating Violence Against Women and Domestic Violence [2024] OJ L1385/1
Official Government Sources
Department for Digital, Culture, Media and Sport and Home Office, Online Harms White Paper (CP 57, April 2019)
HC Deb 1 December 2014, vol 589, col 120
HL Deb 13 February 2024 vol 836, col 1269
HL Deb 28 January 2025 vol 843 col 216
Law Commission, Intimate Image Abuse: a Final Report (Law Com No. 407, 2022)
UK Government, ‘Better Protection for Victims Thanks to New Law on Sexually Explicit Deepfakes’ (UK Government, 22 January 2025) <https://www.gov.uk/government/news/better-protection-for-victims-thanks-to-new-law-on-sexually-explicit-deepfakes> accessed 26 March 2025
UK Government, ‘Government Crackdown on Explicit Deepfakes’ (Ministry of Justice, 7 January 2025) <https://www.gov.uk/government/news/government-crackdown-on-explicit-deepfakes> accessed 7 January 2025
UK Government, ‘Government Cracks down on “Deepfakes” Creation’ (UK Government, 16 April 2024) <https://www.gov.uk/government/news/government-cracks-down-on-deepfakes-creation> accessed 3 January 2025
UK Parliament, ‘Ping-pong’ (UK Parliament – Glossary) < https://www.parliament.uk/site-information/glossary/ping-pong/> accessed 9 January 2025
UK Police, ‘Sextortion’ (Police.UK) <https://www.met.police.uk/advice/advice-and-information/online-safety/online-safety/sextortion/> accessed 10 March 2025
Women and Equalities Committee, ‘Tackling Non-Consensual Intimate Image Abuse’ (HC 336, 5 March 2025) <https://committees.parliament.uk/publications/46899/documents/241995/default/> accessed 26 March 2025
Books
Henry N and Others, Image-based Sexual Abuse: A Study on the Causes and Consequences of Non-consensual Nude or Sexual Imagery (Routledge 2020)
Hoven E and Weigend T (eds), Consent and Sexual Offenses: Comparative Perspectives (Nomos 2022)
Lyon B and Tora M, Exploring Deepfakes: Deploy Powerful AI Techniques for Face Replacement and More with this Comprehensive Guide (Packt Publishing 2023)
Salter M and Mason J, Writing Law Dissertation (Pearson 2007)
Thomas T, Sex Crime: Sex Offending and Society (2nd edn, Taylor & Francis 2005)
Watkins D and Burton M, Research Methods in Law (2nd edn, Taylor & Francis)
Chapters in Edited Books
O’Malley T and Hoven E, ‘Consent in the Law Relating to Sexual Offences’ in Ambos K and Others (eds) Core Concepts in Criminal Law and Criminal Justice (CUP 2020)
Sujay D, Kapoor V and Shandilya S, ‘A Comprehensive Survey of Technological Approaches in the Detection of CSAM’ in Shandilya S, Sujay D and Gupta VB (eds), Advancements in Cyber Crime Investigations and Modern Data Analytics (CRC Press 2024)
Tolosana R and others, ‘An Introduction to Digital Face Manipulation’, in Rathgeb C and others (eds), Handbook of Digital Face Manipulation and Detection (Springer Nature 2022)
Articles
De Ruiter A, ‘The Distinct Wrong of Deepfakes’ (2021) 34(4) Philosophy & Technology 1311
Finkelhor D and Others, ‘Persisting Concerns About Image Exposure Among Survivors of Image-Based Sexual Exploitation and Abuse in Childhood’ [2024] Psychological Trauma 1
Fisher S, Howard J and Kira B, ‘Moderating Synthetic Content: the Challenge of Generative AI’ (2024) 37(4) Philosophy & Technology 133
Gillespie A, “Trust Me: It’s Only for Me’: Revenge Porn and the Criminal Law’ (2015) 11 CLR 866
Henry N and Powell A, ‘Sexual Violence in the Digital Age: The Scope and Limits of Criminal Law’ (2016) 25(4) Social & Legal Studies 397
Juefei-Xu F and Others, ‘Countering Malicious Deepfakes: Survey, Battleground, and Horizon’ (2022) 130 (7) International Journal of Computer Vision 1678
Kira B, ‘When non-consensual intimate deepfakes go viral: the insufficiency of the UK Online Safety Act’ (2024) 54 Computer Law & Security Review 106024
Kirchengast T, ‘Deepfakes and Image Manipulation: Criminalisation and Control’ (2020) 29(3) Information & Communications Technology Law 308
MacKinnon C, ‘Not a Moral Issue’ (1984) 2 Yale Law & Policy Review 321
Mania K, ‘Legal Protection of Revenge and Deepfake Porn Victims in the European Union: Findings From a Comparative Legal Study’ (2024) 25(1) Trauma, Violence & Abuse 117
Maras MH and Logie K, ‘Countering the Complex, Multifaceted Nature of Nude and Sexually Explicit Deepfakes: an Augean Task? (2024) 13(1) Crime Science 1
McGlynn C and Rackley E, ‘Image-Based Sexual Abuse’ (2017) 37(3) Oxford Journal of Legal Studies 534
McGlynn C and Rackley E, ‘Not Porn, but Abuse: Let’s Call it Image-Based Sexual Abuse’ (Everyday Victim Blaming, 9 March 2016)
Öhman C, “Introducing the pervert’s dilemma: a contribution to the critique of Deepfake Pornography” (2020) 22 Ethics and Information Technology 133
Popova M, ‘Deepfakes: An Introduction’ (2020) 7(4) Porn Studies 350
Rigotti C and McGlynn C, ‘Towards an EU Criminal Law on Violence against Women: The Ambitions and Limitations of the Commission’s Proposal to Criminalise Image-Based Sexual Abuse’ (2022) 13(4) NJECL 452
Rigotti C, McGlynn C and Benning F, ‘Image-Based Sexual Abuse and EU Law: A Critical Analysis’ (2024) 25(9) German Law Journal 1472
Sciacca B, ‘Nonconsensual Dissemination of Sexual Images Among Adolescents: Associations with Depression and Self-Esteem’ (2023) 38 (15-16) Journal of Interpersonal Violence 9438
Shamo S, ‘The Deepfake and its Impact on Trading Signals’ (2025) Bentley University Working Paper <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5070125> accessed 10 March 2025
Sippy T and Others, ‘Behind the Deepfake: 8% Create; 90% Concerned: Surveying Public Exposure to and Perceptions of Deepfakes in the UK’ [2024] <https://arxiv.org/abs/2407.05529> accessed 7 March 2025
Tolosana R and Others, ‘Deepfakes and Beyond: a Survey of Face Manipulation and Fake Detection’ (2020) 64 Information Fusion 131
Van der Nagel E, ‘Verifying Images: Deepfakes, Control, and Consent’ (2020) 7(4) Porn Studies 424
Van der Sloot B and Wagensveld Y, ‘Deepfakes: regulatory challenges for the synthetic society’ (2022) 46 Computer Law & Security Review 105716
Velijović S, Ćurčić M and Gavrilović I, ‘Dark Sides of Deepfake Technology’ (2024) 72 (3)
Viola M and Voto C, ‘Designed to Abuse? Deepfakes and the non-consensual diffusion of intimate images’ (2023) 201 Synthese 30
Yavuz C, ‘Adverse Human Rights Impacts of Dissemination of Nonconsensual Sexual Deepfakes in the Framework of European Convention on Human Rights: A Victim-Centred Perspective’ (2025) 56 Computer Law and Security Report 106108
Websites
Bates D, ‘Melania Trump to Tackle Revenge Porn in Second First Lady Stint’ (The Telegraph, 3 March 2025) <https://www.telegraph.co.uk/us/news/2025/03/03/melania-trump-to-tackle-revenge-porn-in-second-first-lady/> accessed 10 March 2025
Bloomberg UK, ‘Levittown: A Victim of Fake Pornography Hunts for her Harasser’ (Bloomberg, 22 March 2025) <https://www.bloomberg.com/news/articles/2025-03-22/levittown-victim-of-deepfake-porn-hunts-for-her-harasser-online> accessed 25 March 2025
Bowman B, ‘Deepfake Porn Victim Elliston Berry’s harrowing Story as Teen Joins Melania at Trump’s Joint Session Speech’ (Mail Online, 4 March 2025) <https://www.dailymail.co.uk/news/article-14460669/elliston-berry-melania-donald-trump-joint-session-speech.html> accessed 10 March 2025
Bushby H, ‘Deepfake Porn Documentary Explores its “Life-Shattering” Impact’ (BBC, 18 June 2023) <https://www.bbc.co.uk/news/entertainment-arts-65854112> accessed 25 March 2025
Change.org, ‘Deepfake Sexual Abuse is not ‘Porn’: Demand action to stop image-based abuse!’ (Change.org) <https://www.change.org/p/deepfake-sexual-abuse-is-not-porn-demand-action-to-stop-image-based-abuse> accessed 26 March 2025
CPS, ‘Indecent and Prohibited Images of Children’ (CPS – Legal Guidance, 7 May 2024) <https://www.cps.gov.uk/legal-guidance/indecent-and-prohibited-images-children> accessed 1 April 2025
Criddle C, ‘Review Calls for Sharing of “Deepfake Porn” to be illegal: Law Commission’ (Financial Times, 7 July 2022) <https://www.proquest.com/newspapers/review-calls-sharing-deepfake-porn-be-illegal/docview/2698979141/se-2?accountid=14557> accessed 7 January 2025
Davies R, ‘Telegram facing probe over AI-generated deepfake porn shared on app’ (Read Write, 3 September 2024) <https://www.proquest.com/blogs-podcasts-websites/telegram-facing-probe-over-ai-generated-deepfake/docview/3100250046/se-2?accountid=14557> accessed 5 January 2025
Dawson C, ‘Financial Services Face Up to Deepfake Risks’ (Clifford Chance, 14 October 2024) <https://www.cliffordchance.com/insights/resources/blogs/talking-tech/en/articles/2024/10/financial-services-face-up-to-deepfake-risks.html> accessed 10 March 2025
Deeptrace, ‘The State of Deepfakes: Landscape, Threats, and Impact’ (Deeptrace Labs, 1 September 2019) <https://regmedia.co.uk/2019/10/08/deepfake_report.pdf> accessed 7 March 2025
Desmarais A, ‘Creating Deepfake Porn to be made a Crime in the UK under “first of its kind” law’ (Euro News, 17 April 2024) <https://www.euronews.com/next/2024/04/17/creating-deepfake-porn-to-be-made-a-crime-in-uk-under-first-of-its-kind-law> accessed 6 January 2025
Durham University, ‘Deepfake Porn: Why we need to make it a crime to create it, not just share it’ (Durham University, 9 April 2024) <https://www.durham.ac.uk/research/current/thought-leadership/2024/04/deepfake-porn-why-we-need-to-make-it-a-crime-to-create-it-not-just-share-it/> accessed 6 January 2025
Durham University, ‘Professor Clare McGlynn helps change law on sexually explicit deepfakes’ (Durham Law School, 4 February 2025) <https://www.durham.ac.uk/departments/academic/law/news-and-events/news/2025/02/changing-the-law-on-explicit-deepfakes/?utm_source=chatgpt.com> accessed 26 March 2025
Elliott V, ‘The US Needs Deepfake Porn Laws: These States are Leading the Way’ (Wired, 5 September 2024) <https://www.wired.com/story/deepfake-ai-porn-laws/> accessed 6 January 2025
End Violence Against Women, ‘Campaign Win: Law to Stop Deepfake Abuse’ (End Violence Against Women, 7 January 2025) <https://www.endviolenceagainstwomen.org.uk/campaign-win-law-to-stop-deepfake-abuse/> accessed 26 March 2025
End Violence Against Women, ‘Government Criminalises Creation of Deepfakes, but with a Major Loophole’ (End Violence Against Women, 16 April 2024) <https://www.endviolenceagainstwomen.org.uk/government-criminalises-creation-of-deepfakes-but-with-a-major-loophole/> accessed 2 January 2025
End Violence Against Women, ‘Government U-Turn on Deepfakes Offence’ (End Violence Against Women, 27 January 2025) <https://www.endviolenceagainstwomen.org.uk/government-u-turn-on-deepfakes-offence/> accessed 26 March 2025
End Violence Against Women, ‘Parliament Hears Call for Action on Deepfake Sexual Abuse’ (End Violence Against Women, 15 November 2024) <https://www.endviolenceagainstwomen.org.uk/parliament-hears-call-for-action-on-deepfake-sexual-abuse/> accessed 28 March 2025
ESET, ‘Nearly Two-Thirds of Women Worry about Being a Victim of Deepfake Pornorgrapgy, ESET UK Research Reveals’ (ESET, 20 March 2024) <https://www.eset.com/uk/about/newsroom/press-releases/nearly-two-thirds-of-women-worry-about-being-a-victim-of-deepfake-pornography-eset-uk-research-reveals/> accessed 25 March 2025
Europol, ‘Malicious Uses of Abuses of Artificial Intelligence’ (Europol, 6 December 2021) <https://www.europol.europa.eu/cms/sites/default/files/documents/malicious_uses_and_abuses_of_artificial_intelligence_europol.pdf> accessed 6 March 2025
Family Online Safety Institute (FOSI), ‘Understanding “Nudify” Apps’ (FOSI) <https://cdn.prod.website-files.com/5f4dd3623430990e705ccbba/66b4ddcbf6edc08bb9c98e5b_Understanding%20%E2%80%98Nudify%E2%80%99%20Apps%20Resource.pdf> accessed 24 March 2025
Federal Bureau of Investigation (FBI), ‘Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes’ (FBI – Public Service Announcement – Alert I-060523-PSA, 5 June 2023) <https://www.ic3.gov/PSA/2023/PSA230605> accessed 10 March 2025
Finnerty N and Trotter A, ‘Tackling the regulation of sexually explicit deepfakes’ (Kingsley Napley, 25 June 2024) <https://www.kingsleynapley.co.uk/insights/blogs/criminal-law-blog/tackling-the-regulation-of-sexually-explicit-deepfakes> accessed 5 January 2025
Geschwindt S, ‘Taylor Swift deepfake porn deluge a “wake-up call” for lawmakers’ (TheNextWeb, 1 February 2024) <https://www.proquest.com/blogs-podcasts-websites/taylor-swift-deepfake-porn-deluge-wake-up-call/docview/2920603673/se-2?accountid=14557> accessed 5 January 2025
Hao K, ‘Deepfake Porn is Ruining Women’s Lives. Now the law may finally ban it’ (MIT Technology Review, 12 February 2021) <https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/> accessed 6 January 2025
Heikkilä M, ‘Three Ways we can fight Deepfake Porn’ (MIT Technology Review, 29 January 2024) <https://www.proquest.com/other-sources/three-ways-we-can-fight-deepfake-porn/docview/2920195062/se-2?accountid=14557> accessed 6 January 2025
Herbet Smith Freehills, ‘Criminalising Deepfakes – the UK’s new offences following the Online Safety Act’ (Herbet Smith Freehills, 21 May 2024)<https://www.herbertsmithfreehills.com/notes/tmt/2024-05/criminalising-deepfakes-the-uks-new-offences-following-the-online-safety-act> accessed 5 January 2025
Hörnle J, ‘Deepfakes and the Law: Why Britan needs Stronger Protections against Technology-Facilitated Abuse’ (Queen Mary University of London, 23 January 2025) <https://www.qmul.ac.uk/media/news/2025/humanities-and-social-sciences/hss/deepfakes-and-the-law-why-britain-needs-stronger-protections-against-technology-facilitated-abuse.html> accessed 25 March 2025
Internet Watch Foundation (IWF), ‘Online Safety Briefing: House of Lords – Briefing for Report Stage’ (IWF, 4 July 2023) <https://www.iwf.org.uk/media/cmubuhzb/osb-lords-report-briefing-04-07-2023.pdf> accessed 6 March 2025
IWF, ‘Artificial Intelligence (AI) and the Production of Child Sexual Abuse Material’ (IWF, 2024) <https://www.iwf.org.uk/about-us/why-we-exist/our-research/how-ai-is-being-abused-to-create-child-sexual-abuse-imagery/> accessed 10 March 2025
IWF, ‘What has Changed in the AI CSAM Landscape?’ (IWF, July 2024) <https://www.iwf.org.uk/media/nadlcb1z/iwf-ai-csam-report_update-public-jul24v13.pdf> accessed 10 March 2025
Laird E, Dwyer M and Woelfel K, ‘In Deep Trouble: Surfacing Tech-Powered Sexual Harassment in K-12 Schools’ (Centre for Democracy & Technology, 1 September 2024) <https://cdt.org/wp-content/uploads/2024/09/2024-09-26-final-Civic-Tech-Fall-Polling-research-1.pdf> accessed 3 January 2025
Langreo L, ‘Students are Sharing Sexually Explicit “Deepfakes”: Are Schools Prepared? (Education Week, 26 September 2024) <https://www.edweek.org/leadership/students-are-sharing-sexually-explicit-deepfakes-are-schools-prepared/2024/09> accessed 3 January 2025
Lavinia E, ‘I’ve seen boys request fake nudes of their teachers and mothers’: How Nudify Apps are Violating Women and Girls in the UK’ (Glamour, 24 June 2024) <https://www.glamourmagazine.co.uk/article/nudify-apps-investigation> accessed 22 March 2025
Leake N, ‘Fear of Deepfake Porn puts Women off Political Roles, warns Baroness Owen’ (The Daily Telegraph, 19 December 2024) <https://www-proquest-com.ezproxy.kingston.ac.uk/newspapers/fear-deepfake-porn-puts-women-off-political-roles/docview/3146645713/se-2?accountid=14557> accessed 7 January 2025
Lees D, ‘Deepfakes are Being Used for Good’ (University of Reading, 8 November 2022) <https://research.reading.ac.uk/research-blog/2022/11/08/deepfakes-are-being-used-for-good-heres-how/> accessed 6 March 2025
Line H, ‘Crackdown on Deepfake Porn Makers’ (Daily Mail London, 16 April 2024) <https://www-proquest-com.ezproxy.kingston.ac.uk/newspapers/crackdown-on-deepfake-porn-makers/docview/3039086565/se-2?accountid=14557> accessed 5 January 2025
McGlynn C and Davies G, ‘Soliciting the Creation of Sexually Explicit Deepfakes: Analysis of the Current Criminal Law, Loopholes and Reform Options’ (IIA0012 - Written Evidence to the House of Commons Enquiry, January 2025) <https://committees.parliament.uk/writtenevidence/134382/html/> accessed 14 January 2025
Newman C, ‘”Deepfake Porn”: How I Became a Victim’ (4 News, 27 December 2024) <https://www.channel4.com/news/deepfake-porn-how-i-became-a-victim> accessed 25 March 2025
Obadia S, ‘Survivor Safety: Deepfakes and the Negative Impacts of AI Technology’ (Maryland Coalition Against Sexual Assault, 8 May 2024) <https://mcasa.org/newsletters/article/survivor-safety-deepfakes-and-negative-impacts-of-ai-technology> accessed 25 March 2025
Ofcom, ‘A Deep Dive into Deepfakes that Demean, Defraud and Disinform’ (Ofcom, 23 July 2024)<https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/deepfakes-demean-defraud-disinform/> accessed 7 March 2025
Oxford English Dictionary, ‘Deepfake’ (Oxford English Dictionary) <https://www.oed.com/dictionary/deepfake_n?tab=meaning_and_use#1345352340> accessed 3 January 2025
Rahman-Jones i, ‘Taylor Swift Deepfakes Spark calls in Congress for New Legislation’ (BBC, 27 January 2024) <https://www.bbc.co.uk/news/technology-68110476> accessed 25 March 2025
Stryker C and Kavlakoglu E, ‘What is Artificial Intelligence (AI)?’ (IBM, 9 August 2024)<https://www.ibm.com/think/topics/artificial-intelligence> accessed 6 March 2025
Turnball T, ‘Woman’s Deepfake Betrayal by Close Friend: “Every moment turned into Porn”’ (BBC, 8 February 2025) <https://www.bbc.co.uk/news/articles/cm21j341m31o> accessed 25 March 2025
Tyler M, ‘All Porn is Revenge Porn’ (Feminist Current, 24 February 2016) <https://www.feministcurrent.com/2016/02/24/all-porn-is-revenge-porn/> accessed 10 March 2025
University of Bath, ‘Deepfake Shows its Positive Face’ (University of Bath, 20 June 2024) <https://www.bath.ac.uk/announcements/deepfake-shows-its-positive-face/> accessed 10 March 2025
Williams R, ‘The Download: How to Combat Deepfake Porn, and Neuralink’s First Implant’ (MIT Technology Review, 30 January 2024) <https://www.proquest.com/other-sources/download-how-combat-deepfake-porn-neuralink-s/docview/2920195035/se-2?accountid=14557> accessed 7 January 2025
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated