NZLII Home | Databases | WorldLII | Search | Feedback

University of Otago Law Theses and Dissertations

You are here:  NZLII >> Databases >> University of Otago Law Theses and Dissertations >> 2023 >> [2023] UOtaLawTD 30

Database Search | Name Search | Recent Articles | Noteup | LawCite | Download | Help

Stuart, Bella-Francesca --- "Bodies in bytes - an argument for the explicit criminalisation of involuntary intimate deepfakes in Aotearoa New Zealand" [2023] UOtaLawTD 30

Last Updated: 14 April 2024

Bodies in Bytes - An Argument for the Explicit Criminalisation of Involuntary Intimate Deepfakes in Aotearoa New Zealand

Bella-Francesca Stuart

A dissertation submitted in partial fulfilment of the degree of Bachelor of Laws (Honours) at the University of Otago – Te Whare Wānanga o Otāgo

October 2023

Acknowledgements

To my supervisor, Alex Latu. Thank you for your invaluable guidance, feedback and insight – I could not have asked for a more dedicated supervisor. Most importantly, thank you for reminding me to trust in myself. You have encouraged me to produce something that is truly “me”, for which I am eternally grateful.

To Caccia, for inspiring this topic. I hope I have done it justice.

To all my wonderful friends, near and far, for your unwavering support of me and everything I do. In particular, thank you to Jade, Carys, Ciaran and Madi. Without your constant encouragement, kindness and excellent proof-reading skills, this dissertation would not have been possible (and would certainly not have been NZLSG compliant).

To Clem, George, Jaiden and Shani for keeping me company on this 15,000-word journey. I am so privileged to consider you my friends, and I am beyond excited to see the wonderful things you will all achieve.

Finally, thank you to my family. To Dad, for inspiring my love of words. To Mum, for being my number one supporter. To Sam, for being my best friend and the sister I never had. Thank you for being proud of me no matter what. Without you, I would not be who or where I am today.

Table of Contents

We must not make a scarecrow of the law, Setting it up to fear the birds of prey,

And let it keep one shape, till custom make it Their perch and not their terror.

William Shakespeare, Measure for Measure, Act 2, Scene 1.

Introduction

First achieving viral status on Reddit in 2017, the colloquially termed “deepfake porn” depicts real people in intimate scenarios which never occurred in reality. When made without the subject’s consent, this hyper-realistic manipulated content is a revolutionary iteration of image- based sexual abuse.1 Though not yet reported as occurring in New Zealand on a great scale, neither are we immune. Netsafe’s Chief Online Safety Officer Sean Lyons noted an increase in reports of the phenomenon in 2022,2 and the New Zealand Police have described that “incipient signs” make deepfake pornography a “phenomenon of concern... [that] needs to be watched closely.”3

Awareness of deepfake pornography has increased in the domestic public consciousness in recent years.4 Despite this it remains largely ignored by New Zealand’s academic, political and legal communities. Credit must be given to Arran Hunt,5 Curtis Barnes and Tom Barraclough,6 who have been among the few attempting to raise awareness of the phenomenon’s domestic legal status, given that New Zealand has no legislative provisions explicitly addressing the potential harms of deepfake pornography.7 While a “wide range of legal and pseudo-legal

1 The concept of “image based sexual abuse” was developed by Clare McGlynn and Erika Rackley, who use it to describe the wide-ranging continuum of abusive practices involving the involuntary making, taking and distribution of intimate images. See Clare McGlynn and Erika Rackley “Image-Based Sexual Abuse” (2017) 37 OJLS 534.

2 Finn Hogan “Experts Concerned Over the Rise of Deepfake Technology” Newshub (online ed, New Zealand, 29 October 2022).

3 Miriam Lips and Elizabeth Eppel Mapping Media Content Harms: A Report Prepared for Department of Internal Affairs (Victoria University of Wellington Te Herenga Waka, 22 September, 2022) at 12.

4 See “NZ’s First Case of Deepfake Pornography Triggering Alarm Bells for Officials” 1News (online ed, New Zealand, 1 August 2020); “AI-Generated Nude Images Spread at Spain School, Parents Outraged” 1News (online ed, New Zealand, 24 September 2023); Sophie Harris “Woman Left Distraught After Stalker Edited Her Face into Porn Video, Put it Online” Stuff (online ed, New Zealand, 1 July 2022); AP “Deepfake Porn Could be a Growing Problem Amid AI Race” NewstalkZB (online ed, New Zealand, 17 April 2023); Simon Shepherd “Digital Experts Worried About Growth of ‘Deep Fakes’ Videos” Newshub (online ed, New Zealand, 2 April 2018); Katie Harris “Calls from MPs and Survivor for Protections for ‘Deepfake Porn’ Victims” NZHerald (online ed, New Zealand, 7 December 2021); Drew Harwell “Fake-Porn Videos Are Being Weaponised to Harass and Humiliate Women” Stuff (online ed, New Zealand, 31 December 2018); Derek Hawkins “Reddit Bans ‘Deepfakes,’ Pornography Using the Faces of Celebrities” Stuff (online ed, New Zealand, 9 February 2018); Brittany Keogh “Deepfakes: New Zealand Experts on How ‘Face-Swap’ Could Turn Sinister” Stuff (online ed, New Zealand, 22 March 2020); and David Court “Deepfake Videos Coming to a Social Media Platform You Use” Stuff (online ed, New Zealand, 16 June 2019).

5 Arran Hunt is a Partner at McVeagh Fleming, and former member of the Auckland District Law Society’s

Technology and Law Committee. I extend my sincere gratitude to Arran for his support of this dissertation.

6 Curtis Barnes and Tom Barraclough are Directors of Brainbox Limited – a New Zealand-based think tank and consultancy on issues of technology, law and public policy.

7 However, academic recognition and contemplation of the phenomenon appears to be slowly increasing, see Nikki Chamberlain “Privacy and Social Media” in Nikki Chamberlain and Stephen Penk (eds) "Privacy - A to Z of New Zealand Law" (online ed, Thomson Reuters, 2023) at [46.7.4.2].

regimes” relate broadly to the harms these hyper-realistic manipulated images can inflict, how these regimes may apply has not yet been considered by the courts, and has been subject to minimal academic discussion.8 This has left a vacuum of ambiguity which, while broadly recognised by a number of commentators, has not yet been subject to comprehensive analysis.9

This dissertation seeks to address this gap in the domestic literature, suggesting that beyond being merely ambiguous, New Zealand’s legal landscape is currently inadequate to address this emergent phenomenon. To resolve this, I will argue that legislative reform is necessary to explicitly criminalise the publication of involuntary intimate deepfakes.10 For present purposes, a distinction has been drawn between the publication and mere creation of deepfakes. For reasons which will be discussed in Chapter Five, the argument to criminalise mere creation of involuntary intimate deepfakes engages several unique considerations which justify a separate analysis. Further, criminalising the publication of involuntary intimate deepfakes is a complex legal issue and raises several collateral questions which fall outside this paper’s scope – such as whether criminalisation should extend to deepfakes which depict deceased individuals, and intersections between the phenomenon and questions of indigenous data sovereignty and Māori data rights. As such, much remains to be discussed. My hope is that these questions attract further scholarly consideration in the near future.

Fundamentally, the argument to criminalise the publication of involuntary intimate deepfakes requires two matters be established – that the harm caused by the publication of involuntary intimate deepfakes justifies their criminalisation, and that New Zealand’s criminal law is currently ill-equipped to impose adequate criminal liability. To justify criminalisation, Chapter

8 Curtis Barnes and Tom Barraclough Perception Inception: Preparing for Deepfakes and the Synthetic Media of Tomorrow (New Zealand, Brainbox, 2019) at [15].

9 See Hogan, above n 2; Rob Vaughan; “Glaring Gaps in Harmful Digital Communications Bill” (12 November 2021) Stace Hammond, <https://www.stacehammond.co.nz/glaring-gaps-in-harmful-digital- communications-bill/>; Arran Hunt and Kesia Denhardt “The Omnipresence of Online Harm” (paper presented to Legalwise Seminar) at 15-16; Karen Ngan and Michelle Dunlop “Pictures Don’t Lie, or Do They?” (20 May 2022) Simpson Grierson <https://www.simpsongrierson.com/insights-news/legal-updates/pictures-dont-lie-or- do-they>; Antonia M “Does New Zealand Need a Specific Law For Deepfakes” (6 September 2019) Linkedin

<https://www.linkedin.com/pulse/does-new-zealand-need-specific-law-deepfakes-antonia-modkova/>; Tess McClure “New Zealand ‘Revenge Porn’ Laws in Spotlight Amid Accusations Against Former National Candidate” The Guardian (online ed, New Zealand, 3 June 2021); Sara Barker “The Deepfake Dilemma: How it Affects Privacy, Security and Law in Aotearoa” FutureFive (online ed, New Zealand, 17 November 2021); and Sophie Cornish “Law Loopholes Around “Deepfakes” A Threat to Justice, Police and Law Experts Warn” Stuff (online ed, New Zealand, 30 July 2022).

10 The precise meaning and scope of the term “involuntary intimate deepfake” as used in this paper is explained further below at page 5. For present purposes, it is sufficient to note that the term is broader than “deepfake pornography” by capturing content which is intimate but not explicitly sexual (for example, mere nudity).

Two will establish that the phenomenon inflicts harms of a kind which warrant criminal legislative intervention, and Chapter Three will demonstrate that the criminal law is the only regulatory mechanism which can effectively address these harms. Then, Chapter Four will comprehensively consider the (in)ability of New Zealand’s existing criminal legal landscape to address the phenomenon. Finally, in Chapter Five I will provide some recommendations regarding the potential scope and design of a prospective offence, closing with a brief discussion on whether mere creation of involuntary intimate deepfakes could or should also attract criminal liability.

Chapter One – An Introduction to Deepfakes and the Analytical Framework

To provide the necessary context for this analysis, Chapter One will first explain the “deepfake” phenomenon, establish the scope of my argument for legislative reform and introduce the analytical framework I have adopted to mount this argument.

I Defining Deepfakes

A “portmanteau of ‘deep learning’ and ‘fake’”,11 “deepfakes” are hyper-realistic manipulated images produced using artificial intelligence technology.12 To create a deepfake, a “source library” of images are input into a machine learning programme which produces a “malleable likeness” of the source library’s “subject” – the individual who is (ostensibly) identifiable in the product content.13 This likeness can be superimposed into new content with false properties, resulting in “hyper-realistic, digitally falsified”14 imagery which depicts “someone undertaking acts they have not done and saying words they have not said.”15 Deepfakes are but one of numerous emergent media manipulation technologies – technologies which “synthesise new audio and images.”16 Though this paper refers to “deepfakes” throughout, unless a distinction is explicitly drawn, the conclusions reached also apply to other forms of hyper-realistic manipulated imagery.17

Deepfakes differ from “real” images like photographs or videos. This distinction can be conceptualised through the difference between “capture” and “manipulation” technologies.18 Capture technologies (e.g., cameras) “capture light or sound energy and convert it to digital

11 Taylor Matthews “Deepfakes, Intellectual Cynics, and the Cultivation of Digital Sensibility” (2022) 92

Royal Institute of Philosophy Supplement 67 at 67.

12 Bobby Chesney and Danielle Citron “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security” (2019) 107 CLR 1753 at 1759-1860. See also Carl Öhman “Introducing the Pervert’s Dilemma: A Contribution to the Critique of Deepfake Pornography” (2020) 22 Ethics Inf Technol 133 at 133.

13 In this paper, “subject” refers to the individual who appears to be depicted in the content. In a deepfake, this will usually be the person whose face is shown, though not necessarily always. For further discussion on the difficulties of identifying a deepfake’s subject, see discussion of “the Plaintiff Problem” in Moncarol Wang “Don’t Believe Your Eyes: Fighting Deepfaked Nonconsensual Pornography with Tort Law” (2022) 415 U Chi L F. 415 at 417-418, at 432. See also Chesney and Citron, above n 12, at 1758-1759; and Chamberlain, above n 7, at [46.7.4.2].

14 Wang, above n 13, at 415.

15 Chamberlain, above n 7, at [46.7.4.2]. See also Chesney and Citron, above n 12, at 1758.

16 Barnes and Barraclough, above n 8, at [37]-[42].

17 See Federica Celli “Deepfakes are Coming: Does Australia Come Prepared?” (2020) 17 Canb LR 193 at 195.

18 Barnes and Barraclough, above n 8, at [95].

data”, while manipulation technologies “change digital data” in a manner that remains capable of display.19 More simply, capture technologies produce images depicting events that have occurred or are occurring, and manipulation technologies (such as deepfakes) produce images depicting events that never occurred.

II The Analytical Scope

Deepfake technology has both anodyne20 and nefarious21 uses. Without commenting on the potential need for broader deepfake regulation, this paper argues that the publication of a designated sub-category of deepfakes – involuntary intimate deepfakes (IIDs) – should attract criminal liability.22 As identified by Barnes and Barraclough, the regulation of this sub- category is “one of the most pressing policy issues” emerging from the rise of media manipulation technologies.23

I suggest that the standard for involuntariness must cover two scenarios – when a subject has not consented to the deepfake’s making, or when a subject consented to its making, but not its publication.24 Further, I suggest the scope of what is “intimate” should align with the definitions of “intimate visual recording” under the Harmful Digital Communications Act 2015 (HDCA) and the Crimes Act 1961. Specifically, to reflect the fact that it is not only explicitly sexual content that is capable of causing harm, “intimate” content should include that which depicts mere nudity / partial nudity, intimate bodily activities (such as toileting and showering) and the colloquially termed “up-skirting” and “down-blousing”.25

19 Barnes and Barraclough, above n 8, at [96].

20 For example, deepfakes are being used in entertainment, education and healthcare contexts in beneficial ways. See Wang, above n 13, at 418; Chesney and Citron, above n 12, at 1769; and Celli, above n 17, at 196.

21 Deepfakes also pose a significant risk of political disinformation. See BuzzFeedVideo “You Won’t Believe What Obama Says in This Video!” (April 18, 2018) Youtube

<https://www.youtube.com/watch?v=cQ54GDm1eL0>; and Hannah Hudnall “Fact Check: Deepfake Video Shows Vladimir Putin Discussing Democracy, is from 2020 Ad Campaign” USA Today (online ed, United States, 19 April 2023). See also Wang, above n (1)at 415; and Chesney and Citron, above n 12, at 1769-1770.

22 This term is drawn from Nikki Chamberlain’s use of “Fake Involuntary Pornographic Deepfakes” However, I suggest referring explicitly to “pornography” is too narrow to address the full scope of potentially harmful intimate content. Further, “fake” is largely redundant given the inherent nature of deepfakes. Compare Chamberlain, above n 7, at [46.7.4.2].

23 Barnes and Barraclough, above n 8, at [10], [643].

24 Compare the definition of intimate visual recording in s 4 of the Harmful Digital Communications Act 2015 with s 216G of the Crimes Act 1961, where the latter does not capture situations when a subject has consented to a video’s making. For further discussion, see below at footnote 176. For the full statutory definitions, see Appendix 1.

25 Harmful Digital Communications Act, s 4; Crimes Act, s 216G.

III The Analytical Framework

Any argument for legislative reform must both justify why reform is necessary, and the type of legislation being proposed. Theoretically, legislation may only be made “when it is necessary and the most appropriate means of achieving [a] policy objective.”26 Further, if legislation is to create an offence, the criminal law must be the only regulatory tool capable of effectively achieving the proposed legislation’s objective.27 This is because the criminal law is “powerful, expensive and invasive”, and a “bluntly coercive, morally loaded sledgehammer” to be used “sparingly”, “with care” and never lightly.28

To justify criminalisation, Andrew Simester and Warren Brookbanks designate that both a “positive” and “negative” case must be made out.29 In the present context, the positive case for criminalisation requires establishing that the publication of IIDs can inflict “sufficiently serious” harms which warrant criminalisation. The negative case requires that criminalisation be the least invasive and most effective regulatory mechanism to address these harms.30 Thus, something unique about the criminal law must exist which makes it necessary when compared to regulatory alternatives.31

Chapters Two and Three will argue that both positive and negative cases for criminalising the publication of IIDs can be established.

26 Legislation Guidelines (Legislation Design and Advisory Committee, September 2021) at [2.3].

27 Legislation Guidelines, above n 26, at [22.2], [24.1].

28 Andrew Simester and Warren Brookbanks Principles of Criminal Law (4th ed, Thomson Reuters, Wellington, 2012) at 1003. See also Legislation Guidelines, above n 26, at 121.

29 Simester and Brookbanks, above n 28, at 1004.

30 Simester and Brookbanks, above n 28, at 1004.

31 See Legislation Guidelines, above n 26, at [2.3]; Simester and Brookbanks, above n 28, at 1004.

Chapter Two – The Positive Case for Criminalisation

The positive case for criminalisation aligns largely with John Stuart Mill’s “harm principle”, according to which an individual’s autonomy should only be limited by the state when this would prevent or reduce harm to others.32 However, establishing prima facie harm is insufficient. The principle also requires that harm be sufficiently serious to justify criminalisation, outweighing any conflicting considerations.33 The following analysis will establish that the publication of IIDs is both prima facie harmful and generates sufficiently serious harms to establish the positive case for criminalisation.

I Harm

Whether, and to what extent, the publication of IIDs is harmful are (perhaps surprisingly) contentious issues despite the phenomenon’s increasingly accepted status as image based sexual abuse (IBSA).34 The fact that IIDs do not depict “real” occurrences is relied on by some (including legislators and law-enforcement) to argue that while the conduct may be immoral or disturbing, it is substantially less harmful than capture-based IBSA, or otherwise should not attract criminal liability.35 This point is illustrated in the debate surrounding the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021. Numerous submissions were made to the Justice Committee criticising the fact that the definition of “intimate visual recording” likely excluded hyper-realistic manipulated images, meaning they would not be subject to the proposed offence provision.36 Submitters

32 John Stuart Mill On Liberty (Longman, Roberts & Green, London, 1869) at 9. See also Joel Feinberg The Moral Limits of the Criminal Law: Harm to Others – vol 1 (Oxford University Press, New York, 1984) at 26.

33 Simester and Brookbanks, above n 28, at 1005.

34 See Chesney and Citron, above n 12, at 1758.

35 See discussion in Edvinas Meskys and others “Regulating Deep Fakes: Legal and Ethical Considerations” (2020) 15 JLPLP 24 at 24-27; Danielle Keats Citron and Mary Anne Franks “Criminalizing Revenge Porn” (2014) 49 Wake Forest L Rev 345 at 350; and Nicola Henry and others Image-Based Sexual Abuse: A Study on the Causes and Consequences of Non-Consensual Nude or Sexual Imagery (London, Routledge, 2021) at 72.

36 See Clare McGlynn, Erika Rackley and Kelly Johnson “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at 7; YouthLaw Aotearoa “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at 7-8; Brainbox “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at 2; National Network Ending Sexual Violence Together “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at [3.5]; Office of the Privacy Commissioner “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at [31]; Netsafe “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at [3.12]; and Stace

suggested this was undesirable on the basis that manipulated images caused similar, or the same, harms as “real” intimate images.37 These submissions led the Ministry of Justice to suggest that “the Committee may wish to consider amending the definition of intimate visual recording... to specifically provide for synthetic images.”38 Despite this, and while only discussing the matter briefly in their report, the Justice Committee “were unable to agree” with recommendations that the Bill should explicitly capture manipulated images.39 Thus, despite being made directly aware that the Bill would likely exclude IIDs, the Justice Committee did not consider this fact problematic, demonstrating that they did not think that IIDs should be treated as equivalent to “real” intimate visual recordings. As will be discussed in Chapter Five, I agree with the Justice Committee insofar as it concluded that IIDs should not be shoehorned into the intimate visual recording definition. However, the fact that the Justice Committee did not recommend an alternate solution to criminalise IID publication demonstrates an underlying view that the phenomenon does not justify criminalisation – a point on which we diverge.

The increasing prevalence of IIDs has prompted an emerging body of quantitative research documenting the experiences of victims. This research demonstrates the extensive psychological and external harms being suffered by victims of IID publication. Psychologically, victims have described their experiences as “being fetishised”,40 “digital rape”41 and “so much violation.”42 Others felt “dehumanised”,43 “physically sick, angry [and] degraded,”44 and that the experience was “humiliating, shaming and silencing”.45 Further, victims have suffered ongoing harms including a “visceral fear”46 of when the images might

Hammond “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at 4.

37 See National Network Ending Sexual Violence Together, above n 36, at [3.5] “Considerable harm is caused by this form of I-BSA”; Office of the Privacy Commissioner, above n 36, at [31] “[Deepfakes have] equal potential for manipulated images to cause significant emotional harm and distress”; and Netsafe above n 36, at [3.12] “implicit harm caused by deepfakes and other synthetic media”.

38 Departmental Report: Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill (Ministry of Justice, June 2021) at 20.

39 Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2) (select committee report) at 4-5.

40 Jesslyn Cook Here's What It's Like to See Yourself in a Deepfake Porn Video” Huffington Post (online ed, United States, 23 June 2019).

41 Megan Farokhmanesh “Is It Legal to Swap Someone's Face into Porn Without Consent?” The Verge

(online ed, United States, 31 January 2018).
42 Flynn and others, above n 54, at 1351.
43 Flynn and others, above n 54, at 1351.

44 Danielle Keats Citron “Sexual Privacy” (2019) 128 Yale L J 1870 at 1926.

45 Rana Ayyuub “In India, Journalists Face Slut-Shaming and Rape Threats” The New York Times (online ed, New York, 22 May 2018).

46 Citron, above n 44, at 1925.

reappear,47 and memory reappropriation – where the line between fake and real becomes blurred in the victim’s own mind.48 Managing these psychological harms “takes emotional labour which comes with financial costs”, leading to external consequences.49 For example, one victim described how she was affected “emotionally, physiologically, professionally, in dating and relationships, in bloody every single factor of [my] life”.50 Testimonies like this are reflective of Clare McGlynn, Kelly Johnson and Anastasia Powell’s “social rupture” theory, which posits that IBSA inflicts “all-encompassing devastation or disruption of everyday life and relationships”.51 This can cause victims to withdraw from online spaces, employment, leadership and education, thus negatively impacting their personal development.52 Further, there is substantial risk of reputational harms and diminished career prospects, such as where employers conduct online searches of prospective employees.53 While these harms would be exacerbated by widespread publication, significant harm could clearly still be experienced even if the IID were only published to the subject or a limited audience – for example, as part of a blackmail attempt.

These testimonies demonstrate that prima facie, the publication of IIDs inflicts harm. In this way, a view that the publication of IIDs is not harmful constitutes a misunderstanding. This misunderstanding aligns with fact that as a wider phenomenon, IBSA is highly misunderstood, and its harms are often minimised or trivialised.54

47 See also Clare McGlynn and others “‘It’s Torture for the Soul’: The Harms of Image-Based Sexual Abuse” (2021) 30 Soc Leg Stud 541 at 550.

48 Helen Busby “Deepfake Porn Documentary Explores its ‘Life Shattering’ Impact” BBC News (online ed, United Kingdom, 18 June 2023).

49 Suzie Dunn Technology-Facilitated Gender-Based Violence: An Overview (Canada, Centre for International Governance Innovation, 2021) at 22.

50 Flynn and others, above n 54, at 1351. See also Samantha Bates “Revenge Porn and Mental Health: A Qualitative Analysis of the Mental Health Effects of Revenge Porn on Female Survivors” (2017) 12 Fem Criminol 22 at 25.

51 Flynn and others, above n 54, at 1351-1352.

52 Report of the Secretary General S- LXXVII UN Doc A/77/302 (18 August 2022) at 5; and Chesney and Citron, above n 12, at 1773-1774.

53 See Flynn and others, above n 54, at 1352; Chesney and Citron, above n 12, at 1773-1774; Citron, above n 44, at 1926; Wang, above n 13, at 424; and Mary Anne Franks and Arr Ezra Waldman “Sex, Lies, and Videotape: Deep Fakes and Free Speech Delusions” (2019) 78 Md L Rev 892 at 893.

54 See Suzie Dunn “Is It Actually Violence? Framing Technology-Facilitated Abuse as Violence” in Asher Flynn, Nicola Henry and Jane Bailey (eds) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Bingley, Emerald Publishing Ltd, 2021) at 26-27; and Asher Flynn and others “Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse” (2021) 62 Brit J Criminol 1341 at 1353.

II Serious Harm

While the publication of IIDs has been established as capable of inflicting harm, these harms must also be sufficiently serious to justify criminalisation.55 As Nils Jareborg and Andrew von Hirsch assert, this requires identifying whether harms infringe upon an interest that the criminal law “should” or does protect.56 I propose that the publication of IIDs infringes significantly upon subjects’ dignity and privacy interests – which are interests widely recognised and vindicated by the criminal law.

The ability of IID publication to affect a subject’s dignity interests is demonstrated clearly in the testimonies of victims who described they were “dehumanised”, “degraded”, “digitally rape[d]” and “fetishised”.57 As a normative legal concept, dignity has been critiqued as vacuous and indeterminate.58 Particularly in the sexual violence context, Anna High argues that dignity’s use should be “closely scrutinised”.59 Despite its potential for imprecision, the use of dignity interests reflects the idea that the publication of IIDs is an affront to a subject’s very humanity, which requires they be able to define themselves as they wish and not be used instrumentally as a means to an end.60 When IIDs are published, subjects are “reduced to genitalia, breasts, buttocks and anuses”, rendering them merely as means to a variety of ends including sexual and emotional gratification, financial gain and improved social status.61 Further, by limiting a subject’s ability to present themselves to the world as they see fit, IID publication restricts the subject’s autonomy – “hijacking sexual identities and exercising dominion over stolen sexualities by exhibiting it to others.”62

Beyond these impacts on a subject’s dignity, I argue the publication of IIDs can also be conceptualised as infringing upon a subject’s privacy interests. Classifying these as privacy

55 Simester and Brookbanks, above n 28, at 1004.

56 Andrew von Hirsch and Nils Jareborg “Gauging Criminal Harm: A Living-Standard Analysis” (1991) 11 OJLS 1 at 3.

57 See above discussion at page 8.

58 Anna High “Sexual Dignity and Rape Law” (2022) 33 Yale J L & Feminism 1 at 3, 7. See also Mary Neal “Dignity, Law and Language-Games” (2012) 25 IJSL 107 at 113; and Mirko Bagaric and James Allan “The Vacuous Concept of Dignity” (2006) 5 J Hum Rights 257.

59 High, above n 58, at 7.

60 Immanuel Kant Lectures on Ethics (New York, Cambridge University Press, 1997) at 163. See also High, above n 58, at 6-7; and Martha C. Nussbaum “Objectification” (1995) 24 Phil & Pub Aff 249 at 268.

61 Citron, above n 44, at 1921. See further discussion below at page 34.

62 Rüya Toparlak “Criminalising Deep Fake Pornography: A Gender-Specific Analysis of Image-Based Sexual Abuse” 2023 1 cognitio 1 at 6.

harms strengthens the positive case for criminalisation because privacy interests are not only generally recognised and vindicated by the criminal law,63 but were specifically used to justify the introduction New Zealand’s existing IBSA offences under the Crimes Act and HDCA.64 Thus, conceptualising IID publication as infringing on privacy interests aligns these harms with New Zealand’s existing criminal legislative response to IBSA.

However, there is contention among commentators regarding whether IIDs inflict privacy harms, which largely stems from underlying theoretical confusion regarding privacy itself.65 Privacy is a “nebulous”66, “elusive”67 and “multi-faceted”68 concept which has long evaded precise definition, rendering the law of privacy much like “a haystack in a hurricane.”69 While some commentators confidently use privacy interests to justify regulating IIDs,70 others assert with equal confidence that IIDs are not capable of inflicting privacy harms.71 However, neither faction has justified why they consider that IIDs do, or do not, invade privacy. Without seeking to define privacy, I suggest that the publication of IIDs can constitute a privacy invasion if

63 von Hirsch and Jareborg, above n 56, at 5.

64 To demonstrate, privacy interests were identified by the Law Commission identified as the key justification for introducing the intimate visual recording offences into the Crimes Act, are referenced in the explanatory notes of both the Crimes (Intimate Covert Filming) Amendment Bill and Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill and case law. Further, intimate visual recording offences are located in Part 9A of the Crimes Act, “Crimes Against Personal Privacy”. See Law Commission Intimate Covert Filming (NZLC SP15 2004) at [2.2]; Crimes (Intimate Covert Filming) Amendment Bill 2005 (257-1) (explanatory note) at 1; Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill (305-1) (explanatory note) at 1; Seymour v R [2021] NZHC 2322 at [33]; and Diffin v R [2013] NZCA 460, (2013) 26 CRNZ 368 at [5].

65 Edward Bloustein “Privacy as an Aspect of Human Dignity: An Answer to Dean Prosser” (1964) 39 NYUL 962 at 962.

66 Ursula Cheer and Stephen Todd “Invasion of Privacy” in Stephen Todd Todd on Torts (8th ed, Wellington, Thomson Reuters, 2019) at 1020.

67 Nicole Moreham “The Nature of the Privacy Interest” in Nicole Moreham and others (eds) Tugendhat and Christie: The Law of Privacy and the Media (3rd ed, United Kingdom, Oxford University Press, 2016) at [2.01].

68 Chamberlain, above n 7, at [46.1.1.1].

69 Ettore v Philco Television Broadcasting Co [1956] USCA3 69; 229 F 2d 481 (3d Cir 1956) per Biggs, CJ at [487]. See generally James Waldo, Lin Herbert and Lynette Millett “Engaging Privacy and Information Technology in a Digital Age: Executive Summary” (2010) 2 J Priv Confidentiality 1; Cheer and Todd, above n 66, at 979; and Chamberlain, above n 7, at [46.1.1.1].

70 For example, Meskys and others, above n 35, at 26; Anne Pechenik Gieseke “‘The New Weapon of Choice’: Law's Current Inability to Properly Address Deepfake Pornography” (2020) 73 Vand L Rev 1479 at 1483; and Citron, above n 44, at 1923-1924.

71 See Rebecca Delfino “Pornographic Deepfakes: The Case for Federal Criminalization of Revenge Porn’s Next Tragic Act” (2019) 88 Fordham L Rev 887 at 923 “In the deepfake context, this element [reasonable expectation of privacy] does not exist and cannot be shown, neither victim would ever have an expectation of privacy in the deepfake”; Öhman, above n 12, at 133 “[T]o clarify, this does not necessitate any privacy infringement”; Matthew Kugler and Carly Pace “Deepfake Privacy: Attitudes and Regulation” (2021) 116 NWULR 611 at 628 the reasonable expectation of privacy is not “generally applicable in the deepfake context”; and Email from Arran Hunt (Partner at McVeagh Fleming) to Bella Stuart regarding Hunt’s views of the applicability of Harmful Digital Communications Act 2015 to IIDs (17 April 2023) “Such privacy would not be expected or experienced in a deepfake situation.”

privacy is conceptualised as stemming from a dignity interest. Intuitively, IID publication does not invade privacy in the same manner as capture-based IBSA, as it does not involve an invasion of a subject’s physical space.72 However, the “best known” definition explains privacy is “the right to be let alone.”73 This definition conceptualises privacy as a feature of personhood, making the right to an “intrusion-free sphere” an aspect of inviolate personality which is protected by an individual’s dignitary rights.74 As described by Nicole Moreham:75

[P]rivacy is now seen to promote numerous interests... They include individual dignity and autonomy; the development and maintenance of relationships; the promotion of health and well- being; and protection against the judgement of others.

The publication of IIDs appears to infringe upon this broader conception of privacy.

The potential breadth of one’s privacy interest is illustrated in the United States’ four-part tortious privacy taxonomy, initially formulated by Dean Prosser.76 As described in the American Law Institute’s Second Restatement, privacy can be invaded in four discrete ways – by an intrusion into seclusion,77 giving publicity to private facts,78 appropriating another’s name or likeness for personal gain79 and giving publicity to an individual which places them in a false light.80 Pertinently, the latter two invasions address the types of harms inflicted by the publication of IIDs. Prosser’s taxonomy has been criticised, including by Prosser himself, on the basis that these latter invasions are not “really about privacy at all”,81 but rather confer “something analogous to a property right.”82 However, while not a universally accepted conception of privacy, if privacy is taken as protecting dignity interests, these invasions (and

72 McGlynn and Rackley, above n 1, at 545; and Nicole Moreham “Beyond Information: Physical Privacy in English Law” (2014) 73 CLJ 350, at 354.

73 Samuel Warren and Louis Brandeis “The Right to Privacy” (1890) 4 Harvard L Rev 193 at 195. See generally Moreham, above n 67, at [2.06].

74 Chamberlain, above n 7, at [46.7.4.2]. See also Bloustein, above n 65.

75 Moreham, above n 67, at [2.54].

76 American Law Institute Restatement of the Law on Torts (2nd ed, St Paul, Minnesota, 1977) § 662; Dean Prosser “Privacy” (1960) 48 Cal L Rev 383.

77 American Law Institute, above n 76, § 652B. 78 American Law Institute, above n 76, § 652D. 79 American Law Institute, above n 76, § 652C.

80 American Law Institute, above n 76, §§ 652A, 652E. See also Prosser, above n 76, at 385.

81 Cheer and Todd, above n 66, at 1020. See also Raymond Wacks, The Protection of Privacy (London, Sweet & Maxwell, 1980) at 171; and For Your Information: Australian Privacy Law and Practice Report (Australian Law Reform Commission, ALRC 108, May 2008) at [74.120].

82 American Law Institute, above n 76, § 662. See also Stephen Penk and Natalya King “Common Law Privacy Protection in Other Jurisdictions” in Nikki Chamberlain and Stephen Penk (eds) "Privacy - A to Z of New Zealand Law" (online ed, Thomson Reuters, 2023) at [46.5.6.7].

thus the publication of IIDs) are rightly conceptualised as inflicting privacy harms.83 In New Zealand specifically, dignity was employed by both the Court of Appeal and High Court respectively to justify introducing the Hosking84 and Holland85 privacy torts. Similarly, in Brooker v Police, Thomas J noted that “the nexus between human dignity and privacy is particularly close.”86 Most relevantly, Hammond J has described that “[o]ne aspect of privacy is that it is necessary to protect everybody from misrepresentation or misportayals.”87 Thus, while New Zealand has not adopted all four privacy torts as in the Restatement, it would appear incorrect to claim that the interests they protect are totally inconsistent with New Zealand’s conception of privacy.88 As evocatively described by Nikki Chamberlain “[d]eepfakes invade an individual’s private realm – ‘a sphere in which to develop her ideas and goals, to let her guard down, and to develop relationships of trust and intimacy.’”89 Thus, I suggest that the harms described by victims can be conceptualised as gravely infringing dignity and privacy interests – two significant legal interests which are recognised and vindicated by the criminal law.

Beyond these infringements to dignity and privacy, the nature of online harms and the status of IID publication as a gendered phenomenon further exacerbate the severity of harm which can be inflicted.

The publication of IIDs primarily occurs online, a context capable of magnifying the harms already described. Put succinctly by Michael Tugendhat and Ian Christie, the internet is “... instantaneous, readily accessible by both recipient and onlookers... cumulative, persistent, viral, potentially global in reach, continuous, and unless arrested, permanent.”90 This increases

83 Chamberlain, above n 7, at [46.1.1.1]. But see Alan Westin Privacy and Freedom (Athenaeum, New York, 1967) at 7; and Julie Inness Privacy, Intimacy and Isolation (Oxford University Press, New York, 1992) at 78.

84 Hosking v Runting [2004] NZCA 34, [2003] 3 NZLR 385 at [239] per Tipping J.

85 C v Holland [2012] NZHC 2155, [2012] 3 NZLR 672 at [27], [67].

86 Brooker v Police [2007] NZSC 30, [2007] 3 NZLR 91 at [182] per Thomas J.

87 Mafart v Television New Zealand Ltd [2006] 3 NZLR 534 (CA) at [60]–[61] per Hammond J.

88 Moreham, above n 67, at [2.39], [2.50]. There is some advocacy for the adoption of a misappropriation tort in New Zealand, see Nikki Chamberlain “Misappropriation of Personality: A Case for Common Law Identity Protection” (2021) 26 TLJ 195. See also Tot Toys Ltd v Mitchell [1992] NZHC 2902; [1993] 1 NZLR 325 (HC) at 60 per Fisher J “There may be a case for going beyond existing causes of action... to North American causes of action for appropriation of personality and/or breach of rights of privacy and publicity.”

89 Chamberlain, above n 7, at [46.7.4.2], citing Lyrissa Lidsky “Prying, Spying and Lying: Intrusive Newsgathering and What the Law Should Do About It” (1999) 73 Tul L Rev 173 at 226-227.

90 Godwin Busuttil, Felicity McMahon and Gervase de Wilde “Privacy, The Internet, and Social Media” in Nicole Moreham and others (eds) Tugendhat and Christie: The Law of Privacy and the Media (3rd ed, United Kingdom, Oxford University Press, 2016) at [15.02]. See also Chamberlain, above n 8, at [46.7.4.2]; and Dunn, above n 49, at 4.

the severity of harm by enabling IIDs to be effectively permanent and unlimited in their reach. Further, IIDs are a gendered phenomenon. This not only means that women are more likely to experience harm, but also that they will experience harm in a more profound manner due to persisting sexual double standards which “enable humiliation, stigma and shame to be visited on women” more readily than men.91 In a 2019 study conducted by European cyber-security company “Deeptrace”,92 100 per cent of nearly 15,000 pornographic deepfakes analysed were found to depict women.93

Further, deepfake technologies are often more effective in generating content of women. For example, the computer application “Deepnude” – which allows users to “strip” uploaded photos – was trained exclusively on images of women, meaning it failed to function when presented with images of a man.94 Suzie Dunn describes that the gendered nature of IBSA generally has “systemic impacts” which reinforce “inequality and maintain discriminatory norms that limit women and transgender people.”95 Further, this solidifies “male dominance of online spaces” by deterring women from using online platforms.96 Of particular concern are the potential impacts of these harms on young girls and their educations. To demonstrate, in September 2023 in the Spanish town of Almendralejo, nearly 30 girls aged between 11–17 had IIDs made of them and circulated widely among their community.97 Local authorities believe the suspects, who made the images with a for-purpose application using images found on the girls’ social media accounts — to be equally young, between the ages of 12–14.98 While the scholarship has focused on the collateral impacts of IIDs on adult women, this recent example demonstrates the significant risks posed to young women and girls as this form of technology becomes increasingly accessible to youth.

91 McGlynn and Rackley, above n 1, at 544.

92 Deeptrace has since changed its name to Sensity AI.

93 Henry Ajder and others The State of Deepfakes: Landscape, Threats and Impact (Deeptrace, September 2019) at 2.

94 Adjer and others, above n 93, at 8. See also Kristen Thomasen and Suzie Dunn “Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada’s Decision in R v Jarvis” in Jane Bailey, Asher Flynn and Nicola Henry (eds) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Emerald Publishing Limited, 2021) at 559.

95 Dunn, above n 49, at 20.

96 McGlynn and Rackley, above n 1, at 551. See generally Report of the Secretary General, above n 52; and Marjan Nadim and Audun Fladmoe “Silencing Women? Gender and Online Harassment” (2019) 39 Soc Sci Comput Rev 245 at 245–46.

97 “Spanish Prosecutor Investigates If Shared AI Images of Naked Girls Constitute a Crime” The Guardian (online ed, United Kingdom, 25 September 2023); and Guy Hedgecoe “AI-Generated Naked Child Images Shock Spanish Town of Almendralejo” BBC News (online ed, United Kingdom, 24 September 2023).

98 “AI-Generated Nude Images Spread at Spain School, Parents Outraged”, above n 4.

III Balancing Requirements

To justify criminalisation, beyond merely establishing these serious dignity and privacy harms, the gravity and probability of these harms must be shown to outweigh how reasonable the risk of harm is, and the need to protect any potential value in the harmful conduct.99 The gravity of harm has largely been demonstrated in the preceding two parts, which also speak to the risk of harm likely being unreasonable (in that it will take a significant competing interest to justify this risk).

It could be suggested that the probability of harm in New Zealand is too low to justify criminal legislative action. However, as discussed earlier, though the phenomenon is not yet occurring domestically at any great scale, New Zealand is far from immune.100 International trends demonstrate that domestic occurrences are likely to increase,101 and the “democratisation of deepfakes” makes their probability increasingly likely.102 This democratisation reflects both an increased access to IIDs themselves, and to the tools which generate them. To demonstrate, in 2019, Deeptrace found that the four largest dedicated deepfake porn sites had attracted 134,364,438 views – a number which has since increased exponentially.103 Deepfake pornography sites are ever-increasing in number, making access to IIDs simpler than ever.104 Further, the software needed to create IIDs is highly accessible, affordable and user-friendly. Realistic video-manipulation is no longer “a laborious pursuit” only available to technophiles or Hollywood, but is now totally available to any person – including school-aged children – capable of a Google search or downloading one of many for-purpose apps to their mobile phone.105 Neither does a person need to manually collate their own source data-set to make an IID, due to the emergence of open-source “scraping” tools which download all images of a person from their social media accounts.106 This further increases the probability of harm, given the number of individuals who have placed their face in the public domain via social media.107

99 See Feinberg, above n 32, at [4.2]; and Simester and Brookbanks, above n 28, at 1012.

100 “NZ’s First Case of Deepfake Pornography Triggering Alarm Bells for Officials”, above n 4.

101 Meskys and others, above n 35, at 24.

102 Flynn and others, above n 54, at 1342.

103 Adjer and others, above n 93, at 1. See generally Celli, above n 17, at 194.

104 Adjer and others, above n 93, at 8.

105 Wang, above n 13, at 11, at 420. See also Douglas Harris “Deepfakes: False Pornography is Here and the Law Cannot Protect You” (2018) 17 Duke L & Tech Rev 99 at 100; Gieseke, above n 70, at 1487; Toparlak, above n 62, at 3; and Regina Rini and Leah Cohen “Deepfakes, Deep Harms” (2022) 22 JSEP 143 at 146.

106 Toparlak, above n 62, at 2.

107 Mika Westerlund “The Emergence of Deepfake Technology” (2019) 9 Technol Innov Manag Rev 39 at

45.

Regarding the value of this harmful conduct, the publication of IIDs must be recognised as having expressive value, with expression being “as wide as human thought and imagination.”108 However, as affirmed under the New Zealand Bill of Rights Act 1990, freedom of expression does not operate in absolutes.109 As described by Danielle Keats Citron in the United States context, speech which contributes little or causes significant harms can be subject to regulation.110 Citron suggests that only an “extreme view” would consider threats, nude images and defamation as speech worthy of protection.111 This has been reflected by New Zealand’s Supreme Court, which described that freedom of expression can justifiably be limited where there are “sufficiently serious and reprehensible interferences with the rights of others.”112 The discussion in this chapter has demonstrated that the publication of IIDs constitutes such an interference. Thus, while freedom of expression concerns must be acknowledged, it would be asinine to argue they outweigh the significant harms at play.

IV Conclusion

This discussion demonstrates that the positive case to criminalise the publication of IIDs can be made out, given that the serious dignity and privacy harms at play are not outweighed by competing interests. What remains to be established is whether criminalisation is the most effective and least invasive mechanism to address these serious harms.

108 Moonen v Film and Literature Board of Review [1999] NZCA 329; [2000] 2 NZLR 9 (CA) at [15] per Tipping J.

109 New Zealand Bill of Rights Act 1990, ss 5, 14.

110 Danielle Keats Citron “Addressing Cyber Harassment: An Overview of Hate Crimes in Cyberspace” (2015) 6 JOLTI 1 at 8.

111 Citron, above n 110, at 9.

112 Morse v Police [2011] NZSC 45, [2012] 2 NZLR 1 at [106]. See also New Zealand Bill of Rights Act ss

4-6.

Chapter Three – The Negative Case for Criminalisation

Julia Black defines regulation as “the sustained and focused attempt to alter the behaviour of others according to defined standards or purposes” and can be achieved in several different ways.113 However, different forms of regulation have differing impacts on those subject to them, and thus serve different purposes. To establish the negative case for criminalisation, Chapter Three will discuss industry self-regulation and the civil law as key regulatory alternatives to criminalisation – ultimately concluding that the criminal law has unique qualities which render it the sole regulatory mechanism capable of addressing the harms caused by the publication of IIDs.

I Adequacy of Industry Self-Regulation

Because IIDs proliferate primarily online, online media platforms can implement and adopt self-regulatory mechanisms to address the phenomenon. This is already being observed on mainstream media platforms – with TikTok requiring clear disclosures of manipulated media;114 Instagram prohibiting impersonation and misleading activity;115 Google implementing tools to facilitate the removal of unwanted images;116 and Reddit,117 X (previously Twitter),118 Pornhub119 and Discord120 all explicitly prohibiting deepfake pornography. Further, Meta, Google, Amazon, Twitch and X have committed to New Zealand’s Code of Practice for Online Safety or Harm – a best-practice self-regulatory framework aimed at reducing harmful online content.121 Though voluntary, these signatories are accountable to administrators and can be subject to sanctions.122

113 Julia Black “Critical Reflections on Regulation” (2002) 27 AJLP 1 at 26.

114 TikTok “Community Guidelines: Integrity and Authenticity” (March 2023)

<https://www.tiktok.com/community-guidelines/en/integrity-authenticity/>

115 Instagram “Community Guidelines” <https://help.instagram.com/477434105621119>

116 Google Search Help “Remove Explicit or Intimate Personal Images From Google”

<https://support.google.com/websearch/answer/6302812?sjid=7547990195809157247-AP>

117 Reddit “Never Post Intimate or Sexually Explicit Media of Someone Without Their Consent” (July 2023) <https://support.reddithelp.com/hc/en-us/articles/360043513411>

118 X Help Center “Sensitive Media Policy” (March 2023) <https://help.twitter.com/en/rules-and- policies/media-policy>

119 Pornhub “Terms of Service” (17 August 2023) <https://www.pornhub.com/information/terms#terms>

120 Discord “Discord Community Guidelines” (24 February 2023) <https://discord.com/guidelines>

121 The Code “About The Code” <https://thecode.org.nz/about-the-code/>

122 The Code: Aotearoa New Zealand Code of Practice for Online Safety Harms (July 2022) at [5.3].

However, self-regulation is “no panacea for all problems”.123 Sole reliance on self-regulation places vulnerable individuals at “the mercy of big platforms” whose interests do not always align with society’s, particularly given the profitability of divisive and harmful content.124 This is especially relevant in the context of IIDs, given that 94 per cent of the “deepfake pornography ecosystem” is hosted on dedicated websites “on the fringes of the internet” by those who have no interest in regulating or banning this content.125 Further, industry self-regulation cannot address private peer-to-peer publication when no online platform is used. Thus, while self- regulation is a useful regulatory aide-de-camp, it cannot replace legal regulatory mechanisms.126

II Adequacy of the Civil Law

Several civil regimes touch upon the harms caused by IID publication – including the HDCA, misappropriation of intellectual property,127 character merchandising,128 civil harassment, defamation and privacy. 129 To justify criminalisation, it is not necessary to argue that these civil remedies could never vindicate the harms inflicted by IID publication. However, neither is clear that the civil law could comprehensively or predictably address the publication of IIDs. As such, the absence of an all-encompassing civil law response generally supports the argument for legal reform.

  1. Limitations in Application

Existing civil remedies may not be capable of addressing the full scope of potentially harmful IIDs. For example, IIDs may not always satisfy the statutory definition of harassment under

123 Michael Cusumano, Annabelle Gawer and David Yoffie “Can Self-regulation Save Digital Platforms” (2021) 30 Ind Corp Change 1259 at 1263.

124 Samantha Cole “Targets of Fake Porn Are At the Mercy of Big Platforms” Vice (online ed, United States, 6 February 2018). See also Westerlund, above n 107, at 44; and Wang, above n 13, at 430.

125 Adjer and others, above n 93, at 6; and Wang, above n 13, at 430.

126 The need for self-regulation to occur alongside other regulatory mechanisms has been recognised by the Department of Internal Affairs as part of its current Safer Online Services Review and Media Platforms Review. See Department of Internal Affairs Discussion Document: Safer Online Services and Media Platforms (June 2023) at 6.

127 See generally Ursula Cheer Burrows and Cheer: Media Law in New Zealand (8th ed, Wellington, LexisNexis, 2021) at 462.

128 See generally Ursula Cheer, above n 127, at 466.

129 See Russell Spivak “‘Deepfakes’: The Newest Way to Commit One of the Oldest Crimes” (2019) 3 GEO L TECH REV 339 at 368-83.

the Harassment Act 1997.130 Defamation suffers from a “fatal Achilles heel” by requiring publication to a third party, thus not capturing situations where IIDs are distributed only to the subject.131 Whether New Zealand’s privacy torts could apply to IIDs is unclear for two reasons

– uncertainty as to whether the torts can apply to false “facts”,132 and an absence of support for a privacy-based tort of misappropriation in line with the United States’ taxonomy.133 The HDCA is also likely unfit for purpose. The HDCA’s civil regime provides remedies for digital communications that breach communication principles and cause significant emotional distress.134 Communications Principle 6 is that “a digital communication should not make a false allegation”, which may well be considered the case regarding IIDs.135 However, before being able to take their claim to the District Court, an applicant must first have laid a complaint with Netsafe.136 While this threshold is admirable in its motivation to prevent court backlog, in reality it prolongs the complaints process for victims and arguably prevents the HDCA from achieving its statutory purpose of providing “a quick and efficient means of redress.”137 Further, even if a complaint reaches the District Court, the prescribed remedies are “primarily technological and... do not include punitive sanctions”. As such, remedial confines limit the HDCA’s ability to vindicate victims’ rights.138

However, even if civil remedies could address the full scope of harmful IID publication, the civil law has inherent limitations (and the criminal law inherent benefits) which renders criminalisation the only regulatory tool capable of effectively addressing the phenomenon.

130 Discussed below at page 45.

131 Wang, above n 13, at 442. See also Ursula Cheer, above n 131, at 17; Barnes and Barraclough, above n 9, at [269], [431]; and Kugler and Pace, above n, at 613.

132 Both Hosking and Holland torts require the existence of “facts in which there is a reasonable expectation

of privacy”. In New Zealand, it remains unclear whether “facts” must be true, or whether the torts can also protect false facts – Nikki Chamberlain describes this would be “an uncomfortable stretch”. However, this may no longer be the case in light of ZXC v Bloomberg [2022] UKSC 5 at [111], wherein the United Kingdom Supreme Court held that the tort of misuse of private information applied regardless of “whether the information is true of false.” See Chamberlain, above n 8, at [46.7.4.2]; Cheer and Todd, above n 66, at 1013; Barnes and Barraclough, above n 9, at [270]; and A v Hunt [2006] NZAR 577 (HC) at [58].

133 See generally Chamberlain, above n 88.

134 Harmful Digital Communications Act, ss 6-12.

135 Harmful Digital Communications Act, ss 6(1), 12(2). See generally David Harvey internet.law.nz: Selected Legal Issues For The Digital Paradigm (5th ed, Wellington, LexisNexis, 2023) at 361.

136 Harmful Digital Communications Act, s 12(1).

137 Harmful Digital Communications Act, s 3(b). See generally Harvey, above n 135, at 367; and Stephanie Panzic “Legislating for E-Manners: Deficiencies and Unintended Consequences of the Harmful Digital Communications Act” (2015) 21 AULR 225 at 236.

138 Harvey, above n 135, at 361. See also Harmful Digital Communications Act, s 19; and Sean Brennan and Geoff McLay “Defamation, Privacy and the Intentional Infliction of Emotional Distress Meet the Internet and the Harmful Digital Communications Act” in Andrew Barker and Geoff McLay (eds) Torts Update (Wellington, New Zealand Law Society, 2016) at 32.

  1. Inherent Limitations

Beyond these potential legal limitations, the civil law suffers inherent limitations which render it inadequate to solely regulate IID publication. Inaccessibility to civil justice is an issue of ever-increasing domestic notoriety.139 The fact that civil justice is “beyond the reach of many” means civil remedies are largely only available to well-resourced plaintiffs.140 The most obvious barriers are financial, stemming from the costs of legal representation and the justice system, the unattainability of civil legal aid and fears of adverse costs awards.141 Further, court attendance can incur indirect costs including transport, childcare and time off work.142 These financial issues disproportionately affect the “missing middle” who sit above civil legal aid thresholds but cannot independently finance legal action.143 These concerns are particularly pertinent in a novel case such as this, given the significant uncertainty of outcome. Further, some prospective plaintiffs face cultural and social barriers. The justice system’s institutional racism and cultural incompetence have been identified as creating mistrust in affected individuals which can deter them from pursuing legal remedies.144 Beyond this, inequities in knowledge can create feelings of shame, inferiority and embarrassment which also act as deterrents.145 These issues are “deeply entrenched” and though improvements are being made, remedying the problem will take time, as it requires “significant structural change”.146 Thus, inaccessibility to civil justice substantially restricts the civil law’s capacity to provide equitable redress to victims, empowering only those with adequate resources to vindicate their rights.147

Even if a prospective plaintiff is not barred by access to justice issues, the pursuance of civil actions can be deterred or invalidated by effectively judgment-proof perpetrators.148 The proliferation of free or highly affordable applications which create deepfakes means that

139 See Helen Winkelmann “Access to Justice: Who Needs Lawyers” [2014] OtaLawRw 2; (2014) 13 Otago LR 229; Rules Committee Improving Access to Civil Justice (November 2022) at [12]; and Ministry of Justice “Key Initiatives: Improving Access to Civil Justice” (20 September 2023) < https://www.justice.govt.nz/justice-sector- policy/key-initiatives/access-to-civil-justice/>

140 Rules Committee, above n 142, at [20]; and Barnes and Barraclough, above n 9, at [436].

141 New Zealand Law Society Access To Justice: Stocktake of Initiatives (December 2020) at [4.17]; and Rules Committee, above n 142, at [23].

142 Rules Committee, above n 142, at [16], [21], [23]; New Zealand Law Society, above n 141, at [4.17]; and Citron, above n 110, at 5.

143 New Zealand Law Society, above n 141, at [4.18]; and Delfino, above n 71, at 902.

144 New Zealand Law Society, above n 141, at 12.

145 New Zealand Law Society, above n 141, at [24].

146 New Zealand Law Society, above n 141, at [31], [33].

147 See Ministry of Justice “Wayfinding for Civil Justice – Imagining a Better Way of Working Together to Improve Access to Civil Justice in Aotearoa New Zealand” (20 September 2023)

<https://www.justice.govt.nz/justice-sector-policy/key-initiatives/access-to-civil-justice/>

148 Citron, above n 44, at 1931; Delfino, above n 71, at 902; and Chesney and Citron, above n 12, at 1792.

perpetrators do not need substantial financial resources to create IIDs.149 Consequently, a perpetrator may be unable to pay financial damages even if a civil action were successful, leaving victims uncompensated and perpetrators with little to deter them from acting again.150 Oppositely, if perpetrators have substantial resources, their conduct is priced by civil liability, rather than prohibited.151 Finally, victims may also be deterred from pursuing civil actions for fear of experiencing the Streisand Effect – where attempts to remove or censor content bring it greater notoriety.152

This is not to say that the civil law has no benefits. Civil actions place victims back in control, do not require they convince law enforcement of their case’s merits and provide remedies (often financial) which may be of greater practical benefit.153 For these reasons, Moncarol Wang argues the civil law provides the solution for “making whole the victims” of IIDs.154 However, while civil remedies will benefit those who can access them, inaccessibility to justice, judgment-proof perpetrators and the Streisand effect will likely deter some victims from pursuing such actions. This renders the civil law an inconsistent and inequitable regulatory response.155

149 Jared de Guzman “Saving Face: Lessons from the DMCA for Combatting Deepfake Pornography” (2022) 58 Gonz L Rev 109 at 120; and Taylor Linkous “It’s Time for Revenge Porn to Get a Taste of Its Own Medicine: An Argument for the Federal Criminalization of Revenge Porn” (2014) 20 RICH J L & TECH 14 at 19.

150 Guzman, above n 149, at 120. See also Nancy Kim “Web Site Proprietorship and Online Harassment” (2009) UTAH L REV 993 at 1006.

151 Simester and Brookbanks, above n 28, at 1024.

152 Merriam-Webster “Words We’re Watching: ‘Streisand Effect’: Don’t Try To Keep This Under Wraps”

<https://www.merriam-webster.com/wordplay/words-were-watching-streisand-effect-barbra> See also Chesney and Citron, above n 12, at 1793.

153 See generally Nikki Godden-Rasul “Retribution, Redress, and the Harms of Rape: The Role of Tort Law” in Anastasia Powell, Nicola Henry and Asher Flynn (eds) Rape Justice: Beyond The Criminal Law (London, Palgrave Macmillan, 2015) at 112; McGlynn and Rackley, above n 1, at 557; and Joanne Conaghan “Civil Liability: Addressing Police Failures in the Context of Rape, Domestic and Sexual Abuse” (Inaugural Lecture, University of Bristol, 19 February 2015).

154 Wang, above n (1)at 416. Wang’s argument is partially coloured by the United States context in which she writes – specifically, Wang draws on the fact that there is no federal criminalisation to argue that a state- specific patchwork approach is undesirable. However, she also argues more broadly that the criminal law has inherent limitations which prevent it from being an effective regulatory solution.

155 McGlynn and Rackley, above n 1, at 557; Barnes and Barraclough, above n 8, at [318], [264]; and

Linkous, above n 149, at 17-18.

III Adequacy of the Criminal Law

Beyond the inherent weaknesses of self-regulation and the civil law, the criminal law also has unique benefits which make it the necessary regulatory response to address the harms of IID publication.

As a coercive mechanism the criminal law deters, punishes and denounces certain conduct and the individuals who engage in it.156 This addresses several of the civil law’s inherent limitations. For example, fear of punishment and stigmatisation better equips the criminal law to address the conduct of judgment-proof perpetrators.157 Further, the state’s burden of initiating, “investigating, prosecuting and punishing criminal conduct” addresses some financial, social and cultural access to justice barriers by releasing victims from the obligation to initiate and fund legal actions.158

Expressively, the criminal law communicates to victims, perpetrators, law enforcement and wider society a cultural standard of what conduct is acceptable.159 Criminalisation can shape “attitudes, beliefs and behaviour through its messages and lessons”, making it a tool for societal change.160 This is reflected in John Rawls’ conception of the “well-ordered society”, in which he suggests that fairness and equality must not only exist, but must also be publicised.161 Similarly, for Jeremy Waldron, visible legal commitments to justice and dignity are necessary in order for a democratic society to flourish.162 Thus, even though the phenomenon is not yet of widespread concern in New Zealand, this does not justify legislative complacency. Even if criminal sanctions are never imposed, they shape “the operative concerns and normative standards of a given community.”163 This is particularly necessary apropos IIDs, both in terms

156 McGlynn and Rackley, above n 1, at 552; Legislation Guidelines, above n 26, at 121; and Simester and Brookbanks, above n 28, at 1025.

157 Delfino, above n 71, at 902; and Chesney and Citron, above n 12, at 1801.

158 Simester and Brookbanks, above n 28, at 1025.

159 Delfino, above n 71, at 903.

160 McGlynn and Rackley, above n 1, at 552. See also Cass Sunstein “On the Expressive Function of Law (1996) 144 U Pa L Rev 2021 at 2051; Simester and Brookbanks, above n 28, at 12; Citron, above n 44, at 1945-

1946; and Citron, above n 153, at 389.

161 John Rawls “Kantian Constructivism in Moral Theory” (1980) 77 J Phil 515 at 520; and John Rawls

Political Liberalism (2nd ed, New York, Columbia University Press, 1996) at 68-69.

162 Jeremy Waldron “Dignity and Defamation: The Visibility of Hate” (2010) 123 Harv Law Rev 1596 at 1619.

163 Vanessa E Munro “Dev’l-in Disguise? Harm, Privacy and the Sexual Offences Act 2003” in Vanessa E Munroe and Carl F Stychin (eds) Sexuality and the Law: Feminist Engagements (London, Routledge, 2007) at 1; “Rawls, “Kantian Constructivism”, above n 161, at 523; and Waldron, above n 162, at 1619, 1626-1627.

of deterrence and in order to raise awareness of a phenomenon which is currently trivialised, tolerated and misunderstood.164

Once they occur, the harms inflicted by the publication of IIDs are irreversible. This fact is exacerbated in the context of online harms due to the internet’s effective permanency.165 Thus, while harms are potentially compensable ex post facto, New Zealand must prioritise proactive regulatory mechanisms to prevent these irreversible harms from occurring in the first place.

IV Conclusion

I suggest the negative case for criminalisation can be made out. Though beneficial, industry self-regulation and the civil law are inherently limited in their abilities to effectively address the publication of IIDs. Comparatively, the criminal law has the capacity to serve the demands of justice more equitably by vindicating victims’ interests regardless of their financial status or willingness to engage with the justice system. Additionally, the criminal law’s coercive and expressive functions serve to deter the conduct, sending a powerful and necessary message to society that the publication of IIDs is seriously harmful and will not be tolerated. Beyond deterrence, this expression will also assist in addressing societal misunderstanding and tolerance of the phenomenon, hopefully making victims confident to report their experiences. Thus, while not the least invasive regulatory path, criminalisation is the only mechanism capable of effectively and equitably addressing the serious harms posed by the publication of IIDs.

164 Dunn, above n 54, at 26; and Citron, above n 153, at 392-404.

165 Rebecca Bonnevie “Privacy and Emerging Technologies” in Nikki Chamberlain and Stephen Penk (eds) "Privacy - A to Z of New Zealand Law" (online ed, Thomson Reuters, 2023) at [46.8.2].

Chapter Four – (In)Adequacy of New Zealand’s Existing Criminal Law

In Chapters Two and Three, I have argued that criminalising IID publication is justified on the basis that the positive and negative cases for criminalisation can be established. However, “not every risk needs new law” – many technological developments are effectively addressed by existing legal regimes.166 Thus, to justify legislative reform, it must also be established that New Zealand’s criminal law is currently inadequate.

This Chapter will consider the application of four separate criminal regimes to the publication of IIDs – the intimate visual recording offences under the Crimes Act and HDCA; the HDCA s 22 offence of causing harm by posting a digital communication; the objectionability offences under the Films, Videos, and Publications Classification Act 1993 (FVPCA); and criminal harassment under the Harassment Act. These regimes have been selected because they are designed to address speech and communication harms comparable to those inflicted by IID publication, and are thus best suited to address the phenomenon. While other tangential offences could operate in a deepfake context, criminal liability would depend on the IID’s content and usage. For example, if an IID were used to facilitate blackmail, the fact that a deepfake was used would not preclude the offence’s operation.167 However, to effectively address the publication of IIDs, the criminal law must address the harms inherent in the content, not only harms which arise when the content is used in specific ways.

As will be argued, each regime is incapable of comprehensively addressing these inherent harms, rendering New Zealand’s criminal law unfit for purpose and strengthening the argument for legislative reform.

166 Barnes and Barraclough, above n 8, at iv. See generally Rodger Brownsword, Law 3.0: Rules, Regulation, and Technology (London, Routledge, 2020) at 3; Patrick Phelan “Are the Current Legal Responses to Artificial Intelligence Facilitated ‘Deepfake’ Pornography Sufficient to Curtail the Inflicted Harm?” (2022) 9 NELR 20 at 25; Roger Brownsword and Morag Goodwin Law and the Technologies of the Twenty-First Century: Text and Materials (Cambridge, Cambridge University Press, 2012) at 64; Colin Gavaghan and others Government Use of Artificial Intelligence in New Zealand (Wellington, New Zealand Law Foundation, 2019) at 49-50; and Legislation Guidelines, above n 26, at [3.1], [3.6].

167 Crimes Act, s 237. See also Barnes and Barraclough, above n 8, at [471].

I The Intimate Visual Recording Offences

The intimate visual recording offences are established under Part 9A of the Crimes Act and s 22A of the HDCA. The Crimes Act criminalises the making,168 possessing169 and distributing170 of intimate visual recordings in particular circumstances, while s 22A criminalises the posting of “a digital communication that is an intimate visual recording of a victim” without reasonable excuse or consent.171 Each offence requires satisfaction of the same gateway requirement – that the relevant content be an “intimate visual recording”. If IIDs are not intimate visual recordings, their publication does not attract criminal liability under these offences.

Whether IIDs could be intimate visual recordings has not been addressed by the courts, but has received some academic consideration. Hunt considers that the matter is unclear, describing the lack of explicit reference in s 22A to manipulated images as a “glaring gap”.172 Similarly, Barnes and Barraclough suggest the issue is sufficiently ambiguous to require Parliamentary or judicial clarification.173 However, each have also argued that it would be open to a court to interpret “intimate visual recording” as capturing manipulated images.174

Bringing IIDs within the scope of the intimate visual recording offences is an admirable aspiration insofar as it would provide an immediate legislative solution. However, I argue this could only result from an interpretation so strained it would be at odds with the realities of statutory interpretation. Such an interpretation – though perhaps lexically available – would not be justifiably open to a court, rendering the intimate visual recording offences even less likely to address the phenomenon than has been tentatively suggested.

168 Crimes Act, s 216H. 169 Crimes Act, s 216I. 170 Crimes Act, s 216J.

171 Harmful Digital Communications Act, s 22A(1). See Appendix 1.

172 Hogan, above n 2. See also Vaughan, above n 9; Stace Hammond, above n 9; and Denhardt and Hunt, above n 9, at 15-16.

173 Barnes and Barraclough, above n 8, at [643].

174 Barnes and Barraclough, above n 8, at [505]-[510]; Cornish, above n 9; Denhardt and Hunt, above n 9, at 16; and Email from Arran Hunt to Bella Stuart, above n 71.

  1. The Intimate Visual Recording Definitions

Section 216G of the Crimes Act and s 4 of the HDCA contain largely “identical” statutory definitions of intimate visual recording, with the HDCA intentionally replicating the earlier Crimes Act definition.175 While the definitions have two minor discrepancies these do not affect their applicability (or lack thereof) to IIDs, meaning their applications can be considered concurrently.176

The intimate visual recording definitions have three core components:

(1) the relevant content is a “visual recording”;

(2) which is “made in any medium using any device”; and

(3) depicts an intimate subject-matter.177

The broad language of “made in any medium using any device” has not been limited in any way by the courts, meaning an IID is likely capable of meeting this requirement.178 However, the remaining components are substantially more complex requiring further consideration.

  1. “Visual Recording”

Neither definition exhaustively defines what constitutes a “visual recording”, but both describe that it will include “a photograph, a videotape, or digital image.”179 While a broad interpretation of “digital image” could capture deepfakes, the statutory structure must be acknowledged. Specifically, what may constitute a digital image remains constrained by the overarching

175 Harvey, above n 135, at 363. Compare Harmful Digital Communications Act, s 4; and Crimes Act, s 216G. For the full definitions, see Appendix 1.

176 The Crimes Act refers to the subject of the recording as “person”, while the Harmful Digital Communications Act refers to the “individual”. “Individual” is defined to mean a natural person, while “person” is defined to include non-natural persons. Further, the HDCA definition has a wider scope than the Crimes Act, by virtue of its capturing visual recordings “made... with or without” the subject’s knowledge or consent.

Conversely, the Crimes Act captures only recordings made without the subject’s knowledge or consent – thus excluding a situation where an intimate visual recording is made with the subject’s consent, but published without. If IIDs were deemed to fall within the intimate visual recording definition, this would mean that an intimate deepfake which was consensually created, but non-consensually published, could only be prosecuted under the HDCA. See generally Law Commission Harmful Digital Communications: The Adequacy of the Current Sanctions and Regimes (August 2012) at [4.32].

177 Crimes Act, s 216G(1); and Harmful Digital Communications Act, s 4(a).

178 See Barnes and Barraclough, above n 8, at [506]; and Harvey, above n 135, at 362.

179 Crimes Act, s 216G(1); and Harmful Digital Communications Act, s 4(a).

requirement that it must still be a visual recording. Therefore, the key inquiry is whether IIDs can be visual recordings.

A comprehensive database search has revealed that to date, cases prosecuted under the intimate visual recording offences have only involved the products of capture technologies – namely photographs and videos.180 This means the courts have not grappled with what could be, beyond the obvious, a visual recording. Absent any precedent, the question reverts to statutory interpretation, which notoriously requires legislation’s meaning “be ascertained from its text and in light of its purpose and context.”181 Without purporting to argue for an ideal or orthodox method of statutory interpretation this analysis will utilise Justice Glazebrook’s four-step “spiral” approach, which requires analysis of the textual language, the provision in the statute’s internal context, legislative history and the provision in its wider context.182

  1. Text

Though “excessive literalism is... perceptible mainly only in the past’s dark days”, legislation’s ordinary grammatical meaning remains central to interpretation.183 Without creating a “fortress” of a dictionary, the Oxford English Dictionary’s definitions of “recording” provide useful insight of the word’s ordinary usage.

“Recording” is defined as:

(1) “The action or an act of setting down in writing or putting on record.”

(2) “The action or process of registering or preserving something (a measurement, event, etc.) by a machine, instrument, or device; the product of such a process.”

(3) “The action or process of recording sound or video images for later reproduction.”184

180 See X v R [2021] NZCA 331 at [3] (video recorded on a cell phone); Benatzky v R [2018] NZCA 413 at

[7] (video recorded on a cell phone); Police v B [2017] NZHC 526 (photographs taken on a cell phone); and

Pearson v Police [2015] NZHC 410 at [5] (videos recorded on a cell phone).

181 Legislation Act 2019, s 10(1). See also Terminals (NZ) Ltd v Comptroller of Customs [2013] NZSC 139, [2014] 1 NZLR 121 at [74] per Glazebrook J; Commerce Commission v Fonterra Co-Operative Group Ltd [2007] NZSC 36, [2007] 3 NZLR 767 at [22]; Down v R [2012] NZSC 21, [2012] 2 NZLR 585 at [20] per Elias

CJ and McGrath J; and Donselaar v Donselaar [1982] NZCA 13; [1982] 1 NZLR 97 (CA) at 114 per Somers J.

182 Susan Glazebrook “Filling the Gaps” in Rick Bigwood (ed), The Statute: Making and Meaning (Wellington, LexisNexis, 2004) 169–76. The spiral approach was used by the Supreme Court in Worldwide NZ LLC v NZ Venue and Event Management Ltd [2014] NZSC 108; [2015] 1 NZLR 1 (SC).

183 Ross Carter and James McHeron Seminar: Statutory Interpretation Update (New Zealand Law Society CLE Ltd, June 2016) at 27. See generally Ross Carter Burrows and Carter Statute Law in New Zealand (6th ed, Wellington, LexisNexis, 2021) at 288; and McKenzie v Attorney General [1991] NZCA 105; [1992] 2 NZLR 14 at 17.

184 Oxford English Dictionary (Online Database) “Recording” accessed 26 June 2023.

Though references to “preserving” and “reproduction” in definitions (2) and (3) appear to limit the meaning of “recording” to the products of capture technologies, it could be argued that definition (1) is broad enough to capture IIDs, in the sense that an IID is something being put down on record. Though this is likely a secondary interpretation, neither would it be correct to say that the ordinary meaning of “recording” totally excludes the products of manipulation technologies. In the absence of a clear ordinary meaning, the interpretative inquiry must turn to context and purpose.185

  1. Immediate legislative context

As all law students should know, words “derive colour from those which surround them”.186 Barnes, Barraclough and Hunt have suggested that the immediate legislative context could support an interpretation of “recording” which covers manipulated images. Specifically, they argue the phrases “made in any medium” and “digital image” should or could colour “recording” in a way which enables it to include manipulated images.187 However, it could be argued with equal persuasion that the examples of “photograph” and “videotape” colour recording in favour of a definition which restricts it to the products of capture technologies. Thus, the immediate legislative context potentially does not assist in ascertaining meaning as significantly as has been argued.

  1. Legislative history

While neither ordinary meaning nor internal context point to a clear interpretative answer, Parliament’s active contemplation of manipulated imagery when enacting the s 22A offence strongly suggests that Parliament did not intend for the intimate visual recording definition to cover manipulated images.

Because s 216G of the Crimes Act was enacted in 2006, it is unsurprising that deepfakes were not in lawmakers’ direct contemplations. Rather, the Parliamentary materials demonstrate a clear emphasis on capture technologies – reflected most obviously in the name of the “Crimes

185 Commerce Commission v Fonterra Co-Operative Group Ltd [2007] NZSC 36; [2007] 3 NZLR 767 at [22]. See also

Carter, above n 183, at 401.

186 Bourne v Norwhich Crematorium Ltd [1967] 2 All ER 576 (Ch) at 578.

187 Crimes Act, s 216G; and Harmful Digital Communications Act, s 4 (emphasis added). See Barnes and Barraclough, above n 9, at [506]; Denhardt and Hunt, above n 9, at 16; and Email from Arran Hunt to Bella Stuart, above n 71.

(Intimate Covert Filming) Amendment Bill” itself.188 Further, “filming” and “recording” are used interchangeably throughout the legislative materials and Hansard,189 and explicit references are made to the fact that the offences addressed “up-skirting”, “down-blousing” and “video-voyeurism” – which are all undeniably capture-based activities.190 Put simply by Diane Yates MP and Brian Connell MP, the Bill was “about people taking sneaky pictures in places where they should not” and “putting away modern-day peeping Toms.”191 Of course, it is arguable that this focus is simply attributable to the era’s technological realities, and thus because “legislation applies to circumstances as they arise”, a wider definition of “recording” can be adopted.192 However, as highlighted in Chapter One, the issue of manipulated imagery was actively contemplated by Parliament in enacting the s 22A offence. This was largely prompted by numerous submissions made to the Justice Committee in response to the proposed legislation’s ambiguity and inadequacy in relation to manipulated images.193 Following the Justice Committee’s rejection of recommendations to explicitly incorporate manipulated images into the intimate visual recording definition,194 both Louisa Wall MP and Golriz Ghahraman MP introduced Supplementary Order Papers to include “a visual recording... that has been created or altered” in the definition.195 Both were rejected by the House.196

This explicit contemplation has a significant impact on interpretation. In an analogous interpretative scenario, the Supreme Court in R v Gordon-Smith (No 2) accepted that the Select Committee’s rejection of a recommendation made by the Ministry of Justice demonstrated Parliament’s intention that the legislation not reflect that recommendation.197 Thus, rejected amendments are “just as illustrative of meaning as changes that were made”,198 meaning “caution should prevail if Parliament has rejected opportunities for clearing up known

188 (Emphasis added).

189 See Law Commission, above n 64, at [1.1], [1.4]; (5 May 2005) 635 NZPD 20324; and (14 March 2006)

629 NZPD at 1778.

190 For example, Law Commission, above n 64, at [1.5]; (14 March 2006) 635 NZPD 1778; and (21

November 2006) 635 NZPD at 6689.

191 (16 November 2006) 635 NZPD 6573; and (21 November 2006) 635 NZPD 6689.

192 Legislation Act, s 11. See also Wood-Luxford v Wood [2013] NZSC 153, [2014] 1 NZLR 451 at [37]-

[40] per Glazebrook J; and Carter, above n 183, at 274.

193 See above at page 7.

194 Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2) (select committee report) at 4-5.

195 Supplementary Order Paper 2021 (83) Harmful Digital Communication (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2); and Supplementary Order Paper 202 (103) Harmful Digital Communication (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2).

196 (15 February 2022) 757 NZPD at 7485.

197 R v Gordon-Smith (No 2) [2009] NZLR 725 (SC) at [69] per McGrath J.

198 Carter and McHeron, above n 183, at 132.

difficulties.”199 Though such legislative intention cannot be ascribed to the Crimes Act definition, to argue that the HDCA and Crimes Act definitions could or should bear such significantly distinct meanings would be to advocate for an undesirable level of legislative inconsistency between the IBSA offences.

Thus, while legislators were aware that manipulated images were likely excluded from the intimate visual recording definition, a conscious decision was made not to explicitly include them.200 This strongly suggests that Parliament did not intend for the intimate visual recording definitions to capture manipulated images.

  1. Wider context

The final interpretative consideration requires assessing the definitions within their wider contexts of legal principles and values. Criminal legislation is subject to unique interpretative principles including the principles of lenity and fair warning, which stem from the rule of law and “the rights and legitimate expectations of those charged with crime”.201 These principles acknowledge that “criminal law marks the legal boundary of individual liberty” by mitigating judicial activism and the uncertainty it can generate.202 The principle of lenity’s most famous exposition prescribes that “if a penal provision is reasonably capable of two interpretations, that interpretation which is most favourable to the accused must be adopted.”203 Relatedly, the principle of fair warning requires criminal offences be stated both clearly and in advance so an individual’s conduct is “clearly and ascertainably punishable, thus enabling them to predict the consequences of their actions.204 While these interpretative principles have bowed to purposive interpretation in recent decades,205 they are “very far from dead” and courts remain highly cautious when interpreting criminal statutes.206

199 C v Director of Public Prosecutions [1995] UKHL 15; [1996] AC 1 (HL) at 28.

200 Carter, above n 183, at 264.

201 Julia Tolmie and others Criminal Law in Aotearoa, New Zealand (Wellington, LexisNexis, 2022) at 66- 67; Simester and Brookbanks, above n 28, at 43; and Carter, above n 183, at 301-302.

202 Legislation Guidelines, above n 26, at [24.2].

203 Sweet v Parsley [1969] UKHL 1; [1970] AC 132 (UKHL) at 149.

204 R v Rimmington [2005] UKHL 63, [2006] 1 AC 459 at [33]. See generally Simester and Brookbanks, above n 28, at 33-34, 43; Tolmie and others, above n 201, at 59; and Glanville Williams Textbook of Criminal Law (4th ed, London, Thomson Reuters, 2015) at [1-019-1.026].

205 See discussion in Kirby v Police [2012] NZHC 2397, [2012] NZAR 975 at [12] per Kós J; and R v Pratt

[1990] 2 NZLR 129 (CA) at 133. See generally Carter, above n 183, at 301; and Williams, above n 204, at [1-

019].

206 Carter, above n 183, at 431-432. See also Chen v R [2019] NZCA 299, (2019) 29 CRNZ 113 at [44].

Both principles favour an interpretation of “recording” which excludes manipulated imagery, which is both the more favourable interpretation to the accused and the more clearly ascertainable primary meaning. Further, this interpretation of “recording” is the most consistent with the right to freedom of expression as affirmed in the New Zealand Bill of Rights Act.207 In the absence of countervailing legislative intent, of which there is none, this interpretation must be preferred.208

  1. Conclusion

While a definition of “recording” which captures manipulated images (and thus IIDs) is arguably lexically available, clear Parliamentary intention and the application of interpretative principles suggest that a court would be exceedingly hesitant to adopt this meaning. To do so would require an unrealistically, arguably unacceptably, creative judicial approach which veered into a realm of Parliamentary usurpation and “curial redrafting”.209 Parliament – with condemning clarity – rejected the opportunity to explicitly incorporate manipulated images into the intimate visual recording definition in the HDCA, despite knowing their proposed definition would likely exclude them. To interpret “recording” in an expansive manner would be for a court to “write in what the legislature has not thought fit to include.”210 I suggest this would be an overly strained interpretation and one the courts would not deem open to them.

  1. “Intimate”

Even if “recording” can be read expansively, the IID’s content must also be “intimate” in the manner prescribed by the statutory definitions. The definitions provide five exhaustive circumstances when a visual recording will be deemed intimate. The first three require that a subject be “in a place which, in the circumstances, would reasonably be expected to provide privacy” and depict them “naked, or [with] his or her genitals, pubic area, buttocks, or female breasts exposed, partially exposed, or clad solely in undergarments”, “engaged in an intimate sexual activity” or “engaged in showering, toileting, or other personal bodily activity that involves dressing or undressing.”211 The remaining two circumstances capture situations

207 New Zealand Bill of Rights Act, s 14.

208 New Zealand Bill of Rights Act, ss 4-6.

209 Taylor v Attorney General [2014] NZHC 2225, (2014) 10 HRNZ 31 at [30]. See also Northern Milk Ltd v Northern Milk Vendors Association Inc [1988] NZCA 80; [1988] 1 NZLR 537 (CA) at 2; and Simester and Brookbanks, above n 28, at 45.

210 Carter, above n 183, at 298.

211 Crimes Act, s 216G(1)(a); and Harmful Digital Communications Act, s 4(1)(i).

colloquially referred to as “up-skirting” or “down-blousing”, where a recording is made of a subject’s “naked or undergarment-clad genitals, pubic area, buttocks, or female breasts... from beneath or under [their] clothing” or “through [their] outer clothing in circumstances where it is unreasonable to do so.”212 Two substantive issues arise from these definitions.

In the context of capture-based intimate visual recordings, the requirement that a recording be of a subject “in a place which... would reasonably be expected to provide privacy” is a logical qualifier, excluding (for example) the nude streaker at an All Blacks game.213 In the deepfake context, this requirement creates a significant lacuna. Barnes and Barraclough suggest that the requirement “could merely mean depicting someone in that setting”. Thus, it would be sufficient for an IID to be “set” in a location which would reasonably be expected to provide privacy.214 However, this would allow criminal liability to be avoided simply by designing an IID to be “set” in a clearly public location such as a park or supermarket. This demonstrates an inherent difference between capture and manipulation-based intimate images, in that the latter does not require a public/private distinction. IIDs should attract criminal liability regardless of the scene in which they are set, thus rendering the intimate visual recording offences unable to comprehensively criminalise the harms of IID publication.

An issue also arises regarding what and to whom these definitions can apply to. In an IID the face and body depicted will not usually belong to the same subject. Nor in many, perhaps most, cases will the subject have engaged in the precise activity being depicted. To demonstrate how this is problematic, one aspect of the definitions requires “the [subject] is naked or has his or her genitals, pubic area, buttocks or female breasts exposed...”.215 In an IID, the subject will likely not have “his or her” body parts, nor necessarily any intimate details of their own life, exposed.216 This leads Chamberlain to argue conclusively that an IID “is not an ‘intimate visual recording’ as it is not the victim’s genuine nudity.”217 Though the matter is perhaps less certain

212 Crimes Act, s 216G(1)(b); and Harmful Digital Communications Act, s 4(1)(ii). 213 Crimes Act, s 216G(1)(a); and Harmful Digital Communications Act, s 4(1)(i). 214 Barnes and Barraclough, above n 9, at [506].

215 Crimes Act s 216G(1)(a)(i); and Harmful Digital Communications Act, s 4(1)(a)(i)(A).

216 Stace Hammond, above n 36, at 4. See also Chamberlain, above n 7, at [46.7.4.2]; and Delfino, above n 71, at 902. Barnes and Barraclough suggest there is a transition in drafting in the definitions which could allow their application to a situation where the depicted body and face do not belong to the same subject. Compare Barnes and Barraclough, above n 9, at [509]-[510].

217 Chamberlain, above n 7, at [46.7.4.2].

than Chamberlain asserts, it remains that the application of these intimacy requirements, if possible at all, will be highly unnatural in a deepfake context.218

  1. Conclusion
  1. suggest the ability of a court to bring IIDs within the scope of the intimate visual recording definitions, and thus offences, is even less viable than has been tentatively suggested in the scholarship. Though an interpretation of “recording” which would achieve this is lexically available, the wider interpretative context demonstrates it would likely not be adopted. As acknowledged by Hunt himself, this would require “an adventurous court, and the right matter” and would be highly susceptible to appeal.219 Even if an IID were deemed capable of being a “recording”, the inherent differences between capture and manipulation-based images mean the intimate visual recording definitions would apply uncomfortably and inconsistently to IIDs. This renders the intimate visual recording regimes as both an unlikely and undesirable mechanism for criminalising IID publication.

II Section 22 of the Harmful Digital Communications Act

Section 22 of the HDCA criminalises the causing of harm by posting a digital communication. The offence applies exclusively to communications which are not intimate visual recordings, making recourse under s 22 only available if the intimate visual recording definitions do not capture the IID in question.220

Liability under s 22 requires satisfaction of three elements:

(1) the perpetrator “posts a digital communication with intention that it cause harm to a victim”;

(2) “posting the communication would cause harm to an ordinary reasonable person in the position of the victim”; and

(3) “posting the communication causes harm to the victim.”221

218 Stace Hammond, above n 36, at 4.

219 Email from Arran Hunt to Bella Stuart, above n 71.

220 Harmful Digital Communications Act, s 22(4). See Appendix 1.

221 Harmful Digital Communications Act, s 22(1). See also Police v B [2017] NZHC 526, [2017] 3 NZLR

203 at [16].

Two matters can be dealt with relatively briefly. An IID is capable of being a “digital communication”, defined broadly as including “any form of electronic communication” and “any material that is communicated electronically.”222 This broad definition has not been limited by the courts, which have described it as “expansive” and not purporting to “limit the communication to certain media or platforms”.223 The statutory requirement that a digital communication be “posted” is similarly broad, meaning to “transfer, send, publish, disseminate, or otherwise communicate by means of a digital communication... any information, whether truthful or untruthful, about the victim” – described by Judge Doherty as “wide-ranging and non-exhaustive”.224 Thus, if an IID is distributed digitally, these requirements will likely be established.

However, substantial issues arise regarding the requirements that the perpetrator intend to cause harm, and that harm be (reasonably) experienced by the victim. The standard of harm required in all circumstances is “serious emotional distress”. What this specifically includes is somewhat unclear, with Justice Downs describing it as “a broad compendious expression that means what it says.”225 More helpfully, the Law Commission has clarified that serious emotional distress extends “to a full range of serious negative consequences which can result from offensive communication, including physical fear, humiliation, mental and emotional distress.”226

  1. Intent to Cause Harm

The intention to cause harm requirement – otherwise referred to as a “motivation threshold” – significantly restricts s 22’s ability to effectively address IID publication by placing a disproportionate evidential burden on the prosecution.227

Clare McGlynn and Erika Rackley suggest that in the IBSA context, motivation thresholds fail to address conduct beyond the “paradigmatic” revenge porn scenario which far from encapsulates every manifestation of IBSA.228 This is particularly relevant regarding the

222Harmful Digital Communications Act, ss 4, 22. See Appendix 1.

223 R v Iyer [2016] NZDC 23957, [2017] NZFLR 119 at [28].

224 R v Iyer, above n 223, at [29]; and Harmful Digital Communications Act, s 4.

225 Harmful Digital Communications Act, s 4; and Police v B, above n 221, at [25]. See generally Brittin v Police [2017] NZHC 2410, [2018] 2 NZLR 147 at [32].

226 Law Commission, above n 176, at [1.26].

227 McGlynn and Rackley, above n 1, at 555.

228 McGlynn and Rackley, above n 1, at 555.

publication of IIDs, which is recognised as being motivated by a wide range of factors. While some IIDs “will be nothing less than cruel weapons meant to terrorise and inflict pain”, other reported motivations include sexual and financial gratification, notoriety, bonding among peers and mere entertainment.229 For example, the rise of deepfake commodification is a pure economic incentive for the publication of IIDs and does not necessarily involve any intention to cause harm.230 Further, research has identified that users who traded in IIDs on private- sharing platforms often did so with “little to no identifying information” of the subjects, demonstrating that “perpetrators appeared more motivated by establishing and maintaining bonds and esteem among peers than other motivations such as revenge.”231 Thus, the motivations behind IID publication can be entirely unrelated to the subject, including situations where perpetrators do not know the subject, or have no intention that they will ever know the IID exists.232 In these circumstances, it may be seriously difficult to prove intention to cause serious emotional distress beyond a reasonable doubt.

  1. The Victim’s Experience of Harm

Even if the requisite intention is established, s 22 also requires a victim both have experienced serious emotional distress and that this experience was reasonable. This composite subjective- objective test has significant implications on s 22’s capacity to meaningfully vindicate a victim’s interests.

The description of harms discussed in Chapter Two demonstrates that the publication of IIDs can cause serious emotional distress. Despite this reality, requiring this to be proven beyond a reasonable doubt is problematic because it promotes a standardisation of victims’ experiences, and makes victims unnecessarily and detrimentally central to criminal proceedings.

Specifically, these elements mandate that “distress is the appropriate and expected impact of these practices”.233 This is particularly detrimental given the gendered nature of the

229 Chesney and Citron, above n 12, at 1774. See also Flynn and others, above n 54, at 1351; and McGlynn and Rackley, above n 1, at 335.

230 Adjer and others, above n 93, at 5; Wang, above n 13, at 420; and Dunn, above n 49, at 12.

231 Nicola Henry and Asher Flynn “Image-Based Sexual Abuse: Online Distribution Channels and Illicit Communities of Support” (2019) 25 Violence Against Wom 1932 at 1943. See also Toparlak, above n 62, at 7.

232 McGlynn and Rackley, above n 1, at 555. See also Jonathan Clough “Revenge Porn: Criminal Law Responses” (2016) 132 Precedent 30 at 31-32.

233 McGlynn and Rackley, above n 1, at 555. See also Alisdair Gillespie “‘Trust Me, It's only for Me’: ‘Revenge Porn’ and the Criminal Law” (2015) 11 Crim LR 866 at 879.

phenomenon.234 While traditional feminism historically emphasised women’s common experiences, modern feminism has challenged this universalisation through Liz Kelly’s “continuum” of sexual violence.235 Kelly argued that sexual violence should be categorised on a spectrum to accurately reflect the fact that how women react to their experiences of sexual violence will vary.236 These variations result from “a complex range of factors” including the relationship between victim and perpetrator, whether the event was isolated or part of a pattern of abuse, the victim’s prior experiences and the victim’s intersecting racial, sexual and social identities.237 Thus, mandating serious emotional distress invalidates and ignores the fact that individuals can and will react differently to their experiences of IBSA.

Further, making the victim’s experience an element of the offence requires them to “expose even more of their private lives to the public” with no guarantee that a court will believe them.238 The early application of s 22 demonstrates this point. In R v Iyer, Judge Doherty concluded that a woman who had had her semi-nude images posted on Facebook had not experienced serious emotional distress despite her testimony that she became depressed, frustrated, angry, anxious and unfit for work.239 Though reversed on appeal, this decision demonstrates the risk that judges may downplay or fundamentally misunderstand a victim’s experience. Though society is developing an increasingly nuanced understanding of sexual violence and gender-based harms, IIDs remain on the periphery of mainstream understanding, exacerbating this risk.240 Thus, an offence which fails to recognise the inherent harms of IID publication not only places a substantial burden on the victim but risks trivialising and invalidating their experience.241

Beyond merely convincing a court that a victim has experienced serious emotional distress, whether this was a reasonable reaction remains open to scrutiny, to be determined using “part fact, part value judgement.”242 This invites defence counsel to challenge the reasonableness of

234 See above discussion at page 14.

235 Liz Kelly “The Continuum of Sexual Violence” in Jalna Hanmer and Mary Maynard (eds) Women, Violence and Social Control (London, Palgrave Macmillan, 1987) at 48.

236 Kelly, above n 235, at 49.

237 Kelly, above n 235, at 48. See generally Natalie Sokoloff and Ida DuPont “Domestic Violence and the Intersections of Race, Class, and Gender” (2005) 11 Violence Against Wom 38 at 41; and Beth Richie “A Black Feminist Reflection on the Antiviolence Movement” (2000) 25 Signs 1133.

238 Delfino, above n 71, at 919.

239 R v Iyer, above n 223, at [13]-[15], [75].

240 See above discussion at page 9.

241 Delfino, above n 71, at 919-922; and Citron, above n 153, at 392-404.

242 Police v B, above n 221, at [23].

a victim’s lived experience and testimony, risking their re-victimisation and re-traumatisation, and potentially deterring them from reporting their experiences in the first place. To demonstrate, the Law Commission described in its 2015 report The Justice Response to Victims of Sexual Violence that:243

A significant number of complainants are ‘opting out’ of the very system that is supposed to recognise their rights and support their needs... [because] they perceive the formal criminal justice system to be alienating, traumatising and unresponsive...

The Law Commission notes that victims felt the criminal justice system “victimises survivors” and imposes a “second sentence on the victim”.244 One victim described their experience as a “horrendous, long, arduous, disempowering, re-traumatising and re-victimising process” that only “superwoman” could cope with.245 Substantially, these experiences stemmed from how victims were treated during cross-examination, for example where their testimonies were challenged by specialist expert evidence or when manipulative questioning techniques were employed.246 Further, there is also a risk that the “reasonable person” test is inherently gendered, given its genesis as the “reasonable man” test “endowed with attributes that are stereotypically and exclusively male.”247 Wendy Parke suggests that a change in name has “not altered the standard itself”, and thus “reasonable person” continues to exclude women’s perspectives.248 Regarding a highly gendered phenomenon, it is of significant concern that a standard of reasonableness may be imposed which does not reflect the intersecting and diverse female experience.

The Sexual Violence Legislation Act 2021 has addressed some of the Law Commission’s findings, amending trial processes and evidence rules to better support victims. Many of these amendments apply only to physical sexual violence trials.249 However, some operate to benefit

243 Law Commission The Justice Response to Victims of Sexual Violence: Criminal Trials and Alternative Processes (SP23534, 2015) at iv; and Flynn and others, above n 54, at 1352.

244 Law Commission, above n 243, at 26.

245 Law Commission, above n 243, at 26.

246 Law Commission, above n 243, at 79-80. See also Harvey, above n 135, at 383; and Olivia Smith and Tina Skinner “Observing Court Responses to Victims of Rape and Sexual Assault” (2012) 7 Fem Criminol 298 at 320.

247 Wendy Parker “The Reasonable Person: A Gendered Concept Claiming the Law” (1993) 28 VUWLawRw 105 at 108. See also Naomi Cahn “Loosenessof Legal Language: The Reasonable Woman Standard in Theory and in Practice” (1991) 77 Cornell L Rev 1398; and Kathleen Kenealy “Sexual Harassment and the Reasonable Woman Standard” (1992) 8 Lab Law 203.

248 Parker, above n 247, at 105.

249 For example, see Sexual Violence Legislation Act 2021, s 8.

victims more broadly, such as by strengthening judges’ obligations to intervene in unacceptable lines of questioning,250 and increasing obligations on judges to address misconceptions about sexual offending.251 While undeniably valuable developments, Elisabeth McDonald notes that years of practical and legislative change in the sexual violence context have not resulted in equivalent changes in the experiences of complainants.252 McDonald argues that this is due to the inherent nature of the adversarial system, which makes victims feel like they are “on trial”.253 Thus, even with improved protections against what can be termed “macho adversarialism”, adversarialism inherently requires that the defence argue the offence’s elements cannot be proven beyond a reasonable doubt.254 When a victim’s lived experience is made part of an offence’s elements, these experiences will be challenged. Thus, it is the adversarial system itself, not only its most extreme manifestations, which causes harm to victims when their experiences and emotions are incorporated into an offence. The limitations of s 22 in its capacity to address capture-based IBSA was recognised by Parliament and addressed through the introduction of the less onerous s 22A offence. I suggest that s 22 is similarly limited in its capacity to address manipulation-based IBSA, such as the publication of IIDs.

  1. Conclusion

Fundamentally, s 22 “misses the point” by failing to recognise the harms inherent in IBSA, rather than as related to the motivations or consequences of the conduct.255 This is entirely understandable given that s 22 was designed to address cyberbullying.256 However, if relied on to address the harms inflicted by IID publication, s 22 will both potentially exclude harmful conduct by virtue of the motivation threshold and pose a significant risk of re-traumatising victims. Therefore, I suggest s 22 of the HDCA does not provide an adequate solution to address the publication of involuntary intimate deepfakes.

250 For example, see Sexual Violence Legislation Act, ss 9, 29.

251 Evidence Act 2006, s 126A. See also Te Kura Kaiwhakawā Institute of Judicial Studies Responding To Misconceptions About Sexual Offending: Example Directions for Judges and Lawyers (August 2023).

252 Elizabeth McDonald Rape Myths as Barriers to Fair Trial Process (Christchurch, Canterbury University

Press, 2020) at 39.

253 McDonald, above n 252, at 40; and Smith and Skinner, above n 246, at 304.

254 Smith and Skinner, above n 246, at 304.

255 Delfino, above n 71, at 922. See also Citron and Franks, above n 35, at 387.

256 New Zealand Law Society “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill” at [3.2].

III The Films, Videos, and Publications Classification Act 1993

The FVPCA establishes New Zealand’s content censorship regime, primarily through the regulation of “objectionable” publications.257 The Classification Office – an independent Crown Entity – is responsible for the classification of publications which are submitted to it.258 Under the FVPCA, it is an offence to possess, make, copy, import, distribute, supply or display publications which have been classified as objectionable, with penalties varying depending on the accused’s mens rea.259 Therefore, if IIDs would be classified as objectionable publications under the FVPCA, the broad scope of these offences would comprehensively criminalise their publication.

An objectionable publication is one which “describes, depicts, expresses, or otherwise deals with matters such as sex, horror, crime, cruelty, or violence in such a manner that the availability of the publication is likely to be injurious to the public good.”260 The Court of Appeal has described that this definition serves two purposes – defining the limits of censorship with regards the publication’s subject-matter (providing the “subject-matter gateway”), and confining censorship only to such publications which are “injurious to the public good”.261 Publications which promote, support, or tend to promote or support certain behaviours, including child sexual exploitation, bestiality or sexual violence, are deemed to be objectionable under the FVPCA.262 Beyond these deemed categories of content, a publication will also be objectionable if it is nonetheless injurious to the public good. This is determined by the Classification Office with reference to the statutory “content criteria”263 and “contextual criteria” 264 laid out in ss 3(3) and 3(4) of the FVPCA.265

257 Films, Videos, and Publications Classification Act, ss 3, 23. Materials can also be classified under the lesser standard of restricted publications, but such publications result only in age-restrictions, rather than attracting bans and inherent criminal liability. As such, whether IIDs would be restricted publications is not relevant for the present analysis. See Films, Videos, and Publications Classification Act 1993, ss 3A, 3B.

258 Films, Videos, and Publications Classification Act, s 76-77.

259 Films, Videos, and Publications Classification Act, ss 123-124. See also Barnes and Barraclough, above n 8, at [421]. See Appendix 1.

260 Films, Videos, and Publications Classification Act, s 3(1). See Appendix 1.

261 Living Word Distributors v Human Rights Action Group [2000] NZCA 179; [2000] 3 NZLR 570 (CA) at [25].

262 Films, Videos, and Publications Classification Act, s 3(2). 263 Films, Videos, and Publications Classification Act, s 3(3). 264 Films, Videos, and Publications Classification Act, s 3(4).

265 Terminology of “content criteria” and “contextual criteria” was first adopted in New Truth & TV Extra (4 November 1994 issue) (1996) 3 HRNZ 162 (FLBR) at [10].

  1. Practical Limitations

As a preliminary matter, even if IIDs were inherently “objectionable” publications, the FVPCA is limited in its capacity to be an adequate or efficient mechanism to criminalise their publication.

Specifically, the Classification Office is widely recognised as being overwhelmed by the ever- increasing volume of content which is emerging as a result of technological advancements. This is recognised by the Office itself which has described that New Zealand’s media regulatory system is “fragmented and out of date... based on laws that reflect a pre-internet world...”.266 The model, “originally designed to censor manuscript materials”, has been challenged since its inception, facing difficulties first in classifying films,267 DVDs,268 video games269 and now the Internet.270 Further, the FVPCA restricts the classes of individuals who may submit publications to the Office for classification, allowing only the chief executive of the New Zealand Customs Service, the Commissioner of Police, the Secretary of Internal Affairs and certain online content hosts to do so.271 Any other person may only submit a publication for classification with leave of the Chief Censor.272 Thus, to be classified as objectionable, a particular IID would have to be brought to the attention of one of these specified individuals or have the Chief Censor grant leave for submission. This constitutes a hurdle for victims or other private individuals endeavouring to bring an IID within the FVPCA’s scope.

It is worth noting that the Department of Internal Affairs is currently examining these inadequacies as part of its broader Safer Online Services and Media Platforms content regulatory review.273 At the time of writing, the review remains in its early stages, and thus I

266 Classification Office Statement of Intent (2021 – 2025) (February 2021) at 10; David Harvey and Rosemary Tobin “New Media and the Challenge of Convergence” in David Harvey, Rosemary Tobin and Paul Sumpter Media Law - A to Z of New Zealand Law (online ed, Thomson Reuters, 2017) at [40.1.3]; and Barnes and Barraclough, above n 9, at [425].

267 Harvey and Tobin, above n 266, at [40.1.3].

268 David Wilson “Responding To The Challenges: Recent Developments in Censorship Policy in New Zealand” (2007) 30 Soc Polic J NZ 65 at 74.

269 Harvey and Tobin, above n 266, at [40.1.3].

270 Katie Kenny “Chief Censor David Shanks Says An Entirely New Media Regulator May be Needed” Stuff (online ed, New Zealand, 23 October 2019); and Mediawatch “Battle Against Online Harm Beefs Up Censor’s Power” RNZ (New Zealand, 21 March 2021).

271 Films, Videos, and Publications Classification Act, ss 13(1).

272 Films, Videos, and Publications Classification Act, s13(2),(3).

273 Department of Internal Affairs “Media and Online Content Regulation: Safer Online Services and Media Platforms Review” <https://www.dia.govt.nz/media-and-online-content-regulation>

will not dedicate significant time to its discussion. However, it is accepted that prospective reform could remedy some of the FVPCA’s current deficiencies and thus the regime’s practical capacity to effectively address IID publication remains to be seen.

  1. Legal Limitations

Beyond practical limitations, the more pertinent question is whether IIDs are legally “objectionable” publications under the FVPCA’s requirements. The Department of Internal Affairs has explicitly stated that prospective reforms will not affect what types of content are prohibited. Thus, the following conclusions regarding the legal status of IIDs will not be affected by the content regulatory review.274

Because “publication” is defined broadly in the FVPCA, IIDs are easily capable of meeting this statutory requirement.275 Further, an IID would clearly be objectionable if it depicted content which is deemed objectionable under the FVPCA, such as the promotion or support of sexual exploitation of children or sexual violence.276 Even under these deeming provisions, the requirements of “promoting” or “supporting” (or tending to do so) are a “high threshold” requiring more than a mere depiction to attract criminal liability.277

However, to comprehensively criminalise IID publication, IIDs must be inherently “objectionable” regardless of the severity of the content they depict. For the reasons that follow, I suggest this is likely not the case.

  1. The subject-matter gateway

Most problematically, the subject-matter gateway appears to exclude IIDs which are not explicitly sexual from the regime’s scope, on the basis that it requires a publication “describes, depicts, expresses, or otherwise deals with matters such as sex, horror, crime, cruelty, or violence...”.278

274 Department of Internal Affairs, above n 126, at 7.

275 Films, Videos, and Publications Classification Act, s 2. See generally Barnes and Barraclough, above n 8, at [399]-[401]; Ursula Cheer, above n 131, at 641; and Harvey, above n 135, at 197.

276 Films, Videos, and Publications Classification Act, s 3(2).

277 Re Codd FLBR, 7 April 2017, 2017 WL 3845824 at [20]. See also Moonen v Film and Literature Board of Review, above n 108.

278 Films, Videos, and Publications Classification Act, s Section 3(1).

Prior to the introduction of intimate visual recording offences, intimate covert filming was addressed under the FVPCA.279 However, in 2000 the Court of Appeal “expressly excluded from the ambit of censorship law” intimate covert films.280 It did so by narrowing the subject- matter gateway to capture only those publications “that can fairly be described as dealing with matters of the kinds listed... [which] tends to point to activity...”.281 Practically, this excluded intimate covert films which did not depict the activity of sex.282 The effect of this is illustrated in a subsequent decision made by the Classification Office, which found that an intimate covert film of underage boys undressing completely in a changing room “could not be said to deal with matters of sex”.283 The consequences of this decision in the context of child-exploitation were addressed swiftly by Parliament through the Films, Videos, and Publications Classification Amendment Act 2005, which explicitly deemed nude and partially nude images of children and young persons as publications dealing with matters of sex.284

While this left an obvious gap regarding the intimate covert filming of adults, this decision was intentional. In its 2004 report on Intimate Covert Filming, the Law Commission recommended that the intimate covert filming of adults should not be addressed under the FVPCA. First, it suggested this would “depart from [the FVPCA’s] focus on... the nature of images rather than the means by which they were obtained”.285 Second, the Commission considered the requirement that a publication be “injurious to the public good” did not easily operate where harm is caused to an individual. Finally, it suggested that amending the FVPCA would introduce uncertainty to the censorship laws and that having independent legislation was consistent with international developments.286 For these reasons, the intimate visual recording offences were introduced separately in Part 9A of the Crimes Act.287

279 Ursula Cheer, above n 131, at 559.

280 Wilson, above n 268, at 73. See also Living Word Distributors v Human Rights Action Group, above n 261.

281 Living Word Distributors v Human Rights Action Group, above n 261, at [28] per Richardson P.

282 Ursula Cheer, above n 131, at 660.

283 Re Untitled Video Recording Public Swimming Pools At Papatoetoe Office of Film and Literature Classification 300507, 27 May 2003. See also Law Commission, above n 64, at [3.14].

284 Films, Videos, and Publications Classification Amendment Act 2005, s 4. See also Films, Videos and Publications Classification Act, s 3(1A).

285 This statement also further supports the capture-based interpretation of the intimate visual recording definitions advanced above from page 25.

286 Law Commission, above n 64, at [3.21].

287 Crimes (Intimate Covert Filming) Amendment Act 2006, s 4. See now the Crimes Act, part 9A.

Thus, while the FVPCA would capture the paradigmatic “deepfake pornography”, it unjustifiably and undesirably excludes IIDs of adults which do not describe, depict, express, or otherwise deal with the activity of sex – such as those depicting mere nudity, intimate bodily activities or up-skirting and down-blousing. While these types of content were intended to be dealt with under the intimate visual recording offences, this leaves a conspicuous gap when dealing with IIDs, based on the previous conclusion that IIDs are likely not intimate visual recordings.288 This would leave a highly problematic loophole if the FVPCA were to be relied on as the sole mechanism to criminalise the publication of IIDs.

  1. Injury to the public good

While explicitly sexual IIDs satisfy the subject-matter gateway, they must also be injurious to the public good to attract criminal liability. As a legal test, “injurious to the public good” is described as “extraordinarily vague... leaving much to the subjective view of the tribunal.”289 If the IID is particularly “hard-core”, it could likely satisfy this requirement. However, whether IIDs are inherently injurious to the public good, regardless of extremity, is unlikely.

In determining injury to the public good, the FVPCA’s content criteria requires the Classification Office give weight to the extent and degree which a publication degrades, dehumanises or demeans any person.290 On this basis two arguments could be mounted to establish that sexual IIDs are inherently objectionable – that pornography inherently demeans women, or if not, that pornography demeans women when it is non-consensual or involuntary. Without purporting to conclude on pornography’s ethical or moral status,291 the argument that pornography inherently demeans women does not align with the current approach taken by the

288 See above discussion at page 33.

289 Police v News Media Ownership Ltd [1975] 1 NZLR 610 (CA) at 616 per McCarthy P.

290 Films, Videos, and Publications Classification Amendment Act 2005, s 3(3)(c).

291 The relationship between pornography (and its potential to demean women) and feminist legal theory is complex. These complexities are closely related to the FVPCA, which was introduced largely in response to the 1987 Ministerial Committee of Inquiry into Pornography’s findings that pornography demeans and degrades women by objectifying them, eroticising their “sexual subordination” and perpetuating myths about their sexuality. For more on the FVPCA’s history, see Films, Videos, and Publications Classification Bill 1992 (select committee report) at [3.1]; (22 June 1993) 536 NZPD at 15992; (29 July 1993) 537 NZPD at 17051,

17057; (17 August 1993) 537 NZPD at 17494; (2 December 1992) 532 NZPD 12777; and Re Baise Moi [2005]

NZAR 214 (CA) at [29]. On the relationship between feminist legal theory and pornography generally see Lara Karaian “The Troubled Relationship of Feminist and Queer Legal Theory to Strategic Essentialism: Theory/Praxis, Queer Porn, and Canadian Anti-Discrimination Law” in Martha Fineman, Jack Jackson and Adam Romero (eds) Feminist and Queer Legal Theory: Intimate Encounters, Uncomfortable Conversations (Surrey, Ashgate Publishing Limited, 2009) at 382; and Nicola Lacey “Theory into Practice? Pornography and the Public/ Private Dichotomy” (1993) 20 JLS 93 at 93.

Classification Office – demonstrated in the very existence of mainstream pornography.292 Thus, the Classification Office clearly considers something beyond mere pornographic content as necessary to injure the public good.

A more convincing argument is that pornography becomes injurious to the public good when it is non-consensual because this demeans, degrades and dehumanises the subject.293 This would inherently capture IIDs by virtue of the fact they are involuntary. However the relationship between the FVPCA and intimate visual recording offences suggests this is likely not the case. The intimate visual recording definitions in both the HDCA and the Crimes Act require that some part of the content be non-consensual – whether that be that it was made and/or distributed without the subject’s consent.294 Therefore, if pornography became objectionable when it was non-consensual, every intimate visual recording would be objectionable under the FVPCA. However, law-makers, law enforcement and the Classification Office itself appear to reject the idea that there is a perfect correlation. In Parliamentary debates regarding the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill, multiple Members of Parliament expressed that intimate visual recordings of adults were not adequately or comprehensively captured by the FVPCA.295 Similarly, the Ministry of Justice has noted that “[i]n some circumstances, an intimate visual recording may fit the definition of an objectionable publication.”296 Further, in considering whether intimate visual recordings could be captured by the FVPCA, Acting Detective Senior Sergeant Aisling Davies said “If the victim is under 18 years old, images could be classified as objectionable...”.297 Finally, in its submission on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill, the Classification Office itself stated explicitly that the intimate visual recording definition “has a

292 Between 2016-2019, the only sexual publications classified as objectionable were those involving the sexual exploitation of children, bestiality, sexual violence, urination/ excretion and necrophilia. Analysis of the Classification Office Database demonstrates that this trend continues in 2023, with only publications involving the sexual exploitation of children, advice as to the procuring, grooming and abuse of children, sexual violence and bestiality receiving objectionable classifications. See 2016/2017 Annual Report of the Office of Film and Literature Classification (2017) at 12; Office of Film and Literature Classification Annual Report 2017/2018 (2018) at 10; and Office of Film and Literature Classification Annual Report 2018/2019 (2019) at 10.

293 Films, Videos, and Publications Classification Amendment Act 2005, s 3(3)(a)(c).

294 See above discussion at footnote 176.

295 For example, see (10 November 2021) 755 NZPD 6076, 6078; and (2 March 2022) NZPD 757 7799.

296 Ministry of Justice Initial Briefing: Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill (4 May 2021) at [15] (emphasis added).

297 Breanna Barraclough “A Broken Law: Rise of For-Profit Porn Sites Leaving ‘Revenge Porn’ Victims Stuck” 1News (online ed, New Zealand, 18 June 2021) (emphasis added).

broader application” than the objectionability standard.298 Thus, these comments demonstrate a belief or understanding that there are instances when intimate visual recordings will not also be objectionable under the FVPCA.

This lack of alignment arises from the differing rationales which underlie each regime. Specifically, the FVPCA is designed to mitigate and prevent societal harms, while the intimate visual recording offences address harms suffered by individuals. Former Chief Censor Dr Andrew Jack has described that harm to the public good “is far more than physical or emotional injury to an individual, it includes changes in attitudes, behaviour and beliefs...”.299 While non- consensual pornography, and thus IIDs, will undeniably harm the individual subject, corresponding societal harm is not inevitable.

  1. Conclusion

Fundamentally, the subject-matter gateway means the FVPCA cannot adequately criminalise IID publication because it would exclude IIDs which are not explicitly sexual but which are nonetheless significantly harmful to the individual victim. Even if this were not the case, based on the current approach of the Classification Office, IIDs would not be deemed inherently objectionable. Rather, this classification would depend on an IID’s particular content. Thus, reliance on the FVPCA would create baseless distinctions between IIDs depending on their particular content, rather than recognising and addressing the phenomenon as inherently harmful in all its manifestations.

IV The Harassment Act 1997

Under the Harassment Act the “most serious types of harassment” attract criminal liability.300 To establish criminal harassment, it must be shown that harassment has occurred, and that the perpetrator either intended that harassment to cause the other person to fear for their own or their family’s safety, or have known that such fear would be reasonably likely.301 Harassment

298 Classification Office “Submission to the Justice Committee on the Harmful Digital Communication (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021” at [2].

299 2016/2017 Annual Report of the Office of Film and Literature Classification (2017), above n 292, at 8.

See also Police v News Media Ownership [1975] 1 NZLR 610 (CA) at 615.

300 Harassment Act 1997, s 8. See generally R v D [2000] 2 NZLR 641 (CA) at [12].

301 Harassment Act, s 8(1). See Appendix 1.

occurs when a person is subjected to a “pattern of behaviour that includes doing any specified act” either twice within a 12-month period or as a “continuing act” – where one specified act “continues to have effect over a protracted period”.302 The continuing act standard was incorporated to recognise the ongoing harms caused when harassment occurs in the online space, where information has substantially greater persistence.303

I suggest that criminal harassment cannot effectively or comprehensively address IID publication on the basis that, much like s 22 of the HDCA, the offence fails to recognise the harm inherent in the phenomenon.

It is possible that IID publication could be a “specified act”, which are exhaustively defined in s 4 of the Harassment Act. Of those acts defined, it is arguable that IID publication could constitute “giving offensive material to [a] person or leaving it where it will be found by, given to, or brought to [their] attention...” under s 4(e),304 or acting in a way which causes the subject to (reasonably) fear for their safety under s 4(f). Regarding s 4(e), what is “offensive” has been interpreted broadly to include aggressive and hurtful behaviour, and that which annoys or insults.305 Regarding s 4(f), “safety” includes mental well-being. Given the significant psychological harms described in Chapter Two, it is likely both specified acts could be easily established in the context of IID publication.306 Despite this, the very existence of the specified act requirement limits the offence’s efficacy in the IBSA context.307 Given the phenomenon’s novelty and the disparity in views as to its harmfulness, a favourable decision by a court is far from inevitable. This places a substantial burden on the prosecution to establish the publication of IIDs as “offensive”, or to otherwise establish that the victim feared for their safety, and that this fear was reasonable. This burden is unnecessary and undesirable in circumstances where conduct inherently causes serious harms, as was argued in Chapter Two. Further, both of these specified acts require the subject see, or be likely to see, the IID. This would likely fail to capture several harmful manifestations of the phenomenon, such as when an IID is distributed on a private sharing site without any identifying information, and thus where the subject may never become aware of its existence.308 This raises related issues for the application of the

302 Harassment Act, s 3(1)-(3). See generally R v D, above n 300, at [12].

303 See generally Harvey, above n 135, at 326, 330; and Harvey and Tobin, above n 266, at [40.9.2].

304 Harassment Act, s 4(e). See also s 4(ea).

305 H v S [2000] DCR 90 (DC) at 93.

306 Harassment Act, s 2.

307 Barnes and Barraclough, above n 8, at [549]; and Harvey, above n 135, at 331.

308 See above discussion at page 35.

offence’s mens rea requirements, which constitute a motivation threshold. As was discussed in relation to s 22 of the HDCA, IIDs are published for a wide range of reasons which are not limited to intentions to cause harm or fear.309 Thus, requiring intention or knowledge that a victim will (reasonably) fear for their or their family’s safety both fails to adequately recognise the harms inherently caused by IID publication, and is also likely to exclude situations where perpetrators are economically or socially motivated.

Finally, it is worth noting that criminal harassment will not criminalise IID publication which occurs only once in a 12-month period, and is not a continuing act.310 What constitutes a continuing act – specifically, what will be a “protracted period” – is largely unclear, but would likely exclude a situation where an IID was posted online only for a short period of time and subsequently removed, if it did not have any form of online permanence.311

Thus, criminal harassment suffers from similar deficiencies to s 22 of the HDCA, in that even if IID publication broadly falls within the offence’s scope, the stringent nature of the offence’s elements makes it an inefficient solution in the context of IBSA.

V Conclusion

With reference to relevant offences in the Crimes Act, HDCA, FVPCA and Harassment Act, Chapter Four has comprehensively analysed whether New Zealand’s current criminal law can adequately address the publication of IIDs. I suggest that it cannot. Even where IID publication is not inherently excluded from these regimes, their applications are highly circumstantial and fail to address the harms inherent in IID publication. As New Zealand’s criminal legal landscape currently stands, any prospective prosecution of IID publication would be reliant on a piecemeal and arbitrary regulatory regime which places an unnecessary – and potentially harmful – burden on victims and the prosecution to establish the elements of any such offence.

309 See above discussion at page 35.

310 Harassment Act, s 3(2),(4).

311 Harvey, above n 135, at 331.

Chapter Five – Recommendations for Legislative Reform

So far, this paper has argued that the publication of IIDs is a phenomenon which requires a comprehensive and effective criminal legal remedy, and that presently this remedy does not exist. While existing criminal, civil and self-regulatory mechanisms play an important role in combatting the publication of IIDs, they do not capture all of the harms inflicted or experienced to the necessary extent. Beyond the obvious issue of legal loopholes, reliance on what is currently a piecemeal patchwork of poorly-fitting offences inhibits the effective operation of the criminal law’s coercive and expressive powers.312 As argued in Chapter Three, both are critical in ensuring an adequate regulatory response to the publication of IIDs. In particular, a failure to clearly and unambiguously criminalise IID publication contributes to the continued minimisation and misunderstanding of the phenomenon and its harms, increasing the likelihood that it falls “through the cracks”.313 It would be undesirable, and frankly stubborn, to rely on these existing offences to regulate IID publication.

Instead, New Zealand must act with legislative vigilance. A targeted, for-purpose offence which explicitly criminalises the publication of IIDs can ensure the phenomenon is comprehensively and effectively addressed while also deterring and stigmatising prospective perpetrators, and expressing to victims and society more generally that this conduct is unacceptable.314

However, neither is criminalisation a panacea. Effective regulation of a novel technological phenomenon such as IIDs will require a combination of legal, non-legal and technological responses.315 Specifically, the government will need to support digital literacy initiatives which educate the public on deepfake identification and misuse,316 stimulate research into anti-

312 Guzman, above n 149, at 120.

313 Barnes and Barraclough, above n 9, at [636]. See also Guzman, above n 149, at 120; and Flynn and others, above n 54, at 1353.

314 For example, “Sarah (victim) “initially reluctant to go to the police... because ‘no actual violence had been committed [and] there weren’t any real pictures of me.” See Flynn and others, above n 54, at 1353.

315 Delfino, above n 71, at 933; Meskys and others, above n 35, at 30. See generally Legislation Guidelines,

above n 26, at 115.

316 Westerlund, above n 107, at 45. For example, see Netsafe “Staying Safe: Understanding Deepfakes” (27 August 2020) <https://netsafe.org.nz/deepfakes/>; and Microsoft “New Steps to Combat Disinformation” (1 September 2020) < https://blogs.microsoft.com/on-the-issues/2020/09/01/disinformation-deepfakes-newsguard- video-authenticator/>

deepfake technologies to assist with rapid detection and authentication317 and continue to incentivise online platforms to engage with self-regulation and enforceable undertakings.318 This is particularly necessary given that anonymity and attribution issues make the implementation, enforcement and policing of offences more difficult in the online space.319 However, given the value of the criminal law’s expressive function, attribution issues should not prevent the criminalisation of IID publication, in the same way they did not prevent the HDCA’s enactment.

Having mounted the argument for explicit criminal legislative reform in the abstract, several questions remain regarding the scope of such an offence. Without purporting to generate a fully drafted legislative proposal, this Chapter provides some high-level recommendations as to how this offence should manifest, and its particular scope. I will then close with a brief discussion on whether, having argued for the criminalisation of IID publication, mere creation of IIDs should also attract criminal liability.

I Criminalising Publication

Because “criminal law marks the legal boundary of individual liberty”, prohibited conduct must be precisely defined.320 Thus, there must be specificity regarding what actions are prohibited (the actus reus) and the required culpability (the mens rea).321 Each will be considered in turn.

  1. Proposed Actus Reus

As was described in Chapter One, this paper has argued that the publication of involuntary, intimate deepfakes should attract criminal liability. I suggest that what is deemed “involuntary”

317 For example, see the United States Defence Advanced Research Projects Agency MediFor programme – William Corvey “Media Forensics (MediFor) <https://www.darpa.mil/program/media-forensics> See generally Chesney and Citron, above n 12, at 1787; Westerlund, above n 107, at 44; and Wang, above n 13, at 430.

318 For example, see Canada’s “multi-component” response to mitigate the risk of Deepfake interference in the 2019 election – B Siekierski Deep Fakes: What Can Be Done About Synthetic Audio and Video? (Economics, Resources and International Affairs Division Parliament of Canada, 4 August 2019) at 4. See generally Meskys and others, above n 35, at 31.

319 Nicola Henry, Asher Flynn and Anastasia Powell “Policing Image-Based Sexual Abuse: Stakeholder Perspectives” (2018) 19 Pol Pract Res 565 at 567; Harvey, above n 135, at 381; Chesney and Citron, above n 12, at 1792; Barnes and Barraclough, above n 9, at [647]; and Meskys and others, above n 35, at 26.

320 Legislation Guidelines, above n 26, at [24.2].

321 Legislation Guidelines, above n 26, at 121.

and “intimate” should be statutorily defined in line with that earlier discussion.322 Specifically, I suggest the intimate visual recording definitions cover the appropriate “types” of content which should also be deemed intimate in an IID context. Further, this would create alignment with the intimate visual recording offences, producing desirable legislative consistency between New Zealand’s IBSA regimes.323 However, for the difficulties identified in Chapter Four, the intimate visual recording definitions cannot be directly uplifted. This is particularly due to the requirement that content be occurring “in a place which, in the circumstances, would be reasonably expected to provide privacy”.324 While this qualifier is sensible in a capture- based scenario, it creates potentially harmful loopholes when applied to IIDs.325

This paper has also focused on the argument for criminalising the publication of IIDs. Because the dignity and privacy harms described in Chapter Two are not dependent on an IID’s publication being widespread, publication must be defined broadly. Specifically, the offence should recognise that significant harms can still be inflicted even where an IID is only published between perpetrator and subject. Section 216J of the Crimes Act, which prohibits the publication of intimate visual recordings, provides an expansive definition of publication which serves as a useful archetype – defining publication as displaying, sending, distributing by any means, conveying by electronic medium, or storing electronically in a way that is accessible to others. I suggest this definition would effectively capture all instances of IID publication capable of causing the serious harms which have been identified.

Further, based on my analysis in Chapter Four regarding s 22 of the HDCA, I argue that a prospective offence should not include any elements which relate to the impact the IID publication has on its subjects.326 These requirements would fail to reflect the inherent harms of IID publication on a subject’s sexual autonomy, dignity and privacy, thus unnecessarily and undesirably drawing the victim into the mechanics of a prosecution. This not only risks re- traumatising victims but may render the offence ineffectual by deterring victims from reporting their experiences in the first place.327

322 See above discussion at page 5.

323 See above discussion at page 31.

324 Crimes Act, s 216G(1)(a); and Harmful Digital Communications Act, s 4(a)(i).

325 See above discussion at page 32. 326 See above discussion at page 36. 327 See above discussion from page 35.

A more conceptually difficult consideration is how such an offence can or should define the types of content which are subject to it. As noted in Chapter One, deepfakes are not the only manifestation of hyper-realistic manipulated content, and thus an offence which refers to “deepfakes” or even “AI-generated content” would be too narrow.328 However, this raises a question of how hyper-realism should be defined. Manipulation alone cannot be the yardstick, as this could capture unrealistic cartoons or digital drawings which do not inflict the harms described in Chapter Two. But what is the desirable level of realism? From whose perspective should this be judged? Should a hyper-realistic deepfake which features something obviously false – such as it being set on Mars – attract criminal liability? These questions also have collateral implications in considering whether an IID should attract criminal liability if it is explicitly labelled as fake.329 New York and California have both introduced civil actions to address IIDs which explicitly state that labelling manipulated content as fake is no defence.330 Further, the United Kingdom Law Commission has considered these questions in some depth, concluding that “the application of [an offence] should be limited to images that appear to be a real video or photo... [but] we do not want to exclude all intimate images that include something that is not realistic.” 331 As an example, the Law Commission suggested that a hyper- realistic manipulated intimate image which had an emoji over the subject’s genitals should still attract criminal liability despite featuring something obviously false.332 To achieve this balance, the Law Commission recommended an offence which captured content that “appears to be an intimate image of a person”333 – terminology which was ultimately adopted into the Online Safety Bill 2022 (UK).334 However, this does not necessarily resolve the questions posed above, instead leaving substantial discretion with the courts in how they choose to determine what “appears to be an intimate image”.335 Though this is not necessarily an undesirable approach, it does create ambiguity as to the offence’s precise scope.

328 See above discussion at page 5.

329 See Kugler and Pace, above n, at 640, at 639.

330 NY Civ Rights Law § 52-C; and Cal Civ Code §§ 1708.85, 1708.86. See Appendix 2.

331 United Kingdom Law Commission Intimate Image Abuse: A Final Report (Law Com No 407, 2022) at [4.240].

332 United Kingdom Law Commission, above n 331, at [4.240].

333 United Kingdom Law Commission, above n 331, at [4.245]. See the Online Safety Bill 2022 (UK) (HL Bill 164) at cl 189 “Shows or appears to show...” At the time of writing, the Online Safety Bill is awaiting Royal Assent.

334 United Kingdom Law Commission, above n 331, at [4.245]. See Online Safety Bill (UK), above n 331, at cl 189.

335 Online Safety Bill (UK), above n 331, at cl 189(1)-(4). See Appendix 2.

Fundamentally, these issues boil down to one question – is the publication of IIDs only seriously harmful when the viewer believes, or could believe, that the content is real? If so, in order to establish the positive case for criminalisation, criminal liability must turn on the content’s realism. In consultations undertaken by the United Kingdom Law Commission, many consultees expressed that the harms of IID publication were, in fact, tied to them being realistic.336 However, while the fact that an IID is known or suspected to be fake may mitigate reputational harms, I suggest that privacy and dignity harms can still occur. As summarised by Rüya Toparlak, “regardless of their level of realism, all pornographic deep fakes reduce victims to sex objects and violate the depicted person’s rights.”337 This also aligns with the findings of a United States survey conducted by Matthew Kugler and Carly Pace, wherein most participants felt that labelling a deepfake as fake had no significant impact on its “perceived harmfulness or blameworthiness.”338

Ultimately, it is clear the statutory language will need to strike a balance between excluding what is obviously unrealistic, while not requiring the viewer necessarily believe the IID is real. This will not be an easy task and will require significant consideration by drafters.

  1. Proposed Mens Rea

As a general rule criminal offences “should include a mental element” to ensure individuals are only subject to criminal liability when they are at fault for prohibited conduct.339 Based on my discussion in Chapter Four, an IID publication offence should not include a motivation threshold, in the sense of requiring an intention to cause harm. Despite increasing recognition of how these thresholds prevent the effective enforcement of IBSA offences, Virginia’s newly enacted offence criminalises the publication of IIDs if there is intent to “coerce, harass, or intimidate.”340 If a similar requirement were adopted in New Zealand, it would both constitute an undesirable and unnecessary hurdle for prosecutors, and create a lacuna where publication is motivated by one of the numerous other identified factors.341 Not requiring a motivation

336 United Kingdom Law Commission, above n 331, at [4.237].

337 Toparlak, above n 62, at 6-7. See also Citron, above n 44, at 1914.

338 Kugler and Pace, above n, at 640, at 673.

339 Legislation Guidelines, above n 26, at [24.3].

340 Va Code Ann § 18.2-386.2. See Appendix 2.

341 See above discussion at page 29. See also Kaylee Williams “Tightening Restrictions on Deepfake Porn: What US Lawmakers Could Learn from the UK” Tech Policy Press (United States, 25 July 2023); and Kaylee Williams “Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography” Tech Policy Press (online ed, United States, 15 May 2023).

threshold is also consistent with s 22A of the HDCA, which was explicitly enacted to address the difficulties of s 22’s motivation threshold in the context of IBSA.

Regarding what the mens rea should be, s 216J of the Crimes Act once again provides a useful archetype – requiring the perpetrator to know or be reckless of the fact that the content they published was an intimate visual recording. I suggest this could be amended in the present context to require a perpetrator to know or be reckless of the fact that the content they published was a hyper-realistic manipulated intimate image. This would strike the correct balance between requiring fault, while adequately capturing the full scope of seriously harmful conduct.

  1. Practical Considerations

Finally, I recommend the criminalisation of IID publication should be enacted as a sui generis offence, rather than as an appendage to existing offences (namely, the intimate visual recording offences). The latter approach has been adopted by the United Kingdom,342 New South Wales343 and the Australian Capital Territories,344 where IIDs have been criminalised by amending existing intimate image offences. While this would be the most convenient method of criminalisation, I suggest it would create an offence which applied uncomfortably and unnaturally to IIDs, and was potentially inconsistent with the rule of law.

Despite recognition by commentators that IBSA requires a coherent “specially tailored” regulatory response, attempts to “keep up” with the phenomenon have often resulted in “ad hoc, piecemeal or misplaced” legislation.345 I suggest incorporating IIDs into the existing intimate visual recording offences would have this effect. Fundamentally, while Chapter Two demonstrated that the serious harms inflicted by IID publications are comparable to those inflicted by capture-based manifestations of IBSA, there are material differences in how these harms are inflicted, differences which are reflected in the already mentioned need to amend the intimacy requirements in the intimate visual recording offences.346 Relatedly, the rule of law requires that criminal offences are differentiated to reflect meaningful distinctions in culpable

342 Online Safety Bill (UK), above n 331, at cl 189.

343 Crimes Act 1900 (NSW), s 91N.

344 Crimes Act 1900 (ACT), s 72A

345 McGlynn and Rackley, above n 1, at 535, 535. See also Gieseke, above n 70, at 1486,

346 See above discussion at page 50.

wrongdoing, enabling the law to be clear, certain and knowable.347 Thus, the principle of fair labelling prescribes that criminal offences should be named with “adequate precision as to the gist of their wrongdoing.”348 This not only generates certainty regarding which behaviours attract criminal liability, but also ensures wrongdoers’ convictions accurately reflect their wrongdoing.349 Though superficially similar, I suggest categorising IIDs as intimate visual recordings would not be an accurate reflection.

  1. Conclusion

Having established the case for criminal legislative reform, precisely how the publication of IIDs should be criminalised requires further analysis and consideration. I have laid out some high-level recommendations based on my research which I believe will assist in ensuring a prospective offence is effective in addressing the significant harms at play. Particularly, legislative developments in this area must pay attention to the lessons which have been learned from criminalising IBSA generally – specifically, that the harm inherent in the conduct must be recognised, and that an offence must be designed to minimise the potential for re- traumatisation of victims.

II Criminalising Creation

Until this point, I have argued that the publication of IIDs should attract criminal liability. However, whether mere creation of IIDs should also be subject to criminalisation has not been addressed. Though the matter falls outside this paper’s direct scope, the following discussion serves to demonstrate why the regulation of creation and publication have been – and should be – considered separately.

The mere creation of IIDs is “more prevalent than people are likely aware of.”350 While many people’s moral intuition may be that IID creation is disturbing, immoral or wrong, whether it is seriously harmful in the manner necessary to establish the positive case for criminalisation

347 See generally Legislation Guidelines, above n 26, at [3.4], [4.1]; Simester and Brookbanks, above n 28, at 1026; and Citron, above n 44, at 1948.

348 Simester and Brookbanks, above n 28, at 1026.

349 Simester and Brookbanks, above n 28, at 1026. See generally Glanville Williams “Convictions and Fair Labelling” (1983) 42 CLJ 85.

350 United Kingdom Law Commission, above n 331, at [4.179].

is less clear.351 In 2022, the United Kingdom Law Commission recommended that mere creation should not attract criminal liability, a recommendation which ultimately resulted in its exclusion from the Online Safety Bill (UK).352 This recommendation was made on the basis that mere creation does not inflict serious harms which outweigh freedom of expression concerns, despite the Commission accepting that creation does constitute an invasion of the subject’s sexual autonomy.353

However, this conception is not undisputed. Internationally, politicians,354 academics355 and tech industry leaders356 have advocated that mere IID creation can inflict serious harms which justify criminalisation. Beyond these theoretical discussions, South Korea has also already enacted legislation which criminalises both IID publication and creation.357 On this basis, I suggest New Zealand should not necessarily follow blindly in the United Kingdom’s footsteps.

It is arguable that the mere creation of IIDs still constitutes a harmful invasion of the individual’s private sphere by intruding upon their sexual autonomy. A potentially stronger argument can be mounted in terms of an IIDs potential for societal, rather than individual, harm. Carl Öhman argues that some form of regulatory intervention is necessary against mere IID creation on the basis that it has a significant potential for societal harm – demonstrated by the fact that mere IID creation still feels intuitively wrong even if it is never shared and the subject never knows of its existence.358 Specifically, Öhman describes that the gendered nature of the phenomenon means that at a societal level, mere IID creation systemically degrades women and perpetuates gender inequality.359 This is also the view of Ann Olivarious, who

351 See Öhman, above n 12, at 134.

352United Kingdom Law Commission, above n 331, at [4.220].

353 United Kingdom Law Commission, above n 331, at [4.215]-[4.220]; and United Kingdom Law Commission Intimate Image Abuse: A Consultation Paper (Consultation Paper 253, 26 February 2021) at [7.96]-[7.98].

354 For example, see United Kingdom House of Commons Women and Equalities Committee Sexual Harassment of Women and Girls in Public Places (HC 701, 23 October 2018) at [52].

355 McGlynn and Rackley, above n 1, at 539, 556; and Meskys and others, above n 35, at 31. Further academic support was expressed in United Kingdom Law Commission, above n 331, at [4.210]-[4.12].

356 United Kingdom Law Commission Intimate Image Abuse: A Consultation Paper, above n 353, at [7.94].

357 Act On Special Cases Concerning the Punishment of Sexual Crimes 2010 (South Korea), art 14-2. See Appendix 2. China has also prohibited the mere creation of non-consensual deepfakes, and requires all synthetic content be labelled as such. For further discussion on IID creation, see Caroline Quirk “The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology” Princeton Legal Journal (19 June 2023); Emmie Hine and Luciano Floridi “New Deepfake Regulations in China are a Tool for Social Stability, but at What Cost?” (2022) 4 Mat Mach Intell 608 at 608; and Hugo Yeung and Brian Chan “In the News: Draft Trademark Law Reforms; China Ahead of the Curve in Regulating Deepfakes; and Guangdong GI Protection” (2023) China Law & Prac 1.

358 Öhman, above n 12, at 134, 137.

359 Öhman, above n 12, at 134, 138-139.

suggests that mere IID creation can harm society because it constitutes an “unmistakable form of violence” against the subject.360 Because subjects are disproportionately women, Olivarious suggests that this violence detrimentally impacts women as a societal group.361 Beyond the systemic degradation of women, I also suggest that unregulated IID creation can be harmful on the basis that it may normalise the utilitarian use of others and potentially lead to an increase of more severe sexual offending in the physical realm.362 This reasoning also underlies arguments for the criminalisation of fake pornographic material depicting children,363 and sexual abuse of sex robots.364

Thus, while mere IID creation perpetuates different – and arguably lesser – harms than publication, whether it should nonetheless be subject to regulation is more nuanced than the United Kingdom’s approach may lead us to believe. Thus, the question of IID creation deserves further consideration in the New Zealand context.

360 United Kingdom Law Commission, above n 331, at [4.210].

361 United Kingdom Law Commission, above n 331, at [4.210].

362 Matthew L Williams and others “Hate in the Machine: Anti-Black and Anti-Muslim Social Media Posts as Predictors of Offline Racially and Religiously Aggravated Crime” (2020) 60 Brit J Criminol 93 at 114.

363 See generally Hadeel Al-Alosi The Criminalisation of Fantasy Material: Law and Sexually Explicit Representations of Fictional Children (New York, Routledge, 2018).

364 See generally John Danaher “Robotic Rape and Robotic Child Sexual Abuse: Should They be Criminalised?” (2017) 11 Crim Law Philos 71.

Concluding Remarks

The emergence of involuntary intimate deepfakes is an alarming reminder of the ever-evolving nature of image-based sexual abuse. In this dissertation, I have argued that to adequately and effectively address the serious harms inflicted by the publication of involuntary intimate deepfakes, this conduct must be criminalised. Currently, New Zealand’s criminal legal landscape does not achieve this clearly or comprehensively. Through a largely novel analysis of New Zealand’s relevant criminal regimes, I argue that the criminal law’s current capacity to address the publication of involuntary intimate deepfakes is not merely ambiguous, but inadequate. To describe the phenomenon’s legal status as ambiguous downplays the extensive issues with existing offences, generating an inaccurate façade that the criminal law could be adequate. Under such an approach, this façade is only lifted when the first victim is told by a court that their interests cannot be vindicated.

Consequently, this dissertation serves as a clarion call for legislative action. The potential risks and harms associated with the publication of involuntary intimate deepfakes demand a prompt and proactive legislative response in the form of an explicit, sui generis offence. In contemplating the path forward, the precise scope and content of this offence is up for debate, and I have set out some recommendations for how I consider it should manifest. In particular, any legislative response must be nuanced, balanced and informed by an understanding of the complexities surrounding this emergent issue.

As New Zealand grapples with incipient signs of involuntary intimate deepfakes, it is crucial for politicians and academics alike to engage in a concerted effort to safeguard New Zealand society from the insidious impacts of hyper-realistic manipulated intimate content. By explicitly criminalising the publication of involuntary intimate deepfakes, New Zealand can take a critical step toward addressing the risks posed by this emergent form of image-based sexual abuse.

Bibliography

  1. Cases
  1. New Zealand

A v Hunt [2006] NZAR 577 (HC).

Benatzky v R [2018] NZCA 413.

Brittin v Police [2017] NZHC 2410, [2018] 2 NZLR 147.

Brooker v Police [2007] NZSC 30, [2007] 3 NZLR 91.

C v Holland [2012] NZHC 2155, [2012] 3 NZLR 672.

Chen v R [2019] NZCA 299, (2019) 29 CRNZ.

Commerce Commission v Fonterra Co-Operative Group Ltd [2007] NZSC 36; [2007] 3 NZLR 767.

Commerce Commission v Fonterra Co-Operative Group Ltd [2007] NZSC 36, [2007] 3 NZLR 767.

Diffin v R [2013] NZCA 460, (2013) 26 CRNZ 368.

Donselaar v Donselaar [1982] NZCA 13; [1982] 1 NZLR 97 (CA) at 114.

Down v R [2012] NZSC 21, [2012] 2 NZLR 585.

H v S [2000] DCR 90 (DC).

Hosking v Runting [2004] NZCA 34, [2003] 3 NZLR 385.

Kirby v Police [2012] NZHC 2397, [2012] NZAR 975.

Living Word Distributors v Human Rights Action Group [2000] NZCA 179; [2000] 3 NZLR 570 (CA).

Mafart v Television New Zealand Ltd [2006] 3 NZLR 534 (CA).

McKenzie v Attorney General [1991] NZCA 105; [1992] 2 NZLR 14.

Moonen v Film and Literature Board of Review [1999] NZCA 329; [2000] 2 NZLR 9 (CA).

Morse v Police [2011] NZSC 45, [2012] 2 NZLR 1.

New Truth & TV Extra (4 November 1994 issue) (1996) 3 HRNZ 162 (FLBR). Northern Milk Ltd v Northern Milk Vendors Association Inc [1988] NZCA 80; [1988] 1 NZLR 537 (CA). Pearson v Police [2015] NZHC 410.

Police v B [2017] NZHC 526.

Police v News Media Ownership [1975] 1 NZLR 610 (CA). Police v News Media Ownership Ltd [1975] 1 NZLR 610 (CA). R v D [2000] 2 NZLR 641 (CA).

R v Gordon-Smith (No 2) [2009] NZLR 725 (SC).

R v Pratt [1990] 2 NZLR 129 (CA).

Re Baise Moi [2005] NZAR 214 (CA).

Re Codd FLBR, 7 April 2017, 2017 WL 3845824.

Re Untitled Video Recording Public Swimming Pools At Papatoetoe Office of Film and Literature Classification 300507, 27 May 2003.

Seymour v R [2021] NZHC 2322.

Taylor v Attorney General [2014] NZHC 2225, (2014) 10 HRNZ 31.

Terminals (NZ) Ltd v Comptroller of Customs [2013] NZSC 139, [2014] 1 NZLR 121.

Tot Toys Ltd v Mitchell [1992] NZHC 2902; [1993] 1 NZLR 325 (HC).

Wood-Luxford v Wood [2013] NZSC 153, [2014] 1 NZLR 451.

Worldwide NZ LLC v NZ Venue and Event Management Ltd [2014] NZSC 108; [2015] 1 NZLR 1 (SC).

X v R [2021] NZCA 331.

  1. England and Wales

Bourne v Norwhich Crematorium Ltd [1967] 2 All ER 576 (Ch).

C v Director of Public Prosecutions [1995] UKHL 15; [1996] AC 1 (HL).

R v Rimmington [2005] UKHL 63, [2006] 1 AC 459.

Sweet v Parsley [1969] UKHL 1; [1970] AC 132 (UKHL).

ZXC v Bloomberg [2022] UKSC 5.

  1. United States

Ettore v Philco Television Broadcasting Co [1956] USCA3 69; 229 F 2d 481 (3d Cir 1956).

  1. Legislation
  1. New Zealand

Crimes (Intimate Covert Filming) Amendment Act 2006. Crimes Act 1961.

Evidence Act 2006.

Films, Videos, and Publications Classification Act 1993.

Films, Videos, and Publications Classification Amendment Act 2005. Harassment Act 1997.

Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Act 2022. Harmful Digital Communications Act 2015.

Legislation Act 2019.

New Zealand Bill of Rights Act 1990. Sexual Violence Legislation Act 2021.

  1. Australia

Crimes Act 1900 (NSW). Crimes Act 1900 (ACT).

  1. United Kingdom

Online Safety Bill 2022 (UK) (HL Bill 164).

  1. United States

Civil Code of the State of California. Code of the State of Virginia.

Consolidated Laws of New York, Civil Rights Law.

  1. South Korea

Act On Special Cases Concerning the Punishment of Sexual Crimes 2010.

  1. Books and Chapters in Books

Hadeel Al-Alosi The Criminalisation of Fantasy Material: Law and Sexually Explicit Representations of Fictional Children (New York, Routledge, 2018).

Rebecca Bonnevie “Privacy and Emerging Technologies” in Nikki Chamberlain and Stephen Penk (eds) "Privacy - A to Z of New Zealand Law" (online ed, Thomson Reuters, 2023).

Sean Brennan and Geoff McLay “Defamation, Privacy and the Intentional Infliction of Emotional Distress Meet the Internet and the Harmful Digital Communications Act” in Andrew Barker and Geoff McLay (eds) Torts Update (Wellington, New Zealand Law Society, 2016).

Roger Brownsword and Morag Goodwin Law and the Technologies of the Twenty-First Century: Text and Materials (Cambridge, Cambridge University Press, 2012).

Roger Brownsword, Law 3.0: Rules, Regulation, and Technology (London, Routledge, 2020).

Godwin Busuttil, Felicity McMahon, and Gervase de Wilde “Privacy, The Internet, and Social Media” in Nicole Moreham, Mark Warby, Michael Tugendhat and Iain Christie (eds) Tugendhat and Christie: The Law of Privacy and the Media (3rd ed, United Kingdom, Oxford University Press, 2016).

Andrew Burrows The Hamlyn Lectures: Thinking about Statutes – Interpretation, Interaction, Improvement

(Cambridge, Cambridge University Press, 2018).

Ross Carter Burrows and Carter Statute Law in New Zealand (6th ed, Wellington, LexisNexis, 2021).

Nikki Chamberlain “Privacy and Social Media” in Nikki Chamberlain and Stephen Penk (eds) "Privacy - A to Z of New Zealand Law" (online ed, Thomson Reuters, 2023).

Ursula Cheer and Stephen Todd “Invasion of Privacy” in Stephen Todd Todd on Torts (8th ed, Wellington, Thomson Reuters, 2019).

Ursula Cheer Burrows and Cheer: Media Law in New Zealand (8th ed, Wellington, LexisNexis, 2021). Suzie Dunn “Is It Actually Violence? Framing Technology-Facilitated Abuse as Violence” in Asher Flynn,

Nicola Henry and Jane Bailey (eds) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Bingley, Emerald Publishing Ltd, 2021).

Suzie Dunn Technology-Facilitated Gender-Based Violence: An Overview (Canada, Centre for International Governance Innovation, 2021.

Joel Feinberg The Moral Limits of the Criminal Law: Harm to Others – vol 1 (Oxford University Press, New York, 1984).

Colin Gavaghan, Alastair Knott, James MacLaurin, John Zerilli and Joy Liddicoat Government Use of Artificial Intelligence in New Zealand (Wellington, New Zealand Law Foundation, 2019).

Susan Glazebrook “Filling the Gaps” in Rick Bigwood (ed), The Statute: Making and Meaning (Wellington, LexisNexis, 2004).

Nikki Godden-Rasul “Retribution, Redress, and the Harms of Rape: The Role of Tort Law” in Anastasia Powell, Nicola Henry and Asher Flynn (eds) Rape Justice: Beyond The Criminal Law (London, Palgrave Macmillan, 2015).

David Harvey and Rosemary Tobin “New Media and the Challenge of Convergence” in David Harvey, Rosemary Tobin and Paul Sumpter Media Law - A to Z of New Zealand Law (online ed, Thomson Reuters, 2017).

David Harvey internet.law.nz: Selected Legal Issues For The Digital Paradigm (5th ed, Wellington, LexisNexis, 2023).

Nicola Henry, Clare McGlynn, Asher Flynn, Kelly Johnson, Anastasia Powell and Adrian Scott Image-Based Sexual Abuse: A Study on the Causes and Consequences of Non-Consensual Nude or Sexual Imagery (London, Routledge, 2021).

Julie Inness Privacy, Intimacy and Isolation (Oxford University Press, New York, 1992). Immanual Kant Lectures on Ethics (New York, Cambridge University Press, 1997).

Lara Karaian “The Troubled Relationship of Feminist and Queer Legal Theory to Strategic Essentialism: Theory/ Praxis, Queer Porn, and Canadian Anti-Discrimination Law” in Martha Fineman, Jack Jackson and Adam Romero (eds) Feminist and Queer Legal Theory: Intimate Encounters, Uncomfortable Conversations (Surrey, Ashgate Publishing Limited, 2009).

Elizabeth McDonald Rape Myths as Barriers to Fair Trial Process (Christchurch, Canterbury University Press, 2020).

Nicole Moreham “The Nature of the Privacy Interest” in Nicole Moreham Mark Warby, Michael Tugendhat and Iain Christie (eds) Tugendhat and Christie: The Law of Privacy and the Media (3rd ed, United Kingdom, Oxford University Press, 2016).

Vanessa E Munro “Dev’l-in Disguise? Harm, Privacy and the Sexual Offences Act 2003” in Vanessa E Munroe and Carl F Stychin (eds) Sexuality and the Law: Feminist Engagements (London, Routledge, 2007).

Frances E. Olsen Feminist Legal Theory II: Positioning Feminist Theory Within the Law (Dartmouth, Dartmouth Publishing Company Limited, 1995).

Stephen Penk and Natalya King “Common Law Privacy Protection in Other Jurisdictions” in Nikki Chamberlain and Stephen Penk (eds) "Privacy - A to Z of New Zealand Law" (online ed, Thomson Reuters, 2023).

John Rawls Political Liberalism (2nd ed, New York, Columbia University Press, 1996).

Andrew Simester and Warren Brookbanks Principles of Criminal Law (4th ed, Thomson Reuters, Wellington, 2012).

John Stuart Mill On Liberty (Longman, Roberts & Green, London, 1869).

Madhavi Sunder Gender and Feminist Theory in Law and Society (Hampshire, Ashgate Publishing Limited, 2007).

Kristen Thomasen and Suzie Dunn “Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada’s Decision in R v Jarvis” in Jane Bailey, Asher Flynn and Nicola Henry (eds) The Emerald International Handbook of Technology-Facilitated Violence and Abuse ((Emerald Publishing Limited, 2021).

Julia Tolmie, Kris Gledhill, Fleur Te Aho and Khylee Quince Criminal Law in Aotearoa, New Zealand

(Wellington, LexisNexis, 2022).

Raymond Wacks, The Protection of Privacy (London, Sweet & Maxwell, 1980). Alan Westin Privacy and Freedom (Athenaeum, New York, 1967).

Glanville Williams Textbook of Criminal Law (4th ed, London, Thomson Reuters, 2015).

Lucia Zedner and Julian V Roberts Principles and Values in Criminal Law and Criminal Justice: Essays in Honour of Andrew Ashworth (Oxford, Oxford University Press, 2012).

  1. Journal Articles

Mirko Bagaric and James Allan “The Vacuous Concept of Dignity” (2006) 5 J Hum Rights 257.

Samantha Bates “Revenge Porn and Mental Health: A Qualitative Analysis of the Mental Health Effects of Revenge Porn on Female Survivors” (2017) 12 Fem Criminol 22.

Julia Black “Critical Reflections on Regulation” (2002) 27 AJLP 1.

Edward Bloustein “Privacy as an Aspect of Human Dignity: An Answer to Dean Prosser” (1964) 39 NYUL 962.

Naomi Cahn “Looseness of Legal Language: The Reasonable Woman Standard in Theory and in Practice” (1991) 77 Cornell L Rev 1398.

Federica Celli “Deepfakes are Coming: Does Australia Come Prepared?” (2020) 17 Canb LR 193.

Nikki Chamberlain “Misappropriation of Personality: A Case for Common Law Identity Protection” (2021) 26 TLJ 195.

Bobby Chesney and Danielle Citron “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security” (2019) 107 CLR 1753.

Jonathan Clough “Revenge Porn: Criminal Law Responses” (2016) 132 Precedent 30.

Michael Cusumano, Annabelle Gawer and David Yoffie “Can Self-regulation Save Digital Platforms” (2021) 30 Ind Corp Change 1259.

John Danaher “Robotic Rape and Robotic Child Sexual Abuse: Should They be Criminalised?” (2017) 11 Crim Law Philos 71.

Jared de Guzman “Saving Face: Lessons from the DMCA for Combatting Deepfake Pornography” (2022) 58 Gonz L Rev 109 at 120.

Rebecca Delfino “Pornographic Deepfakes: The Case for Federal Criminalization of Revenge Porn’s Next Tragic Act” (2019) 88 Fordham L Rev 887.

Asher Flynn, Anastasia Powell, Adrian Scott and Elena Cama “Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse” (2021) 62 Brit J Criminol 1341.

Mary Anne Franks and Arr Ezra Waldman “Sex, Lies, and Videotape: Deep Fakes and Free Speech Delusions” (2019) 78 Md L Rev 892.

Alisdair Gillespie “‘Trust Me, It's only for Me’: ‘Revenge Porn’ and the Criminal Law” (2015) 11 Crim LR 866. Douglas Harris “Deepfakes: False Pornography is Here and the Law Cannot Protect You” (2018) 17 Duke L &

Tech Rev 99.

Nicola Henry, Asher Flynn and Anastasia Powell “Policing Image-Based Sexual Abuse: Stakeholder Perspectives” (2018) 19 Pol Pract Res 565.

Nicola Henry and Asher Flynn “Image-Based Sexual Abuse: Online Distribution Channels and Illicit Communities of Support” (2019) 25 Violence Against Wom 1932.

Anna High “Sexual Dignity and Rape Law” (2022) 33 Yale J L & Feminism 1.

Emmie Hine and Luciano Floridi “New Deepfake Regulations in China are a Tool for Social Stability, but at What Cost?” (2022) 4 Mat Mach Intell 608.

Danielle Keats Citron and Mary Anne Franks “Criminalizing Revenge Porn” (2014) 49 Wake Forest L Rev 345. Danielle Keats Citron “Addressing Cyber Harassment: An Overview of Hate Crimes in Cyberspace” (2015) 6

JOLTI 1.

Danielle Keats Citron “Sexual Privacy” (2019) 128 Yale L J 1870.

Liz Kelly “The Continuum of Sexual Violence” in Jalna Hanmer and Mary Maynard (eds) Women, Violence and Social Control (London, Palgrave Macmillan, 1987).

Kathleen Kenealy “Sexual Harassment and the Reasonable Woman Standard” (1992) 8 Lab Law 203. Nancy Kim “Web Site Proprietorship and Online Harassment” (2009) UTAH L REV 993.

Matthew Kugler and Carly Pace “Deepfake Privacy: Attitudes and Regulation” (2021) 116 NWULR 611. Nicola Lacey “Theory into Practice? Pornography and the Public/ Private Dichotomy” (1993) 20 JLS 93. Lyrissa Lidsky “Prying, Spying and Lying: Intrusive Newsgathering and What the Law Should Do About It”

(1999) 73 Tul L Rev 173.

Taylor Linkous “It’s Time for Revenge Porn to Get a Taste of Its Own Medicine: An Argument for the Federal Criminalization of Revenge Porn” (2014) 20 RICH J L & TECH 14.

Taylor Matthews “Deepfakes, Intellectual Cynics, and the Cultivation of Digital Sensibility” (2022) 92 Royal Institute of Philosophy Supplement 67.

Clare McGlynn and Erika Rackley “Image-Based Sexual Abuse” (2017) 37 OJLS 534.

Clare McGlynn, Kelly Johnson, Erika Rackley, Nicola Henry, Nicola Gavey, Asher Flynn and Anastasia Powell “‘It’s Torture for the Soul’: The Harms of Image-Based Sexual Abuse” (2021) 30 Soc Leg Stud 541.

Edvinas Meskys, Julija Kalpokeine, Paulius Jurcys and Aidas Liaudanskas “Regulating Deep Fakes: Legal and Ethical Considerations” (2020) 15 JLPLP 24.

Nicole Moreham “Beyond Information: Physical Privacy in English Law” (2014) 73 CLJ 350.

Marjan Nadim and Audun Fladmoe “Silencing Women? Gender and Online Harassment” (2019) 39 Soc Sci Comput Rev 245.

Mary Neal “Dignity, Law and Language-Games” (2012) 25 IJSL 107. Matha C. Nussbaum “Objectification” (1995) 24 Phil & Pub Aff 249.

Carl Öhman “Introducing the Pervert’s Dilemma: A Contribution to the Critique of Deepfake Pornography” (2020) 22 Ethics Inf Technol 133.

Stephanie Panzic “Legislating for E-Manners: Deficiencies and Unintended Consequences of the Harmful Digital Communications Act” (2015) 21 AULR 225.

Wendy Parker “The Reasonable Person: A Gendered Concept Claiming the Law” (1993) 28 VUWLawRw 105. Anne Pechenik Gieseke “‘The New Weapon of Choice’: Law's Current Inability to Properly Address Deepfake

Pornography” (2020) 73 Vand L Rev 1479.

Patrick Phelan “Are the Current Legal Responses to Artificial Intelligence Facilitated ‘Deepfake’ Pornography Sufficient to Curtail the Inflicted Harm?” (2022) 9 NELR 20.

Dean Prosser “Privacy” (1960) 48 Cal L Rev 383.

Caroline Quirk “The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology” Princeton Legal Journal (19 June 2023).

John Rawls “Kantian Constructivism in Moral Theory” (1980) 77 J Phil 515.

Beth Richie “A Black Feminist Reflection on the Antiviolence Movement” (2000) 25 Signs 1133.

Regina Rini and Leah Cohen “Deepfakes, Deep Harms” (2022) 22 JSEP 143.

Olivia Smith and Tina Skinner “Observing Court Responses to Victims of Rape and Sexual Assault” (2012) 7 Fem Criminol 298.

Natalie Sokoloff and Ida DuPont “Domestic Violence and the Intersections of Race, Class, and Gender” (2005) 11 Violence Against Wom 38.

Russell Spivak “‘Deepfakes’: The Newest Way to Commit One of the Oldest Crimes” (2019) 3 GEO L TECH REV 339.

Cass Sunstein “On the Expressive Function of Law (1996) 144 U Pa L Rev 2021.

Rüya Toparlak “Criminalising Deep Fake Pornography: A Gender-Specific Analysis of Image-Based Sexual Abuse” 2023 1 cognitio 1.

Andrew von Hirsch and Nils Jareborg “Gauging Criminal Harm: A Living-Standard Analysis” (1991) 11 OJLS 1.

James Waldo, Lin Herbert and Lynette Millett “Engaging Privacy and Information Technology in a Digital Age: Executive Summary” (2010) 2 J Priv Confidentiality 1.

Jeremy Waldron “Dignity and Defamation: The Visibility of Hate” (2010) 123 Harv Law Rev 1596. Moncarol Wang “Don’t Believe Your Eyes: Fighting Deepfaked Nonconsensual Pornography with Tort Law”

(2022) 415 U Chi L F. 415.

Samuel Warren and Louis Brandeis “The Right to Privacy” (1890) 4 Harvard L Rev 193.

Mika Westerlund “The Emergence of Deepfake Technology” (2019) 9 Technol Innov Manag Rev 39. Glanville Williams “Convictions and Fair Labelling” (1983) 42 CLJ 85.

Matthew L Williams, Pete Burnap, Amir Javed, Han Liu and Sefa Ozalp “Hate in the Machine: Anti-Black and Anti-Muslim Social Media Posts as Predictors of Offline Racially and Religiously Aggravated Crime” (2020) 60 Brit J Criminol 93.

David Wilson “Responding To The Challenges: Recent Developments in Censorship Policy in New Zealand” (2007) 30 Soc Polic J NZ 65.

Helen Winkelmann “Access to Justice: Who Needs Lawyers” [2014] OtaLawRw 2; (2014) 13 Otago LR 229.

Hugo Yeung and Brian Chan “In the News: Draft Trademark Law Reforms; China Ahead of the Curve in Regulating Deepfakes; and Guangdong GI Protection” (2023) China Law & Prac 1.

  1. Parliamentary and Government Materials
  1. New Zealand

(2 December 1992) 532 NZPD.

(22 June 1993) 536 NZPD.

(29 July 1993) 537 NZPD.

(17 August 1993) 537 NZPD.

(5 May 2005) 635 NZPD.

(14 March 2006) 629 NZPD.

(21 November 2006) 635 NZPD.

(10 November 2021) 755 NZPD

(15 February 2022) 757 NZPD.

(2 March 2022) NZPD 757.

Brainbox “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

Classification Office “Submission to the Justice Committee on the Harmful Digital Communication (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021”.

Crimes (Intimate Covert Filming) Amendment Bill 2005 (257-1) (explanatory note).

Departmental Report: Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill (Ministry of Justice, June 2021).

Films, Videos, and Publications Classification Bill 1992 (select committee report).

For Your Information: Australian Privacy Law and Practice Report (Australian Law Reform Commission, ALRC 108, May 2008).

Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2) (select committee report).

Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill (305-

  1. (explanatory note).

Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2) (select committee report).

Law Commission Harmful Digital Communications: The Adequacy of the Current Sanctions and Regimes

(August 2012).

Law Commission Intimate Covert Filming (NZLC SP15 2004).

Law Commission The Justice Response to Victims of Sexual Violence: Criminal Trials and Alternative Processes (SP23534, 2015).

Legislation Guidelines (Legislation Design and Advisory Committee, September 2021).

Clare McGlynn, Erika Rackley and Kelly Johnson “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

Ministry of Justice “Key Initiatives: Improving Access to Civil Justice” (20 September 2023)

< https://www.justice.govt.nz/justice-sector-policy/key-initiatives/access-to-civil-justice/>

Ministry of Justice “Wayfinding for Civil Justice – Imagining a Better Way of Working Together to Improve Access to Civil Justice in Aotearoa New Zealand” (20 September 2023).

<https://www.justice.govt.nz/justice-sector-policy/key-initiatives/access-to-civil-justice/>

Ministry of Justice Initial Briefing: Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill (4 May 2021).

National Network Ending Sexual Violence Together “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

Netsafe “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

New Zealand Law Society “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

Office of the Privacy Commissioner “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

Rules Committee Improving Access to Civil Justice (November 2022).

Stace Hammond “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

Supplementary Order Paper 202 (103) Harmful Digital Communication (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2).

Supplementary Order Paper 2021 (83) Harmful Digital Communication (Unauthorised Posting of Intimate Visual Recording) Amendment Bill 2021 (305-2).

YouthLaw Aotearoa “Submission to the Justice Committee on the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Bill”.

  1. United Kingdom

United Kingdom House of Commons Women and Equalities Committee Sexual Harassment of Women and Girls in Public Places (HC 701, 23 October 2018).

United Kingdom Law Commission Intimate Image Abuse: A Consultation Paper (Consultation Paper 253, 26 February 2021).

United Kingdom Law Commission Intimate Image Abuse: A Final Report (Law Com No 407, 2022).

  1. Reports

2016/2017 Annual Report of the Office of Film and Literature Classification (2017).

Henry Ajder and others The State of Deepfakes: Landscape, Threats and Impact (Deeptrace, September 2019). Curtis Barnes and Tom Barraclough Perception Inception: Preparing for Deepfakes and the Synthetic Media of

Tomorrow (New Zealand, Brainbox, 2019).

Miriam Lips and Elizabeth Eppel Mapping Media Content Harms: A Report Prepared for Department of Internal Affairs (Victoria University of Wellington Te Herenga Waka, 22 September, 2022).

New Zealand Law Society Access To Justice: Stocktake of Initiatives (December 2020).

Office of Film and Literature Classification Annual Report 2017/2018 (2018). Office of Film and Literature Classification Annual Report 2018/2019 (2019). Report of the Secretary General S- LXXVII UN Doc A/77/302 (18 August 2022).

B. J. Siekierski Deep Fakes: What Can Be Done About Synthetic Audio and Video? (Economics, Resources and International Affairs Division Parliament of Canada, 4 August 2019).

Te Kura Kaiwhakawā Institute of Judicial Studies Responding To Misconceptions About Sexual Offending: Example Directions for Judges and Lawyers (August 2023).

The Code: Aotearoa New Zealand Code of Practice for Online Safety Harms (July 2022).

  1. Newspaper Articles

“AI-Generated Nude Images Spread at Spain School, Parents Outraged” 1News (online ed, New Zealand, 24 September 2023).

Rana Ayyuub “In India, Journalists Face Slut-Shaming and Rape Threats” The New York Times (online ed, New York, 22 May 2018).

Sara Barker “The Deepfake Dilemma: How it Affects Privacy, Security and Law in Aotearoa” FutureFive

(online ed, New Zealand, 17 November 2021).

Brianna Barraclough “A Broken Law: Rise of For-Profit Porn Sites Leaving ‘Revenge Porn’ Victims Stuck”

1News (online ed, New Zealand, 18 June 2021).

Helen Busby “Deepfake Porn Documentary Explores its ‘Life Shattering’ Impact” BBC News (online ed, United Kingdom, 18 June 2023).

Samantha Cole “Targets of Fake Porn Are At the Mercy of Big Platforms” Vice (online ed, United States, 6 February 2018).

Jesslyn Cook Here's What It's Like to See Yourself in a Deepfake Porn Video” Huffington Post (online ed, United States, 23 June 2019).

Sophie Cornish “Law Loopholes Around “Deepfakes” A Threat to Justice, Police and Law Experts Warn” Stuff

(online ed, New Zealand, 30 July 2022).

David Court “Deepfake Videos Coming to a Social Media Platform You Use” Stuff (online ed, New Zealand, 16 June 2019).

Megan Farokhmanesh “Is It Legal to Swap Someone's Face into Porn Without Consent?” The Verge (online ed, United States, 31 January 2018).

Sophie Harris “Calls from MPs and Survivor for Protections for ‘Deepfake Porn’ Victims” NZHerald (online ed, New Zealand, 7 December 2021).

Sophie Harris “Woman Left Distraught After Stalker Edited Her Face into Porn Video, Put it Online” Stuff

(online ed, New Zealand, 1 July 2022).

Drew Harwell “Fake-Porn Videos Are Being Weaponised to Harass and Humiliate Women” Stuff (online ed, New Zealand, 31 December 2018).

Derek Hawkins “Reddit Bans ‘Deepfakes,’ Pornography Using the Faces of Celebrities” Stuff (online ed, New Zealand, 9 February 2018).

Guy Hedgecoe “AI-Generated Naked Child Images Shock Spanish Town of Almendralejo” BBC News (online ed, United Kingdom, 24 September 2023).

Finn Hogan “Experts Concerned Over the Rise of Deepfake Technology” Newshub (online ed, New Zealand, 29 October 2022).

Hannah Hudnall “Fact Check: Deepfake Video Shows Vladimir Putin Discussing Democracy, is from 2020 Ad Campaign” USA Today (online ed, United States, 19 April 2023).

Katie Kenny “Chief Censor David Shanks Says An Entirely New Media Regulator May be Needed” Stuff

(online ed, New Zealand, 23 October 2019).

Brittany Keogh “Deepfakes: New Zealand Experts on How ‘Face-Swap’ Could Turn Sinister” Stuff (online ed, New Zealand, 22 March 2020).

Tess McClure “New Zealand ‘Revenge Porn’ Laws in Spotlight Amid Accusations Against Former National Candidate” The Guardian (online ed, New Zealand, 3 June 2021).

Mediawatch “Battle Against Online Harm Beefs Up Censor’s Power” RNZ (New Zealand, 21 March 2021).

“NZ’s First Case of Deepfake Pornography Triggering Alarm Bells for Officials” 1News (online ed, New Zealand, 1 August 2020).

AP “Deepfake Porn Could be a Growing Problem Amid AI Race” NewstalkZB (online ed, New Zealand, 17 April 2023).

Simon Shepherd “Digital Experts Worried About Growth of ‘Deep Fakes’ Videos” Newshub (online ed, New Zealand, 2 April 2018).

“Spanish Prosecutor Investigates If Shared AI Images of Naked Girls Constitute a Crime” The Guardian (online ed, United Kingdom, 25 September 2023).

Kaylee Williams “Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography” Tech Policy Press (online ed, United States, 15 May 2023).

Kaylee Williams “Tightening Restrictions on Deepfake Porn: What US Lawmakers Could Learn from the UK”

Tech Policy Press (United States, 25 July 2023).

  1. Internet Resources

BuzzFeedVideo “You Won’t Believe What Obama Says in This Video!” (April 18, 2018) Youtube

<https://www.youtube.com/watch?v=cQ54GDm1eL0>

Department of Internal Affairs “Media and Online Content Regulation: Safer Online Services and Media Platforms Review” <https://www.dia.govt.nz/media-and-online-content-regulation>

Discord “Discord Community Guidelines” (24 February 2023) <https://discord.com/guidelines> Google Search Help “Remove Explicit or Intimate Personal Images From Google”

<https://support.google.com/websearch/answer/6302812?sjid=7547990195809157247-AP> Instagram “Community Guidelines” <https://help.instagram.com/477434105621119>

Antonia M “Does New Zealand Need a Specific Law For Deepfakes” (6 September 2019) Linkedin

<https://www.linkedin.com/pulse/does-new-zealand-need-specific-law-deepfakes-antonia-modkova/> Merriam-Webster “Words We’re Watching: ‘Streisand Effect’: Don’t Try To Keep This Under Wraps”

<https://www.merriam-webster.com/wordplay/words-were-watching-streisand-effect-barbra>

Microsoft “New Steps to Combat Disinformation” (1 September 2020) < https://blogs.microsoft.com/on-the- issues/2020/09/01/disinformation-deepfakes-newsguard-video-authenticator/>

Netsafe “Staying Safe: Understanding Deepfakes” (27 August 2020) <https://netsafe.org.nz/deepfakes/> Karen Ngan and Michelle Dunlop “Pictures Don’t Lie, or Do They?” (20 May 2022) Simpson Grierson

<https://www.simpsongrierson.com/insights-news/legal-updates/pictures-dont-lie-or-do-they> Pornhub “Terms of Service” (17 August 2023) <https://www.pornhub.com/information/terms#terms> Reddit “Never Post Intimate or Sexually Explicit Media of Someone Without Their Consent” (July 2023)

<https://support.reddithelp.com/hc/en-us/articles/360043513411> The Code “About The Code” <https://thecode.org.nz/about-the-code/>

TikTok “Community Guidelines: Integrity and Authenticity” (March 2023)

<https://www.tiktok.com/community-guidelines/en/integrity-authenticity/>

United States Defence Advanced Research Projects Agency MediFor program, William Corvey “Media Forensics (MediFor) <https://www.darpa.mil/program/media-forensics>

Rob Vaughan “Glaring Gaps in Harmful Digital Communications Bill” (12 November 2021) Stace Hammond,

<https://www.stacehammond.co.nz/glaring-gaps-in-harmful-digital-communications-bill/>

X Help Center “Sensitive Media Policy” (March 2023) <https://help.twitter.com/en/rules-and-policies/media- policy>

G Other Resources

Ross Carter and James McHeron Seminar: Statutory Interpretation Update (New Zealand Law Society CLE Ltd, June 2016).

Classification Office Statement of Intent (2021 – 2025) (February 2021).

Conaghan “Civil Liability: Addressing Police Failures in the Context of Rape, Domestic and Sexual Abuse” (Inaugural Lecture, University of Bristol, 19 February 2015).

Email from Arran Hunt (Partner at McVeagh Fleming) to Bella Stuart regarding Hunt’s views of the applicability of Harmful Digital Communications Act 2015 to IIDs (17 April 2023).

Oxford English Dictionary (Online Database) “Recording” accessed 26 June 2023.

APPENDIX 1 – New Zealand Legislation

I Crimes Act 1961

216G Intimate visual recording defined

(1) In sections 216H to 216N, intimate visual recording means a visual recording (for example, a photograph, videotape, or digital image) that is made in any medium using any device without the knowledge or consent of the person who is the subject of the recording, and the recording is of—

(a) a person who is in a place which, in the circumstances, would reasonably be expected to provide privacy, and that person is—
(i) naked or has his or her genitals, pubic area, buttocks, or female breasts exposed, partially exposed, or clad solely in undergarments; or

(ii) engaged in an intimate sexual activity; or

(iii) engaged in showering, toileting, or other personal bodily activity that involves dressing or undressing; or

(b) a person’s naked or undergarment-clad genitals, pubic area, buttocks, or female breasts which is made—

(i) from beneath or under a person’s clothing; or

(ii) through a person’s outer clothing in circumstances where it is unreasonable to do so.

(2) In section 216H, intimate visual recording includes an intimate visual recording that is made and transmitted in real time without retention or storage in—

(a) a physical form; or

(b) an electronic form from which the recording is capable of being reproduced with or without the aid of any device or thing.

216J Prohibition on publishing, importing, exporting, or selling intimate visual recording

(1) Everyone is liable to imprisonment for a term not exceeding 3 years who, knowing that a visual recording is an intimate visual recording, or being reckless as to whether a visual recording is an intimate visual recording,—

(a) publishes in New Zealand the intimate visual recording:

(b) imports into New Zealand the intimate visual recording:

(c) exports from New Zealand the intimate visual recording:

(d) sells the intimate visual recording.

(2) In this section, unless the context otherwise requires,— publishes means any of the following:

(a) displays by any means:

(b) sends to any person by any means:

(c) distributes by any means:

(d) conveys by electronic medium:

(e) stores electronically in a way that is accessible by any other person or persons sells means sells in a physical form or by electronic medium, and includes—

(a) offers for sale:

(b) agrees to sell.

II Harmful Digital Communications Act 2015

  1. Purpose

The purpose of this Act is to—

(a) deter, prevent, and mitigate harm caused to individuals by digital communications; and

(b) provide victims of harmful digital communications with a quick and efficient means of redress.

  1. Interpretation

intimate visual recording—

(a) means a visual recording (for example, a photograph, videotape, or digital image) that is made in any medium using any device with or without the knowledge or consent of the individual who is the subject of the recording, and that is of—

(i) an individual who is in a place which, in the circumstances, would reasonably be expected to provide privacy, and the individual is—

(ii) an individual’s naked or undergarment-clad genitals, pubic area, buttocks, or female breasts which is made—

(b) includes an intimate visual recording that is made and transmitted in real time without retention or storage in—

(i) a physical form; or

(ii) an electronic form from which the recording is capable of being reproduced with or without the aid of any device or thing

22 Causing harm by posting digital communication

(1) A person commits an offence if—

(a) the person posts a digital communication with the intention that it cause harm to a victim; and

(b) posting the communication would cause harm to an ordinary reasonable person in the position of the victim; and

(c) posting the communication causes harm to the victim.

(2) In determining whether a post would cause harm, the court may take into account any factors it considers relevant, including—

(a) the extremity of the language used:

(b) the age and characteristics of the victim:

(c) whether the digital communication was anonymous:

(d) whether the digital communication was repeated:

(e) the extent of circulation of the digital communication:

(f) whether the digital communication is true or false:

(g) the context in which the digital communication appeared.

(3) A person who commits an offence against this section is liable on conviction to,—

(a) in the case of a natural person, imprisonment for a term not exceeding 2 years or a fine not exceeding $50,000:

(b) in the case of a body corporate, a fine not exceeding $200,000.

(4) This section does not apply if the posted digital communication is an intimate visual recording to which the offence in section 22A applies.

22A Posting intimate visual recording without consent

(1) A person commits an offence if the person, without reasonable excuse, posts a digital communication that is an intimate visual recording of a victim—

(a) knowing that the victim has not consented to the posting; or

(b) being reckless as to whether the victim has consented to the posting.

(2) An individual under the age of 16 years cannot consent to the posting of an intimate visual recording of which they are the subject.

(3) A person who commits an offence against this section is liable on conviction to,—

(a) in the case of a natural person, imprisonment for a term not exceeding 2 years or a fine not exceeding

$50,000:

(b) in the case of a body corporate, a fine not exceeding $200,000.

III Harassment Act 1997

  1. Interpretation

safety, in relation to any person, includes that person’s mental well-being.

  1. Meaning of harassment

(1) For the purposes of this Act, a person harasses another person if he or she engages in a pattern of behaviour that is directed against that other person, being a pattern of behaviour that includes doing any specified act to the other person on at least 2 separate occasions within a period of 12 months.

(2) To avoid any doubt,—

(a) the specified acts required for the purposes of subsection (1) may be the same type of specified act on each separate occasion, or different types of specified acts:

(b) the specified acts need not be done to the same person on each separate occasion, as long as the pattern of behaviour is directed against the same person.

(3) For the purposes of this Act, a person also harasses another person if—

(a) he or she engages in a pattern of behaviour that is directed against that other person; and

(b) that pattern of behaviour includes doing any specified act to the other person that is one continuing act carried out over any period.

(4) For the purposes of subsection (3), continuing act includes a specified act done on any one occasion that continues to have effect over a protracted period (for example, where offensive material about a person is placed in any electronic media and remains there for a protracted period).

  1. Meaning of specified act

(1) For the purposes of this Act, a specified act, in relation to a person, means any of the following acts:

(a) watching, loitering near, or preventing or hindering access to or from, that person’s place of residence, business, employment, or any other place that the person frequents for any purpose:

(b) following, stopping, or accosting that person:

(c) entering, or interfering with, property in that person’s possession:

(d) making contact with that person (whether by telephone, correspondence, electronic communication, or in any other way):

(e) giving offensive material to that person or leaving it where it will be found by, given to, or brought to the attention of that person:

(ea) giving offensive material to a person by placing the material in any electronic media where it is likely that it will be seen by, or brought to the attention of, that person:

(f) acting in any other way—

(i) that causes that person (person A) to fear for his or her safety; and

(ii) that would cause a reasonable person in person A’s particular circumstances to fear for his or her safety.

(2) To avoid any doubt, subsection (1)(f) includes the situation where—

(a) a person acts in a particular way; and

(b) the act is done in relation to a person (person B) in circumstances in which the act is to be regarded, in accordance with section 5(b), as done to another person (person A); and

(c) acting in that way—

(i) causes person A to fear for his or her safety; and

(ii) would cause a reasonable person in person A’s particular circumstances to fear for his or her safety,—

whether or not acting in that way causes or is likely to cause person B to fear for person B’s safety.

(3) Subsection (2) does not limit the generality of subsection (1)(f).

8 Criminal Harassment

(1) Every person commits an offence who harasses another person in any case where—

(a) the first-mentioned person intends that harassment to cause that other person to fear for—
(i) that other person’s safety; or

(ii) the safety of any person with whom that other person is in a family relationship; or

(b) the first-mentioned person knows that the harassment is likely to cause the other person, given his or her particular circumstances, to reasonably fear for—

(i) that other person’s safety; or

(ii) the safety of any person with whom that other person is in a family relationship.

(c) Every person who commits an offence against this section is liable, on conviction, to imprisonment for a term not exceeding 2 years.

IV Films, Videos and Publications Classification Act 1993

  1. Interpretation

publication means—

(a) any film, book, sound recording, picture, newspaper, photograph, photographic negative, photographic plate, or photographic slide:

(b) any print or writing:

(c) a paper or other thing that has printed or impressed upon it, or otherwise shown upon it, 1 or more (or a combination of 1 or more) images, representations, signs, statements, or words:

(d) a thing (including, but not limited to, a disc, or an electronic or computer file) on which is recorded or stored information that, by the use of a computer or other electronic device, is capable of being reproduced or shown as 1 or more (or a combination of 1 or more) images, representations, signs, statements, sounds, or words

(e) a copy of images or sounds that have been livestreamed, but not the livestreaming itself of those images or sounds (livestream has the meaning given in section 119A)

  1. Meaning of objectionable

(1) For the purposes of this Act, a publication is objectionable if it describes, depicts, expresses, or otherwise deals with matters such as sex, horror, crime, cruelty, or violence in such a manner that the availability of the publication is likely to be injurious to the public good.

(1A) Without limiting subsection (1), a publication deals with a matter such as sex for the purposes of that subsection if—

(a) the publication is or contains 1 or more visual images of 1 or more children or young persons who are nude or partially nude; and

(b) those 1 or more visual images are, alone, or together with any other contents of the publication, reasonably capable of being regarded as sexual in nature.

(1B) Subsection (1A) is for the avoidance of doubt.

(2) A publication shall be deemed to be objectionable for the purposes of this Act if the publication promotes or supports, or tends to promote or support,—

(a) the exploitation of children, or young persons, or both, for sexual purposes; or

(b) the use of violence or coercion to compel any person to participate in, or submit to, sexual conduct; or

(c) sexual conduct with or upon the body of a dead person; or

(d) the use of urine or excrement in association with degrading or dehumanising conduct or sexual conduct; or

(e) bestiality; or

(f) acts of torture or the infliction of extreme violence or extreme cruelty.

(3) In determining, for the purposes of this Act, whether or not any publication (other than a publication to which subsection (2) applies) is objectionable or should in accordance with section 23(2) be given a classification other than objectionable, particular weight shall be given to the extent and degree to which, and the manner in which, the publication—

(a) describes, depicts, or otherwise deals with—
(i) acts of torture, the infliction of serious physical harm, or acts of significant cruelty:

(ii) sexual violence or sexual coercion, or violence or coercion in association with sexual conduct:

(iii) other sexual or physical conduct of a degrading or dehumanising or demeaning nature:

(iv) sexual conduct with or by children, or young persons, or both:

(v) physical conduct in which sexual satisfaction is derived from inflicting or suffering cruelty or pain:

(b) exploits the nudity of children, or young persons, or both:

(c) degrades or dehumanises or demeans any person:

(d) promotes or encourages criminal acts or acts of terrorism:

(e) represents (whether directly or by implication) that members of any particular class of the public are inherently inferior to other members of the public by reason of any characteristic of members of that class, being a characteristic that is a prohibited ground of discrimination specified in section 21(1) of the Human Rights Act 1993.

(4) In determining, for the purposes of this Act, whether or not any publication (other than a publication to which subsection (2) applies) is objectionable or should in accordance with section 23(2) be given a classification other than objectionable, the following matters shall also be considered:

(a) the dominant effect of the publication as a whole:

(b) the impact of the medium in which the publication is presented:

(c) the character of the publication, including any merit, value, or importance that the publication has in relation to literary, artistic, social, cultural, educational, scientific, or other matters:

(d) the persons, classes of persons, or age groups of the persons to whom the publication is intended or is likely to be made available:

(e) the purpose for which the publication is intended to be used:

(f) any other relevant circumstances relating to the intended or likely use of the publication.

13 Submission of publications by others

(1) Any of the following persons may submit a publication to the Classification Office for a decision on that publication’s classification:

(a) the chief executive of the New Zealand Customs Service: (ab) the Commissioner of Police:

(b) the Secretary:

(ba) subject to subsections (1A) and (1B), an online content host who or that has been issued with a take- down notice relating to an online publication:

(c) subject to subsection (2), any other person.

(1A) A submission by an online content host under subsection (1)(ba) must be submitted within 20 working days after they receive the take-down notice.

(1B) The Chief Censor may determine that an online publication submitted to the Classification Office under subsection (1)(ba) will not be examined or classified by the office if—

(a) the online publication has already been submitted to the Classification Office under this section; or

(b) the online publication has already been the subject of a classification decision; or

(c) the Chief Censor considers that the submitting of the online publication to the Classification Office is frivolous or vexatious.

(d) A publication may be submitted to the Classification Office under subsection (1)(c) only with the leave of the Chief Censor given under section 15.

(e) The Chief Censor may, on his or her own motion, determine that any publication should be received for examination by the Classification Office. In any such case the Chief Censor shall, by notice in writing, direct the chief executive of the New Zealand Customs Service or the Secretary to take all reasonable steps to obtain a copy of the publication and submit it to the Classification Office under paragraph (a) or, as the case requires, paragraph (b) of subsection (1).

  1. Offences of strict liability relating to objectionable publications

(1) Every person commits an offence against this Act who—

(a) makes an objectionable publication; or

(b) makes a copy of an objectionable publication for the purposes of supply, distribution, display, or exhibition to any other person; or

(c) imports into New Zealand an objectionable publication for the purposes of supply or distribution to any other person; or

(d) supplies or distributes (including in either case by way of exportation from New Zealand) an objectionable publication to any other person; or

(e) has in that person’s possession, for the purposes of supply or distribution to any other person, an objectionable publication; or

(f) in expectation of payment or otherwise for gain, or by way of advertisement, displays or exhibits an objectionable publication to any other person.

(2) Every person who commits an offence against subsection (1) is liable on conviction,—

(a) in the case of an individual, to a fine not exceeding $10,000:

(b) in the case of a body corporate, to a fine not exceeding $30,000.

(3) It shall be no defence to a charge under subsection (1) that the defendant had no knowledge or no reasonable cause to believe that the publication to which the charge relates was objectionable.

(4) Without limiting the generality of this section, a publication may be—

(a) supplied (within the meaning of that term in section 2) for the purposes of any of paragraphs (b) to (e) of subsection (1); or

(b) distributed (within the meaning of that term in section 122) for the purposes of any of paragraphs (b) to

(e) of subsection (1); or

(c) imported into New Zealand for the purposes of paragraph (c) of subsection (1),—

not only in a physical form but also by means of the electronic transmission (whether by way of facsimile transmission, electronic mail, or other similar means of communication, other than by broadcasting) of the contents of the publication.

  1. Offences involving knowledge in relation to objectionable publications

(1) Every person commits an offence against this Act who does any act mentioned in section 123(1), knowing or having reasonable cause to believe that the publication is objectionable.

(2) Every person who commits an offence against subsection (1) is liable on conviction,—

(a) in the case of an individual, to imprisonment for a term not exceeding 14 years:

(b) in the case of a body corporate, to a fine not exceeding $200,000.

APPENDIX 2 – International Legislation

I Australia

  1. Crimes Act 1900 (NSW)

Division 15C Recording and distributing intimate images 91N Definitions

intimate image means—

(a) an image of a person’s private parts, or of a person engaged in a private act, in circumstances in which a reasonable person would reasonably expect to be afforded privacy, or

(b) an image that has been altered to appear to show a person’s private parts, or a person engaged in a private act, in circumstances in which a reasonable person would reasonably expect to be afforded privacy.

91Q Distribute intimate image without consent

(1) A person who intentionally distributes an intimate image of another person—

(a) without the consent of the person, and

(b) knowing the person did not consent to the distribution or being reckless as to whether the person consented to the distribution,

is guilty of an offence.

Maximum penalty—100 penalty units or imprisonment for 3 years, or both.

(2) A prosecution of a person under the age of 16 years for an offence against this section is not to be commenced without the approval of the Director of Public Prosecutions.

  1. Crimes Act 1900 (ACT)

Part 3A Intimate image abuse 72A Definitions—Pt 3A

intimate image, of a person—

(a) means a still or moving image, in any form—

(i) of the person’s genital or anal region; or

(ii) for a female or a transgender or intersex person who identifies as a female—of the person’s breasts; or

(iii) of the person engaged in a private act; or

(iv) that depicts the person in a sexual manner or context; and

(b) includes an image, in any form, that has been altered to appear to show any of the things mentioned in paragraph (a).

72C Non-consensual distribution of intimate images

A person (the offender) commits an offence if—

(a) the offender distributes an intimate image of another person; and

(b) the offender—

(i) knows the other person does not consent to the distribution; or

(ii) is reckless about whether the other person consents to the distribution. Maximum penalty:

(a) for an aggravated offence—400 penalty units, imprisonment for 4 years or both; or

(b) in any other case—300 penalty units, imprisonment for 3 years or both.

II South Korea

Act On Special Cases Concerning the Punishment of Sexual Crimes 2010.

Article 14-2 (Distribution of False Video Products)

(1) A person who edits, synthesizes, or processes photograph, video, or audio (hereinafter referred to as “photograph, etc.” in this Article) targeting the face, body or voice of a person for the purpose of dissemination, etc, in a form that may cause sexual desire or shame against the will of the person who is subject to video, etc. (hereinafter referred to as “editing, etc.” in this Article), shall be punished by imprisonment with labor for not more than five years or a fine of not more than 50 million won.

(2) A person who has published, etc. a compilation, composite, or processed product (hereinafter referred to as "compilation, etc." in this paragraph) or a editing, etc. (including a duplicate of a duplicate; hereinafter the same applies in this paragraph) under paragraph (1), or even if it is not contrary to the intention of the person subject to the video material, etc. at the time of editing, etc. under paragraph (1), a person who dismisses the edited material, etc. against the intention of the person subject to the video material, etc. after the death, shall be punished by imprisonment with labor for not more than five years or by a fine not exceeding 50 million won.

(3) A person who commits a crime under paragraph (2) by means of information and communications networks against the will of the person subject to video works, etc. for the purpose of making profits shall be punished by imprisonment with labor for not more than seven years.

(4) A person who habitually commits any of the crimes provided for in paragraph (1) through (3) shall be aggravatingly punished by up to 1/2 of the punishment for each crime.

III United Kingdom

Online Safety Bill 2022 (UK) (HL Bill 164)

Offences to be inserted into Sexual Offences Act 2003

189 Sharing or threatening to share intimate photograph or film

In the Sexual Offences Act 2003, after section 66A (inserted by section 188), insert— “66B Sharing or threatening to share intimate photograph or film

(1) A person (A) commits an offence if—

(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,

(b) B does not consent to the sharing of the photograph or film, and

(c) A does not reasonably believe that B consents.

(2) A person (A) commits an offence if—

(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,

(b) A does so with the intention of causing B alarm, distress or humiliation, and

(c) B does not consent to the sharing of the photograph or film.

(3) A person (A) commits an offence if—

(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,

(b) A does so for the purpose of A or another person obtaining sexual gratification,

(c) B does not consent to the sharing of the photograph or film, and

(d) A does not reasonably believe that B consents.

(4) A person (A) commits an offence if—

(a) A threatens to share a photograph or film which shows, or appears to show, another person (B) in an intimate state, and

(b) A does so—

(i) with the intention that B or another person who knows B will fear that the threat will be carried out, or

(ii) being reckless as to whether B or another person who knows B will fear that the threat will be carried out

...

III United States

A Civil Code of the State of California

§ 1708.85 – Invasion of privacy: distribution of sexually explicit materials

(a) A private cause of action lies against a person who intentionally distributes by any means a photograph, film, videotape, recording, or any other reproduction of another, without the other's consent, if

(1) the person knew, or reasonably should have known, that the other person had a reasonable expectation that the material would remain private,

(2) the distributed material exposes an intimate body part of the other person, or shows the other person engaging in an act of intercourse, oral copulation, sodomy, or other act of sexual penetration, and

(3) the other person suffers general or special damages as described in Section 48a.

...

§ 1708.86 – Cause of action for depicted individual

(a) For purposes of this section:

(1) "Altered depiction" means a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.

...

(b) A depicted individual has a cause of action against a person who does either of the following:

(1) Creates and intentionally discloses sexually explicit material and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation or disclosure.

(2) Intentionally discloses sexually explicit material that the person did not create and the person knows the depicted individual in that material did not consent to the creation of the sexually explicit material.

...

(4) "Depicted individual" means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.

B Code of the State of Virginia

§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.

  1. Any person who, with the intent to coerce, harass, or intimidate, maliciously disseminates or sells any videographic or still image created by any means whatsoever that depicts another person who is totally nude, or in a state of undress so as to expose the genitals, pubic area, buttocks, or female breast, where such person knows or has reason to know that he is not licensed or authorized to disseminate or sell such videographic or still image is guilty of a Class 1 misdemeanor. For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's face, likeness, or other distinguishing characteristic.

...

C Consolidated Laws of New York, Civil Rights Law

§ 52-C Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual

  1. For the purposes of this section:
    1. "depicted individual" means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
    2. "digitization" means to realistically depict the nude body parts of another human being as the nude body parts of the depicted individual, computer-generated nude body parts as the nude body parts of the depicted individual or the depicted individual engaging in sexual conduct, as defined in subdivision ten of section 130.00 of the penal law, in which the depicted individual did not engage.

...

2.

  1. A depicted individual shall have a cause of action against a person who, discloses, disseminates or publishes sexually explicit material related to the depicted individual, and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation, disclosure, dissemination, or publication.
  2. It shall not be a defense to an action under this section that there is a disclaimer in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.

...


NZLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.nzlii.org/nz/journals/UOtaLawTD/2023/30.html