NZLII Home | Databases | WorldLII | Search | Feedback

University of Otago Law Theses and Dissertations

You are here:  NZLII >> Databases >> University of Otago Law Theses and Dissertations >> 2011 >> [2011] UOtaLawTD 18

Database Search | Name Search | Recent Articles | Noteup | LawCite | Download | Help

Mellor, Samuel --- "Regulating Online Conduct: Conundrums and Spatial Metaphors in the Wild West" [2011] UOtaLawTD 18

Last Updated: 17 September 2023

REGULATING ONLINE CONDUCT: CONUNDRUMS AND SPATIAL METAPHORS IN THE WILD WEST

Samuel T Mellor

An essay submitted in partial fulfilment of the requirements for the degree of Bachelor of Laws (Hons)

University of Otago, New Zealand

14 October 2011

CONTENTS

INTRODUCTION 2

  1. REALISM AND THE REGULATORY CONUNDRUM 5
  2. REGULABILITY IN FACT 10
  3. REGULABILITY: THE NORMATIVE PERSPECTIVES 30
  4. PERSISTENCE OF THE WILD WEST 42
CONCLUSION 53

INTRODUCTION

The genesis of this essay came about through an announcement made by Justice Minister Simon Power in October 2010. The minister directed a Law Commission review to consider how law and the justice system interact with the internet. In particular, Mr Power wished the Commission to consider how to curb the problem of the new media as what he called a “Wild West”. Similar enquires are also on-going in Australia through the Convergence Review and a second distinct review announced by Senator Conroy in September 2011.1

The implications of the New Zealand review could be far-reaching given the scope of the brief. The Commission was tasked with determining:2

My aim has not been to duplicate the Commission’s work but to take a second look at the assumptions that underlie the review itself and offer a broader perspective on where internet regulation has been, where it is and where it could go. This matter is of particular interest because recent and forthcoming decisions of Court of Appeal and the High Court have illuminated the judiciary’s view about regulating the new sphere. Specifically, the timing seems ripe in the wake of the recent Slater v Police to examine some of the premises behind Internet regulation.3

1 Stephen Conroy ‘Government announces Independent Media Inquiry’ (press release, 14 September 2011); Department of Broadband, Communications and the Digital Economy, Australian Government (2011) ‘Convergence Review’ < www.dbcde.gov.au/digital_economy/convergence_review>

2 Simon Power ‘Law Commission to review regulatory gaps around “new media”’ (press release, 14 October 2010) Law Commission ‘Review of Regulatory Gaps and the New Media’ (2010) Law Commission

<www.lawcom.govt.nz/home>

3 Slater v Police HC Auckland CRI-2010-404-379, 12 May 2011

This essay thus has three broad goals. The first is to examine the proposition that the internet and new media exist without proper regulation. The bulk of this work elucidates how the legislature and judiciary have used existing legal precepts to regulate the web sphere. From this we can deduce that regulation of online content prima facie does not require a special “webified” quality. Rather, the mature laws that operate in other sectors of society apply almost uniformly to the internet. Simply put, the impetus for specialist ‘internet law’ regulating online content seems dead. The second goal is to illuminate why, in spite of the actual regulability of the Internet, the idea that it is unregulated retains such strong currency. This entails consideration of both the normative framework which supports the notion of the Internet as a separate entity and the powerful cognitive metaphors which underpins it. A final goal is to examine whether these normative arguments and cognitive metaphors have positive or negative impacts on the prospect of regulating for better democracy.

Part I therefore considers why the net is so often thought of as a regulatory conundrum and offers a realist perspective on how this conundrum might be avoided in legal analysis via an appreciation of the fine-grained analysis conducted by both the legislature and judiciary. Part II seeks to elucidate various examples where either the legislature has competently dealt with perceived regulatory conundrums or the judiciary has applied existent laws seemingly without difficulty to online problems. Part III turns to the normative arguments suggesting that we ought to view the Internet as a separate space for the purposes of law based either on its cultural uniqueness or simply for reasons of liberty. This examination of the normative arguments suggests that they are unsustainable. Finally, Part IV examines the powerful role that metaphor and construction of space have on how we view the regulability of the networked sphere.

One further preliminary note is necessary. This essay confines itself to the discussion of ‘unlawful anarchic conduct’ and the regulation of such. This is conduct which has either an offensive impact (to an individual or group) or an impact on well-being which is ephemeral or difficult to quantify.4 This is in contrast to either dangerous or fraudulent conduct which impacts on either physical or national safety or the economic wellbeing of persons, businesses or governments.5 These latter categories are best described loosely as “cyber-crime” although

4 Stuart Biegel Beyond Our Control? Confronting the Limits of Our Legal System in the Age of Cyberspace (MIT Press, Cambridge, 2003) at 73

5 Ibid, at 55, 65

elements of the criminal jurisdiction are also involved in my analysis. Specifically I will not be discussing areas of law for which there is dedicated legislation.6 Rather, I am concerned with how the ordinary laws that govern anarchic conduct translate into the networked sphere.

To this end my analysis will involve discussion of defamation, hate speech and court-ordered suppression. This is for several reasons. Firstly, this type of behaviour has the most impact on democracy. Its anarchic nature is the most troubling and difficult and is therefore a beneficial recipient of scrutiny. Secondly, the types of issues that arise in this area (jurisdiction, publication, enforcement) are demonstrative of wider issues in other categories. Moreover, the case law and legislative practice in this category has had an outsize influence on the law in general. As we shall see, this is particularly so in regards the law of defamation.7

6 New Zealand has a large array of such legislation to confront specific issues of online criminality. For example: Accessing/damaging computer systems: Crimes Act 1961, ss 249-252; Spam Email: Unsolicited Electronic Messages Act 2007; Distribution of obscene material: Films, Videos, and Publications Classification Act 1993, s 122; Protection of rights to privacy: Crimes Act 1961, s 216J (publishing intimate covert films); Harassment: Telecommunications Act 2001, s 112 (misuse of a telephone device).

For general discussion of New Zealand cybercrime issues see: Gregor Allen ‘Responding to Cybercrime: A Delicate Blend of the Orthodox and the Alternative’ (2005) 2 New Zealand Law Review 149. Also: David Harvey internet.law.nz (2nd ed, LexisNexis, Wellington, 2005) at 181-344.

7 Lilian Edwards ‘The Fall and Rise of Intermediary Liability Online’ in Lilian Edwards and Charlotte Waelde (eds.)

Law and the Internet (3rd ed, Hart Publishing, Oxford, 2009) 47 at 51

PART I— REALISM AND THE REGULATORY CONUNDRUM

Notionally, the Internet presents innumerable conundrums for law. However, the most forthright lies in the area of enforcement of existing law. Chris Reed notes for example that the difference between applicability and enforceability needs to be front and centre when considering law in relation to the Internet. He argues that while it is easy to create law directed at a perceived problem online, these laws have at least the perception of being unenforceable. This creates two major defects for the rule of law. The first is that obviously an unenforceable law is not effective in dealing with the mischief it seeks to target. The second is that unenforceable law threatens or weakens the normative force of other laws.8

The nature of the web itself makes difficult enforcement of national restrictions on freedom of speech. Anonymity is one problem, jurisdiction is another. Tools such as anonymous remailers and public key encryption are widely available. Also, so-called onion routing allows users to make most online activity anonymous.9 Aside from these varied technical solutions for circumventing content restrictions, the most ubiquitous form is simple information redeployment, i.e. moving the host of material to a country with less strict laws than the target state. This method allows dissident groups to continue to disseminate information to a target country but also has consequences for enforcing laws against offensive content.10 So, for instance, sites in Canada and the United States are known to host material which breaches European anti-hate speech and holocaust denial laws.11 The simplicity with which national law can be evaded has led Uta Kohl to state:

8 Chris Reed Internet Law: Text and Materials (2nd ed, Cambridge University Press, Cambridge, 2004)at 291

9 Michael Owen ‘Fun with onion routing’ (2007) 4 Network Security 8

10 For example, under threat of prosecution by German authorities, access providers blocked access to a magazine on a Dutch server which promoted terrorist violence. This was an over-inclusive as all material, including in- offensive material, was blocked. It was also ineffective in the sense that the material was almost immediately available on mirrored sites throughout the web and on newsgroups, see European Commission, Illegal and harmful content on the Internet COM(96)0487-C4-0592/96 cited Reed, above n 8, at 260

11 Viktor Mayer-Schonberger and Teree Foster (1996) ‘Free Speech and the Global Information Infrastructure’ 56

Michigan Telecommunications and Technology Law Review 45 at 54.

Perhaps the most famous case in which the conflicting jurisdiction issue came alive was the sequence of cases involving the sale of Nazi memorabilia in France from an auction site based in the United States. The French court in LICRA and UEJF v Yahoo! Inc. and Yahoo France (Tribunal de Grande Instance de Paris, 20 Novemeber 2000) found this breached French law and ordered that Yahoo Inc. to make such material unavailable to French users.

Yahoo sought a declaration in the US courts that such an order was unenforceable given the application of the 1st

There is no doubt...much of the regulatory conundrum of the internet is due to state often being unable to give effect to their domestic laws in respect of foreign online activity penetrating their borders—ranging from spam email and misleading advertising, to unauthorised sites offering gambling, pornography and pharmaceuticals.12

Two choices are regularly discussed to address the enforcement problem. One is to do it alone. Some techniques to achieve enforcement are already in place in many jurisdictions, for example controlling ISPs or making them liable for policing content. Another choice is to act in concert with other states to create better enforcement. Both these solutions have drawbacks and limitations. Increasing enforcement powers unilaterally may have concomitant effects on human rights of individuals at home or the sovereignty of other states. Indeed, the impact on the rights of individuals has been a hot topic of debate in New Zealand in recent times with the advent of new copyright legislation. This legislation retains provision for disconnection as a punishment for file-sharing. It was passed in the face of outcry and charges that the prospect of disconnection created a potential breach of human rights.13 In the same vein, there are problems with acting in concert internationally. Harmonisation is not always possible between states due to differing cultural expectations or legal norms. For instance, it would be difficult for New Zealand to harmonise its law of defamation with the United States given the importance of the First Amendment.

Despite these challenges however, this essay seeks to argue that the traditional legal branches have been mostly up to the challenges posed by the Internet. This obtains in both the sense of creating solutions to new problems and using existing law to regulate activity which may not have been envisioned when that particular law was created. In Kohl’s estimation, the internet creates both qualitative and quantitative problems for the role of law. In the qualitative sense, these problems arise when a new interaction or new activity begins which may not have any parallel in the physical world. An example of this may be the practice of linking online and

Amendment doctrine. In the end, the 9th Circuit Court of Appeals declined to make such a declaration but only because by that stage the issue was moot because Yahoo had voluntarily removed the offending material: Yahoo! Inc. v LICRA and UEJF, 169 F Supp 3d 1120 (9th Cir. 2004)

12 Uta Kohl Jurisdiction and the Internet: a study of regulatory competence over online activity (Cambridge University Press, Cambridge, 2007) at 26

13 Frank La Rue Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, , GA 17th sess. A/HRC/17/27 (2011) at paras.[ 49]-[50]

Jordan Cox ‘Access to the Internet as a human right’ (2010) 135 NZ Lawyer

whether this act creates an offence of copyright infringement or defamation. In a quantitative sense, the internet allows users to undertake actions with more efficiency. The increase in speed and scope creates the perception of newness around online actions when realistically the action is not new at all. As an example, in defamation law it has been noted that online defamation is nothing new or especially novel. However ‘the problems of traditional publishing and defamation are so multiplied when applied to a forum as large, as accessible, as cheap and as transnational as the Internet that it is not hard to see why there is a perception that the law of libel has been transformed by its application to the new electronic highway’.14

The value of Kohl’s statement on qualitative and quantitative factors is to remind us that while newness has its challenges, it is worthwhile noting that sometimes the more things change the more they stay the same. What Kohl advocates is that instead of looking into radical projects for reworking law in response to the Internet’s challenges, we should take a conservative approach of reliance on the traditional tools of legislation and stare decisis. In essence, Kohl’s argument is that the staple reasoning processes are adequate to meet the new phenomenon even if this is ignored by those who seek to revolutionise the legal approach to new media.15

The problem with viewing the Internet in the revolutionary sense is that it seems to require action on qualitative or quantitative issues immediately. However this is a poor method of analysing legal development as discrete areas of law have their own standards of evolution which do not necessarily correlate to each other. This leads to a multiplicity of differing models upon which to base internet regulation.16 Some of these models are attractive and plausible. One could easily support these new developments based on a state of nature scenario. However their ability to describe the real world interaction between the justice system and the Internet is limited. This is because they fail to account for the fact that case law and the judiciary in general are best placed to offer a fine grained analysis of legal issues encountered. In the realm of tricky content regulation, this is an important and crucial part of legal evolution. The fact is the diffuseness of the internet can only be managed at this granular level because the wide dispersal of subject

14 Lilian Edwards ‘Defamation and the Internet’ in Lilian Edwards and Charlotte Waelde (eds.) Law and the Internet— Regulating Cyberspace (Hart Publishing: Oxford, 1997) 183 at 184

15 Uta Kohl ‘Legal Reasoning in the Age of the Internet—Why the Ground Rules are still Valid’ 7 International Journal of Law and Information Technology 123

16 Harvey, above n 6, at 105-125

matter and site content does not lend itself to overall rules. In that case, the judiciary has the role of sifting which sites affect which state and in what way.

It is important to value traditional legal reasoning in the field of Internet law for two reasons. Firstly, it is forum specific and therefore able to capture different jurisdictions’ methods of assessing good argumentation. Conversely, each legal system also uses similar deductive logic. Because this logic depends on extant mandatory rules, each system also has methods to deal with the balance between continuity and change including how to assess new phenomena and which considerations to take into account. Continuity is an essential element of law but also requires nuance because attempting to fit old law over a new phenomenon is destructive to the former. Therefore, all reasoning is both present and future oriented in equal measure. It is this crucial aspect of incrementalism whereby past rules must always influence new ones that makes consideration by existing legal institutions valuable.

Legislative reasoning draws on some of the same qualities though not as bound by rigid tools of justification. The legislature too has a duty to show self-restraint in order to promote social and economic stability. Thus legislative decisions are also based on an incrementalist perspective of looking at past experience to inform prospective law. As Atiyah notes:17

Many of the sorts of reasons given for legislative decisions by legislators are not essentially different from those given by judges. Fairness, justice, the rights of individuals, all these enter significantly into political as well as judicial law-making.

These, of course, are obvious points. I do not make them with a mind to seriously analyse the process of legal reasoning but to demonstrate why it is important to look at actual decisions of both the legislature and judiciary to determine the path of internet related law. In the subsequent section, I will seek to demonstrate that Kohl’s conservative approach has value in looking at the fine grain of actual cases and legislative decisions. The manner in which courts either assume or decline jurisdiction or render previously well understood words meaningful in the online context evinces that the continual evolution of law according to established processes has much merit. In short, I will show that the jurisdictional and enforcement hurdles are not so great nor so

17 P S Atiyah Law and Modern Society (Oxford University Press, Oxford, 1983) at 134

intractable once real world situations come into focus. Put simply, I will demonstrate the assumption inherent in Kohl’s statement that:

Judges [and legislatures] have been able to reinvent the law in light of the online scenario because the law has consistently evolved in the past in response to a changing world.18

18 Kohl, above n 12, at 52

PART II—REGULABILITY IN FACT

The first two sections of this analysis are built upon the recent case of Police v Slater and more broadly the issue of name suppression. As noted above, the outcome of this case has coincided with a renewed push to look at the way the internet and law interact. Name suppression is a useful case study for our purposes in several respects. Firstly, its scope and relationship with the Internet has undergone recent examination by both the legislative and the judicial branches. Therefore it has relevance in the sense of being the most recent exploration of Internet issues by two major branches of government. Secondly, by its nature, name suppression has elements which are also present in other category three type offences. Elements of freedom of expression, publication and jurisdiction all lend themselves to examination through the lens of name suppression. Finally, name suppression is not difficult law in a technical sense therefore it is fairly simple to work through the implications of applying it online without being hampered by its own internal logic.

Firstly, I will look at how the legislature has dealt with one aspect of the regulatory conundrum—the enforcement problem. This involves discussion of the recent moves to introduce an intermediary liability scheme in regards name suppression and its rejection as policy in New Zealand. Secondly, I will consider how the District and High Courts have recently approached the name suppression issue in the case of Police v Slater. This involves an examination of how the two courts used and expanded current definitions in order to allow current law to speak in the online context. Finally, I will use case law surrounding hate speech and defamation to discuss the way in which the judiciary have approached the issue of jurisdiction in the context of the Internet.

ISP Liability: the legislature and the enforcement problem

Background

In response to a wide government overview of the criminal justice system, the Law Commission was charged with preparing a review of the system of judicial suppression of names and

evidence. Included within this review was consideration how the suppression area should deal with the challenge of the Internet. In its final report, the Law Commission recommended a form of notice and takedown procedure to overcome the problem of enforcing non-publication of orders online. Recommendation 26 if the report stated:

Where an Internet service provider or content host becomes aware that they are carrying or hosting information that they know is in breach of a suppression order, it should be an offence for them to fail to remove the information or to fail to block access to it as soon as reasonably practicable.19

This proposal was adopted and included in the Criminal Procedure (Reform and Modernisation) Bill 2010. Clause 216 created a specific offence for ISPs to allow material which breached name suppression orders to remain online. However the Select Committee considering the Bill advocated deleting this clause and making it clear that ISPs do not bear any responsibility for hosting material which breaches name suppression unless they knew or were reckless as to whether the information was suppressed. The Select Committee therefore recommended a tiered system of liability, one tier for knowing or reckless published breaches and one tier of strict liability for publishing.20 The main difference is that mens rea becomes part of the disputed facts at trial rather than an element to be considered at sentencing. Simply put, ISPs may still be held liable for allowing material to be published but only if they have knowledge or are reckless. Clause 215(3) of the Bill ensures explicitly that ISPs are immune from the strict liability form of the offence.

This alteration was made subject to consultation and advice pertaining to the difference between ISPs providing routing and transmission of internet traffic and so-called content hosts who actively place content online. The former have responsibility for some limited content related to the provision of their services but usually this is not extensive. Judge Harvey, submitting in a personal capacity, noted that the aim of regulating online content had to be directed correctly in order to be effective. Thus, there is no benefit in making “infrastructure” ISPs liable for content it does not actually control. This is so even if the ISP provides an amount of storage capacity for a customer to deliver their own content.21 An agglomeration of ISPs under the banner of the

19 Law Commission Suppressing Names and Evidence (NZLC R109, 2009) at 66

20 Criminal Procedure (Reform and Modernisation) Bill 2010 (243-2) (select committee report) at 5 and 177-178

21 David Harvey “Submission to the Justice and Electoral Select Committee on the Criminal Procedure (Reform and Modernisation) Bill” at [6]-[17]

Telecommunications Carriers Forum vehemently rejected any notion that traditional ISPs had either prior legal responsibility or the practical resources to for “policing the internet”.22

Intermediary liability overseas

The decision against allowing ISP liability for third party content is in keeping with most overseas jurisprudence, particularly from the United States and the EU. For instance, the US legislature emphatically ruled out ISP liability when it created an explicit exemption via section 230 of the Communications Decency Act (CDA).23 Nonetheless, ISPs have been viewed in the US as part of the distribution chain based on the level of knowledge. Under US law, distributors (as opposed to publishers) could be held liable if they had knowledge of offending material. In Stratton Oakmont, Inc. v Prodigy Services Co., the defendant had performed screening functions on hosted bulletin boards and was thus held to have the requisite knowledge.24 The opposite was found in Cubby, Inc. v CompuServe Inc.25 as the defendant lacked the type of editorial control in Stratton. Subsequently, ISP liability was expressly exempted under the CDA and upheld in Zeran v

22 Telecommunications Carriers’ Forum “Submission to the Justice and Electoral Select Committee on the Criminal Procedure (Reform and Modernisation) Bill”

The Forum includes most of the larger infrastructure ISPs in New Zealand, e.g. Telecom, Vodafone, Kordia (owner of Orcon) and TelstraClear,

23 “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Section 230 defines “interactive computer service” as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.” 47 U.S.C. § 230(f)(2)

24 1995 WL 323710 (NY Sup Ct. 1995)

25 776 F. Supp. 135 (SD NY 1991)

America Online.26 The reasoning, as Horton observes, ‘is to promote self-help on the Internet and prevent the potential chilling effect that private regulation may have on Internet speech.”27

In the UK, Godfrey v Demon Internet28 (heard under the Defamation Act 1996) in a preliminary ruling to court found that ISPs could be held liable for libellous content hosted on their networks. On the facts, Demon could not avail itself of the defence of innocent dissemination under the Act because it had been informed the content existed. The case was settled.29 In Australia, an express notice and take down procedure deals with the issue of ISP liability.30

The arguments from the initial Bill’s detractors mirror those made by ISPs and others in the 1990s in the US and EU in favour of immunity from content liability. Specifically, immunity is justified on the basis that ISPs lack effective or actual control of the content they host. Reversing this, it is argued, will create undue delay and expense to the consumer. Furthermore, the lack of control is not solved by technology or filtering software for several reasons. The notion of filtering requires an ex ante judgement about the content that will be allowed through the net. The argument goes that this is simply too hard a task to undertake as there is real prospect of being either under or over inclusive of material. Related to this is the fact that content such as libel, hate speech and name suppression by their nature require recognition of semantic meaning and nuanced appreciation sometimes only a court can provide. Another argument advanced is that it is morally iniquitous to impose liability on those who merely carry the message. The

26 [1997] USCA4 999; 129 F.3d 327 (4th Cir. 1997) The court briefly flirted with re-imposing the knowledge based test in Barrett v Rosenthal 112 Cal. App. 4th 749 but was overturned on appeal: Barrett v Rosenthal 40 Cal.4th 33, 146 P.3d 510, 51

Cal.Rptr.3d 55 (Cal. Sup. Ct., November 20, 2006).

The creation of an exemption for potential “republishers” was designed to avoid the chilling effect of having private parties responsible for content online. In Zeran, the court noted ‘Congress recognized the threat that tort-based lawsuits pose to freedom of speech in the new and burgeoning Internet medium’ and therefore introduced statutory immunity. This reasoning was backed in the subsequent California Supreme Court decision Barrett in which the court stated at [62]

The congressional intent of fostering free speech on the internet supported the extension of Section 230 immunity to active individual users. It is they who provide much of the 'diversity of political discourse,' the pursuit of 'opportunities for cultural development,' and the exploration of 'myriad avenues for intellectual activity' that the statute was meant to protect

27 Allison E. Horton ‘Beyond Control?: The Rise and Fall of Defamation Regulation on the Internet(2009) 43 Val.

U.L. Rev. 1265

28 [1999] EWHC QB 244; [1999] 4 All ER 342

29 ‘Demon settles net libel case’ BBC News (United Kingdom, 30 March 2000)

<http://news.bbc.co.uk/2/hi/science/nature/695596.stm> accessed 4 October 2011

30 See Broadcasting Services Amendment (Online Services) Act 1989 amending the Broadcasting Services Act 1992 (Cth). Also: Harvey, above n 5, at 484; Ronald Deibert et al (eds.) ‘Australia and New Zealand Overview’ in Access Controlled: The Shaping of Power, Rights and Rule in Cyberspace (MIT Press, Cambridge, 2010) at 392

traditional ISP-User arrangement lacks a necessary element of culpable complicity between that service provider and suppliers of offending material.31

In the US and Europe, these arguments were successful in batting back the fledgling intermediary liability standards. A general consensus was reached that ISPs, in principle, should be free from liability provided they show a willingness to co-operate and remove identified illegal content. This led to the safe harbour approach in the US in regards copyrighted material per the Digital Millennium Copyright Act 1998 (DMCA). The EU Electronic Commerce Directive (ECD)32 by contrast features a harmonised regime for ISP liability across many different content issues including intellectual property, libel and obscenity.

This consensus has been breaking down over recent years as the advent of Web 2.0, increase in User Generated Content (UGC) and other technological advances put pressure on the original arguments upon which web intermediaries based their immunity. Part of this has been driven by changes to how content is delivered whereby sites offer content for free but attach advertising to the product. This seems to erode the moral argument that those wronged by internet content should not ‘shoot the messenger’. The argument dissipates once it is realised the intermediary benefits from the illegal activity via advertising revenue. Indeed the whole basis of the Viacom v YouTube case was that the former felt the latter ought not to be able to benefit financially from the mountainous traffic built by displaying illegal content.33 In addition to the erosion of the moral basis, technological advancements over the last decade have made the prospect of blocking offending content more realistic and efficient.34

Yet, there are good reasons not to shift too far from the notion of ISP immunity. First, although the moral basis has diminished for some third parties, the traditional infrastructure based ISP still lacks the direct moral nexus between the service they provide and their income stream. This income comes directly from fees for providing network connections and hosting not the actual

31 See Edwards, above n 7, at 59-62

32 Directive 2000/31/EC on Electronic Commerce <http://ec.europa.eu/internal_market/e- commerce/directive_en.htm> accessed 1 October 2011

33 See Viacom International v Youtube, Inc No 1:2007-CV-02103 (SDNY 13 March 2007). Viacom’s complaint can be found at: <http://online.wsj.com/public/resources/documents/ViacomYouTubeComplaint3-12-07.pdf> accessed 1 October 2011

34 Edwards, above n 7, at 85

traffic driven to a site. The rejected moral claim is thus inapplicable as they are several layers removed from the content. Secondly, the technological advancements in blocking throw up as many difficulties as they solve. It is a realistic possibility now to establish ISPs as intermediaries with responsibility to block offensive material such as child pornography. The incentive to establish efficient systems to do so clearly would involve removing immunity if such content is found to be hosted by a particular ISP. Nonetheless, the fact this censorship occurs on a private basis creates very real concerns about protecting accountability, democracy and freedom of expression.35 Additionally, better filtering/blocking still does not solve problems related to name suppression, defamation and hate speech. As Edwards notes:

It will still be important to remember that in areas such as defamation or hate speech, where filtering has to be based substantially on a human understanding of nuances of interpretation and intention, automated filtering will continue to be merely a science- fiction dream.36

In contrast, Reidenberg argues that the democratic problems with blocking can be mitigated by making the sure that blocking programs are: mandated and discussed and as narrow, as targeted and as effective as possible.37 Overall, he maintains that technology ought to give effect to public policy and that states ought not to be afraid to turn back the tide of technological determinism in which they seem engulfed. His argument—simply put—is that attacks on state jurisdiction over internet matters is not the proper order of things as this cedes law-making to technology and technologists.38 States cannot afford to cede in such a way because this negates its role in securing the rights for its citizens. In the ordering of what the internet ought to achieve, the state must set the public policy with the technology to follow not the other way around.

Reidenberg makes the point that on its own the technological community develops solutions based on commercial viability not public policy. He seeks to change this balance by having states (legislatures) and judiciaries view their decisions as impacting the technological market. He cites the example of the Yahoo v LICRA case in which Yahoo used geo-location technology to target

35 Lilian Edwards ‘Pornography, censorship and the Internet’ in Lilian Edwards and Charlotte Waelde (eds.) Law and the Internet (3rd ed, Hart Publishing: Oxford, 2009) 623 at 628

36 Edwards, above n 7 at 85

37 Joel Reidenberg ‘Technology and Internet Jurisdiction’ (2005) 153 University of Pennsylvania Law Review 1951 at 1965

38 Ibid, at 1969

advertising but did not do so to prevent offending material being accessed by its French users.39 Reidenberg also notes that in Reno v ACLU the court missed an opportunity to change the dynamic of filtering content by assuming the court decision would have no impact on technological development. In Reidenberg’s estimation this was quite wrong, a decision the other way could have spurred technology to better protect children online.40 Overall, the key message from Reidenberg’s piece is to remind us that technology is not static and can respond to public policy. Specifically, he notes that if ISP liability is imposed better filters would result.41

Despite arguments of this nature, the concerns raised by a number of submitters dissuaded the Committee from recommending strict liability should attach to these hosts given the nature of control exercised over material. Although a defence of total absence of fault is available for the strict liability offence, arguably this leaves content hosts unsure of their liability once informed of infringing content.42 In addition, the increase in penalties where a breach of a non-publication order occurs logically should correlate with the inclusion of some element of mens rea. For instance, NZLII noted that increased penalties without an adequate defence of absence of knowledge may have a negative effect on the provision of useful services.43 In NZLII’s case, this was the provision of useful legal information to self-litigants and to the legal community at large.

The argument could extend further however—raising the potential compliance costs could have effects for sites which provide valuable services such as Citizens Advice Bureau or other sites which provide information to the public. This may seem like a small problem but its substance could have real effects. For instance, the site Trade Me not only hosts auctions but has an extensive message board system with up to 25,000 postings per day.44 Trade Me moderate these boards and even has a self-imposed alert system which notifies moderators of breaches of non- publication orders. Nonetheless, with a standard of strict liability and the potential for breaches

39 Ibid, at 1970

40 Ibid, at 1973

41 Ibid, at 1974

42 This concern is borne out by the notice-and-take down approach in the Mumsnet case. In that case, a notice of takedown was issued to the moderators of an online childcare forum. The notice alleged defamatory statements. The moderators were forced to settle even though they may have been able to argue defences such as innocent dissemination to the defamation action. See: ‘Parent website settles libel case’ BBC News (United Kingdom, 10 May 2007) <http://news.bbc.co.uk/2/hi/uk_news/6641693.stm> accessed 4 October 2011

43 New Zealand Legal Information Institute ‘Submission to the Justice and Electoral Select Committee on the Criminal Procedure (Reform and Modernisation) Bill 2010’ at 4

44 Trade Me Limited ‘Submission to the Justice and Electoral Select Committee on the Criminal Procedure (Reform and Modernisation) Bill 2010’

to go unnoticed, the fear of inadvertently breaching the law could render such services too risky to operate. This over-inclusiveness of who is caught by the strict liability offence was a theme of many of the submissions including Trade Me itself.

Thus the tiered system is an adequate response to the problem which errs on the side of freedom of expression and limited liability for content hosts. Duly, offences involving these putative publishers are subject to the highest burden of proof of either knowledge. The reversal of the proposed Bill’s position on ISP liability shows that the legislature is alive to some difficulties and possibilities regarding control of the online sphere. As such, there is no particular concern that the legislature is out of touch with current realities. As in all law, the legislature may take some time to respond to newer problems within the justice system but the Criminal Procedure Bill is a prime example of the legislative process working in the right direction. A perceived wrong was being committed to which the legislature felt it needed to respond. In turn, submitters seem to have convinced members of the Justice and Electoral Select Committee to pare back liability for ISPs. Knowledge-based liability for name suppression was evidently seen as the best response to the complications outlined above. As a midway point between immunity and responsibility for content it probably strikes the right balance in the interests of fairness. It remains to be seen if the rationale will apply in other areas of law beyond suppression.

Despite these positives, the legislature did fail to engage with practical solutions which would have made the issue of ISP liability much more certain. Within the new law, there are no mechanisms for an ISP to accurately assess which names or particulars are suppressed and there is no formal notice and take down procedure. This is in contrast to Australia’s regime which has given the Australian Communications and Media Authority (ACMA), an equivalent regulator to the Broadcasting Standards Authority (BSA), jurisdiction over online content and has given the agency ability to issue take-down notices if the content is hosted in Australia or subject the offending content to filtering if it is outside.45 These take-down notices are backed by the force of law.

45 ACMA’s co-regulation regime is established under Schedules 5 and 7 of the Broadcasting Services Act 1992 (Cth). See: ACMA website, ‘Online Regulation’ last updated 13 August 2010

<http://www.acma.gov.au/WEB/STANDARD/pc=PC_90169> accessed 4 October 2011; Deibert above n 30, at 392.

It is difficult to assess why the legislature did not turn its mind to these issues. Official advice from the Ministry of Justice only discusses the substantive issues raised above regarding the difference between strict and knowledge-based liability. Simply put, official advice did not countenance the need for a more official notice regime nor a more formalised take down process and the Committee ask for such advice.46 The Law Commission report did suggest the creation of a national register of suppression orders though this has hitherto not occurred.47 Likewise, a formal notice and take down procedure was adverted to by the Law Commission but no provision was made for one in the Bill.48 Either these have been oversights or deliberate policy choices. Nevertheless, the approach to intermediary liability tells us that, at least in regards name suppression, the legal framework has not been technologically determined. Rather, the legislature has taken steps to enforce traditional law online within the normal legislative process. That the regulation may be imperfect does not detract from the fact the legislature has felt it to be within their competence to develop law addressing policy issues about enforcement.

Dealing with definitions: judicial consideration of name suppression:

The “Whaleoil” case

The case Police v Slater is a prime example of how the courts adapt to new technology without having to resort to strained or unlikely analogies.49 Mr Slater was convicted in the District Court on nine counts of breaching name publication prohibition orders under ss 139 and 140 of the Criminal Justice Act 1985. The charges related to postings on his “Whaleoil” blog which revealed

In comparison to Australia, New Zealand has a rather liberal regime when it comes to content regulation. For instance, media organisations which are subject to Broadcasting Standards Authority jurisdiction in regards public airwaves on radio or television are not subject to the same conditions regarding their online content. This is in contrast to the Australian system. As some have noted, Australia’s internet regulation regime is one of the most rigorous in the world. This is evidenced by the existence of both opt-in and mandatory filtering schemes, a supervisory role for the media body ACMA (including the provision for take down notices). Also most states have criminalised the distribution of objectionable material

46 Ministry of Justice ‘Departmental report on Criminal Procedure (Reform and Modernisation) Bill 2010’ (16 May 2011) at 182-184; Ministry of Justice ‘ Supplementary Briefing on Criminal Procedure (Reform and Modernisation) Bill 2010 (7 June 2011) at [37]-[42]

47 Law Commission, above n 19, at 60-61

48 Ibid, at 66

49 Police v Slater [2011] DCR 6, Slater v Police, above n 3

the details of the subject by a combination of text, hypertext, photographs, pictorial representations and—in one instance—binary code. The convictions were upheld by the High Court in May 2011. Mr Slater subsequently has been granted leave to appeal to the Court of Appeal.50

Mr Slater’s arguments at both trial and appeal reveal an on-going perception from those in the blogosphere that the web is a special medium which may be immune from the consequences of law. Thus, for example Mr Slater disputed that the concept of a ‘report or account’ ss139 and 140 could apply to a blog post by a member of the public as this phrasing was aimed specifically at reporting by news organisations.

At trial, Harvey J gave the words ‘report or account’ a purposive construction, i.e. the term applied within the context of the purpose to prevent publication of name and other details.51 A narrower construction, favoured by the defence, was to require that ‘report or account’ involved eyewitness reportage from the court hearing itself.52 Harvey J noted that the notion of communication to others was an implicit part of the offence and drew an analogy with ‘publications’ under defamation law which technically include communication from one person to another. Also, the notion of publication in that sense does not just include formal media organisations. He surmised overall that the narrow interpretation would simply make the current law unworkable.

On appeal, Slater attacked the conflation of report or account with any “narrative or information” per Harvey J’s reasoning. This was viewed as too broad and potentially stifling to informal commentary or opinion. Once again—the narrow view was rejected because:

  1. The ordinary meanings of report and account were clearly broader than the narrow one advocated;
  2. In context, the provisions envision some kind of publication to a wide audience but there is no specific requirement that this need be via traditional media;

50 Slater v Police HC Auckland CRI-2010-404-379, 8 July 2011

51 Police v Slater, above n 49, at [17]

52 Ibid, at [83]

  1. The broad interpretation accurately reflects the concern for fair trials and protecting victims.53

The court’s decision is buttressed by the association of publishing concept with the words ‘report or account’. Indeed, there is no reason to regard the words themselves with any special attention. The Law Commission, in its review of name suppression has adverted to the fact the Criminal Justice Act 1985 uses the three terms publication, report and account almost interchangeably.54 Therefore, the Law Commission’s recommendation was not to introduce a statutory definition of publish or the words ‘report or account’. Instead, the Commission felt comfortable letting Courts determine the meaning on a case by case basis. This indeed seems the appropriate compromise in order for the law to continue to speak on issues surrounding name suppression.

Furthermore, the Slater decision is in line with current authority which gives publication a wide meaning of simply disseminating information in the public arena.55 The wideness of this characterisation seems a deliberate attempt to catch novel methods of publication even if they do not conform to traditional understanding. Additionally, in a different context, the Courts have already determined that the notion of “publish” is not restricted to the news media.56 Specifically, publish has also been found to apply to particulars disclosed over the internet.57 This is surely now settled law—so much so that on appeal Slater accepted that publication applies to the internet. Finally, academic writing also supports a wide definition of ‘report or account’ in order to fulfil the purposes of the Criminal Justice Act.58

Judicial jujitsu and gerrymandering jurisdiction

This section will discuss the methods by which the judiciary has addressed the issue of a transnational internet and the putative jurisdictional issues this entails. The issue of jurisdiction is

53 Slater v Police, above n 3, at [67]-[69]

54 Law Commission Suppressing Names and Evidence (NZLC IP13, 2008) at 63; Law Commission, above n 19, at 66

55 Solicitor-General v Smith [2004] 2 NZLR 540.

56 Television New Zealand v Department of Social Welfare [1990] NZHC 299; [1990] 6 FRNZ 303 at 305

57 Re X [2002] NZAR 938

58 John Burrows and Ursula Cheer Media Law in New Zealand (5th ed, Oxford University Press, Melbourne, 2005) at 332

perceived as a vexing one given the nature of a transnational internet. Harvey notes that jurisdiction in the internet realm is a prime intersection point of differing views on the role of sovereignty, the nature of law and the role and the flexibility of modern jurisprudence.59 The problem, at a conceptual level is that we are conditioned to view law at the level of physical presence.60 As we will see however, the issue is not as complex as suggested at a cursory glance and realist rather than radical solutions are available.

Criminal hate speech

New Zealand does not have express hate-speech legislation but there is an limited offence under s 131(1) of the Human Rights Act 1993 of publishing material ‘likely to excite hostility or ill-will against, or bring into contempt or ridicule, any such group of persons in New Zealand on the ground of the colour, race, or ethnic or national origins of that group of persons’. The notion of publish is broad and patently covers online material per s 61. Nonetheless the offence has never been used given it requires Attorney-General consent to bring a prosecution per s 132 of the Act. The possibility nevertheless exists that this type of content regulation could apply to the online sphere.

In regards jurisdiction, well-established rules exist to deal with extraterritorial criminality. While s 6 of the Crimes Act 1961 states that nothing done outside New Zealand can be tried as an offence in New Zealand, s 7 provides that where a part of an offence necessary for the completion of the full offence occurs in New Zealand will be deemed to have been committed in this country.

The same approach has been applied judicially in regards hate speech legislation in the United Kingdom. In R v Sheppard the accused were charged with the offence of possessing, publishing and distributing racially inflammatory material via the internet under s 19 of the Public Order Act 1986.61 The defendants argued that since the material was routed through a server in

59 Harvey, above n 6, at 45

60 Ibid, at 40

61 R v Sheppard [2010] EWCA Crim 65

Torrance, California, the offence was not committed in the UK and came under the protection of the 1st Amendment in the US.

In doing so the defence asked the court to adopt a country of origin approach whereby jurisdiction only applies where the host server is based.62 The defence attempted to get the Court of Appeal to engage in determining which theory of jurisdiction ought to apply to the UK. Instead, the court sidestepped the debate by utilising the available substantive test of jurisdiction. To this end, the court applied the test in R v Smith (Wallace Duncan) (No 4) [2004] CA which asks whether a substantial amount of the offence took place within the jurisdiction. This was a direct decision not to engage in debates about formal jurisdiction but to decide the case on a practical basis by applying extant, settled law.

In doing so, the court also captured the spirit of the Smith decision in its determination to ‘move away from definitional obsessions and technical formulations aimed at finding a single situs of a crime by locating where the gist of the crime occurred or where it was completed. Rather, they now appear to seek by an examination of relevant policies to apply the English criminal law where a substantial measure of the activities constituting the crime take place in England, and restricts its application in such circumstances solely to cases where it can seriously be argued on a reasonable view that these activities should on the basis of international comity not be dealt with by another country."63

In terms of the substance of the offence, the court was in no doubt that the major parts (the generation, editing, uploading of material) occurred in England. The role of the server was one of mere transmission, i.e. the server was not a novus actus as it was merely a mechanism for displaying the material collated in England. Thus, on the basis of the substantive test, the jurisdictional requirements were met.64

62 Ibid, at [33]. A similar argument was made in Police v Slater but swiftly dropped.

63 Cited M Dyson ‘Public Order on the Internet’ (2010) 2 Archbold Review 6 (my italics)

64 R v Sheppard, above n 61, at [32]

The simple and practical approach to jurisdiction extended to the definition of publication. Scott Baker LJ confirmed the trial judge’s statement that publication entails giving the public access to material.65 The judge baulked at a more refined definition preferring the ordinary meaning given in the Shorter Oxford Dictionary and case law which held that “making available” material to the public amounts to publication.66 This notion of publication matches with the statutory definition of publish given in the s 61 of the Human Rights Act 1993 and the conclusion of Harvey J in Police v Slater discussed above.

As Dyson notes, the application of the substantial measures approach was a useful curb on the prospect of exponential jurisdiction and an effective way for the English courts to direct their resources, i.e. at crimes which have a genuine connection to the jurisdiction in question without having to assume jurisdiction over vast swathes of internet activity.67 The Court was concerned not to extend itself too far by claiming an overarching jurisdiction over matters perhaps best left to other jurisdictions.

Tort

Tortious jurisdiction has a somewhat more nuanced method of working through jurisdictional problems. This is because of the potentially greater risk that even if ‘the defendant’s tortious activities de facto only occur in one jurisdiction, the global reach of the Internet can be troublesome for the unsuspecting defendant’.68 On the other hand, the jurisdiction minefield can also be troublesome for the plaintiff in terms of whether she can sue where the damage was suffered or has to avail herself of a foreign jurisdiction for relief.

In defamation, EU law states the place of publication is where the wrong occurred, i.e. where the publisher is established.69 In comparison, the English and Australian approach, since Dow Jones v

65 Ibid, at [34]

66 R v Perrin [2002] EWCA Crim 747 at [22] per Kennedy LJ

67 Dyson, above n 63

68 Julia Hornle ‘The Jurisdictional Challenge of the Internet’ in Lilian Edwards and Charlotte Waelde (eds.) Law and the Internet (3rd ed, Hart Publishing, Oxford, 2009) 121 at 134

69 Ibid, at 136. See Shevill v Presse SA [1995] EUECJ C-68/93; [1995] ECR I-415

Gutnick, has been to assume publication occurs where material is downloaded.70 The approach has been sanctioned in obiter comments by the Court of Appeal in New Zealand.71 The plaintiff in Gutnick was affirmed in his right to sue an American newspaper publisher for material originating in that jurisdiction but downloaded in the Australian state of Victoria. Thus, the courts have simply followed the ordinary rules of publication that apply to all broadcast and written media including international periodicals. In doing so, the High Court of Australia rejected the idea of a targeting approach which had been the preference of Hedigan J in the lower court.72 The court asserted jurisdiction form the mere fact that the plaintiff had a reputation he wished to protect and vindicate in an Australian court. In contrast, a targeting approach would mean a fine grained analysis of the nature of the publisher’s intention to publish Victoria. Targeting is displayed in US authority which assumes that the key question is whether a publisher intends to direct their website to a specific audience.73

The Gutnick rule has a potentially major drawback in relation to freedom of expression. It allows complainants to begin a suit in the country with the most favourable defamation laws, not necessarily their own (libel tourism). This may have a chilling effect as publishers will need to be wary of being held to different standards to those in their home country. Nonetheless, legal realities again step in to prevent a disproportionate impact on freedom of expression.

One of these legal realities is the nature of the tort itself. Defamation requires injury to reputation which by its nature is place specific. Thus the prospect that a publisher may face multiple debilitating suits in multiple jurisdictions is far-fetched simply because a claimant must meet this minimum requirement. Another limitation (discussed below) is the doctrine of ‘forum

70 Dow Jones & Co Inc v Gutnick [2002] HCA 56

71 Nationwide News Pty Limited v The University of Newlands [2005] NZCA 317 (9 December 2005)

72 Gutnick v Dow Jones & Co Inc [2001] VSC 305 (28 August 2001)

73 See Diane Rowland ‘Free Expression and Defamation’ in Mathias Klang and Andrew Murray (eds.) Human Rights in the Digital Age (Glasshouse, London, 2005) at 55-70 US jurisprudence: Copyright infringement and targeted jurisdiction--ALS Scan Inc. v Digital Service Consultants [2002] USCA4 124; 293 F.3d 707 Defamation and targeted jurisdiction: Young v New Haven Advocate [2002] USCA4 228; 315 F.3d 256 (4th Cir. 2002)—found no jurisdiction because no intent to aim website material at audience from other states. ECJ has also held that placing material online is not an automatic publication/transfer to another jurisdiction because the website itself does not have the technical means to send information automatically. (Case C-101/01 Lindqvist, judgment of 6 November 2003)

<http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2004:007:0003:0004:en:PDF>

What is crucial is the act of the person who downloads at the other end. Internet requires data to be sought out. As Rowland points out, (p. 63) this statement about the non-automatic nature of publication may be of little comfort to the victim of a damaged reputation but it has plenty of practical benefit when limiting the global reach of publication. In other words, the idea that mere passive existence of offending material online is not within the cognisance of the court helps to solve the issue of courts being too involved in policing the online sphere.

non conveniens’ preventing complainants from attempting to sue in places to which they have little connection.74 Finally, the courts starting point for service is governed by High Court Rule

219. As discussed below, this rule along with r 131 have been used to temper the extensive jurisdiction which the Gutnick rule creates. In the next part I will discuss how each of these realities helped to deny jurisdiction in the Nationwide News case.

Service and Protest to Jurisdiction

The Nationwide News case began with the publication of potentially defamatory material on the website of The Australian newspaper regarding the activities of a private educational business (University of Newlands) in New Zealand. The New Zealand business was placed on a list of so- called low quality “degree mills” but the complainant’s name was not able to be discovered without visiting a second site. Newlands sued for defamation in New Zealand but Nationwide News (NN) filed proceedings of protest to jurisdiction under r 131. Newlands had claimed to be able to file suit under r 219 which provides:

Where in any proceeding a statement of claim or counterclaim and the relevant notice of proceeding or third party notice cannot be served in New Zealand under these rules, they may be served out of New Zealand without leave in the following cases:

(a) Where any act or omission for or in respect of which damages are claimed was done or occurred in New Zealand

This is directly relevant to the jurisdiction issue and application of the Gutnick rule as NN challenged the filing on the basis that the wrong had been committed in Australia. In the High Court, Associate Judge Gendall answered this question with reference to Gutnick and therefore held that the act occurred in New Zealand.

I agree with the position taken by the High Court of Australia. To my mind, if a defendant chooses to upload information on the internet, being aware of its reach, then they assume the associated risks, including the risk of being sued for defamation. If it

74 Berezovsky v Forbes [1999] EMLR 278; Jameel v Dow Jones & Co Inc [2005] EWCA Civ 75; [2005] QB 946. Also note: Nationwide News v University of Newlands, above n 71, at [50]-[57]

were held otherwise, namely, that publication occurred at the place of uploading, defendants could potentially defame with impunity by uploading all information in countries with relaxed, or no, defamation laws.75

The Court of Appeal, without deciding, agreed that Gutnick effectively stated New Zealand law but took a more nuanced view of the jurisdiction question.76 Nominally, this was because the High Court failed to account for the fact that the plaintiff may have been the only person to download the defamatory material in New Zealand meaning that harm to reputation could be non-existent. It was argued that no evidence was presented to show anyone other than the plaintiff knew of the material and that there was difficulty connecting the comments made on The Australian website with a known person in New Zealand.

The Court of Appeal was therefore using one of the elements of the defamation: that for harm to occur there must also be a reputation to protect. In the end, Court declined to decide the case on the basis of Gutnick alone and instead focussed on the fact the defamation claim lacked merit. Still, one of the points made in relation to merit was the fact little damage to reputation could result if no one downloaded the material in the jurisdiction it was claimed harm had occurred.

In turning to the merits, the Court then noted that the application of r 219 should be determined by two factors: whether there was “a good arguable claim on the merits and a strong probability that the claim falls within the letter and spirit of the rule about service abroad”.77 The facts relating to publication therefore were meant to be considered as part of this calculus. The Court also stressed another point made in the Kuwait case that: “If the dispute has little connection with New Zealand and it could be seen as exorbitant to assert jurisdiction over the foreigner, a stricter standard may well be appropriate”.78 In order to pass the good arguable claim test, the plaintiff

75 University of Newlands v Nationwide News Pty Ltd (2004) 17 PRNZ 206 at [35]

76 Nationwide News v University of Newlands , above n 71, at [22]

77 Kuwait Asia Bank EC v National Mutual Life Nominees Ltd (No 2) [1989] 2 NZLR 50 (CA) at 54. Though note decision overturned on unrelated grounds: Kuwait Asia Bank EC v National Mutual Life Nominees [1990] 3 NZLR 513 (PC)

78 Ibid, at 54

also had to supply sufficient evidence, usually affidavit, however it was not the courts role to determine areas of fact properly dealt with at trial.79

On the facts, the Court held there was no good arguable case on two main grounds. The first was the lack of evidence adduced about the damage to the plaintiff’s reputation. Although this is not strictly a requirement of defamation as such reputation is assumed without evidence to the contrary, there was scope to take it into account when considering if there was an arguable case. Secondly, the court noted that there was a paucity of evidence proving dissemination to a wide audience. Indeed, it may have been possible that the plaintiff was the only person in New Zealand to download the material. This absence of evidence was coupled with the fact it was difficult to associate the post on the Australian website with the plaintiff unless a second site was viewed.

The court also made a rather fascinating comment in which it claimed that the ordinary rule that ‘publish’ need not be involve publication to any particular person may not apply in the case of an internet posting.80 The plaintiff had apparently argued that the issue of publication was non- contentious as the defendant was a mass media entity clearly identifiable as a publisher. The Court noted however that this assumption may not necessary follow because in order to be identified from the published piece a reader needed to seek further information from a second site. The Court then noted that ‘whether the present case naturally falls within the mass media principle is, we think, debatable’.81 This comment implies that for publications other than newspapers, radio and television the plaintiff may have to prove that the publication was directed at or even reached specific people in order to be considered defamation. This is a departure from the standard set in Gutnick which viewed the news medium online as an almost seamless translation of existing broadcasting and publishing albeit with some differences in speed and scope.

This tougher standard has also been applied in numerous cases in the UK in which the courts have not simply assumed publication because the material was available online. Thus in Jameel v

79 Bomac Laboratories Ltd v F Hoffman-La Roche Ltd (2002) 7 NZBLC 103,627 (HC) at 28

80 Nationwide News v University of Newlands , above n 71, at [48]

81 Ibid, at [48]

Wall Street Journal, the English Court of Appeal noted the fact that only five people had accessed the offending comment, three of whom were associated with the plaintiff.82 The court dismissed the case on the grounds that damage to reputation would be minimal and damages would be nominal at best therefore the cost of conducting a trial was essentially an abuse of process. The same approach was taken in another English case, Al Amoudi v Brisard, in which no proof was shown as to how many readers had accessed offending material.83 Finally, in Carrie v Tolkien, the court also refused cognisance of the case as the offending statement was only accessible for a matter of hours and there was again no proof of readership.84 As Edwards notes: ‘[t]he lesson here for a libel plaintiff is that the courts want hard log statistics on the number of page views and downloads, will not presume a readership, and that even if the action is allowed, the level of damages will be crucially dependent on the number of readers’.85

In Nationwide News, the Supreme Court rejected an appeal from the plaintiff that the Court of Appeal had misapplied the law of defamation as it related to the internet. The NZSC saw the appeal as an attempt to re-litigate the evidence as to publication in New Zealand and the lack of identification of the plaintiff within the original article. With respect, the Court could have used the opportunity to clarify the New Zealand position regarding online publication in general. Since the Court of Appeal had declined to decide the case on the jurisdictional issues and opted instead for an assessment of the merits it would have been helpful to better understand the court’s view on how these two aspects intersect. On the strength of the Court of Appeal’s reasoning, it seems difficult to separate the two concepts. The Court of Appeal showed this by referring back to the idea of publication in its discussion on the merits rather than sticking to the fact the plaintiff’s case was probably untenable due to paucity of evidence.

The Court of Appeal treated the claim of publication as an evidentiary matter, concluding:

We seriously doubt that there was sufficient material before the Associate Judge to demonstrate that an act in respect of which damages are claimed had occurred in New Zealand, as required in terms of r 219(a). Typically the occurrence in New Zealand of something which is relied upon to justify service of the proceeding overseas, will not be

82 Jameel v Dow Jones, above n 74

83Al Amoudi v Brisard [2001] 1 WLR 113

84 Carrie v Tolkien [2009] EWHC 29 (QB)

85 Edwards, above n 7, at 58

contentious. Its occurrence will be a matter of record or at least something of which there is available evidence. But that cannot be said in this instance. It is at most a matter of supposition that someone in addition to the second plaintiff has downloaded the relevant material.

Therefore, it seems that in regards defamation, the courts may develop a nuanced perspective on the jurisdiction issue by not applying the Gutnick rule directly but a supplementary targeting approach. Nationwide News appears to be a case of the judiciary fulfilling a gatekeeping function to keep out speculative claims of tortious wrong and defending the integrity of the New Zealand legal system in the process. The approach taken in relation to rules 131 and 219 shows that the judiciary does not want to create a bright line rule regarding online defamatory publications. Instead, the Courts would rather retain the option of looking at the specific factual matrix to determine if the action has merit in the first place. The test adopted is stricter than the threshold of tenability adopted on strike-out applications meaning the judiciary is also alive to concerns about forum shopping as well as other complications such as republication. It seems therefore that the Supreme Court has approved a test designed to mitigate the damaging effects of a multiple publication, download jurisdiction rule.

In extremis, the court also has recourse to the doctrine of forum non conveniens. This applies where two or more courts have jurisdiction in the same proceedings. The doctrine seeks to establish which jurisdiction is most suitable to try the proceedings in the interests of the parties and the furtherance of justice. The argument was run unsuccessfully in Gutnick. The Australian approach is to assume jurisdiction unless it is shown that Australia is clearly the inappropriate forum. New Zealand rules are different where the initial burden is on the defendant to persuade the court to exercise a stay but this then shifts to the other party once it is established prima facie that there is another appropriate forum. The plaintiff must then show special circumstances which require justice is better served for the trial to take place in New Zealand.86

86 Harvey, above n 6, at 80

PART III—REGULABILITY: THE NORMATIVE PERSPECTIVES

Normative arguments from both a sceptics and proponents of regulation are an important part of understanding the governability or otherwise of the Internet. Digital liberals argue that the Internet is a separate space which deserves treatment as such. To this end, they advocate a self- regulation or specialist cyber-law approach online both because this conforms to their ideological program and because, to them, it best matches the cultural expectations of the medium. On the other hand, realists argue that traditional legal frameworks are apt to deal with the new medium and therefore the Internet should be treated no differently to any other space. Other schools of thought exist which privilege other methods of dealing with Internet regulation. Notably, the transnational school advocates the harmonisation of national law and creation of public international law to deal with issues arising from a transnational Internet.87 In addition, the code school led by Lawrence Lessig advocates a balance between formal (traditional) and informal (self-regulatory) methods by altering the very architecture upon which the Internet sits according to policy preferences about behaviour.88 Both these schools (and others) throw light on aspects of the regulatory conundrum but it is beyond the scope of this essay to discuss them in depth. Instead the focus is on the implications of either a governed or ungoverned internet which is best examined with reference to the viewpoints of realists and liberals. After all, it is the aim of this essay to examine this fundamental tension and it is therefore not helpful to be side-tracked into other debates.

In this section I will examine both the sceptics and the traditionalists’ normative claims about the governability of cyberspace. This will show that the balance of the normative argument has swung in favour of realists who recognise the value of traditional law. Then I will look deeper at some of the arguments about cyber-culture to discover if there is any merit to the claim that cyberspace has cultural characteristics worth protecting from the force of ordinary law. Finally, I will examine how the judiciary have responded to such normative and cultural arguments when they are made in defence against the application of traditional laws.

87 Ibid, at 110-115. Also, Rolf Weber Shaping Internet governance: regulatory challenges (Springer, Berlin, 2010) at 3

88 Lawrence Lessig Code and other laws of Cyberspace (Basic Books, New York, 1999)

For starters, self-regulation is underpinned by notions of digital liberalism and anarchism. The earliest and most vociferous opposition to government regulation online originated in a campaign by the Electronic Freedom Foundation, led by John Perry Barlow. In 1996, Barlow issued an infamous manifesto entitled ‘A Declaration of the Independence of Cyberspace’ which proclaimed that individual state sovereignty did not extend to the realm of the web. In irascible prose Barlow proclaimed:89

I declare the global space we are building to be naturally independent of the tyrannies you seek to impose on us. You have no moral right to rule us nor do you possess any method of enforcement we have true reason to fear.

Governments derive their just powers from the consent of the governed. You have neither solicited nor received ours. We did not invite you. You do not know us, nor do know our world. Cyberspace does not lie within your borders.

Plainly, the anarchic element of this libertarian program has not come to fruition. While early digital liberals were united by the idea that the net was ungovernable, this view fell away through experience as early legislative and judicial action began to bite. As demonstrated, states including New Zealand have assumed cognisance over elements of online behaviour despite protestations that it ought to be wholly unregulated. However, the softer form of digital liberalism propounded by others has sustained. This is because, instead of focusing on descriptive arguments of the Internet, proponents have shifted to normative ones stating that even if content is regulable in fact, it should not be.

Prominent theorists maintain that the transnational nature of the net and its de-centralised nature are the ingredients which allow the web to prosper. As such, these elements must be preserved and therefore the prospect of state regulatory cognisance is anathema. These theorists also note that the global internet alters the link between geography and the power and legitimacy of local governments to assert law. Additionally, Johnson and Post observe that the notification

89 John P Barlow ‘A Declaration of the Independence of Cyberspace’ (1996) Electronic Frontier Foundation,

<http://w2.eff.org/Censorship/Internet_censorship_bills/barlow_0296.declaration> accessed 1 October 2011 See Aimee Hope Morrison ‘An impossible future: John Perry Barlow’s “Declaration of the Independence of Cyberspace”’ (2009) 11 New Media & Society 53

necessary for effective law is lacking due to the extra-territoriality of the web. In sum, they conclude:

The Net thus radically subverts the system of rule-making based on borders between physical spaces, at least with respect to the claim that Cyberspace should naturally be governed by territorially defined rules.90

These theorists then propose various mechanisms to create a self-regulated/semi-regulated online sphere. Some favour contractual models analogous to the medieval lex mercatoria and others versions of a consensual self-regulation.91 Johnson and Post themselves posit the most interesting theory which is to jettison national territories as units of jurisdiction and instead rely on international organisations or bands of users to create regulations within individual network systems.92 This model keeps the essence of the informal governance model whilst attempting to place a layer of order over the potential anarchy of an unregulated web.93

Against regulatory scepticism

Regulatory realism seeks to counteract the view of the Internet as a place which requires different rules to achieve regulatory aims. The charge was led by Frank Easterbrook when he proclaimed that there is no more a law of cyberspace than there is a ‘Law of the Horse’. Easterbrook’s motto meant to emphasise that general rules are much better equipped to deal with new technology issues. It will not suffice to select one category—for example, horses—and attempt to discuss law only as applicable to that category. A study of the ‘law of the horse’ is bound to misidentify key issues and pass-by unifying concepts. Easterbrook provides an apt summary of the realist argument: the internet presents no significant difficulties for ordinary law.

90 David Johnson and David Post ‘Law and Borders—The Rise of Law in Cyberspace’ (1996) 48 Stan.L.Rev 1367 at 1370

91 Trotter Hardy ‘The Proper Legal Regime for “Cyberspace”’ (1994) 55 U.Pitt.L.Rev 993 at 1015-1025; Joel Reidenberg ‘Lex Informatica: The Formulation of Information Policy: Rules Through Technology’ (1998) 76 Tex.L.Rev. 553; Weber, above n 87, at 17-21

92 See Harvey, above n 6, at 124

93 The early desire for web unregulated by government had some success in the 1990s in the area of Internet architecture. The arguments of liberals led the US government to divest itself of its power assigning domain names and create a non-governmental body, ICANN to oversee this aspect of Internet infrastructure. The process by which internet protocols moved from a tiny, military research and communication system to a mass consumer- oriented service and the governance behind this shift is aptly laid out in Milton Mueller Ruling the Root: Internet Governance and the Taming of Cyberspace (MIT Press, Cambridge, 2002) While the ICANN charter model does not fulfil the most grandiose wishes of cyber-libertarians, it is at least tacit acknowledgement that non-governmental cognisance over internet structure can be supported by states.

Easterbrook’s argument has been fleshed out by other realists. Thus, Jack Goldsmith notes that the underlying basis of the self-regulation argument is wrong. Cyberspace is not real space; it mirrors physical space in the sense that both involve real people interacting with each other with real territorial consequences. That those territorial consequences may seem difficult is not a reason to jettison the traditional doctrine of conflict of laws. Moreover, regulation sceptics underestimate the role of traditional legal tools in order to further their normative argument. Goldsmith also notes that sceptics suffer from a conceptual difficulty because they fail to distinguish between default laws and mandatory ones. The former can be modified to suit circumstances but the latter are paternalistic in nature so as to restrict private legal ordering, the very thing sceptics crave.94

Neil Weinstock Netanel has also taken up the cudgel to attack the normative basis of the sceptic’s claims. He cites the fact that liberals’ argument rests on two understandings of liberal democratic doctrine. The first is that self-regulation fully realises liberal democratic goals including ‘individual liberty, popular sovereignty and consent of the governed’.95 The second is that a liberal state exists to grant autonomy to those who seek it. Simply put, this claim is that imposing state norms is a colonial intrusion because the community has self-defined as different.96 Netanel refutes both claims as either misunderstandings of the liberal project or insufficiently democratic. An unregulated net would exacerbate problems with counter- majoritarianism and direct democracy. Furthermore, even if not regulated by the state, the net would develop its own institutions to regulate behaviour but these would simply re-create the issues of legitimacy libertarians have with state governance. In the worst case, self-governance is a recipe for illiberal activities including discrimination of both status and access, invasions of privacy and deep inequality.97

Another group of realists goes further and attacks the very basis of the self-regulatory idea. They claim that the metaphor of cyberspace and the reality simply do not match. It may have been true in the very beginning of the internet’s development that it exhibited elements of other-ness which rendered it difficult to classify. However, this argument has long since passed its usefulness. As Wu notes, the Internet is not like a place at all but a ‘multiple-use network,

94 Jack Goldsmith ‘Against Cyberanarchy’ (1998) 65 U. Chi.L.Rev. 1199

95 Neil Netanel ‘Cyberspace Self-Governance: A Skeptical View from Liberal Democratic Theory’ (2000) 88

Calif.L.Rev. 395 at 402

96 Ibid, at 402-403

97 Ibid, at 497-498

capable of supporting any kind of application anyone want[s] to run on it’.98 Notwithstanding any feelings of community users feel there is no adequate justification for completely divorcing virtual space from physical space in the manner desired by regulatory sceptics.99

Cyber-culture

The normative regulation sceptic arguments manifest themselves in specific points made about the cultural difference of the web. Cyber libertarians state that the web has legitimate internal processes and norms for dealing with offensive speech. Secondly, that the culture encourages robust exchanges which may not meet legal guidelines but are part of online experience.

In support of this, they make five examinable claims:

98 Timothy Wu ‘When Law & the Internet First Met(2000) 3 Green Bag 2d 171 at 173

99 See Mark Lemley ‘Place and Cyberspace’ (2003) 91 Cal. L. Rev. 521

These claims, on the whole, are not persuasive. First, rather than being conversational, web dialogue and postings can range from being ephemeral to permanent and published. It is a continuum not an absolute and courts have had no trouble broadening the concept of publishing the placement of text or other material on a website. As Harvey J noted in Slater, the web is different to coffee table conversation.100 Simply put, the web is a publishable medium not a conversational one due to the fact it is readable, writable, editable and archival.

Second, the cultural argument assumes a homogenous web in its adherence to these norms and values. In fact, there is no way that all participants adhere to rule about citing material in support of an argument. Participants’ reasons for being on the web differ. For instance, the title blogger could apply to a citizen journalist concerned with reporting political information or simply to a personal diarist with a passion for cooking. These citizen journalists do have an interest in adhering to norms of the culture because in this way they obtain reciprocal treatment but this does not apply to everyone.101

Thirdly, the culture argument overstates the ability to rebut assertions made. The efficiency of spreading information is both a virtue and a curse in this sense because it can spread so rapidly it cannot then be recovered.102 Theoretically, low barriers of entry mean self-help methods to correct erroneous information are available to everyone. Such idealistic assumptions led the US Supreme Court to opine that ‘any person with [an Internet connection] can become a town crier with a voice that resonates farther than it could from any soapbox’.103 In reality, the picture is much less rosy. A putative speaker can set up her own website but the likelihood is that it will receive substantially less traffic than established ones. Only a small number of websites attract the majority of traffic and the rest are read by very small numbers of readers or none at all. This led one commentator to note that expressing thoughts online ‘is less like crying oyez from the central marketplace and more like whispering in a labyrinth’.104

100 Police v Slater, above n 49, at 134

101 Mark Cenite et al ‘Doing the right thing online: a survey of bloggers’ ethical beliefs and practices’ (2009) 11 New Media & Society 575

102 Sunstein notes the phenomenon of information cascades helps spread rumour in an environment that allows for instant provision and recovery of information, see Cass Sunstein ‘Believing False Rumours’ in in Saul Levmore and Martha Nussbaum (eds.) The Offensive Internet (Harvard University Press, Cambridge, 2010) 91

103 Reno v ACLU [1997] USSC 73; 521 US 844 at 870 (1997)

104 Beth Simone Noveck ‘Designing Deliberative Democracy in Cyberspace: The Role of the Cyber Lawyer’ (2003) 9

BU J Sci & Tech L. 1 at 26

Also, the ability to rebut is up to the vagaries of any individual platform that hosts offending content. Some discussion boards for instance only extend commenting privileges to invited members making instant rebuttal more illusory than real.105 Furthermore, disapproval of content posted may not be enough. Many have noted that such disapproval is much weaker given the Internet’s allowance of ‘anonymous or pseudonymous expression by speakers not physically proximate to—and therefore not at risk of retribution from—those with whom the speaker disagrees’.106

Finally, it is a myth that the Internet is a low authority medium. In fact, search engines promote authority because of the way in which they rank results, i.e. the number of page views and links determines the ranking of pages. Search engines are not designed to seek the truth or non- offensive content but display content according to a complex algorithm based substantively on popularity. Simply put, search engines are not an objective method of determining truth. In fact, by their nature they may make truth harder to come by.107 Furthermore, the Internet’s mechanisms and practices create salience around topics of interest. These mechanisms such as linking are authoritative in the sense that they ‘generate a common set of themes, concerns and public knowledge around which a public sphere can emerge...Users self-organise to filter the universe of information that is generated in the network. This self-organisation includes a number of highly salient sites that provide a core of common social and cultural experiences and knowledge that can provide the basis for a common public sphere’.108 The fact these common experiences are possible militates against the view that users do not take online content seriously.

Nonetheless, some litigants have advanced the separatist cyber-culture argument, in particular defamation cases tend to attract arguments that the kinds of comments made online do not deserve usual legal scrutiny.

105 Dawn Nunziato ‘The Death of the Public Forum in Cyberspace’ (2005) 20 Berkeley Technology and Law Journal 1115. See also Anthony Varona ‘Toward a Broadband Public Interest Standard’ (2009) 61 Administrative Law Review 1 at 53-58

106 Varona, above n 105, at 59

107 Eric Goldman ‘Search Engine Bias and the End of Search Engine Utopianism’ (2006) 8 Yale J.L. & Tech 188. For a rebuttal to the assertion that search engines widen and equalise political speech see: Mathew Hindman, Kostas Tsioutsiouliklis and Judy Johnson ‘Googlearchy: How a Few Heavily-Linked Sites Dominate Politics on the Web’ (Paper Presented at the Annual Meeting of the Midwest Political Science Association, Chicago, 31 March 2003)

108 Yochai Benkler Wealth of Networks: How Social Production Transforms Markets and Freedom (Yale University Press, New Haven, 2006) at 256

The Gutnick case is important because it appears to mark the nadir of the concept of specialist internet law.109 On the one hand, Kirby J was willing to accept the idea that the internet required a new perspective as advocated by Dow Jones. He noted that the features of the web did make it different to other, extant technology and broadcasting systems in the way it was truly global, continuous, flexible and perhaps even incompatible with law conceived and applied long before the internet was even thought of. In direct terms, Kirby J lamented the fact that ‘this court should solve the present problem by reference to judicial remarks in England in a case, decided more than a hundred and fifty years ago, involving the conduct of a manservant of a Duke, despatched to procure a back issue of a newspaper of miniscule circulation’.110 Nonetheless, Kirby J concluded that the uniqueness of the web was overstated and in fact it incorporated many characteristics of other technology which courts had had little difficulty drawing within their ambit.

Callinan J was not so permissive of the contention that the internet should be treated as a special case. He stated bluntly that Dow Jones’ arguments about the uniqueness of the technology were ‘not very convincing’.111 The judge also outright rejected the implication that because the internet created a ubiquity of information, enforcement of established law or attempts to regulate that ubiquity was futile. This argument could not succeed because to let it would be to suggest ‘that any attempt to control, regulate or even inhibit [the Internet’s] operation, no matter the irresponsibility or malevolence of a user, would be futile and that therefore no jurisdiction should trouble to try to do so’.112

The culture argument was brought home in the defamation case of O’Brien v Brown.113 Judge Ross refused to accept the argument that the internet required special treatment as a venue where comments did not need to conform to usual rules of defamation. The defamation related to comments posted on an online newsgroup/email list. The defendant claimed hostile, insulting

109 Paul Sumpter ‘The end of Internet law?’ (2003) NZLJ 7

110 Gutnick, above n 70, at [92]

111 Ibid, at [180]

112 Ibid, at [186]

113 O’Brien v Brown [2001] DCR 2001

comments were a normal practice online and on this newsgroup in particular. The judge strongly rejected this evidence, stating:114

I know of no forum in which an individual citizen has the freedom to say what he likes and in any manner he wishes, about another individual citizen with immunity from suit for all consequences. Merely because the publication is being made to cyberspace does not alter this. I am not aware of any precedent for internet-type material deriving protection from action in the tort of defamation. There can be no question that publication on the internet counts as publication for defamation purposes.

Despite these clear expressions, the debate as it relates to defamation may not be over. A very recent defamation action in Canada was dismissed at first instance on the grounds that it occurred in the blogging medium to which special rules might apply. Judge Annis regarded blogs as “a form of public conversation”,115 by which he meant that the medium has a “back and forth character...[which] provides an opportunity for each party to respond to disparaging comments before the same audience in an immediate or a relatively contemporaneous time frame.”116 Put simply, he viewed blogging as akin to live debate not a medium of publication.

There is some debate over which aspects of the judge’s comments amount to the ratio and which are obiter. A conservative reading would split the judge’s views on defamatory meaning of the statement from the comments directed at blogs in particular. However there seems no way to logically separate the two arguments as the Judge’s reflections on the context of the comment were immediately relevant to his determining that it lacked a defamatory meaning. Thus, on a radical reading, Judge Annis appears to create a side rule in the law of defamation to deal with the new phenomenon of online commentary. With respect, the judge probably took his consideration of the online context too far in not considering the defamatory meaning on a standalone basis. Nevertheless, it is worth bearing in mind the judges reasoning given it could be persuasive in a New Zealand context.

114 Ibid, at [7.13]

115 Baglow v. Smith (2011 ONSC 5131) at [59]

116 Ibid

In applying the cultural argument to the law of defamation, Judge Annis stated that the contextual circumstances were important in determining defamatory meaning, thus: 117

In essence, I am suggesting that the Court, in construing alleged defamatory words in an ongoing debate, should determine whether the context of the comment from the perspective of the reasonable reader or listener is one that anticipates a rejoinder, which would eliminate the possible consequence of a statement lowering the reputation of the plaintiff in their eyes.

The Canadian authority may be supported by case law in the UK. In Smith v ADVN, Eady J remarked that:118

Bulletin board postings are rather like contributions to a casual conversation (the analogy sometimes being drawn with people chatting in a bar)...they are often uninhibited, casual and ill thought out. Those who participate know this and expect a certain amount of repartee or “give and take”.

The conclusion Eady J drew was that such communication was more akin to impermanent slanders than libels and therefore less likely to draw sanction. The legal rationale for this in defamation terms could be that the statements are not defamatory if they are ‘obviously, in their context, either vulgar abuse or fair comment’.119 Despite these comments, both UK and US courts have however been prepared to countenance web 2.0 tools such as Facebook and Twitter within the defamation jurisdiction.120 Obviously these tools are akin to the bulletin board postings envisioned by Eady J.

The idea that the Internet allows for more leniency in regards the ‘give and take’ of open debate has been reinforced in the United States. US law is more concerned than most with striking the right balance between the right to freedom of expression and the right to reputation. Therefore, there is greater amenability to the notion that the Internet has created hitherto unparalleled

117 Ibid, at [65]

118 Smith v ADVN [2008] EWHC 1797 (QB) at [14].

119 Ibid. at [106]

120 Firsht v Rafael [2008] EWHC 1781 (QB); Andrew Johnson and Ian Griggs ‘Love's online spat sparks first Twitter libel suit’ The Independent (United Kingdom, 29 March 2009)

<http://www.independent.co.uk/news/media/online/loves-online-spat-sparks-first-twitter-libel-suit- 1656621.html> accessed 4 October 2011

methods of self-help to deal with issues of defamation than the law originally envisaged. This is displayed in the use of ‘retraction’ defence in US case law. Thus in Mathis v. Cannon the Supreme Court of Georgia applied a statutory defence of retraction to allegations of defamation emanating from comments made on a Yahoo! Bulletin board.121 This defence of retraction bars punitive damages unless the plaintiff has already requested a retraction of the libellous statement. It encourages defamation victims to seek self-help as their first remedy by using available opportunities to contradict the lie or correct the error and thereby to minimize its adverse impact on reputation. In short, it strikes a balance in favour of "uninhibited, robust, and wide-open" debate in an age of communications when "anyone, anywhere in the world, with access to the Internet" can address a worldwide audience of readers in cyberspace.122 The Delaware Supreme Court also recognised self-help as a remedy in Doe v Cahill.123 The court summed up the argument thus:

The internet provides a means of communication where a person wronged by statements of an anonymous poster can respond instantly, can respond to the allegedly defamatory statements on the same site or blog, and thus, can, almost contemporaneously, respond to the same audience that initially read the allegedly defamatory statements. The plaintiff can thereby easily correct any misstatements or falsehoods, respond to character attacks, and generally set the record straight. This unique feature of internet communications allows a potential plaintiff ready access to mitigate the harm, if any, he has suffered to his reputation as a result of an anonymous defendant's allegedly defamatory statements made on an internet blog or in a chat room

As discussed above, the actual success of such self-help measures is limited. Aside from the question of efficacy, the promotion of self-help is probably indefensible on the grounds it allows rouge actors greater control and potentially unreasonable individuals the greatest control over regulation on the Internet.124 There are obvious quality control issues over the creation and operation of self-help practices and rules which may not be transparent at inception and may not protect all users equally. Self-help mechanisms in general lack the crucial component of settled rules of recognition therefore they are much less stable and efficient at correcting wrongs. Furthermore, there is the propensity for free-riders to leech off others’ efforts to develop and

121 Mathis v. Cannon, 573 S.E.2d 376 (Ga. 2002).

122 Ibid, at 385-86 (internal quotation marks and citations omitted)

123 Doe v. Cahill, 884 A.2d 451, 464 (Del. 2005)

124 Horton, above n 27, at 1303

enforce self-help. This in turn creates further efficacy problems as not everyone in the community shares the burden of keeping it within the bounds of rules.125

Baglow is subject to an appeal in Canada. It is doubtful—even in the event the decision is upheld—that the same conclusion would pertain in New Zealand. There seems little scope under New Zealand law to recognise the reasoning of Judge Annis that a rejoinder would remove the harm done by the initial comment. This is particularly so when it may not be read by the same people or necessarily believed by those who do. In any event, the overall context seems irrelevant to whether a defamatory meaning exists given that that meaning is not determined by the expectations of the audience but by the ordinary reasonable person.126 Additionally, the analogy of blog postings and comments with a live debate is at odds with the recent Slater cases which distinguished the type of coffee table talk from the online medium. Blogging can be distinguished on the basis that it is a published medium that maintains a permanence that is lacking in ordinary conversation. Judge Harvey noted that the internet ‘allows everyone to be a publisher’ and thus everyone must bear the consequences of what this entails.127

125 Weber, above n 87, at 21

126 Capital & Counties Bank Ltd v Henty {1882} 7 App Cas 741, 745. The principles of the reasonable person test were further explicated by the Court of Appeal in New Zealand Magazines Ltd v Hadlee [2005] NZAR 621 (CA)

127 Police v Slater, above n 49, at [38], [135]. See Craig Sisterson ‘Blogs akin to mainstream media, not causal conversation’ 11 NZLawyer extra (24 September 2010)

PART IV—PERSISTENCE OF THE WILD WEST

If, as demonstrated above, the normative and descriptive arguments favouring the non- regulability of the Internet do not stand scrutiny then why does the sense of cyberspace as a separate space persist? More specifically, why does the notion of the Internet as a Wild West persist if it is plainly demonstrable that this is both normatively and descriptively wrong? Further, what can we learn from this persistence that may lead us to produce better regulation? In this part, I will seek to address this question with reference to the seductiveness of metaphor in a political sense and then discuss Dan Hunter’s argument that the spatial metaphor is prevalent because it helps in the cognitive process. Noting Hunter’s arguments about other areas of law, I will then briefly examine whether the persistence of the metaphor has implications for online democracy.

Seduction of the Wild West

The concept of the Internet as a Wild West has been a powerful and enduring one since its inception. Early internauts referred to the idea of an “electronic frontier” and viewed themselves much like frontiersmen.128 They were said to abide by a code of the West which amounted to conforming to as much or as little order as deemed necessary. These early attitudes were further displayed in notions of the Internet as a prairie range insistent on being claimed by eager volunteers.129 Jane Tompkins accurately reflects the pull of the metaphor when she says that the West:

functions as a symbol of freedom, and of the opportunity for conquest. It seems to offer escape from the conditions of life in modem industrial society: from a mechanized existence, economic dead ends, social entanglements, unhappy personal relations, political injustice. The desire to change places also signals a powerful need for self- transformation. The desert light and the desert space, the creak of saddle leather and the

128 “Internaut” as defined by the Oxford Online Dictionary is ‘a user of the Internet, especially a habitual or skilled one’. <http://oxforddictionaries.com/definition/internaut> accessed 1 October 2011

129 Jonathan Rusch ‘Cyberspace and the “Devil’s Hatband”’ (2000) 24 Seattle Univ.L.Rev. 577 at 578

sun beating down, the horses' energy and force—these things promise a translation of the self into something purer and more authentic, more intense, more real.130

The seduction of the idea is thus self-evident. On the other hand, policy makers inevitably find the Wild West motif disquieting at best. Such a view is contained in the comment from Simon Power noted at the beginning of this essay. These opposing views of the metaphor are deeply ingrained. Indeed, though libertarians celebrate the metaphor for its innocent offer of hope, they also fear it may motivate anxious politicians and law-makers to reverse loose internet governance at the expense of their liberal vision.131

We can clearly see that a nightmarish dystopia of governmental interference has not come to fruition. As Morrison puts it, the Internet is neither a ‘digital gulag of governmental interference and surveillance, nor...[a] bodiless utopia free from the constraining influence of history, politics and other messy materialities of human culture’.132 There is a consciousness among policymakers that gradual evolution and care are needed in regards regulation of the cyberspace. A recent example is the advent of the so-called copyright ‘guilt upon accusation’ law. The initial law, passed under the Labour government in 2008, created an offence where a rights holder complained a user had downloaded pirated content. After public outcry, the law was suspended the following year and a substantive review undertaken which involved discussions with various stakeholders. The result was a new Bill passed in 2011 which retracted the guilt by association measure in favour of a notice and appeal regime.133

130 Jane Tompkins West of Everything: the Inner Life of Westerns (Oxford University Press, New York, 1992) at 4 131 Mike Godwin Cyber Rights: Defending Free Speech in the Digital Age (Times Books, New York, 1998) at 298 132 Morrison, above n 89, at 65

133 Martin Kay and Andrea Vance ‘Controversial internet file-sharing law passed’ Stuff.co.nz (New Zealand, 14 April 2011) < http://www.stuff.co.nz/technology/digital-living/4885041/Controversial-internet-file-sharing-law-

passed> accessed 1 October 2011

Claire Tompkins and Ian Finch ‘Continuing controversy over proposed changes to copyright law’ 139 NZLawyer

(25 June 2010); NZPA ‘Controversial internet law on hold – Key’ NZ Herald (New Zealand, 23 February 2009)

<http://www.nzherald.co.nz/technology/news/article.cfm?c_id=5 & objectid=10558256> accessed 1 October 2011 Adam Bennett ‘Controversial file-sharing law to pass today’ NZ Herald (New Zealand, 14 April 2011)

<http://www.nzherald.co.nz/nz/news/article.cfm?c_id=1 & objectid=10719055> accessed 1 October 2011 Copyright (Infringing File Sharing) Amendment Bill 2010 (119-2)

Cyberspace as place

The seductive freedom which the Wild West metaphor entails is only part of the reason it is easy to treat the Internet as outside normal legal bounds. In this section, I will examine Dan Hunter’s work on the area of metaphor and the idea of cyberspace and place. Hunter makes several valuable contributions to the debate. The first is his willingness to go against the grain of received wisdom and claim that people do treat online space as a place. This is in direct contrast to scholars like Wu who reject this type of equivalency in the interests of maintaining a realist stance. Hunter however argues that to equate cyberspace with place is a natural and unavoidable way of ordering our thinking. He points to the fact ‘the language we use to discuss cyberspace is shot through with physical references and implications...Even those who argue against the cyberspace as place metaphor find it impossible to talk about Internet regulation without invoking spatial references’.134 Any cursory look at this language proves Hunter’s analysis to be correct.

But what matters about this linguistic convention? Why does it matter that everyone including lawyers, judges and lay people use spatial metaphors to talk about the Internet? The answer lies in Hunter’s contention that these uses of language are not merely rhetorical flourishes but sophisticated ways of structuring abstract thought. Hunter notes that for a long time, metaphor has been the orphan of political and legal philosophy particularly given the strength of logical positivism in the 20th century.135 Consequently, Hunter draws together metaphor theories to explain the importance of metaphor to legal thinking. He makes three crucial points:

  1. Drawing on I.A. Richards’ Philosophy of Rhetoric, Hunter notes that metaphors do not just have linguistic or rhetorical effects but cognitive ones and therefore are fundamental and ubiquitous.136
  2. He co-opts interaction theory of metaphor to explain the way we think. Simply put, interaction theory posits a ‘target’ and ‘source’ view meaning certain source words/ideas are used to create meaning onto a target object. This creates unique meaning which cannot be expressed another way. Hunter cites Max Black’s

134 Dan Hunter ‘Cyberspace as Place and the Tragedy of the Digital Anticommons’ (2003) 91 Cal.L.Rev. 439 at 458

135 Ibid, at 463

136 Ibid, at 464-465

implication-complex as describing how metaphor uses source material to conjure a unique understanding about a target thing. This understanding is determined by the underlying system of meaning which the source word conjures. Thus, in Hunter’s example, the phrase “lawyers are pigs” only derives meaning from a complex understanding of our own view of what the source (pig) means.137
  1. Finally, Hunter describes how we use metaphors to structure our thinking by using common perceptions about elements in our lives. He cites Lakoff in support of the contention that we share a multitude of conceptual metaphors even if our linguistic expression of these ideas may differ. A critical conceptual metaphor we share is viewing abstract things in physical terms. A simple but effective example of this is the ubiquitous notion of life as a journey and the interaction between time and space this entails. A further step occurs called ‘structural mapping’ whereby our own experiences generate unique thoughts about how the target works. Hunter applies this aspect to the cyberspace as place metaphor whereby we cognitively imply aspects of our own experience onto the new medium of the Internet. Directly, this leads to our conceptions that the internet is an actual space that can be zoned, trespassed upon and divided.138

The upshot of Hunter’s argument is that the Wild West metaphor is not merely a rhetorical flourish but a trope that would not exist without the cognitive foundation that cyberspace exists as place. In that sense the Wild West is a linguistic expression of a deeper lying and fundamental way in which we structure our process of thinking towards any entity, whether old or new.

Hunter’s ultimate point is twofold. Firstly, he elucidates that the way we view cyberspace as place has implications in all areas of law. To this end, Hunter cites the constant use of the spatial metaphor by courts in discussing cyber-crime, tort and constitutional issues. Thus, in regards spam legislation, the expansion of the tort of trespass to chattels and the application of free speech jurisprudence it is only coherent to link the Internet with these elements of law via an understanding of the physical world. Most strikingly, each involves re-creating the public/private splits of which liberalism is fond. Thus, spam legislation is only susceptible to reasonable analysis

137 Ibid, at 465-469

138 Ibid, at 471

because it involves the protection of the home from unwanted intrusions.139 So too the expansion of the tort of trespass relies on a somewhat strained analogy built on the invasion of private virtual space. Finally, courts’ protection of the right to freedom of expression online is suffused with spatial characteristics including defining some spaces as public and some as private according to whether or not strong freedom of speech principles apply in a particular forum.

The key thing to note is not that metaphor is used; this is standard practice in reasoning and is— as Hunter shows—unavoidable. However, use of the metaphor without proper analysis can create negative externalities and lead to illogical outcomes. This has occurred in regards extension of the trespass to chattels tort in the United States.140 Another important area where the spatial metaphor has grasped hold is in regards the right to freedom of expression. The spatial metaphor is critical to the US conception of free speech given that the level of protection is determined by the nature of the forum in which it occurs. The courts have been prepared to offer the same protections to online spaces as physical ones.141 The use of spatial analogy has allowed courts in the US to apply public forum protection doctrine to aspects of the online sphere under the 1st Amendment. The notion that freedom of expression applies online is a thorny one as it involves careful construction of the difference between public and private uses of Internet networks.

Hunter’s second point is of more normative character. His thrust is that the cyberspace as place metaphor has negative implications for the public domain. Hunter, like many other scholars, is concerned by increasing private ownership of commons resources into splintered ownership.142 Hunter and Lemley criticise the use of metaphor in regards property rights because its effects are detrimental to an efficient property regime. While not a problem in itself, private ownership does have the potential to create what Michael Heller terms the “Tragedy of the Anti-Commons”.143 This is a reversal of Hardin’s famous argument that a “Tragedy of the Commons” develops

139 Andrea Slane ‘Home is Where the Internet Connection Is: Law, Spam and the Protection of Personal Space’ (2005) 2 U.Ottawa.L.&Tech.J. 255

140 Hunter , above n 139, at 484-488

141 Ibid, at 489

142 Boyle coined the term “second enclosure movement” to describe the application of the anti-commons scenario to the digital sphere. See: James Boyle ‘The Second Enclosure Movement and the Construction of the Public Domain’ (2003) 66 Law and Contemporary Problems 33; Carol Rose ‘Romans, Roads and Romantic Creators: Traditions of Public Property in the Information Age’ (2003) 66 Law and Contemporary Problems 89

143 Michael A. Heller ‘The Tragedy of the Anticommnons: Property in the Transition from Marx to Markets’ (1998) 111 Harv.L.Rev. 621

when public resources are overused because there are no private property interests limiting their abuse.144 The anti-commons scenario occurs when too many parties have exclusive rights of access to resource. This creates high transaction costs and potential gridlock meaning inefficient underuse of resources. Hunter’s premise is that the cyberspace as place metaphor makes this type of anti-commons possible.145

Spatial metaphors, democracy and law

If the metaphor has potentially negative consequences in the realm of property, it is apt to enquire what implications it has for democracy. This point about the spatial implications of the Internet was made in Diana Saco’s work Cybering Democracy. Saco’s spatial idea is that the Internet must be viewed as social space even though it is non-physical. Drawing on Lefebvre, Soja and Foucault, she demonstrates that spatial characteristics and our understanding of the creation of such spaces are socially constructed.146 Therefore, meaning is derived from use under this model, while lacking physical and bodily integrity we still view cyberspace as a place once constructed. This is a more useful way of considering the Internet’s spatial characteristics in that it is not reliant on metaphor. Saco then turns to her main concern which is the disembodiment the Internet engenders and its impact on democratic theory. She notes that the lack of bodiliness has impacts in two specific areas: scaling democracy and achieving meaningful discussion.

Saco’s contention is that the Internet may solve some spatial issues with which democratic theory has wrestled. Saco cites the fact that overwhelmingly democratic theories rest on notions of physical space and some are deliberately designed to “scale” democracy in order to make it more practical. Thus, representative theories are designed to deal with the complication of large, dispersed polities. Conversely, participatory democracy relies on small polities as a practical reality in order to bring its normative project to fruition. Saco goes deeply into competing strands of democratic theory including the public sphere conceptions of Arendt and Habermas. Her overall conclusion is that these theories too often privilege physical embodiment when

144 Garrett Hardin ‘The Tragedy of the Commons’ (1968) 162 Science 1243-1248

145 Hunter, above n 139, at 511-513

146 Henri Lefebvre The Production of Space trans Donald Nicholson-Smith (Oxford: Blackwell, 1991); Edward W Soja Postmodern Geographies: The Reassertion of Space in Critical Social Theory (London: Verso, 1989); Michel Foucault The Order of Things: An Archaeology of the Human Sciences (New York: Vintage Books, 1994)

practically they could work just as well without the requirement of corporealness. In that sense therefore, the Internet could be a means to achieve a realistic deliberative public sphere shorn of the need for face-to-face communication.

Contrarily, Saco notes that bodiliness has previously been required to identify the political citizenry and enhance general governability. The anonymity of cyberspace and ability to take on multiple identities confounds this notion. However, Saco does not engage fully on this point and has almost nothing to say about the way in which analogue political actions (e.g. protest) have moved into the online realm. Evidence suggests this trend is increasing with the advent of online activism including petitioning, smart mobbing and ‘hactivism’.147 Simply put, she ignores the fact that the physical and virtual aspects of democratic practice can co-exist and that neither rules out the other. She does however note that disembodied communication raises issues about accountability and restates the point that democratic theory often requires confronting one another in the flesh in order to induce accountability.148

Saco’s main thrust is that the Internet creates a tension between the physical and the virtual which has replaced the tension between freedom and control or public and private. It is these latter tensions which hitherto it has been the role of democracy to unravel. She writes that:

...[the Internet’s] radicalness depends in part on [the distinction between physical and virtual] in that by offering up digital versions of conventionally physical phenomena, it skews practically every idea, every labour, every law and every human interaction that has been conventionally understood or premised on the physicality of the things thought about, the commodities produced, the objects legislated, and the bodies engaged.149

147 Martha McCaughey and Michael Ayers (eds.) Cyberactivism: Online Activism in Theory and Practice (Routledge, New York, 2003)

Recently, the issue has been advanced by the increase in use of mobile technology and social networking. The riots in the UK in August 2011 were particularly noteworthy for the role of social media and mobile technology:

Josh Halliday ‘David Cameron considers banning suspected rioters from social media’ The Guardian (United Kingdom, 11 August 2011) <http://www.guardian.co.uk/media/2011/aug/11/david-cameron-rioters-social- media> accessed 4 October 2011; Josh Halliday ‘London riots: how BlackBerry Messenger played a key role’ The Guardian (United Kingdom, 8 August 2011) <http://www.guardian.co.uk/media/2011/aug/08/london-riots- facebook-twitter-blackberry> accessed 4 October 2011.

Hactivist group ‘Anonymous’ has also risen to prominence with various attacks on high profile commercial and governmental websites, see for example: ‘Anonymous hacktivists say Wikileaks war to continue’ BBC News (United Kingdom, 9 December 2010) <http://www.bbc.co.uk/news/technology-11935539> accessed 4 October 2011

148 Diana Saco Cybering Democracy: public space and the Internet (University of Minnesota Press, Minneapolis, 2002) at 71

149 Ibid, at 24

As we have seen above however, the radical thesis is not substantiated by the realist perspective that has dominated legislative and judicial discourse about the law and the Internet. The debate has not focussed on a new tension between virtual and physical and the radical implications this engenders. Rather, the tension has been managed in two ways. Either, legal bodies focus ‘on the spirit rather than the letter of the law to argue that the Internet environment is not so unique as to warrant a whole new set of rules’.150 Or, the focus is on ‘the degree to which government will maintain control over communications for public ends, which is indeed a central feature of all current efforts to curtail online crime’.151

In regards the former point, we can see this concern with the spirit rather than the letter of the law in the Slater case. In that sense, ‘report or account’ was given a wide, purposive construction in order to fulfil the ‘spirit’ of the law around name suppression. Likewise, in discussion on the nature of blogs, Harvey J drew a comparison between a blog and posting mail into private letterboxes or pasting information on a billboard. This analogy showed that the blog is a publishing medium in that it is designed to make information available to an audience.152 Again, the analogy made use of the spirit of the law rather than consuming itself with the radicalness of the medium per se.

In regards the latter point, the concern to maintain governmental control over communication in the public interest was displayed in Slater via the discussion of clash of rights. White J noted that there was a conflict between the freedom of expression under s 14 of the New Zealand Bill of Rights Act and the right to a fair hearing and open justice under s 25(a). The right to fair hearing places limitations on freedom of expression which are demonstrably justified.153 Harvey J also noted that the incidents of breaching non-publication orders were a deliberate campaign of “electronic civil disobedience”, in other words involved acting ‘beyond legitimate protest and criticism’.154 Both at trial and on appeal, the judges noted that Mr Slater had freedom to comment on changing the particular non-publication rules but was not immune from

150 Andrea Slane ‘Democracy, Social Space and the Internet’ 57 U.Toronto.L.J. 81 at 103

151 Ibid, at 95

152 Police v Slater [2011] DCR 6 at [15]

153 Slater v Police, above n 3, at [45]; Police v Slater, above n 49, at [123]-[124]

154 Police v Slater, above n 49, at [181]

prosecution for breaking them.155 The careful use of spatial analogy will be an important factor for the Court of Appeal when it considers the Slater appeal.

Nevertheless, Saco’s insights cannot be totally dismissed. Though the radicalness of the tension between physical and virtual is overstated, her thoughts do have value in bringing to mind different ways of viewing democratic practice online. Crucially, Hunter never says that the cyberspace as place idea should be cast away because it has implications for the ordering of property. Indeed he recognises that , given its cognitive implications, it is pointless to do so.156 Rather, in his discussion of areas where it impacts law he wishes simply to draw attention to the hidden nature of reasoning so this can be critiqued openly. It is a vital part of the theory of the anti-commons that it is predicated on the idea that we cannot know what might exist without an enclosed commons because it masks better uses as well as precluding them.157

So too Saco’s conception that the Internet is a stage for battles between the physical and the virtual does not need to be expunged simply because it is also the venue for age old battles of freedom vs. control and private vs. public. Saco contends that the disembodiment online could lead to new forms of democracy but she is not certain what these forms might be. This is a flaw in her argument but it is an easily forgivable one if we take the point at face value. The topic of what these new forms are and how they relate to traditional notions of democracy are for another essay. Nonetheless, there is value in pointing out the possibility in itself. What realist theorists like Wu and Netanel may have missed is not that the rule of law extends into cyberspace—it clearly does—but that the conflation of the virtual and the physical which realism demands may rule out development of some valuable new democratic tools and potentially afflict meaningful deliberation.

155 This seems an uncontroversial and typical understanding of how civil disobedience works. Indeed, the very essence of civil disobedience is arguably the act of being caught and punished: Henry David Thoreau On the Duty of Civil Disobedience (Arc Manor, Rockville, MD; 2008) at 21

156 Hunter, above n 139, at 515

157 Ibid, at 512-513

The solution may be therefore to conceive of space as Slane and Saco advocate, i.e. as social space which derives its meaning from its use. This means that the Wild West metaphor is outmoded and incomplete. It will not do to conceive of the Internet as wholly ungoverned because the temptation then is to enter into a democratic form of the enclosure movement by interfering in areas perceived to be ungoverned. It may be that so-called ‘wild spheres’ are critically necessary for democracy in the 21st century just as the salons of the 19th century were necessary for political discussion. This is not to say that either these corners of the web or those historical salons do not need to comply with the law. Rather, that there must continue to be leeway and thought given to preserving different avenues of discussion. In one sense, this is the import of the Baglow decision discussed above. Sometimes the expectations of behaviour are different in different spheres and according to context. The Supreme Court has recently said as much in relation to the offence of behaving in an offensive manner in a public space.158 Such anti-social actions may therefore need protection in the interest of vibrant democracy.

Changes to regulating the online environment need also to recognise that colonising the whole of the Internet as public or political space is equally as dangerous to democratic values as allowing it to become wholly private space. This of course is where the nuance of the case law comes into its own. As demonstrated in Nationwide News, the definition of what is public and what is private often can be determined on a factual basis. Though the court did not frame it this way, the ultimate concern was to temper the prospect that harm to reputation could be sued for despite the fact very few people noticed the existence of the defamatory comment. This led to the analysis of how many viewers the material had. The court was thus adjusting its view of defamation to the reality of the online world in which every post is technically a public publication and yet not all can realistically be classed as such. Essentially what the court was saying is that some statements can be classed as public in the sense they have public effect but others are more or less private due to the lack of reception they garner.

The key is that although the physical/virtual split is not as great as it appears in libertarian fantasy nor can it be totally submerged as the realists would argue. Instead, it is still a valuable, standalone sub-category of analysis. When talking about legal solutions to problems online, it is crucial to remember that we are not dealing with an entity which has gone ‘off reservation’ so to

158 Morse v Police [2011] NZSC 45

speak. Rather, what Hunter’s analysis shows quite clearly is that even if the libertarian/liberal basis of the argument about cyberspace as place is wrong, this does not and has not diminished the power of the spatial metaphor. As Slane demonstrates in relation to Saco’s arguments about physical and virtual replacing public/private as the key tension, now is the time to be reminded that the latter will keep replicating itself despite any sense of newness that attaches to the online sphere.

It is only after coming to the realisation that the law seeks to replicate public-private distinctions within the realm of cyberspace that we can begin to make arguments shifting these viewpoints. As Hunter and Lemley show in relation to property law and trespass, the metaphors of the physical world are strong when the legal system considers elements seemingly outside its previous knowledge. The persistence of metaphors like the Wild West is comforting in that respect. Crucially though, we always need to be aware of these underlying structures on our thinking and challenge them where appropriate.

The realist critiques are correct that conceptually there is nothing different between the physical and virtual apart from the notions of scale and speed. The virtual and the physical are not a dichotomy; rather both feed into each other, complicate and complement each other. Online actions have consequences in the physical world which cannot be ignored. Crucially though—the reverse is also true. Physical world actions and conceptions have consequences in the online world. It is only by recognising both halves of this equation that we will obtain quality regulation of both the physical and virtual worlds.

CONCLUSION

Overall, it is demonstrably clear that framing the debate about Internet regulation in the form of the Wild West metaphor is both unduly pessimistic and highly contentious. It is pessimistic in the sense it betrays a sense of hopelessness from policy makers about containing undesirable content online. It is unduly so however because as we have seen, a substantive body of both legislative and judicial consideration shows that traditional legal methods are still adept at handling problems of jurisdiction, enforcement and definition which the Internet seems to create. Put simply, the Internet is regulable in fact as well as theory.

The use of metaphor to drive policy is also highly contentious because—as we have seen— realists, regulation sceptic and policy-makers adopt their own view of its meaning to suit their particular agenda. For realists, it is necessary to crush the metaphor lest it betray any sense that the Internet’s newness is a problem for law. For regulation sceptics, the metaphor can be embraced because it reflects the very peak of their vision of autonomy. Finally for policy makers, the metaphor is a reflection of doubt and dismay about how to regulate new concepts. As we saw in the final section however, there may at least be a tiny space opening up where each of the three groups can come together to realise that it may be better for democracy overall to see the physical and the virtual as reciprocal and complementary rather than mortal enemies.


NZLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.nzlii.org/nz/journals/UOtaLawTD/2011/18.html