NZLII Home | Databases | WorldLII | Search | Feedback

University of Otago Law Theses and Dissertations

You are here:  NZLII >> Databases >> University of Otago Law Theses and Dissertations >> 2020 >> [2020] UOtaLawTD 38

Database Search | Name Search | Recent Articles | Noteup | LawCite | Download | Help

Ruske, Aleisha --- "Keeping an AI on the future: An investigation into whether New Zealand's antitrust law is capable of addressing algorithmic collusion" [2020] UOtaLawTD 38

Last Updated: 22 September 2023

2020_3800.png

Keeping an AI on the future:

An investigation into whether New Zealand’s antitrust law is capable of addressing algorithmic collusion

Aleisha Ruske

A dissertation submitted in partial fulfilment of the degree of Bachelor of Laws (Honours) at the University of Otago – Te Whare Wānanga o Otāgo.

i

October 2020

Acknowledgements

To Rex for your knowledge, patience and half-time quizzes. It has been a pleasure to learn from you over the past four years.

To my family for being my best friends, greatest supporters and most constructive critics. I love you all to the moon and back.

To the new family I have made at Otago. The last five years have been my most treasurable. I look forward to a lifetime more of memories with you.

To my 2020 flatmates and quarantine inmates for the revelry and laughter shared this year.

Your morning salutations will certainly not be something I am ever nostalgic for.

To my heroines: Alana, Anna, Emma, Flick, Floss and Nicole. My world would be upside down without you wonderful women in it.

Finally, to Dunedin and the extraordinary 24 years you have gifted me. You will be missed.

Thank you!

Table of Contents

Introduction 1
Chapter I “Can machines think?”: An Overview of Artificial
Intelligence 5

A From Conception to Renaissance

5
B What is AI and how does it work?

6
C Algorithmic Collusion

12
D Forms of Algorithmic Collusion

14
E The Oligopoly Problem

16
Chapter II Antitrust Law in New Zealand

19
A New Zealand’s Per Se Ban on Cartel Conduct

19
B Section 30

20
C Enter into, arrive at, give effect to

20
D Contract, arrangement or understanding

21
E Provision

25
F Purpose, effect, or likely effect

26
G Fixing, controlling, or maintaining

27
H Price

28
I Section 90

29
J Exceptions, Authorisation & Remedies

30
K Investigation and Detection of Cartel Conduct

30
Chapter III An International Perspective on Algorithmic
Tacit

Collusion

33
A Concerns Over the Commerce Act 1986

33
B Algorithms Before International Courts

33
1 Posters and Frames Case

34
2 Eturas Case

35
3 Electronic Goods Manufacture Case

36
4 Posters Case

37
C Are Regulatory Responses Looking?

37
Chapter IV Assessment of Regulatory Solutions

43
A Should New Zealand Regulate?

43

“Conversation is a meeting of minds with different memories and habits. When minds meet, they don’t just exchange facts: they transform them, reshape them, draw different implications from them, engage in new trains of thought. Conversation doesn’t just reshuffle the cards: it creates new cards.”

Theodore Zeldin

Introduction

There is no question that Artificial Intelligence (AI) is rapidly changing the world we live in. From self-driving cars and disease detection1 to the introduction of robot judges.2 In the decades to come almost every facet of human existence in the modern world will be shaped and changed by increasing use of AI technology. Prominent AI researcher Andrew Ng has noted that “AI is the new electricity. Just as one hundred years ago electricity transformed industry after industry, AI will now do the same”.3 While monumental advancements in technology bring new and exciting innovations with the ability to increase efficiency, accuracy and safety across a number of sectors, the advancements bring with them ethical and regulatory concerns, especially as these advancements allow for computers to learn and make decisions for themselves.

In an antitrust context, AI is becoming increasingly employed around the world to set prices and respond to changing market conditions.4 It is therefore not surprising that there has been increased research and debate over whether or not this may lead to collusion by machines, and if so, whether we are adequately prepared in a regulatory context to deal with it. And then if we are not, how do we deal with it, if at all?

This dissertation will, in summary, aim to analyse how New Zealand’s Commerce Act 1986 is equipped to cope with the changing digital landscape and the new challenges it brings in regulating and prosecuting cartel conduct. Analysis will be based on theoretical and scholarly assumptions on the potential uses of AI and the possibility of the attribution of legal personhood to machines.

Cartel behaviour deprives consumers and other competitors of a market in which fair dealing exists, thus leading to an overall less competitive market.5 Decreased competition jeopardises

1 Sarah Griffith “This AI software can tell if you’re at risk from cancer before symptoms appear” (26 August 2016) Wired https://www.wired.co.uk/article/cancer-risk-ai-mammograms.

2 Eric Niiler “Can AI Be a Fair Judge in Court? Estonia Thinks So” (25 March 2019) Wired https://www.wired.com/story/can-ai-be-fair-judge-court-estonia-thinks-so/.

3 Roger Parloff “Why Deep Learning is Suddenly Changing Your Life” Fortune (28 September 2016).

4 The OECD has observed that “[a] majority or retailers track online prices of competitors. Two thirds of them use automatic software programmes that adjust their own prices based on the observed prices of competitors”. “Algorithms and Collusion: Competition Policy in the Digital Age” (2017) OECD https://www.oecd.org/daf/competition/Algorithms-and-colllusion-competition-policy-in-the-digital-age.pdf at 40.

5 “What is a cartel?” (May 2019) Commerce Commission New Zealand https://comcom.govt.nz/business/avoiding-anti-competitive-behaviour/what-is-a-cartel.

the efficiencies and benefits of a competitive market, namely ensuring prices remain fair and the quality of the goods and services offered in a market are high. As competition in a market increases, consumers also receive the benefit choice, thus incentivising competing firms to innovate, invest and operate efficiently.6

While principally concerned price-fixing, many aspects of this dissertation may also relate closely to other cartel behaviour currently regulated against in New Zealand (i.e. output restriction and market allocation). In simple terms, price-fixing is the agreement of two or more firms on the prices charged for products in a market so as to avoid competing with one another. This is not necessarily limited to the task of setting specific prices for an overall good or service; it also includes the setting of only part of the price or mutual method in which prices as ascertained.7

In Chapter I, I will provide a comprehensive overview of algorithms and how they may be used to achieve collusive behaviour. Due to the novel nature and technical complexities of this topic I will break down exactly how different algorithms and AI operate, and whether it is in fact possible and at all likely to be concerned over colluding machines. I will then extend these findings to outline the specific scenarios in which algorithms can generate anti-competitive results and supra-competitive profits in a market. Particular attention will be paid to instances where, without human instruction or intervention, algorithms can autonomously learn to set prices (i.e. tacit collusion).

In Chapter II, I will examine New Zealand’s competition law as it pertains to cartel conduct within the Commerce Act. Particular attention will be paid to s 30, the elements of which will be discussed in depth.

In Chapter III, I will begin to build off the information attained in Chapters I and II and begin to analyse how advancements in AI may expose loopholes in New Zealand’s cartel prohibitions and whether the Commerce Act is sufficiently drafted to ensure New Zealand is able to address

6 “Avoiding anti-competitive behaviour” (2018) Commerce Commission New Zealand https://comcom.govt.nz/business/avoiding-anti-competitive- behaviour#:~:text=Competitive%20markets%20help%20to%20keep,goods%20and%20services%20remains%2 0high.&text=The%20Commerce%20Act%20prohibits%20anti,allocate%20markets%20or%20restrict%20outpu t

7 Above, n 5.

possible cases of algorithmic collusion. The conclusion reached will show that at present there is both an attribution and a ‘meeting of the minds’ issue when it comes to algorithmic tacit collusion. I will then go on to provide an overview of how other jurisdictions, such as the European Union and United States, have begun to deal with, or at least proposed to deal with, the impact AI could have on its existing competition regulations.

Finally, in Chapter IV I will begin to consider ways in which New Zealand can proactively address the legislative insufficiencies outlined in Chapter III. Initially I will consider whether, in theory, there is any need to further regulate against collusive behaviour. I will then identify and appraise potential ways in which this may work within New Zealand’s existing framework, focusing on both potential legislative and market responses.

As it currently stands, the global approach to addressing cartel conduct requires the presence of variations of consensus ad idem, i.e. a ‘meeting of the minds’. In the New Zealand context, this is transcribed as a “contract or arrangement or,... understanding”.8 While there is much judicial and scholarly discussion as to the exact meaning of these words in an antitrust context, as we will observe throughout this paper, it appears that the phrasing of New Zealand’s cartel provisions will have serious limitations when, and not if, confronted with attempts at price- fixing by machines.

Due to this, suppose we hypothesise such a scenario in which our law is not adept to deal with modern technology advancements, thus facilitating markets that not competitive. If we wait complacently for this scenario to come before us, have we defeated the very purpose of Act which seeks to regulate against anti-competitive behaviour? Can we truly say that New Zealand “promote[s] competition in markets for the long-term benefit of consumers”?9 Or does the current legality of conscious parallelism void this topic of any intensive further consideration? This paper will investigate.

8 Commerce Act 1986, s 30(a).

9 Commerce Act 1986, s 1A.

“You cannot endow even the best machine with initiative; the jolliest steamroller will not plant flowers”

Walter Lippmann, 1913

Chapter I “Can machines think?”10: An Overview of Artificial Intelligence

A From Conception to Renaissance

It began with steam and transformed into electricity. Then it went digital. Now the fourth wave of the industrial revolution is upon us: Artificial Intelligence.11 Coined at Industry 4.0 by the World Economic Forum, the premise of the fourth wave is one of “cyber-physical systems”12 where we will see the emergence of joint effort of human and machine capabilities embedding themselves within society.

AI was first conceptualised by British polymath Alan Turing in 1950. Turing hypnotised that machines could combine available information with reason to solve problems and formulate decisions much in the same way as humans can.13 His discussion on how the machines would be built and tested was to be tested five years later by Allen Newell, Cliff Shaw and Herbert Simon in their computer program called the Logic Theorist. It was this work that was presented at the Dartmouth Summer Research Project on Artificial Intelligence, hosted by John McCarthy and Marvin Minsky, in 1956 that gave birth the research field now known as AI.14 Simon famously predicted that “machines will be capable, within twenty years, of doing any work that a man can do”.15

Over the next half-century, the field developed and slowly became increasingly powerful. The field endured two “AI Winters” from the early seventies to the early nineties. The computational power required to enable the use of “deep learning” and neural network functions (to be discussed in more depth below) was insufficient, and thus funding for further research was cut and progress stagnated.

10 This question was posed by British polymath Alan Turing when exploring the mathematical possibility of what would come to be known as artificial intelligence. Alan Turing “Computing Machinery and Intelligence” (1950) LIX Mind 433 at 433.

11 David Kelnar “The fourth industrial revolution: a primer on Artificial Intelligence (AI)” Medium https://medium.com/mmc-writes/the-fourth-industrial-revolution-a-primer-on-artificial-intelligence-ai- ff5e7fffcae1

12 Nicholas Davis “What is the fourth industrial revolution” (19 January 2016) World Economic Forum https://www.weforum.org/agenda/2016/01/what-is-the-fourth-industrial-revolution/

13 Turing, above n 10.

14 Rockwell Anyoha “The History of Artificial Intelligence” (28 August 2017) Harvard University http://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/

15 Herbert Smith The Shape of Automation for Men and Management (Harper & Row, New York, 1965) at 96.

Near the turn of the 21st century, Moore’s Law16 proved its staying power and the ‘renaissance’ period of AI began. The computational power of computers suddenly began to rapidly advance. This then enabled the use of machine learning and then deep learning to be implemented. Then in 2011, though nearly just over 25 years late, Simon’s prediction was realised: “Watson”, an IBM supercomputer, famously won a live round of the American quiz show Jeopardy! against past champions by searching its 200 million page information database to formulate and present its answers.17 The algorithm had finally surpassed its human creators, and would continue to do so.18

B What is AI and how does it work?

Britannica defines AI as “the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings”.19 This ability to do intelligent things is made possible through the use of algorithms, a concept over a millennium in age.20 Algorithms are, in essence, a “sequence of rules that should be performed in an exact order to carry out a certain ask” or “an instance of logic that generates an output from a given input”.21 In short, algorithms provide the instructions off of which AI operates.

This is in many ways comparable baking; the predicted output being a sponge cake, and the manually generated inputs being the specified ingredients being the recipe. Such technology is incredibly useful to humans, and in its simplest and traditional form, adaptive algorithms allows computers to perform a series of repetitive automated tasks, often involving complicated calculations and vast quantities of data, in timeframes that far surpass the efficiency of human beings. Tasks such as monitoring competitors’ actions, automatic communication of commercially sensitive information or automated reactions to changes in competitors’ pricing standards or market conditions can occur almost instantaneously provided they have been

16 Moore’s Law refers to an empirical observation that the number of transistors on a computer microchip will double every two years, thus increasing the speed and capability of the computers.

17 John Markoff “Computer Wins on ‘Jeopardy!’: Trivial, It’s Not” The New York Times (16 February 2011).

18 Later in 2011 an AI system, built by a computer scientist at Google, was able to learn to recognise cat faces and human bodies after looking at 10 million images downloaded from the Internet. Quoc Le and others “Building High-level Features Using Large Scale Unsupervised Learning” (paper presented to the 29th International Conference on Machine Learning, Edinburgh, June 2012). Then in 2016 another Google system, “AlphaGo” won a game of the complex board game “Go” against the world champion Lee Sedol. Steven Borowicc “AlphaGo Seals 4-1 Victory over Go Grandmaster Lee Sedol” The Guardian (15 March 2016).

19 BJ Copeland “Artificial intelligence” (11 August 2020) Encyclopedia Britannica https://www.britannica.com/technology/artificial-intelligence

20 Marek Kowalkiewicz “How did we get here? The story of algorithms.” (10 October 2019) Towards Data Science https://towardsdatascience.com/how-did-we-get-here-the-story-of-algorithms-9ee186ba2a07

21 OECD, above n 4, at 8.

Price Obtained

pb

Price Obtained

pa

programmed correctly. Applying some more context to the baking analogy, we can see in figure 1 below how, in the context of price-fixing, a basic adaptive algorithm operates, essentially revolves around the performance of two tasks: estimation and optimization.

Merchant A has programmed their algorithms to generate a certain output (i.e. to match their competitor’s (Merchant B) price or undercut their competitors’ price) without the need for human intervention. An infamous example of this occurred between two merchants on Amazon selling a copy of Peter Lawrence’s The Making of a Fly. Merchant A and Merchant B had adopted pricing algorithms which were functions of the others’ price such that Merchant A’s algorithm priced the book at 1.270589 times higher than Merchant B’s price, while Merchant B’s algorithm priced the book at 0.9983 times that of Merchant A’s price. This continuous cycle of pricing resulted in an ‘algorithm price war’, which saw a book about flies soar to the staggering price of over US$23 million.22

Merchant A

Merchant B

Price Set

pa = pb * 1.270589

Price Set

pb = pa * 0.9983

Figure 1: Illustration of Adaptive Algorithmic Pricing

22 Olivia Solon “How A Book About Flies Came To Be Priced $24 Million On Amazon” (27 April 2011) Wired https://www.wired.com/2011/04/amazon-flies-24-million/

The renaissance period of AI has done away with the laborious task of manually generating inputs for desired and predicted outcomes through an advancement known as Machine Learning (ML). ML is a subset of AI which allows computers to learn without being explicitly programmed to do so. This made possible the ability to program algorithms capable of making predictions and decisions at a greater efficiency and accuracy.23 ML can, therefore, be viewed upon as an approach to achieve AI through the implantation of algorithms that “iteratively learn from data and experience”24 in order to determine or predict an outcome. This shifts the task of optimisation (i.e. the weighting of variables in the data) away from the programmer to the algorithm.25 It can also allow for the specification of input considerations to be shifted too.

ML still operates through the implementation of algorithmic inputs, however the way in which the logical rule is derived is different from the adaptive algorithm approach explored above. ML derives the logical rule through a process known as ‘feature engineering’ whereby programmers extract the relevant key ‘features’26 from a set of raw data off of which the underlying problem is then solved by the algorithm.27 In simple terms, a programmer will show the algorithm a series of scenarios where the output is known. The learning algorithm can then take note of key differences between its own predications and the shown correct outputs and weight the quantitative impact of the features until it is able to optimise the outputs given a combination of raw data inputs. It is then possible for programmers to observe how to algorithm has learned to optimise the data and determine the process by which their prediction or decision has been made.

It logically follows that the quality of ML predictions improves with experience.28 In contrast, adaptive algorithms are merely capable of improving their predictions by acquiring additional data, but they cannot adapt their processes to acquire additional knowledge. ML algorithms on the other hand are able to experiment with an array of strategies, even those that are sub-optimal per their existing knowledge, in order to learn which strategy will optimise an output in the long run based on current market conditions.29

23 OECD, above n 4, at 9.

24 At 9.

25 Kelnar, above n 11.

26 A ‘feature’ is input off of which an algorithm can base its predictions on the raw data upon.

27 OECD, above n 4, at 9.

28 Kelnar, above n 11.

29 Emilio Calvano, Giacomo Calzolari, Vincenzo Denicolò and Sergio Pastorello “Algorithmic Pricing: What Implications for Competition Policy?” (2019) 55 Review of Industrial Organization 155 at 6.

Probability Weightings Based on Experience
Feature A
x%
y%
Raw Input Data
Feature B
Optimised Output
z%
Feature C

2020_3801.png

Figure 2: Hypothetical Example of Machine Learning Algorithm

In Figure 2 above we can see how a hypothetical ML algorithm may function. Notice that there have been no criteria set by the programmer in regard to estimation techniques or optimised solving strategy – the algorithm is able to compute all this on its own. The programmer has simply chosen which features the output is conditional upon.30

ML can be categorised into three broad categories of learning styles: supervised learning, unsupervised learning and reinforcement learning. Supervised learning is the most simplistic: the algorithm uses a database of manually labelled data to derive the logical rule.31 Unsupervised learning algorithms are programmed so that they attempt to identify hidden structures and patterns in the unlabeled database. Essentially the algorithm attempts to categorise data based on its likeness with the rest of the unlabeled data, allowing anomalies to be detected.32 Finally, reinforcement learning utilises trial-and-error to perform a task in a dynamic environment to solve for the most optimal route to achieve a specified goal.3334

30 Further details programmed, though not outlined in the simple depiction in Figure 2, may include how often an algorithm should experiment with various strategies and the weight given to the most recent experience relative to combined total experience. Calvano and others, above, n 20 at 6.

31 For example, pictures of dogs are shown to the algorithm and labelled ‘dogs’. The algorithm will then be able to identify subsequent inputs of dog pictures. This is useful for facial recognition functions and filtering systems such as junk email inboxes.

32 This is useful for activities such as flagging unusual buying behaviour on credit cards and creating suggestions items when online shopping based on past searches or purchases.

33 This is the basis on which technology such as self-driving cars or computer games operates. Each time the reinforcement learning algorithm achieves its goal, the strategy will be used to achieve the same more efficiently in further attempts, and thus with each attempt improving its success and efficiency.

34 Though too speculative to consider in this research, improvements in quantum computing would significantly radicalise how reinforcement machine learning algorithms work. Instead of a time-consuming trial-and-error approach in which each scenario is tested until an optimal solution is found, all possible outcomes can be tested simultaneously as the basic units of information on which a quantum computer operates, called a qubit, can exist in all states at any one time (a phenomenon known as ‘superposition’) allowing for the manipulation of all possible

While powerful, ML is limited in the way the process raw input data due to need to conduct feature engineering.35 Feature engineering takes considerable effort to identify and construct the features as it is a manual, and thus expensive and time-consuming, process.36

Subsequently, a subset of ML, Deep Learning (DL) was developed in the early 2000’s that allows for feature extraction to be performed automatically.37 This breakthrough was achieved by a simple notion: model the brain, instead of the world. Computer scientists have managed to create artificial neural networks (made from software-based calculators connected together, with the output of one neuron acting as the input of another) that replicate the activity of human neurons (see Figure 3 below). The neural network is able autonomously extract features from the raw input data and assign weightings to each feature by processing the raw input data many times through the networks many layers.38 As the network processes the data and identifies

Figure 3: Overview of Neural Network in Deep Learning Algorithms

2020_3802.png

outcomes to be manipulated at the same time (otherwise called ‘entanglement’). This would transform ML as it would allow for far more complicated algorithms to be run due to the increased computational power. See: Martin Giles “Explainer: What is a quantum computer?” (29 January 2019) MIT Technology Review https://www.technologyreview.com/2019/01/29/66141/what-is-quantum-computing/ and DF Seun “Quantum Computing: The Real Game Changer” (30 November 2018) Medium https://medium.com/the-andela- way/quantum-computing-the-real-game-changer-e60010d77fe4.

35 Yann LeCunn, Yoshua Bengio and Geoffrey Hinton “Deep learning” (2015) 521 Nature 436 at 436.

36 OECD, above n 4, at 9.

37 Above, n 35, at 436.

38 Kelnar, above n 11.

incorrect output, the connections between the neurons shift to adapt how it functions to the new knowledge.

Raw Input Data

Neural networks require large quantities of data to operate effectively, which mean they only best serve application to complex problems where vast amount of data is obtainable. Obtaining such quantities of data, however it is becoming increasingly easy to access as we see an increase in Big Data39 and Data Mining40, and thus the scenarios in which DL can be utilised expand as the commodity that is data grows.41

Features

Machine Learning Algorithm

Feature Extractor

2020_3803.png

2020_3803.png

Optimised Output

Machine Learning Flow

Raw Input Data

Deep Learning Algorithm (‘The Black Box’)

Deep Learning Flow

Optimised Output

Figure 4: Machine Learning vs Deep Learning Algorithm Flow

42

DL algorithms do not operate on a linear module in the same way their ML counterparts do. Instead, they are comprised of non-linear modules on multiple hierarchical levels based on complexity and abstraction.43 While a linear process ensures easy in ascertaining how an algorithm reaches a conclusion, the autonomy of a non-linear DL algorithm complicates algorithmic decision-making transparency as there is no observable feature extraction step, hence the term ‘the black box’ (as depicted above in Figure 4) being used to describe the

39 Though there is no consensus of the precise definition of Big Data, one study proposes the definition as being an “information asset characterized by such a High Volume, Velocity and Variety to require specific Technology and Analytical Methods for its transformation into Value”. This typically correlates with the data being used for commercial purposes. Andrea De Mauro, Marco Greco and Michele Grimaldi “A Formal Definition of Big Data Based on its Essential Features” (2015) 65 Library Review 122 at 127.

40 Data Mining is a sub-concept of Big Data and can be defined as “the automated or convenient extraction of patterns representing knowledge implicitly stored or captured in large databases, data warehouses, the Web, other massive information repositories, or data streams”. Jiawei Han, Micheline Kamber and Jian Pei Data Mining: Concepts and Techniques (3rd ed, Morgan Kaufamnn Publishers, Waltham, 2012) at xxiii.

41 “Big Data: What is it and why it matters” SAS https://www.sas.com/en_nz/insights/big-data/what-is-big- data.html.

42 Adil Moujahid “A Practical Introduction to Deep Learning with Caffe and Python” (26 June 2016) Adil Moujahid http://adilmoujahid.com/posts/2016/06/introduction-deep-learning-python-caffe/.

43 LeCunn and others, above n 35, at 436.

features or information off of which DL generates outputs. It is at this level of understanding of AI that we can clearly see how such a blatant lack of transparency can generate issues in an antitrust context.

C Algorithmic Collusion

While we will be considering the more precise legal definition and its elements in the following chapters, for now it is sufficient to understand that “collusion is a joint profit maximization strategy put in place by competing firms that might harm consumers”.44 Such joint behaviour may either happen explicitly (i.e. requiring express agreement on a proposed action between two or more competitors) or tacitly (i.e. implementation of independent profit-maximising strategies without any co-ordination or express agreement between competitors). While achieving a similar outcome, the two key differences between explicit and tacit collusion are:

(1) explicit collusion requires the notion of ‘agreement’ (to put simply before in-depth analysis, a ‘meeting of the minds’) and (2) explicit collusion is per se illegal.45

With this in mind, we turn out focus to collusion that is tacit as it sits just outside the scope of most regulatory frameworks, a sort of legal loophole known as conscious parallelism.46 While not a new concept, there are particular conditions in which this phenomenon is more susceptible too, especially once the use of algorithmic pricing strategies is factored in.

Firstly, tacit collusion is particularly susceptible in concentrated markets supplying a homogenised range of goods and services. This is because is it easier for algorithms to monitor and respond to price changes and other key market conditions, which eases the facilitation of tacit collusion by increasing the predictability of competitors’ reactions and increasing the accuracy at which proactive decisions can be anticipated.47 The OECD has noted that this factor in particular has increased the volume and accessibility of available data, thus increasing market transparency overall.48

44 OECD, above n 4, at 19.

45 Contrastingly, tacit collusion is typically regarded around the world as being legal. See Benjamin Liu “Algorithmic Tacit Collusion” (2019) 25 NZBLQ 199 at 205-206.

46 Commonly understood as a pricing strategy lacking agreement between competitors in a oligopolistic market. 47Ariel Ezrachi and Maurice E Stucke “Sustainable and Unchallenged Algorithmic Tacit Collusion” (2020) 17 NW J Tech & Intell Prop 217 at 226.

48 Antonio Capobianco, Pedro Gonzaga and Anita Nyseó “Algorithms and Collusion – Background Note by the Secretariat” (paper presented to the OECD Secretariat to serve as a background note for Item 10 at the 127th Meeting of the Competition Committee, 21-23 June 2017).

Secondly, markets in which deterrents to price deviations operate effectively are also susceptible to tacit collusion.49 This is particularly so when using algorithms as they can respond in seconds, thus depriving benefits gained from implementing price deviations in the first place.50 This is somewhat related to the transparency factor discussed above, as the more transparent a market is, the faster a machine can observe and respond to price deviations.51 Therefore there is less benefit obtained by the initial firm from decreasing their prices, making the chances of tacit collusion occurring in a market more likely.52

Thirdly, the reactions of outside parties (i.e. customers and current or future competitors) to price coordination do not jeopardise the expected results.53 This therefore implies algorithmic tacit collusion will typical occur in concentrated markets in which there is: (1) no exertion of buying power; (2) a frequent volume of smaller transactions; and (3) there are high barriers to entry.54

Theoretically, algorithms can autonomously learn to collude through the DL process discussed above. If the programmed output is to maximise price, the algorithm will, through trial-and- error, learn which pricing strategies operate most effectively in the given market, and price accordingly. It is during this unobservable process inside the ‘black box’ which an algorithm may learn that collusive tendencies generate the best output (i.e. highest prices). Absent of evidence of programming for an algorithm to tend towards collusive behaviour, it is entirely impossible to prove a collusive strategy was used in setting prices. This is problematic when one conisders that the majority of online retailers in the EU employ algorithmic based pricing software.55 It is unsurprising that in 2017 the OECD noted that the use of algorithms may alters market conditions (by increasing price transparency and high-frequency trading) to the extent that price-setting strategies may operate collusively in all market structures. Due to this, the DL AI utilised by these price-setting algorithms may cause the similar outcomes as traditional cartels.56

49 Above, n 48, at 8.

50 Ezrachi above, n 47, at 227.

51 Guidelines on the assessment of horizontal mergers under the Council Regulation on the control of concentrations between undertakings (Official Journal of the European Union, C 31/5, 5 February 2004) at [51]. 52 Above, n 47, note 34 at 227.

53 Above, n 51, note 41.

54 Above, n 47, at 228.

55 Report from the Commission to the Council and the European Parliament Final Report on the E-commerce Sector Inquiry (European Commission, SWD(2017) 154 final, 10 May 2017) at [13].

56 OECD above, n 48, at 49-50.

Until very recently there was wide skepticism as to whether algorithmic tacit collusion was just a theory and remained impossible in practice.57 Coupled with the fact that there has not been of yet any observed cases of such pricing strategies,58 the passive regulatory response to the potential threat remained somewhat understandable. In fact, markets in which algorithmic pricing strategies are widely used have declined over the past few decades, which was argued to draw the conclusion that machines were not colluding.59

However, new research has proven that tacit collusion by machines is in fact plausible and suggests that algorithms learn to charge supra-competitive prices, without explicit communication.60 The research further suggests that algorithms can learn to do so better than their human counterparts. Finally, while increased competition in the marketplace was found to decrease the severity of collusion in the market, considerable collusion still was observed when there are three or four active firms in a market, when they are asymmetric and when they exist in a stochastic environment.61

D Forms of Algorithmic Collusion

Competition law Professors Ariel Ezrachi and Maurice E Stucke have, in their research on AI and collusion,62 outlined several scenarios in which one can describe algorithmic collusion. Each variation of algorithmic collusion outlined increases in design complexity, and therefore differs in its requirements for monitoring and regulation. This section will briefly summarise the aforementioned work and the noted difficulties in regulating the various forms of algorithmic collusion.

Firstly, in the “Messenger Scenario”63 we can observe collusive price-setting algorithms in their most simplistic form as they require deliberate human involvement and consensus on price-fixing, while the algorithm acts as the messenger between the colluding parties to assist

57 Alistair Lindsay “Do We Need to Prevent Pricing Algorithms Cooking Up Markets” (2017) 38(12) ECLR 533; Ai Deng “When Machines Learn to Collude: Lessons from a Recent Research Study on Artificial Intelligence” (5 September 2017) Social Science Research Network www.ssrn.com.

58 OECD, above n 48, at 33.

59 Ashwin Ittoo and Nicholas Petit “Algorithmic Pricing Agents and Tacit Collusion: A Technological Perspective” (3 October 2017) Social Science Research Network www.ssrn.com.

60 Emilio Calvano, Giacomo Calzolari, Vincenzo Denicolò and Sergio Pastorello “Artificial Intelligence, Algorithmic Pricing and Collusion” (April 2019) Social Science Research Network www.ssrn.com at 3.

61 At 4.

62 Ariel Ezrachi and Maurice E Stuck “Artificial Intelligence & Collusion: When Computers Inhibit Competition” (2017) Univ. Ill. Law Rev. 1775.

63 At 1782.

in the implementation and monitoring of the collusive agreement. While in this scenario the AI certainly assists in collusive behaviour, it is not conceptually different to traditional forms of collusion, and thus can be enforced in much the same way.

Secondly, the “Hub and Spoke”64 scenario involves collusion between a cluster of various market operators (the ‘spoke’) via a third party (the ‘hub’) in a vertical relationship using an algorithm.65 Antitrust action for this form of collusion in more challenging as a defence under s 31 is likely if the collaborative activity is not for the dominant purpose of substantially lessening competition.

Thirdly, in the “Predictable Agent”66 scenario computers are used unilaterally to predict certain outcomes, and given this, react to market conditions in a predictable manner. Though there is no human agreement to fix prices, the algorithms are able to mirror market prices and detect any price deviations. The algorithms may either learn to detect and punish such deviations that take the form of price cutting, thus deterring such pricing which may lead to supra-competitive prices. Alternatively, the increased transparency within the market may result in parallel conduct increasing across the market, thus achieving a form of tacit collusion through conscious parallelism. The difficulty in this scenario is there has been no form of agreement or understanding to these pricing methods as conduct within the competitors is unilateral. It is, of course, completely rational that a firm would try to maximise profits through price-setting algorithms.

Fourthly, the “Digital Eye”67 scenario is similar to the predictable agent in that algorithms are designed to achieve a target such as maximising profit. However, through the use of DL the computers are able to use trial-and-error tactics to optimise profits autonomously. With the immense power that DL holds, the digital eye is able to monitor the market continuously and

64 Ezrachi, above n 62, at 1782.

65 A good example of how this form of pricing operates in an actual market is ride-share apps, such as Uber. The sophisticated pricing algorithm for ride-share apps automatically provides a price for a potential customer without negotiation between the customer and driver occurring with the fare being paid directly to Uber, completely removing the possibility for drivers to compete on price for the same customer. There is some debate as to whether the net benefits of this form of pricing are actually negative, as some would argue that the lower prices on apps and the ease of use of the platform outweigh any possible antitrust issues. See: Julian Nowag “When sharing platforms fix sellers’ prices” (2018) 6 J. Antitrust Enorc 382.

66 Above, n 62, at 1783.

67 At 1783-1784.

process vast quantities of data, thus achieving a “God-like view of the marketplace”.68 Digital eye algorithms can coordinate tacitly through the computers own self-learning due to experimentation undertaken within the ‘black box’ and not through any human involvement. This raises huge difficulties in whether using these practices can practically fit within the scope of antitrust law, or whether it can escape prosecution completely.

E The Oligopoly Problem

Finally, before undertaking analysis of the legal implications of AI and its facilitation of tacit collusion, it is prudent to end this chapter with some analysis of the ‘oligopoly problem’69 as from this traditional economic theory we can begin to unravel whether it is possible, or indeed desirable, to regulate tacit collusion by machines at all.

An oligopolistic market is one in which few firms compete with another. This typically means that competitors are able to closely examine the actions of each other. Thus, the oligopoly problem theory proposes that competitors in an oligopolistic market must be interdependent. In other words, market competitors pricing strategies are adapted to achieve a stable non- competitive environment absent of incentives to compete with one another. As a result, and without the existence of an agreement between the market competitors, any degree of competitive pricing has now been replaced with an oligopolistic price which generates supra- competitive profits. The interdependence of this theoretical oligopolistic market is somewhat similar to a market that may possibly be achieved through tacit collusion by AI, and thus analysis of this traditional scenario may provide a good framework for the regulation of its 21st counterpart.

Of course, the reason why this theory is so intriguing is because calls into question whether it is sagacious to regulate against its potential occurrence given that there is, in the traditional sense, no observable form agreement by which competition agencies can horizontally restrain. Unfortunately, competition policy from across the globe has provided limited solution as the optimal market conditions under which a true oligopolistic market70 could operate in this way

68 Ariel Ezrachi and Maurice E Stucke Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy (Harvard University Press, Cambridge, 2016), ch 8.

69 Most commonly attributed to American jurist and economist Richard Posner. Richard A Posner “Oligopoly and the Antitrust Laws: A Suggested Approach” (1968) 21 Stanford Law Rev. 1562.

70 E.g. Ability to effectively retaliate to price deviations by competitors, highly concentrated markets with high barriers to entry and with homogenous product ranges. Marc Ivaldi, Bruno Jullien, Patrick Rey, Paul Seabright and Jean Torole “The Economics of Tacit Collusion” (report for DG Competition, European Commission, March 2003) at 5.

are rarely observed.71 However, academic research has noted that algorithms may achieve collusive outcomes better than humans as they can more accurately detect changes in prices or market conditions, are totally rational and reduce the chances of mistake in any collusive strategy.72

Therefore, there is reason to argue that the increased use of AI in increasingly digital markets could broaden the circumstances in which the oligopoly problem could extend, thus reinstating a need to consider viable solutions.73

71 OECD, above n 4, at 35.

72 Salil Mehra “Antitrust and the Robo-Seller: Competition in the Time of Algorithms” (2015) 100 Minn. L. Rev. 1323 at 1352.

73 OECD, above n 4, at 54. AI may increase both the internal and external stability of markets, allowing incumbents in a market to instantaneously target new entrants from permeating entry into a given market, removing the threat of new entrants.

“[T]acit collusion, sometimes called oligopolistic price coordination or conscious parallelism, describes the process, not in itself unlawful, by which firms in a concentrated market might in effect share monopoly power, setting their prices at a profit-maximising supracompetitive level by recognizing their shared economic interests and their interdependence with respect to price and output decisions and subsequently unilaterally set their prices above the competitive level”74

74 Brooke Group Ltd v Brown & Williamson Tobacco Corp [1993] USSC 105; 509 US 209 (1993).

Chapter II Antitrust Law in New Zealand

A New Zealand’s Per Se Ban on Cartel Conduct

New Zealand’s antitrust law is set out in the Commerce Act 1986 (the Act). The Act’s purpose is to promote competition in markets for the long-term benefit of consumers within New Zealand.75 Introduced to replace the Commerce Act 1975 to align New Zealand’s legislation with Australia’s then Trade Practices Act 197476 (subsequently renamed Competition and Consumer Act 2010) the newly formatted Act achieved a more simplified approach to antitrust law. Significant changes in the redrafting of the 1986 Act included per se bans77 on cartel conduct, which had previously not been the case. This Act then saw significant reforms in 2017 following the passing of the Commerce (Cartels and Other Matters) Amendment Act 2017 which extended a per se bans on cartel conduct78 from just price fixing to restrictive output and market allocation provisions too. It also created new exemptions for cartel conduct on which the Commission could grant clearance to parties upon reviewing their application.

Part 2 of the Commerce Act deals with the regulation of “Restricted trade practices” with the key provision, s 27, dealing with anticompetitive arrangements in general.79 More specifically sections 30 to 34 outline the per se ban on “cartel provisions”. Prior to the 2017 amendment, ss 27 and 30 operated in tandem to enforce bans on price fixing,80 however now this is not the case. Section 30 is now a standalone provision which does not require s 27 to be satisfied in order to enforce penalties. However, it is generally understood that cartel conduct under s 30 will weaken competitive relationships and thus satisfy the substantially lessening competition test under s 27. 81

75 Commerce Act, s 1A.

76 With this legislation being modelled on the United States’ Sherman Antitrust Act of 1890 (‘the Sherman Act’). 77 While the Act does not explicitly state the bans are per se, we can infer cartel conduct is in fact per se banned due to the fact that the Act does not require evidence of harm to competition. See Chris Noonan Competition Law in New Zealand (online looseleaf ed, Thomson Reuters) at [14.C.6.1.2].

78 A provision that does not require proof of harm to competition is considered to constitute a per se ban.

79 This section is deemed to apply to both horizontal and vertical relationships. See Noonan, above n 79.

80 If it was found that parties had entered into or given effect to provisions of contracts, arrangements, or understanding that had the purpose or effect or likely effect of fixing prices (thus satisfying s 30), then it could be said that conduct had the purpose, effect or likely effect of substantially lessening competition (thus satisfying s 27 and allowing for penalties to be applied).

81 See Commerce Commission v Taylor Preston Ltd [1998] 3 NZLR 498 at 509.

B Section 30

Section 30 prohibits a person from entering into a contract or arrangement, or arriving at an understanding, that contains a cartel provision, or prohibits a person from giving effect to a cartel provision. Section 30A(1) defines a cartel provision as a provision that has the purpose, effect, or likely effect of price fixing, restricting output or market allocation. Price fixing is defined under subs (2):

(2) In this Act, price fixing means, as between the parties to a contract, arrangement, or understanding, fixing, controlling, or maintaining, or providing for the fixing, controlling, or maintaining of,–

(a) the price of goods or services that any 2 or more parties to the contract, arrangement or understanding supply or acquire in competition with each other; or

(b) any discount, allowance, rebate or credit in relation to goods or services that any 2 or more parties to the contract, arrangement, or understanding supply or acquire in competition with each other.

It is worth noting that s 30C of the Act makes cartel provisions unenforceable.82

While some of the key elements find their definition in statute, others do not. Therefore, I will explore each element in more detail to gain a better understanding of how and why difficulties arise with the current drafting in relation to algorithmic collusion.

C Enter into, arrive at, give effect to

The phrasing of “entering into... or arriving at” and “giving effect to” in the New Zealand Act are borrowed from the s 45(2) of the Trade Practices Act 1974 (Cth) and is defined similarly. The arriving at provision, defined by s 2(1) of the Commerce Act, includes the notions of reaching and entering into. Because the definition of this provision includes the idea of entering into, it is considered that this term was “intended to expand the means whereby an understanding could be reached”.83 It can therefore be implied that to arrive at something

82 Section 27(4) also voids enforceability for provisions which satisfy subs (1) and (2).

83 Trade Practices Commission v Nicholas Enterprises Pty Ltd (1979) ATPR 40-126 (FA) at 18,346 – 18,347 per Fisher J on the equivalent Australian provision with the same wording.

constitutes that such a something exists at the end of process (not a single discrete act) of the contract, arrangement, or understanding.84

“Giving effect to” is defined by s 2 of the Commerce Act to includes doing “an act or thing in pursuance of or in accordance with [a cartel] provision or seeking to “enforce or purport to enforce [a cartel] provision”. In contrast with the arriving at or entering into provisions, to give effect to something can convey a process or a single discrete act.85 It is also likely that due to the phrasing “in pursuance of or in accordance with” this term requires something more than compliance or implementation, but rather an active effort to achieve the terms of the provision.86 It may also require some knowledge of such a provision on behalf of the person assuming liability under s 30.87 This is particularly important because while s 30 requires that “no person” shall give effect to a cartel provision, this does not imply that such a person need to be a party to the contract, arrangement or understanding.88

D Contract, arrangement or understanding

For the purposes of this paper and the discussion that follows, the most important definition to gain an in-depth comprehension into the “spectrum of dealing”89 that exists in the statutory terms of “contract, arrangement or understanding”. While the definition of contract retains its traditional common law meaning of “a legally enforceable agreement”90 the other two definitions are somewhat trickier as they are less formally identifiable and there has been no definitive distinction between the two terms.91 The term “arrangement” is less formal than a

84 Noonan, above n 77, at [14.C.5.2.1].

85 Bray v F Hoffman-La Roche Ltd [2002] FCA 243, (2002) 118 FCR 1 at [158]- [161].

86 Tradestock Pty Ltd v TNT (Management) Pty Ltd (1997) ATPR 40-056 (FCA) at 17,571. Under this meaning, coordination of bids, communication of cartel policy to an employee of a subsidiary or the sale or supply of goods to a third party in accordance with the terms of a cartel may also be considered collusive activity. See Noonan, above n 77, at [14.C.5.2.1].

87 Australian Competition and Consumer Commission v IPM Operations and Maintenance Loy Yang Pty Ltd

[2006] FCA 1777 at [222].

88 However, s 90 will still create liability for such persons. This will be discussed more in depth below.

89 Australian Competition and Consumer Commission v Leahy Petroleum Pty Ltd [2007] FCA 794, (2007) ATPR 42-162 at [24] per Gray J.

90 Commerce Commission v The Wellington Branch of the NZ Institute of Driving Instructors (1990) 3 NZBLC 101,913 (HC) at 101,917; Commerce Commission v Carter Holt Harvey Building Products Ltd [2000] NZHC 1220; (2000) 9 TCLR 535 (HC) at 555. The definition of contract also includes a lease of, and licence in respect of, land or an interest in land per Commerce Act 1986 s 2(6).

91 Some courts have suggested there may be little substantive difference between an arrangement and understanding. See: Auckland Regional Authority v Mutual Rental Cars (Auckland Airport) Ltd [1987] NZHC 213; [1987] 2 NZLR 647 (HC) at 662; NZ Institute of Driving Instructors, above n 92; Hughes v Western Australian Cricket Assoc (Inc) [1986] FCA 357; [1986] ATPR 40-736 (FCA) at 48,040.

contract in that it is “lacking some of the essential elements that would make a contract”.92 The definition of “understanding” is less formal again.93

In regard to the definition of arrangement, New Zealand initially found preliminary reliance upon the British interpretation outlined in Re Basic British Slag.94 This was cited authoritatively by Diplock LJ in Auckland Regional Authority v Mutual Rental Cars (Auckland Airport Ltd95 with the Diplock LJ stating that it is:96

...sufficient to constitute an ‘arrangement’ between A and B, if (i) A makes a general representation as to his future conduct with the expectation and intention that such conduct on A’s part will operate as an inducement for B to act in a particular way; (ii) such representation is communicated to B, who had knowledge that A so expected and intended; and (iii) such representation or A’s conduct in fulfilment of it operates as an inducement, whether among other inducements or not, to B to act in that particular way.

There has been some disagreement as to whether this test represents a complete enough definition given that it does not discuss other elements, such as mutuality and obligation, which are generally regarded as being key elements in finding whether there has been an arrangement or understanding formed.97 Nor does this test require any communication from B to A, or require B to feel any compulsion to act in the specified way.

The Privy Council in New Zealand Apple and Pear Marketing Board v Apple Fields Ltd98 then held that arrangement “involves no more than a meeting of the minds between two or more persons not amounting to a formal contract, by leading to an agreed course of action”.99

New Zealand began to shift from the Re Basic British Slag style test in an effort to address issues that arose in relation to conscious parallelism and how it should be distinguished from

92 Re Wellington Fencing Materials Assoc [1960] NZLR 1121 (HC) at 1130.

93 Leahy Petroleum, n 89, at [26] and [27].

94 Re British Slag [1962] 2 All ER 807 (CA) at 819. Note the wording of the equivalent British provision only includes the terms “agreement” and “arrangement”.

95 Mutual Rental Cars, above n 91.

96 At 662.

97 Above, n 94, at 819 per Wilmer LJ.

98 New Zealand Apple and Pear Marketing Board v Apple Fields Ltd [1991] 1 NZLR 257.

99 At 216.

collusion. In Giltrap City Ltd v Commerce Commission100 Tipping J summarised the term agreement or understanding as requiring:101

...a consensus between those said to have entered into the arrangement. Their minds must have met – they must have agreed – on the subject-matter. The consensus must engender an expectation that at least one person will act or refrain from acting in the manner the consensus envisages. In other words, there must be an expectation that the consensus will be implemented in accordance with its terms.

Tipping J’s definition moved away from notions of legal or moral obligation set out in Re Basic British Slag102:103

While the concept of moral obligation is helpful in that it will often reflect the effect of an arrangement or understanding under s 27, the flexible purpose of the section is such that it is best to focus the ultimate inquiry on the concepts of consensus and expectation. A finding that there was a consensus giving rise to an expectation that the parties would act in a certain way necessarily involves communication among the parties of the assumption of a moral obligation.

Tipping J set out that establishing whether the necessary consensus existed is to be judged from the perspective of what the reasonable person would infer from the conduct of the person (or persons) involved in the consensus.104 While Tipping J did not state how communication may necessarily be achieved, other case law suggests that the courts may infer necessary communication by inference due to the difficulty in providing evidence of such communications.105

However, McGrath J was reluctant to reject the need for mutuality of moral obligation between the parties stating that this obligation “should remain and important touchstone for determining whether there is an arrangement or understanding”.106 This requisite feature of an arrangement or understanding allowed for a “clear distinction” between conscious parallelism and collusive

100 Giltrap City Ltd v Commerce Commission [2004] 1 NZLR 608 (CA).

101 At [17].

102 Above, n 94, at 746-747.

103 Above, n 100, at [15].

104 At [23].

105 Above, n 91, at 662.

106 Above, n 100, at [66].

conduct.107 Without such an obligation there was “mere hope” or “anticipation” of the desired outcome which cannot constitute collusive conduct.108

The outcome in Giltrap Motors and the divide in the requirement for moral obligation was considered in the judgement for the now leading New Zealand case, Lodge Real Estate Ltd v Commerce Commission.109 In this case the Supreme Court summarised to definition of arrangement and understanding as whether there is an objective assessment110 of:111

... a consensus or meeting of minds among competitors involving a commitment from one or more of them to act (or refrain from acting) in a certain way, that will constitute an arrangement (or understanding). The commitment does not need to be legally binding but must be such that it gives rise to an expectation on the part of the other parties that those who made the commitment will act or refrain from acting in the manner the consensus envisages.

In setting out this definition they abstained from differentiating between arrangement and understanding as there had been no submission to the Supreme Court or lower courts of any key differences as they pertained to the case. However, they did suggest that an understanding may have a “less restrictive meaning that arrangement” but elected not to discuss the matter further.112

In shutting down the argument that the Giltrap judgement could potentially capture conscious parallelism within the scope of s 27 the Court held that Tipping J’s judgement is to be interpreted to mean “that there will be no arrangement unless the expectation that arises from the consensus is such that it can be inferred that the parties to the consensus have assumed a moral obligations to each other”.113 If the parties were making decisions to act in a certain way and forming an expectation114 independently to act in such a way there would be no

107 Giltrap City, above n 101 at [67].

108 At [68].

109 Lodge Real Estate v Commerce Commission [2020] NZSC 25.

110 At [50]. The Supreme Court also noted that subjective intent will be of little assistance to a defendant where, objectively, there are strong arguments in favour of an arrangement or understanding being entered into.

111 At [58].

112 At [30].

113 At [54]. In Giltrap City McGrath J stated “It is no part of the policy of s 27 [and ergo s 30] to catch... conscious parallelism” at [67].

114 Expectation is to be read as meaning to arise from the consensus based on a commitment made between the

parties. This was deemed consistent with the British Basic Slag approach observed by Wilmer LJ at 739. At [54].

arrangement formed.115 The Supreme Court, however, did prefer to replace the phrase “assumed a moral obligation with “made a commitment” as they did not believe introducing morality was necessary116 and therefore no moral obligation has to be separately proven.117

Finally, ss 80(1)(b) and (d) creates inchoate liability for cartel provisions. Through these sections, the courts can, following application by the Commission, impose pecuniary penalties on persons for attempting to contravene, or has induced or attempted to induce any other person, whether by threats or promises or otherwise, to contravene any provisions of pt 2 of the Act. Injunctive relieve for similar inchoate actions may also be granted via s 81.

E Provision

A provision, with reference to an understanding or arrangement, is defined by s 2(1) as “any matter forming part of or relating to the understanding or arrangement”. Australian courts have observed that the word is used comprehensively instead of in a technical contract sense. Attention should be paid to the content of the contract, arrangement or understanding rather than the exact form of expression of that content.118

More definitively, a provision must contain content which expresses for something to, or not to, occur with enough substance so as to be capable of having the relevant anticompetitive purpose or effect.119 Section 89 is statutory affirmative of this idea as it provides that provisions which contravene the Act do not affect the other provisions of a contract, arrangement or understanding.120

115 Re British Slag, above n 94, at [57].

116 Lodge Real Estate, above n 109, at [54]. This was because concerning the assessment of whether there had been an arrangement or understanding with issues or morality lead to unpredictable outcomes in a commercial context. Though the existence of a moral obligation may be taken into account, the objective focus on consensus and expectation was to be preferred (at [67]).

117 At [55]. This finding was in line with Australian case law: at [63]. Although McGrath J had stated a moral obligation was an “important touchstone” for determining whether there had been an arrangement or understanding, this did not mean he was suggested that this be a determinative element: at [67].

118 Visa Paper Pty Ltd v Australian Competition and Consumer Commission [2003] HCA 59, (2003) 216 CLR 1

at [7] and [32].

119 Leahy Petroleum, n 89, at [32].

120 Noonan, above n 77, at [14.C.5.2.2].

F Purpose, effect, or likely effect

Under s 30A(1) there are three tests of legality for cartel provisions: “purpose”, “effect” or “likely effect” of price fixing.121 While there is considerable judicial and academic debate surrounding the appropriate definition of purpose122 and whether it is to be assessed subjectively or objectively,123 the extent to which this discussion might be relevant in application to AI is limited. To extend discussion in this part in that direction would be to go off tangent. As discussed in the prior chapter, the use of DL and the way in which the ‘black box’ operates means that it is impossible to fully understand the decision-making process, and thus the purpose of a machine at the present time. For this reason, under the current legislation the relevant tests of legality for price fixing by machines would be the “effect” or “likely effect” limbs.

Section 2(3) of the Act makes clear that while a provision in a contract, arrangement or understanding may not have the effect or likely effect of price-fixing when it was arrived at, entered into or given effect to, it may at a later date become unlawful. Therefore, the Act is relevant throughout the entire life of the contract, arrangement or understanding. This is particularly important in relation to price-fixing by machines because as we know they are able to learn through trial-and-error in their pricing strategies and thus their methods are always slightly adapting.

In determining whether or not a provision has the “effect” or “likely effect” the courts will undertake a “counterfactual analysis”.124 When considering whether a contract, arrangement or understanding will have the “likely effect” of “fixing, controlling, or maintaining” price, the standard is that of “...above mere possibility but not so high as more likely than not and is best expressed as a real and substantial risk that the stated consequence will happen”.125 This will

121 This is similar from s 27 off which most of the case law is discussed. However, the definitions are equally applicable to this section as well. See Ophthalmological Society of New Zealand v Bolton [2001] NZCA 296 at [116].

122 See Commerce Act, s 2(5). For more discussion on this debate see: Rex Ahdar The Evoluation of Competition Law in New Zealand (Oxford University Press, Oxford, 2020) at 97 - 100; and Paul Scott “The Purpose of Substantially Lessening Competition: The Divergence of New Zealand and Australian Law” (2011) Waikato Law Rev 168.

123 Australian case law has appeared to confirm the adoption of a subjective test for purpose. Contrastingly, New Zealand has appeared to take an objective approach. See Ahdar, above, n 122, at 97.

124 Essentially the weighing up of the pro and anticompetitive effects of a provision; i.e. the court will conduct a hypothetical test of the relevant market with and without the disputed provision. See ANZCO Foods Limited v AFFCO NZ Limited [2005] NZCA 166; [2006] 3 NZLR 351 at [245]. This is similar to the “rule of reason” analysis used in the courts in the United States in the assessment of agreements under s 1 of the Sherman Act. See Scott, above n 122, at 170. 125 Port Nelson Ltd v Commerce Commission [1996] NZCA 230; [1996] 3 NZLR 554 (CA) at 562.

normally pertain to future conduct, however past conduct (even no such price-fixing eventuated) is not beyond consideration. However, the benefit of hindsight will usually be applied by the courts so as to render any contract, arrangement or understanding which did not have the actual effect of price-fixing not captured by the “likely effect”.126

G Fixing, controlling, or maintaining

The Act does not provide a definition for the phrase “fixing, controlling, or maintaining”. We therefore must turn to case law to ascertain its meaning. The courts have “tended to focus on the constituent words of the phrase”.127

Prices may be fixed by forming an agreement as to the final figure of price or a method or formula in which the final price is derived. Such a price does not need to be permanently fixed, however it must be for period that is not instantaneous or ephemeral.128 However, there must be a degree of “proximity” between the price and the disputed arrangement or understanding. The fact that there is some discretion as to the overall price does not mean that it cannot be have requisite, purpose, effect or likely effect.129 Collusion on a starting price, off of which negotiations may be based, is enough.130 To hold otherwise would facilitate the avoidance of s 30, which would be contrary to the sections purpose.131 Nor does there need to be any penalty for no adhering to the agreed upon figure or price-setting method132 or be “absolute in its requirement for adherence”.133

Prices are said to maintained if “a price assumes it has been fixed beforehand”.134 Australian case law suggests that the scope of the definition of “controlling” is characterised by a

126 Extraneous events or factors which intervened with the expected outcome may justify a different result.

127 Tony Dellow and Anna Parker Commercial Law in New Zealand (online ed, LexisNexis) at [35.4.2]. See also Radio 2 UE Sydney Pty Ltd v Stereo FM Pty Ltd [1942] ArgusLawRp 52; (1983) 48 ALR 361, (1983) 68 FLR 70 where it was stated at 72 that “In our view the word “fixing” is s 45A [of the FTA; the old equivalent Australian provision to s 30] takes colour from its general context and from the words used with it – “controlling or maintaining”...”.

128 See: Commerce Commission v Caltex New Zealand Ltd [1998] 2 NZLR 78 (HC) at 84-85; Radio 2 UE, above n 127, at 449; Trade Practices Commission v Parkfield Operations Pty Ltd [1985] FCA 27; (1985) 5 FCR 140 at 143; approved

on appeal [1985] FCA 403; (1985) 7 FCR 534 (FCAFC) at 540; Re Insurance Council of New Zealand (Inc) (1989) 2 NZBLC

(Com) 104,477 at 104,482.

129 Commerce Commission v Lodge Real Estate [2018] NZCA 523 at [83] and [90].

130 At [87]-[88].

131 At [91].

132 Federal Trade Commission v Pacific States Paper Trade Association [1927] USSC 8; 273 US 52 (1927).

133 Above, n 129, at [87].

134 Radio 2 UE, above n 127, at 449. This interpretation was questioned on the basis of its correctness in Australian Competition & Consumer Commission CC (New South Wales) Pty Ltd [1999] FCA 954; (1999) 165 ALR 468 at 498. Lindgren J argued that the contrary could be true in that there may be an agreement or understanding to agree on a “component sufficiently influential on price” to be included in a tender prior to the fixing of a minimum price level.

“restrain[t of] freedom that would otherwise exist as to a price to be charged”.135 A similar approach has been taken in New Zealand with Salon J in Commerce Commission v Caltex NZ Ltd136 noting that while the meaning of control relied on the meaning of “fixing” and “maintaining”, it should take on its ordinary meaning of “to exercise restraint or direction upon the free action of”. Further, the degree of control may be relevant to the courts’ discretion of the appropriate penalty, but not the actual contravention of the provision.137

H Price

Price includes, by the definition set out in s 2(1), direct or indirect valuable consideration in any form.138 In the context of price fixing, s 30 prohibits a contract, arrangement or understanding only if it has the requisite purpose, effect or likely effect of fixing, controlling or maintaining price. It is therefore not necessary that price is the specific subject of such contract, arrangement or understanding.139

The definition of price fixing in s 30A(2) also includes the concepts of discount, allowance, rebate or credit as coming within the ambit of the valuable consideration element of price. There has been a considerable amount of case law discussing the relationship between these concepts as it relates to price, however for the purpose of this dissertation it is not necessary to go further in depth on this point.140

I Section 90

Section 90 of the Commerce Act creates vicarious liability through attribution of a state of mind141 in relation to a contravention of the Act in four instances:

135 CC (NSW) Pty Ltd, above n 134, at 504. Applied in Australian Competition & Consumer Commission v Pauls Ltd [2002] FCA 1586; (2003) ATPR 41-911 and Australian Competition & Consumer Commission v Australian Medical Association Western Australia Branch Inc [2003] FCA 686; (2003) 199 ALR 423, (2003) ATPR 41-945 at 47,26. In Australian Medical Association the it was further held that the word “control” was to be read in context with “fixing and maintaining”. 136 [1998] 2 NZLR 78 (HC) at 311.

137 Above, n 136, at 507. This was accepted by the NZCA is Lodge Real Estate, above n 129, at [86].

138 This may include any consideration that in effect relates to the acquisition or supply of goods or services or the acquisition or disposition of any interest in land, although ostensibly relating to any other matter or thing.

139 Dellow and Parker, above 127, at [35.4.2].

140 For more discussion see: Noonan, above n 79, at [14.C.6.3.4]; Dellow and Parker, above n 127, at [35.4.3]. New Zealand’s leading case, Caltex New Zealand, above n 128, dealt with allegations by the Commission that the withdrawal of existing car wash promotions at retail petrol stations amounted fell within the definition of price fixing under s 30A for the purposes of s 30. The result drawing the conclusion that the price of a good or service, when dealing with allegations of price fixing, is not simply the nominal price, but a function of credit terms, discounts, allowances and rebates as well.

141 Subsection (5) states the definition of a “the state of mind of a person” will include the knowledge, intention, opinion, belief, or purpose of the person and the person’s reasons for that intention, opinion, belief, or purpose and the state of mind of a person outside New Zealand.

(1) State of mind of persons other than individuals: attribution of state of mind via a director, employee or agent acting in the scope of their actual or apparent authority.

(2) Conduct of persons other than individuals: attribution of conduct by an individual if at the time of conduct:

(a) the person was a director, employee acting within the scope of their actual or apparent authority; or

(b) the person was acting on the direction or with the consent or agreement (either empress or implied) of a person described in above in (a).

(3) State of mind of individual person: attribution of state of mind in a civil proceeding via an employee or agent of the individual acting in the scope of their actual or apparent authority.

(4) Conduct of an individual person: attribution of conduct in a civil proceeding is deemed to be the conduct of an individual if, at the time of the conduct:

(a) the person was acting on the direction, or with the consent or agreement (either express or implied) of the individual; or

(b) the person was an employee or agent of an individual acting with the scope of their actual or apparent authority.

The purpose of s 90142 was stated by McGrath J in Giltrap Motors is:143

...to better secure compliance with the Act’s purpose of promoting a competitive market by confining the scope for a company to obtain the benefit of restrictive practices prohibited by the Act, simply because they were undertaken by low-level employees without the direct knowledge of the board or senior management.

Conduct was held to fall within the scope of the actual or apparent authority of an employee if it was within the scope to something in which this person was employed to do “considered subjectively in terms of actual employment arrangements or objectively in terms of the reasonable perceptions of observer”.144

142 This was in regard to s 90 prior to its re-drafting in 2017. However, the new provision is not materially different enough to warrant this interpretation insufficient for the current drafting.

143 Above, n 100, at [77].

144 Above, n 100, at [80].

J Exceptions, Authorisation & Remedies

Sections 31 to 33 of the Commerce Act provide exceptions for cartel provisions for collaborative activity, vertical supply contracts and joint buying and promotion agreements. Further, s 58 allows persons to seek authorisation from the Commission for conduct which would otherwise contravene the restrictive trade practices provisions. The Commission will then proceed with an appropriate assessment of the public benefit test set out in s 61. However, the exception provisions already discussed will likely cover most claims that are likely to be successful in gaining authorisation.

If, however, a person is found in contravention of the cartel provisions and does not fall under one of the outlined exemptions, or have been granted authorisation by the Commission, the Commission may proceed with enforcing the Act to seek pecuniary penalties145, injunctions146 and damages147 (including exemplary damages148).

K Investigation and Detection of Cartel Conduct

The Commission, as part of their role of investigating suspected breaches of the Commerce Act, had adopted policies149 on how to use this discretionary power to open investigations and proceed with enforcement action. These documents generally follow the theory of responsive regulation, which in short suggests that compliance is best achieved by utilizing a pyramid of enforcement responses for the regulatory regime.150

The Commissions preference is to encourage compliance through non-enforcement options.151 However, where enforcement is deemed to be required, the Commission will focus their resources and attention on cases which are considered to generate the greatest “harm”.152 Enforcement responses will have one of several objectives: stopping the unlawful conduct,

145 Commerce Act, s 80.

146 Commerce Act, s 81.

147 Commerce Act, s 82.

148 Commerce Act, s 82A.

149 See: Commerce Commission “Enforcement Response Guidelines” (23 August 2017) www.comcom.govt.nz; Commerce Commission “Criminal Prosecution Guidelines” (29 November 2013) www.comcom.govt.nz; Commerce Commission “Enforcement Criteria” (3 June 2014) www.comcom.govt.nz; Commerce Commission “Model Litigant Policy” (31 May 2016) <www.comcom.govt.nz>.

150 Ian Ayres and John Braithwaite Responsive Regulation: Transcending the Deregulation Debate (Oxford

University Press, 1992); Richard Baldwin and Julia Black “Really Responsive Regulation” (2008) 71 MLR 59 as cited in Noonan, above n 77, at [14.C.16.1.3(2)], n 60.

151 “Enforcement Response Guidelines”, above n 151, at [10] and [11].

152 The Commission will consider the extent of the detriment, the seriousness of the conduct and the public interest in their determination of the extent of the “harm”. These criteria are subject to a balancing test which will determine whether enforcement is appropriate. See Noonan, above n 77, at [14.C.16.1.3(2)].

deterring future unlawful conduct by that person or others, remedying any harm caused by the unlawful conduct, punishing the wrongdoer (where appropriate), encouraging businesses to comply or providing informative public precedents.153

153 “Enforcement Response Guidelines”, above n 149, at [12] and [13].

“Pity the poor price fixer. Artificial intelligence... has the potential to replace the smoke- filled meeting rooms of cartel lore with humming, refrigerated server farms working away to collude in ways that competition enforcers might struggle to detect, much less crack down upon”154

Nicholas Hirst

154 Nicholas Hirst “When Margrethe Vestager takes antitrust battle to robots” (28 February 2018) Politico https://www.politico.eu/article/trust-busting-in-the-age-of-ai/

Chapter III An International Perspective on Algorithmic Tacit Collusion

A Concerns Over the Commerce Act 1986

While initially this part was to be nearer to the end of this paper, it has become apparent that any concerns over the Commerce Act’s ability to handle algorithmic collusion are only the beginning of a larger and far more complex discussion. The initially posed big question of “whether New Zealand’s antitrust law is capable of addressing algorithmic collusion” can be simply answered with a resounding “no”. This is for two reasons.

Firstly, s 30 presents an attribution problem. The section requires a “person” enter, arrive at or give effect to a cartel provision. Presently machines are not classified as “persons” before the law, which means there can be no attribution of their conduct or state of mind to an individual or company via s 90. While the actions and decisions of a machine that have been specifically programmed by humans to collude are easily attributable to a legal person, machines employing ML and DL algorithms to make decisions based on programming that have no collusive intentions raise many issues. In this case it has been suggested that:155

[I]t would not be appropriate to attribute the mental state of the designer or user to the company because of the disconnection between the AI system’s conduct and mental state of the human actors... nor would it be possible to attribute the mental state of the AI system to the company because an AI does not have a mental state... [because] all of its decisions are based on mathematical calculations.

Leading researcher Salil Mehra has stated that “[i]n dealing with a robo-seller that takes anticompetitive actions there are three choices in attributing responsibility: to the robo-seller itself, to the humans who deploy it, or to no one”.156 Evidently the latter is the status quo in New Zealand. Therefore, if New Zealand decide to make changes to our antitrust regime, we will need to consider which of the first two options is the most suitable and how such changes may be achieved.

Secondly, s 30 presents a “meeting of the minds” problem. The terms “contract, arrangement or understanding” all require some element of consensus ad idem, which is clearly impossible

155 Liu, n 45, at 207-208.

156 Above, n 72, at 1366.

to prove when machines are able to collude without communication with another. Even if it was possible to deduce that a machine, on its own accord, decided to employ collusive pricing strategies, it may be impossible to prohibit this behaviour under s 30 as there has been no “contract, arrangement or understanding” with another person.

It is worth noting that for similar reasons discussed above s 27 of the Commerce Act is of no additional help in the quest to handle possible cases of algorithmic collusion. Neither is s 36 as it requires the element of “purpose”, which as we know, due to the way ML works inside the “black box” it is impossible to derive the purpose of a machine employing these tactics.157

At present we can conclude that there are undeniable issues that will be faced by courts in New Zealand if a case of algorithmic collusion under the present legislation is brought before them. While we are yet to see a case in New Zealand, it is still worth considering the possible solutions to address the legislative inadequacies as they relate to algorithmic collusion. The aforementioned issues with the New Zealand framework will be discussed in the following chapter following a brief assessment of international responses to similar issues.

B Algorithms Before International Courts

With the above conclusion addressed, it is prudent to look beyond our shores to see how other jurisdictions are facing similar issues.

  1. Posters and Frames Case
In 2016 two companies were held liable under s 2(1) of the UK’s Competition Act 1998 for an “agreement with an understanding” having “as their object or effect the prevention, restriction or distortion of competition within the United Kingdom”. An agreement between Trod Limited (Trod) and GB eye Limited (GBE) detailed that “where there was no cheaper third-party seller on the online retail platform [Amazon UK], they would not undercut each other on prices”.158 This was done using separate automated pricing software. GBE’s software was desined to match Trod’s where there was no cheaper third party. Trod’s software was programmed to use an ignore function which signified GBE’s exclusion from regular pricing rules utilised to undercut other competitors.

157 It may perhaps be possible to utilise s 36B to infer the purpose of the machine as a relevant circumstance or if AI eventually gains legal personhood.

158 CMA Case 50233 Online sales of posters and frames (12 August 2016).

From a New Zealand legal perspective, this is not all that different from our traditional understanding of price-fixing as the only differing detail is that the arrangement was implemented using AI instead of manual pricing inputs. However, a critical factor in the Competition and Market Authority’s (CMA) decision was the evidence of interaction between the parties via email. This begs the question of whether, without such hard evidence, algorithmic collusion in cases like this would be so easily to detect and prosecute if two colluding firms were more careful not the leave a ‘paper trail’ of evidence.

  1. Eturas Case
The first EU case concerning pricing algorithms, Eturas and other v Lietuvos Respublikos konkurencijos taryba,159 which involved the application of an on a Lithuanian travel agency platform, E-Turas, on which various agencies sold their products. Following a request from the agencies using the platform, the algorithm was programmed to cap all discounts at 3%. The platforms internal messaging system was used to communicate with the various agencies about this pricing decisions in the form of an amendment to the platform terms and conditions.

At issue was the fact that only specific agencies who knew of the programmed discount could be in contravention of Article 101 TFEU for concerted practice. To be found liable, the Court of Justice of the European Union (CJEU) were of the view that the travel agencies must have had actual knowledge of the communication160 in order to “tacitly assent to the anti-competitive action”.161 If the travel agencies were found to have “publicly distanced themselves from the practice, reported it to administrative authorities or adduce other evidence to rebut the presumption”162 then they would avoid liability. The CJEU held that receiving the message was not by itself sufficient to presume actual knowledge, “objective and consistent” indicia were instead required.163 The CJEU left the “assessment of evidence and standard of proof” to Lithuania’s national court to make a ruling according to Lithuanian law.164 The overall result being that only travel agencies who received the message and did not oppose it were held liable.

159 Case C-74/14 Eturas and other v Lietuvos Respublikos konkurencijos taryba [2016] OJ C 98/3 ECLI:EU:C:2016:42

160 At [41].

161 At [45].

162 At [46]-[49].

163 At [40].

164 Above, n 159, at [50].

A similar occurrence in the New Zealand context, a case similar to this would likely be addressed relatively easily. The Commission’s response would likely be that the agreement to use a third-party pricing agency (regardless of the use of algorithms) would amount to cartel conduct via using s 90 to attribute of the conduct third party to the agent.

  1. Electronic Goods Manufacture Case
Another European case in 2018 involved the European Commission fining various electronic goods manufacturers for restricting online retailers from independent resale pricing in an effort to pressure them to maintain higher prices. Algorithms were used by the manufacturers to monitor resale prices and thereby adjusting supply levels and wholesale prices to those retailers who were selling at prices lower than the manufacturers desired.165 Again physical evidence of communication between the manufacturers played a large factor in the overall determination of the case.166

Of note is that the proceedings of this case were initiated following an unannounced inspection167 following the European Inquiry into e-commerce (discussed further below). Recent commentary on the case has suggested that the unannounced inspection tactic is beneficial, particularly given the covid-19 pandemics correlation with a 74% increase in e- commerce transactions compared to the same period last year.168

Another noteworthy aspect highlighted in this case is that the various manufacturers headquarters were situated in various countries across Europe and Asia, with consumers affected in countries right across the EU. This somewhat shows the ease in which global cartels can form and the global reach they can impact on in a digitised economy.

  1. Posters Case
Finally, in 2015 the United States’ Department of Justice (DOJ) filed criminal proceedings against e-commerce retailers using the platform Amazon.169 The retails had entered into an

165 European Commission “Antitrust: Commission fines consumer electronics manufacturers for fixing online resale prices” (press release, 24 July 2018).

166 Case At 40464 – ASUS at [30] – [33].

167 At [15] – [16].

168 Claudia P O’Kane and Ioannis Kokkoris “A Few Reflections on the Recent Caselaw on Algorithmic Collusion” (3 August 2020) Social Science Research Network https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3665966 at 4.

169 Department of Justice “Former E-Commerce Executive Charged with Price Fixing in the Antitrust Division’s

First Online Marketplace Prosecution” (press release, 6 April 2015).

agreement to fix the prices of posters, facilitated by harmonised algorithmic pricing software. The algorithms were programmed to undercut the lowest priced competitor not party to the agreement so that the products of the cartel members would be the nearest to the top of the search query, thus avoiding competition with each other.

Similar to the Electronic Goods Manufacture case, the outcome of this case was as a result of proactive DOJ investigation following market studies. This once again highlights the usefulness in such studies in combating collusive behaviour facilitated by AI.

C Are Regulatory Responses Looming?

It is of no surprise that as we see the ways in which collusion can be facilitated rapidly advancing and thus altering the purview of competition law that regulatory bodies, competition officials and academics are beginning to take not of the issues discussed.

Though the technologies causing these issues are novel, at its core this dilemma is of course a 21st century revival of the oligopoly problem. Except on a larger and digitised scale. Research suggests that the rise of ML algorithms using in pricing software will extrapolate this problem as its occurrence will no longer be limited to rare circumstances under particular market conditions.170 Therefore there has been a “renewed debate about whether classic oligopoly behaviour can be prosecuted as an unlawful agreement”.171

Arguably one of the main reasons that tacit collusion is currently viewed as lawful across the globe is perhaps because of a lack of remedy existing within the present legal frameworks. As Stephen Breyer J phrased the issue:172

That is not because such pricing is desirable (it is not), but because it is close to impossible to devise a judicially enforceable remedy for “independent” pricing. How does one order a firm to set its prices without regard to the likely reactions of its competitors?

170 See Jorge Lemus and Fernando Luco “Price Leadership and Uncertainty about Future Costs” (6 June 2020) Social Science Research Network https://ssrn.com/abstract=3186144 and Mehra, above n 72, at 1342.

171 George A Hay Anti-Competitive Agreements: The Meaning of ‘Agreement’ (Cornell Law School, No 13-09, 18 February 2013) at 5.

172 Clamp-all Corporation v Cast Iron Soil Pipe Institute [1988] USCA1 298; 851 F 2d 478, 848 (1st Cir 1988) (citation omitted).

These difficulties have been echoed by former Federal Trade Commission (FTC) Commissioner Terrell McSweeny, noting that due to the lack of remedy for conscious parallelism a coordinated effects theory used to predict and thus prevent tacit collusion may be more desirable than judicially enforceable remedies.173

As of now we are yet to see many decisive international responses, however a number of bodies have issued a series of warnings and conducted discussion about how such issues might be dealt with. In 2016 the European Commission acknowledged in its Preliminary Report of the E-commerce Sector Inquiry that explicit or tacit collusion may be facilitated by price monitoring software which employs algorithms similar to ones discussed in prior chapters.174 In the same year McSweeny, on behalf of the FTC, also made similar sentiments, noting that price discrimination as a result of algorithms may be a challenge in the future.175 Australian Competition and Consumer Commission (ACCC) Chairman Rod Sims also stated in 2017 that the excuse “my robot did it” would not excuse liability for algorithmic collusion.176

Most notably, European Commissioner Magrethe Vestager in her speech at the 2017 Bundeskartellamt 18th Conference on Competition alluded to the EU adopting an approach which would see companies liable for decisions of their machine agents and adopting a “compliance by design” approach as preventative action to collusion:177

What businesses can and must do is ensure antitrust compliance by design... That means pricing algorithms need to be built in a way that doesn’t allow them to collude... Businesses need to know that when they decide to use an automated system, they will be held responsible for what it does.

She further expanded on this tangent at the 2017 Web Summit indicated that it is perhaps the intention to have algorithms programmed to understand antitrust law before they are able to be used as pricing tools. The purpose of this being because “[w]e don’t want the algorithms to

173 Terrell McSweeny and Brian O’Dea “The Implications of Algorithmic Pricing for Coordinated Effects Analysis and Price Discrimination in Antitrust Enforcement” (2017) 32 Antitrust Law J 75 at 76.

174 Above, n 55, at (608).

175 Tracey McSweeny “Competition Law: Keeping Pace in a Digital Age” (keynote remarks presented to the 16th Annual Loyola Antitrust Colloqium, Chicago, 15 April 2016).

176 Rod Sims, ACCC Chair “The ACCC’s approach to colluding robots” (Can Robots Collude Conference, Sydney, 16 November 2017).

177 Magrethe Vestager Algorithms and competition (Bundeskartellamt 18th Conference on Competition, Berlin 16

March 2017).

learn the tricks of the old cartelists... We want them to play by the book when they start playing by themselves”.178

While the compliance by design approach is interesting as it negates the need to address the precarious debate on the oligopoly problem “there are currently no clear indications how it could be integrated into the already complex competition policy fabric”.179

Other bodies and officials have expressed doubt over the “compliance by design” strict liability approach favoured by the EU with the Chairman of the CMA, David Currie, questioning “how far can the concept of human agency be stretched to cover these sorts of issues?”.180 Russian authorities have also discussed the challenges of determining “the responsibility of computer engineers for programming machines that are ‘educated’ to coordinate prices on their own”.181

On the other hand, the United States believe that so long as algorithms behave in a way that constitutes concerted action,182 “independent adoption of the same or similar pricing algorithms is unlikely to lead to antitrust liability, even if it makes interdependent pricing more likely”.183 This is based on the United States’ antitrust law goal to “safeguard the competitive process”184 by not interfering with independent pricing decisions due to the important their benefit to consumers for pricing competition and the role they play in market pricing by efficiently allocating resources. In short, the goal of competition policy in the United States “is to protect competition as the most appropriate means of ensuring the efficient allocation of resources”. 185 Their belief is that merger control is the best mechanism to deter pricing interdependence and prosecuting collusive behaviour for both traditional and algorithmic collusion.186

178 Interview with Magrethe Vestager, European Commissioner (Kara Swisher, Web Summit, 6 November 2017) see video https://youtu.be/90OhCfyYOOk.

179 Simonetta Vezzoso “Competition by Design” (paper prepared for presentation at the 12th ASCOLA

Conference, Stockholm University, 15-17 June 2017) at 23-24.

180 David Currie, Chairman of the UK Competition and Markets Authority “The Role of Competition in Stimulating Innovation” (Concurrences Innovation Economics Conference, King’s College London, 3 February 2017).

181 “Algorithms and Collusion” (note by the Russian Federation submitted for Item 10 of the 127th OECD Competition committee, 21-23 June 2017) at [20].

182 Thus, being in breach of s 1 of the Sherman Act.

183 “Algorithms and Collusion” (note by the United States submitted for Item 10 of the 127th OECD Competition committee, 21-23 June 2017) [18]. In the example given in the paper, an unknowing adoption of the same software using identical algorithms is not considered to breach antitrust liability.

184 At [7].

185 Above, n 183, at [8].

186 At [18].

Australia is one of the only nations to have already attempted to actively addressed their regulatory framework in preparation for possible cases of algorithmic collusion. Following the 2016 Harper Review of Competition Policy the ACCC made two important amendment to the Competition Consumer Act. Firstly, a new provision was inserted under s 45, the equivalent to New Zealand’s s 27 of the Commerce Act, which states that “[a] corporation must not engage with one or more persons in a concerted practice187 that has the purpose, or has or is likely to have the effect, of substantially lessening competition”. A concerted practice188 provision negates the requirement for a “meeting of the minds” during the consideration of whether there has been collusive behaviour occur between competitors. A new misuse of market power provision, under s 46 (the equivalent to New Zealand’s s 36) was also included in the 2017 amendment, which adds the ability for prosecution for a corporation with substantial market power to engage in conduct which has either the “purpose” or “has or is likely to have the effect, of substantially lessening competition”. New Zealand, though utilising similar wording to Australia has not adopted either of these changes.

ACCC Chairman, Rod Sims, stated that he was “confident” that the introduction of these amendments within the existing framework is able to deal with instances of “an intelligent robot engag[ing] in sustained collusion with another robot... if they substantially lessen competition”. He then went on to state that if “the law is ultimately found wanting, I also have little doubt it will be amended”.189

It is easy to see the benefits these changes make to Australia’s existing legislation. For example, they will neatly deal with instances similar to that seen in the Etarus case through the s 45(1)(c) concerted practices insertion. It is also clear that these amendments are somewhat focused on conduct by large companies, particularly in regard to the misuse of market power amendments given that a corporation must have been proved to possess “substantial market power” before

187 The ACCC defines as concerted practice as: “involv[ing] communication or cooperative behaviour between businesses that may not amount to an understanding but goes beyond a business independently responding to market conditions”. ACCC “Anti-competitive conduct” https://www.accc.gov.au/business/anti-competitive- behaviour/anti-competitive-conduct.

188 Europe and the United States already have “concerted practice” laws in place. See: Article 101 Treaty on the Functioning of the European Union (TFEU) and s 1 of the Sherman Act, which while only making reference to “contract, combination... or conspiracy” is generally understood as conceptualizing concerted activity.

189 Above, n 176.

they may be liable. Ultimately, however, Australia has side-stepped the tricky discussion of the regulation of conscious parallelism.190

These opposing views and approaches, as well as the further discussion points that they raise, show the sheer difficulty in addressing algorithmic collusion. This is perhaps why we are yet to see in several parts of the world. Time will of course tell what proactive regulatory steps, if any, are taken. What is clear now is that there is a great magnitude of things to consider.

190 The ACCC has distinguished concerted practice from parallel conduct. See Guidelines on Concerted Practices

(ACCC, August 2018) at 3.6.

“Finding ways to prevent collusion between self-learning algorithms might be one of the biggest challenges that competition law enforcers have ever faced”191

OECD

191 Ania Thiemann and Pedro Gonzaga “Big Data: Bringing Competition Policy to the Digital Era” (Background note for 126th Meeting the OECD Competition Committee, Paris, October 2016) at 84.

Chapter IV Assessment of Regulatory Solutions

A Should New Zealand Regulate?

The prevailing view on antitrust regulation is that tacit collusion is not and should not be regulated against. Though the economic result of explicit and tacit collusion is the same, the legal standpoint on the matter has always been that “one cannot condemn a firm for behaving rationally and interdependently”.192 This is because the legal approach to collusion places more emphasis on the collaborative means by which competitors collude, as opposed to the outcome firms’ decision making has on a market. Therefore, in the case of algorithms, if they “increase market transparency, defendants will often have an independent legitimate business rationale for their conduct”.193

It has been suggested that “[t]acit coordination is feared by antitrust policy even more than express collusion, for tacit coordination, even when observed, cannot easily be controlled by the antitrust laws”.194 This is perhaps why its legality has been at the center of academic legal debate for many decades with any mutual agreement. Donald Turner, Richard Posner and Louis Kaplow are some of the most commonly cited experts on the matter as their opposing opinions provide great insight into the difficulties of justifying the regulation of tacit collusion.

Turner is of the view that tacit collusion “is devoid of anything that might reasonably be called an agreement when it involves simple the independent responses of a group of competitors to the same set of economic facts”.195 If a firm is to consider the possible outcomes of their competitors when setting prices “without more in the way of agreement than is found in conscious parallelism, [they] should not be held unlawful conspirators under the Sherman Act”.196

Contrastingly Posner argues that a decision to reduce a price to take advantage of the time lag between the reduction and oligopolistic markets response could be seen as economically

192 Ariel Ezrachi and Maurice E Stucke “Tacit collusion on steroids – The tale of online price transparency, advanced monitoring and collusion” (2017) 3 CLPD 24 at 29, n 37.

193 At 29.

194 FTC v H J Heinz Co [2001] USCADC 59; 246 F 3d 708, 725 (DC Cir 2001)

195 Donald Turner “The Definition of Agreement under the Sherman Act: Conscious Parallelism and Refusals to Deal” (1962) 75 Harv Law Rev 655 at 666.

196 At 671.

rational conduct.197 Without such reductions an inference can be made that there is a joint effort generate supra-competitive prices. Posner’s view takes on a more economic approach collusion as he considered that if “there is neither legal nor practical justification for requiring evidence that will support the further inference that collusion was explicit rather than tacit”.198

Differing again, Kaplow argues that different classifications and re-categorisations of the interaction between firms should be introduced to antitrust law.199 Kaplow distinguishes between independent behaviour,200 interdependent behaviour,201 and express agreement. Interdependent behaviour elicits the notion of agreement, and therefore should be prohibited by antitrust law. Kaplow proposes that to define “agreement” narrowly is to oppose an economics-based approach to competition law, arguing that “successful interdependent coordination that produces supra-competitive pricing leads to essentially the same economic consequences regardless of the particular manner of interactions that generate this outcome”.202 However, Kaplow also posits that antitrust law should endeavor to find balance between deterrence of collusive behaviour and preventing positive economic results.

In some ways tacit collusion may be “marginalised” as an “oligopoly problem” given its tendency to occur in oligopolistic markets.203 Research discussed in prior chapters indicates that algorithms may not confine this problem to its traditional spheres. As Vestager has pointed out, the issue with our current economic models and theories of the oligopoly problem is that they are “based on human incentives and what we think humans rationally will do. It’s entirely possible that not all of that learning is necessarily applicable in some of these markets”.204 Following this line of logic, we can then adduce that:205

197 Richard Posner Antitrust Law (2nd ed, University of Chicago Press, Chicago, 2001) at 57.

198 At 94.

199 See generally: Louis Kaplow “On the Meaning of Horizontal Agreements in Competition Law” (2011) 99 Cal Law Rev 683.

200 Behaviour between two or more parties that is absent of any relationship. Behaviour with similarities is said to be independent if it is motivated by factors that are not contingent upon behaviour by another party.

201 Behaviour involving coordinating with other parties.

202 Louis Kaplow “Competition policy and price fixing” (2014) 113 J Econ 309 at 311.

203 Samuel Dobrin “Algorithms and Collusion: Competition Law Challenges of Pricing Algorithms” (Masters Thesis, Lund University, 2019) at 41.

204 David J Lynch “Policing the digital cartels” (9 January 2017) Financial Times https://www.ft.com/content/9de9fb80-cd23-11e6-864f-20dcb35cede2.

205 Above, n 154.

Antitrust systems are based on deterrence: Humans' desire to conspire with competitors can be outweighed by a fear of getting caught, fined and thrown behind bars. The same (presumably) is not true of robots.

On this basis it does not seem unwise to ignore the discussion on solutions altogether. Instead, it appears best to at least consider the options available before making a definitive decision on whether regulatory intervention is the best course of action. Before we proceed on this basis, however, some caveats the generation of tacit collusion must be put forward. Firstly, pure interdependence is not likely to produce tacit collusion on its own. Factors such as product differentiation, information asymmetries incentivising collusion and possible new entrants must also be considered.206 Secondly, oligopolistic markets compete on more factors than price; service, quality and loyalty schemes are also key factors on which they compete.207

B Theories of Regulation

How to regulate AI is a topical debate across many sectors and areas of law, each with its own niche concerns and contested solutions. It is therefore helpful put some consideration into regulatory theory employ some of this logic when deciding how to best serve New Zealand’s legislative framework. In my research I have gravitated towards the approaches of two academics which I believe provide holistic, insightful and complementary approaches the the regulation of AI.

Firstly, Professor Lawrence Ludwig in his text Code Version 2.0 postulates that the law is not effective in regulating digital markets.208 Instead, he suggests, digital markets should be regulated through “the design of an architectural network” guided by the “four modalities of constraints”: “the law, social norms, the market and architecture”.209

Regulation by law allows for a direct response to an issue raised by AI. By prohibiting market behaviour the conduct of market participants is shaped as desired.210 Regulation by architecture seeks to adapt the structure of digital markets themselves; in a collusion context this would involve regulating the design of machines used in pricing strategies. This form of response is

206 William Page “Communication and Concerted Action” (2007) 38 Loyola University Chicago Law Journal 405.

207 Richard Whish and David Bailey Competition Law (8th ed, Oxford University Press, Oxford, 2015) at 594.

208 Lawrence Ludwig Code Version 2.0 (Basic Books, New York, 2006) generally.

209 At 123.

210 Above, n 208, at 124.

an iteration of the “compliance by design” approach heralded by the EU. Lastly, the regulation by market seeks to manipulate the behaviour of market participants. This is achieved through the use of penalties to disincentivise collusion outcomes and incentivise the destabilisation of structures abetting collusion through whistleblowing and leniency provisions.211

Secondly, Professor Nicholas Petit advances212 that there are two approaches to regulation: legalistic and technological.213 Petit then provides guidance on essentially which “modality” referenced above is best served to regulate AI based on the externalities it generates.

A legalistic approach asks the question of “whether AI and robotic application is caught by existing rules”.214 This is similar to the line of questioning much of this paper has taken so far. Assessment of whether or not such application is caught is to be conducted on a disciplinary basis.215 Petit highlights there are difficulties with this approach. Most importantly he infers that lawyers are “remote from technological fields” have “imperfect comprehension of the underlying science” and therefore form opinions of legislative options which contains mistakes. Lawyers, he says, “may elaborate wrong hypotheses, and in turn create new legal problems” due to the fact that AI generates unpredictable sets of issues. Therefore:216

Reasoning from existing rules is likely to generate blind spots. This is because our rules are abstract commands designed on the basis of specific representations of the state of the world, and its trajectories.

On the other hand, a technological approach attempts to identify the applications of AI and deduce the legal needs based off this analysis. In doing so, it is best to “envision a lawless AI world” because the law cannot be treated as a given. This approach, he suggests is hospitable to the regulation of technology than the legalistic approach.217 It is therefore more conducive

211 See Kenji Lee Jia Juinn “Algorithmic Collusion & its Implications for Competition Law and Policy” (LLB (Hons) Dissertation, National University of Singapore, 2018) at 41 – 47 for a more comprehensive overview of these modalities as they apply to cartel conduct.

212 Nicholas Petit “Law and Regulation of Artificial Intelligence and Robots: Conceptual Framework and Normative Implications” (working paper, University of Liege, 2017).

213 At 6 and 8.

214 At 6.

215 At 6. In the case of collusion, this is through the eye of civil and criminal liability, and perhaps legal personhood.

216 At 7.

217 Above, n 212, at 8.

to ex ante legal approaches that revolve around discussion on “whether and how law and regulation risks stifling innovation incentives, research and development investments and incentive activity”.218

To the point of externalities. Based on a public interest theory, AI may be observed to generate two types of externalities219: negative220 and positive221. Such externalities may either be categorised as “discrete”, “systemic” or “existernalties”.222 Discrete externalities cause harm or benefit personally, randomly, rarely or endurably. Regulation of discrete externalities should be undertaken through legalistic approaches and thus solved ex post as they do not affect society in a significant way.223 Systemic externalities cause harm or benefit locally, predictably, frequently, or unsustainably. Their regulation should be achieved primarily using an ex ante approach, though both ex ante and ex post impact assessments should be undertaken. Petit provides as example of whether “black-box requirements [ought to] be imposed on manufacturers of robots confronted with moral dilemmas”. 224 Systemic externalities, Petit concludes, are best addressed through the experimentation of “the various regulatory options in dedicated zones of the real-life environment”.225

In combination of the two theories discussed, it would appear that a technological ex ante regulatory response approach would best serve the issue at hand as the issues raised by tacit algorithmic collusion are systemic. I will proceed with this line in enquiry. As a caveat to the discussion that follows, I will not attempt to champion a sole solution to algorithmic tacit collusion. However, I do wish to highlight an array of solutions put forward by leading experts that may fit seamlessly within New Zealand’s framework. For the sake of completeness, academic interest, and at Petit’s suggestion, legalistic ex post responses will be discussed first briefly, followed by a more detailed analysis of technological based ex ante responses.

218 At 9.

219 At 25.

220 Negative externalities impose costs on third parties that are not entirely, or in part, internalised by the AI.

221 Positive externalities confers benefits on third parties.

222 At 26. The “existernalities” are relevant to the forthcoming discussion so any further discourse on them is omitted.

223 At 28.

224 At 29.

225 At 29.

C Ex Post Responses

Below I will briefly outline some legalistic ex post responses that the literature has suggested as being possible solutions to tacit algorithmic collusion.

  1. Attribution
The scope of the concept of human agency calls into question whether there is now a need to give AI legal personality in a similar way companies have been granted such standing before the law.

Samir Chopra and Laurance White produced a paper for the 16th European Conference on Artificial Intelligence in 2004 which discussed whether the law should extend legal personhood to artificial agents.226 They conceptualised five possible solutions to the issue of “electronic contracting”, which were as follows: (1) “artificial agents as mere tools”; (2) “deploy[ing] the unilateral offer doctrine of contract law”; (3) “deploy[ing] the objective theory of contractual intention”; (4) “treating artificial agents literally as the legal agents of their operators”; and (5) “treat[ing] artificial agents as legal persons”.227

Implementation of options four and five would allow the Commerce Act to attribute the conduct of a machine to a company or individual via s 90, however they come with a number of objections as outlined in the aforementioned paper. A slightly less radical approach to attribution may be to insert a provision employing the first option. However, difficulties with ML and DL algorithms arise here with New Zealand based expert Benjamin Liu suggesting that there is difficultly in attributing of the mental state of ML AI where the behaviour of the machine has not be programmed or envisaged by its user or designer to those persons. He argues that in cases such as this attribution of a mental state of a machine (if it even feasible of having a mental state) is inappropriate.228

A University of Otago based project, Artificial Intelligence and Law in New Zealand is currently being undertaken by Associate Professor Colin Gavaghan, Associate Professor James Maclaurin and Associate Professor Alistair Knott. As part of their investigation around AI and

226 Samir Chopra and Laurence White “Artificial Agents – Personhood in Law and Philosophy” (paper presented at the 16th European Conference on Artificial Intelligence, Valencia, 22-27 August 2004).

227 For further detail on these options see 2.1.

228 Liu, above n 45, at 208.

employment they will consider the issue of legal personhood for machines in some instances. Also, PhD student Mauricio Kimura is currently completed a dissertation at Waikato University posing the question of: “Whether or not it would be advisable to ascribe legal personality to [AI]” and “if ascribing legal personality to [AI] is advisable, what would be the most efficient legal personality”.229 The results of these pieces of research will be an interesting addition to the debate in New Zealand.

However successful or likely such a response will be, it does however only solve half the problem with the current drafting of s 30. It will still be difficult to find a “contract, arrangement or understanding” between machine agents when they do not communicate with one another and the neural network is so hard to decipher.

  1. Addition of ‘concerted practices’
New Zealand could follow the lead of Australia and insert a “concerted practices” provision in either s 27 (as Australia has done with their equivalent provision) or s 30. The United States and European Union also have similar concepts within their respective laws. A “concerted practice” is typically defined as something less than a contract or arrangement, but still features an element of cooperation between competitors, though there may be no express communication between them. This is perhaps a solution to the ‘meeting of the minds’ problem such a level of “agreement” for lack of a better word is not required, at least in the Australian context. What is required is there is something “more than a person independently responding to market conditions”.230

Speculation of such a provision will pose the question of whether New Zealand’s antitrust law should seek to prevent anticompetitive conduct where there is a clear link between cause in effect, not matter the cause. Further consideration will also need to be paid to whether such a provision will only be applicable for cases of a substantial lessening of competition rather than cartel conduct as seen in Australia.

229 Mauricio Kimura “Mauricio Kimura LinkedIn Profile” https://nz.linkedin.com/in/mauriciokimura.

230 Above, n 190, at 1.2.

  1. Use of other existing laws
In this United States, ‘unfair practice’ provisions in the Fair Trade Act and ‘market manipulation’ provisions have been touted as being able to deal with some instances of algorithmic collusion.231 Similarly, New Zealand may try and use other provisions within the Commerce Act for instances where s 30 is wanting for applicability.

Section 36 may be used in instances where the Commission believe that companies are taking advantage of substantial market power and using algorithms to achieve supra-competitive prices. This will of course require the “substantial market power” and “taking advantage of” provisions to be likely established. The outcome will then turn on whether it is possible to utilise s 36B to infer the “purpose” element. Section s 36B allows the purpose element in s 36 to be inferred by “the conduct of any relevant person or from any other relevant circumstances”. Using price setting software where it is highly likely a firm knows will end up achieving supra-competitive prices may by persuasive enough to sway the courts in deciding in favour of the Commission. This is similar to the “plus factor” approach used in the United States where the combination of parallel pricing and factors such as the algorithm or its input factors.232 Liu suggests that in case of two firms using a similar algorithm concurrently may be possible to apply a “plus factor” approach, most typical scenarios of algorithmic tacit collusion do not sit well with current case law and this approach will therefore be unsuitable.233 Ultimately, I suspect, this method will be rather difficult to successfully argue.

  1. Other suggestions
Other suggestions as to regulatory responses have been made, including a per se prohibitions on pricing algorithms that support supra-competitive prices,234 price regulation and rules on the design of price-setting algorithms.235

The per se prohibition requires two conditions. Firstly, that we can identify between machines that support supra-competitive prices and those that do not. Secondly, that the algorithm can be determined. Liu suggests that both conditions are problematic due to complications with

231 Above, n 192, at 30.

232 Michael Gal and Niva Elkin-KOren “Algorithmic Consumers” (2017) 30(2) Harv JL & Tech 309 at 346-347.

233 Above, n 45, at 208.

234 John Harrington “Developing Competition Law for Collusion by Autonomous Price-Setting Agents” (22 August 2017) Social Science Research Network www.ssrn.com at 47-71 as cited by Liu, above n 45, at 208-209. This approach requires the use of the United States’ Fair Trade Act s 5 prohibition on ‘unfair practice’.

235 OECD, above n 4, at 49-50.

explaining and understanding neural networks and the ease at which the first condition could be circumvented.236

An issue with price regulations is that they disincentivise innovation and the production of quality goods and services. This could possibly do more harm than good to competition in a market and is therefore an unlikely solution. This may be the case of rules on algorithm design are implemented. Innovation of algorithms which are actually pro-competitive may be disincentivised and could be costly for the government to monitor compliance.

To implement regulations such as these may be the perfect manifestation of the warning issued by Petit the lawyers may be unable to elucidate or understand how AI works thus generating wrong hypotheses that create more issues. We now turn to ex ante responses.

D Ex Ante Responses

Leading experts on algorithmic tacit collusion Ariel Ezrachi and Maurice Stucke in their work “Sustainable and Unchallenged Algorithmic Tacit Collusion”237 outline three steps to “better understand and deter algorithmic collusion”.238

Firstly, competition enforcement agencies must “better understand the risks of algorithmic collusion”. To do so, research and consultation projects and the creation of specialised teams within competition enforcement agencies and special teams should be undertaken.239

Secondly, the agencies need to “improve their tools in detecting collusion”.240 This can be achieved by developing “Algorithmic Collusion Incubators” to simulate the effects and likelihood of different ways in which tacit collusion can be destablised.

Thirdly once a better understanding is gained, and if evidence suggests pricing algorithms are facilitating collusive conduct, then competition agencies should consider making amendments

236 Above, n 45, at 209.

237 Above, n 47.

238 At 257.

239 At 257.

240 At 257.

to legislation and regulation.241 Merger review is considered as the primary method in which to deter tacit collusion, and is an example of regulation by architecture.242

Is it of this suggested framework that I shall proceed with the discussion on ex ante responses.

  1. Market studies and sector inquiries
The OECD also suggests that market studies should precede other regulatory responses.243 They suggest investigating “whether algorithms commonly result in coordinated effects and, if so, to attempt to identify the circumstances and sectors under which algorithmic collusion is more likely to be observed”.244 These studies can will therefore help enforcement agencies understand algorithmic collusion, particularly as it relates to the relevant market in which they are enforcing antitrust laws in, and may lead to recommendations for other forms of regulatory responses based on market conditions and structures.

Market studies could also lead to advocacy efforts and recommendations to the business community itself with the objective of fostering stronger compliance with competition principles. This could result, for instance, in the adoption of self-regulation in the form of codes of conduct, which companies would agree to comply with when designing and using pricing algorithms.

Antitrust agencies across Europe and the United States have already undertaken such action.245 Places such as the United Kingdom, Iceland and Mexico have even gone so far as to instigate “market investigations”. Such investigations surpass market studies in that they allow agencies to issue non-binding recommendations to that they may “eventually impose structural or behavioural remedies”.246

  1. Destabilisation of market structures facilitating collusion
Ezrachi and Stucke suggest that regulators should attempt to actively destabilise market structures that facilitate collusion by implementing counter measures such as promoting entry

241 Above, n 47, at 258.

242 At 258.

243 Above, n 4, at 40.

244 At 41.

245 Above, n 47, at 257.

246 Above, n 4, at 41.

by mavericks and reducing regulatory barriers, limited transparency to the buyers’ advantage and incentivising competitors to deviate.247 These sorts of strategies can be implemented on a case-by-case basis suited to the particular market by using an Algorithmic Collusion Incubator. In essence, this is a computer based simulation which draws on “market characteristics, demand, and supply, and enables competition officials to test under what conditions tacit collusion occurs, and the effects and likelihood of different counter-measures to destabilize this conscious parallelism”.248

This approach is an example of Petit’s technological based response. Countries such as the United Kingdom and Australia have already introduced dedicated technology units to utilise Algorithmic Collusion Incubator based research.249

  1. Merger review
Finally, the last ex ante response aims to develop a market structure in which tacit collusion is prevented by enforcing strategically designed merger controls. This approach is perhaps, at the present time, the best option New Zealand has against preventing algorithmic collusion. In particular, because it will likely not cause significant harm or transformations of existing markets. As with any approval there is a chance that there be unforeseen anti-competitive flow- on effects, regardless of whether consideration of algorithmic collusion is given.

To implement effective merger reviews and control Ezrachi and Stucke suggest “lowering the threshold of intervention and investigate the risk of coordinated effects” on mergers which decrease competitive firms in a market from as many as from five to four (rather than the three to two approach currently utilised).250 In doing this, agencies may have to alter the ways in which they address humans instigated and machine instigated tacit collusion.

The OECD reports that in reviewing merger controls, agencies should pay attention to the main factors which facilitate algorithmic collusion, namely “transparency and velocity of interaction”.251 Ezrachi and Stucke suggest refining the approach to signalling by robots as an

247 Above, n 192, at 32.

248 Above, n 47, at 258.

249 Above, n11, at 45 n 161 and 162.

250 Above, n 47, at 258.

251 Above, n 4, at 41.

initial starting point to this investigation, then implementing various restrictions on market manipulations.252

252 At 258.

Conclusion

As far as dissertation topics go, I sincerely thought I finessed my selection on my key criteria: fascinating enough to keep me thinking about it for the best part of eight months, contentious enough to make some interesting observations and findings, but narrowly defined enough so as to somewhat definitively answer the ‘big question’. It may be unsurprising to the reader that I now admit my certainty in the aforementioned criteria being met was incredibly naïve. The deeper into the research for this paper I found myself the larger, more abstract and increasingly overwhelming my big question seemed to get. Between that and confidently grasping the technical and complicated world of AI I admit I was stumped for some time on how this piece of research would play out. But as I draw to the end of my research and thoughts and the relief sets in I realise how grateful I am for my naivety. As with a plethora of other areas of law I find myself swimming in the grey area with different schools of jurisprudence and economics as the lane ropes, each deviating in opposing directions. But this is exactly what makes this area of the law so engrossing.

If I had to be succinct, my conclusion to my big question would be “no”. New Zealand’s existing antitrust framework is not capable of addressing tacit algorithmic collusion (though explicit collusion is theoretically addressable under the existing law, the difficulties in its detection I shall leave to the computer scientists). However, the questions that follow this cannot be argued quite so easily. In the consideration of how to then address tacit algorithmic collusion we quickly find ourselves considering decades old legal and economic concepts that experts far better versed in the respective fields than myself continue to disagree on. From the legal definition of “arrangement”, to the laws avoidance to regulate conscious parallelism and the very expedient issue of whether AI should be granted legal personhood, this topic raises a host of issues that could be and are passionately debated by the world’s finest legal minds. And I can envisage will be for some time.

What is evident, however, is that technology is evolving at such rapid paces, while the law remains somewhat stagnant. Instinctively, and admittedly with less authority than my sources to make such grand conclusions, I do not believe that simply because the law has historically avoided tackling the regulation of conscious parallelism that this will necessarily be the case going forward. The world of AI is advancing at a rate most people cannot fathom and as a result, so too is the way in which we conduct commerce in a growing global economy. It is

therefore not unreasonable to re-examine traditional legal positions and evaluate whether the changing society warrants re-routing the law, particularly when it becomes more evident that the traditional issues that conscious parallelism brings with it are exacerbated by our new technology. One day it may perhaps be worth re-defining traditional concepts such as “agreement”, or even carve out entirely new areas of law such as giving legal personality to machines.

However, for the time being, academic theory and logical pragmatism suggest that beginning with an ex ante regulatory approach is the best course of action as the consequences of algorithmic collusion are systemic in nature. Architectural and market modality based proposals such as merger review, the destabilisation of market structures facilitating collusion and market studies and sector inquiries are championed in this paper and based on suggestions by international experts are likely sufficient to prepare New Zealand for the chance of a case of algorithmic collusion.

While there is perhaps no right or wrong answer to the issues I have delved into as the future of technology is not foreseeable, I do hope this paper might serve as a stimulus to New Zealand’s legal community, Commerce Commission (the Commission) and Government. Substantial thought and discussion must be given as to how we can best tackle these impending issues before our judicial system is left in the lurch.

Bibliography

A Cases

  1. New Zealand

ANZCO Foods Limited v AFFCO NZ Limited [2006] 3 NZLR.

Auckland Regional Authority v Mutual Rental Cars (Auckland Airport) Ltd [1987] NZHC 213; [1987] 2 NZLR 647 (HC).

Commerce Commission v Caltex New Zealand Ltd [1998] 2 NZLR 78 (HC).

Commerce Commission v Lodge Real Estate [2018] NZCA 523.

Commerce Commission v Taylor Preston Ltd [1998] 3 NZLR 498.

Commerce Commission v The Wellington Branch of the NZ Institute of Driving Instructors

(1990) 3 NZBLC 101,913 (HC).

Giltrap City Ltd v Commerce Commission [2004] 1 NZLR 608 (CA).

Lodge Real Estate v Commerce Commission [2020] NZSC 25.

New Zealand Apple and Pear Marketing Board v Apple Fields Ltd [1991] 1 NZLR 257.

Ophthalmological Society of New Zealand v Bolton [2001] NZCA 296.

Port Nelson Ltd v Commerce Commission [1996] NZCA 230; [1996] 3 NZLR 554 (CA).

Re Insurance Council of New Zealand (Inc) (1989) 2 NZBLC (Com) 104,477.

Re Wellington Fencing Materials Assoc [1960] NZLR 1121 (HC).

  1. Australia

Australian Competition & Consumer Commission CC (New South Wales) Pty Ltd [1999] FCA 954; (1999) 165 ALR 468.

Australian Competition & Consumer Commission v Australian Medical Association Western Australia Branch Inc [2003] FCA 686; (2003) 199 ALR 423, (2003) ATPR 41-945.

Australian Competition & Consumer Commission v Pauls Ltd [2002] FCA 1586; (2003) ATPR 41-911. Australian Competition and Consumer Commission v IPM Operations and Maintenance Loy Yang Pty Ltd [2006] FCA 1777.

Australian Competition and Consumer Commission v Leahy Petroleum Pty Ltd [2007] FCA 794, (2007) ATPR 42-162.

Bray v F Hoffman-La Roche Ltd [2002] FCA 243, (2002) 118 FCR 1.

Commerce Commission v Carter Holt Harvey Building Products Ltd [2000] NZHC 1220; (2000) 9 TCLR 535 (HC).

Hughes v Western Australian Cricket Assoc (Inc) [1986] FCA 357; [1986] ATPR 40-736 (FCA).

Radio 2 UE Sydney Pty Ltd v Stereo FM Pty Ltd [1942] ArgusLawRp 52; (1983) 48 ALR 361, (1983) 68 FLR 70. Trade Practices Commission v Nicholas Enterprises Pty Ltd (1979) ATPR 40-126 (FA). Trade Practices Commission v Parkfield Operations Pty Ltd ( [1985] FCA 403; (1985) 7 FCR 534 (FCAFC). Trade Practices Commission v Parkfield Operations Pty Ltd [1985] FCA 27; (1985) 5 FCR 140.

Tradestock Pty Ltd v TNT (Management) Pty Ltd (1997) ATPR 40-056 (FCA).

Visa Paper Pty Ltd v Australian Competition and Consumer Commission [2003] HCA 59, (2003) 216 CLR 1.

  1. United States

Brooke Group Ltd v Brown & Williamson Tobacco Corp [1993] USSC 105; 509 US 209 (1993).

Clamp-all Corporation v Cast Iron Soil Pipe Institute [1988] USCA1 298; 851 F 2d 478, 848 (1st Cir 1988). Federal Trade Commission v Pacific States Paper Trade Association [1927] USSC 8; 273 US 52 (1927). FTC v H J Heinz Co [2001] USCADC 59; 246 F 3d 708, 725 (DC Cir 2001).

  1. European Union

Case C-74/14 Eturas and other v Lietuvos Respublikos konkurencijos taryba [2016] OJ C 98/3 ECLI:EU:C:2016:42

CMA Case 50233 Online sales of posters and frames (12 August 2016).

  1. United Kingdom

Re British Slag [1962] 2 All ER 807 (CA).

B Legislation

  1. New Zealand
Commerce Act 1986

  1. Australia
Competition and Consumer Act 2010

  1. United States
Sherman Antitrust Act of 1890
  1. European Union
Treaty on the Functioning of the European Union

  1. United Kingdom
Competition Act 1998

C Books and chapters in books

Ariel Ezrachi and Maurice E Stucke Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy (Harvard University Press, Cambridge, 2016), ch 8.

Chris Noonan Competition Law in New Zealand (online looseleaf ed, Thomson Reuters). Herbert Smith The Shape of Automation for Men and Management (Harper & Row, New York, 1965).

Ian Ayres and John Braithwaite Responsive Regulation: Transcending the Deregulation Debate (Oxford University Press, 1992).

Jiawei Han, Micheline Kamber and Jian Pei Data Mining: Concepts and Techniques (3rd ed, Morgan Kaufamnn Publishers, Waltham, 2012).

Lawrence Ludwig Code Version 2.0 (Basic Books, New York, 2006)

Rex Ahdar The Evoluation of Competition Law in New Zealand (Oxford University Press, Oxford, 2020).

Richard Posner Antitrust Law (2nd ed, University of Chicago Press, Chicago, 2001).

Richard Whish and David Bailey Competition Law (8th ed, Oxford University Press, Oxford, 2015).

Tony Dellow and Anna Parker Commercial Law in New Zealand (online ed, LexisNexis).

D Journal articles

Alan Turing “Computing Machinery and Intelligence” (1950) LIX Mind 433.

Andrea De Mauro, Marco Greco and Michele Grimaldi “A Formal Definition of Big Data Based on its Essential Features” (2015) 65 Library Review 122.

Ariel Ezrachi and Maurice E Stuck “Artificial Intelligence & Collusion: When Computers Inhibit Competition” (2017) Univ. Ill. Law Rev. 1775.

Ariel Ezrachi and Maurice E Stucke “Sustainable and Unchallenged Algorithmic Tacit Collusion” (2020) 17 NW J Tech & Intell Prop 217.

Ariel Ezrachi and Maurice E Stucke “Tacit collusion on steroids – The tale of online price transparency, advanced monitoring and collusion” (2017) 3 CLPD 24.

Benjamin Liu “Algorithmic Tacit Collusion” (2019) 25 NZBLQ 199 William Page “Communication and Concerted Action” (2007) 38 Loyola University Chicago Law Journal 405.

Donald Turner “The Definition of Agreement under the Sherman Act: Conscious Parallelism and Refusals to Deal” (1962) 75 Harv Law Rev 655.

Emilio Calvano, Giacomo Calzolari, Vincenzo Denicolò and Sergio Pastorello “Algorithmic Pricing: What Implications for Competition Policy?” (2019) 55 Review of Industrial Organization 155.

Louis Kaplow “Competition policy and price fixing” (2014) 113 J Econ.

Louis Kaplow “On the Meaning of Horizontal Agreements in Competition Law” (2011) 99 Cal Law Rev 683.

Michael Gal and Niva Elkin-KOren “Algorithmic Consumers” (2017) 30(2) Harv JL & Tech 309.

Paul Scott “The Purpose of Substantially Lessening Competition: The Divergence of New Zealand and Australian Law” (2011) Waikato Law Rev 168.

Richard A Posner “Oligopoly and the Antitrust Laws: A Suggested Approach” (1968) 21 Stanford Law Rev. 1562.

Richard Baldwin and Julia Black “Really Responsive Regulation” (2008) 71 MLR 59.

Salil Mehra “Antitrust and the Robo-Seller: Competition in the Time of Algorithms” (2015) 100 Minn. L. Rev. 1323

Terrell McSweeny and Brian O’Dea “The Implications of Algorithmic Pricing for Coordinated Effects Analysis and Price Discrimination in Antitrust Enforcement” (2017) 32 Antitrust Law J 75

Yann LeCunn, Yoshua Bengio and Geoffrey Hinton “Deep learning” (2015) 521 Nature 436.

E Parliament and government materials

  1. New Zealand
Commerce Commission “Criminal Prosecution Guidelines” (29 November 2013) www.comcom.govt.nz.

Commerce Commission “Enforcement Criteria” (3 June 2014) www.comcom.govt.nz. Commerce Commission “Enforcement Response Guidelines” (23 August 2017) www.comcom.govt.nz.

Commerce Commission “Model Litigant Policy” (31 May 2016) www.comcom.govt.nz.

  1. Australia

Guidelines on Concerted Practices (ACCC, August 2018

  1. European Union

Guidelines on the assessment of horizontal mergers under the Council Regulation on the control of concentrations between undertakings (Official Journal of the European Union, C 31/5, 5 February 2004)

F Reports

George A Hay Anti-Competitive Agreements: The Meaning of ‘Agreement’ (Cornell Law School, No 13-09, 18 February 2013).

Marc Ivaldi, Bruno Jullien, Patrick Rey, Paul Seabright and Jean Torole “The Economics of Tacit Collusion” (report for DG Competition, European Commission, March 2003).

Report from the Commission to the Council and the European Parliament Final Report on the E-commerce Sector Inquiry (European Commission, SWD(2017) 154 final, 10 May 2017).

G Internet resources

“Algorithms and Collusion: Competition Policy in the Digital Age” (2017) OECD https://www.oecd.org/daf/competition/Algorithms-and-colllusion-competition-policy-in-the- digital-age.pdf.

“Avoiding anti-competitive behaviour” (2018) Commerce Commission New Zealand https://comcom.govt.nz/business/avoiding-anti-competitive- behaviour#:~:text=Competitive%20markets%20help%20to%20keep,goods%20and%20servi ces%20remains%20high.&text=The%20Commerce%20Act%20prohibits%20anti,allocate%2 0markets%20or%20restrict%20output.

“Big Data: What is it and why it matters” SAS https://www.sas.com/en_nz/insights/big- data/what-is-big-data.html.

“What is a cartel?” (May 2019) Commerce Commission New Zealand https://comcom.govt.nz/business/avoiding-anti-competitive-behaviour/what-is-a-cartel.

ACCC “Anti-competitive conduct” ACCC https://www.accc.gov.au/business/anti- competitive-behaviour/anti-competitive-conduct.

Adil Moujahid “A Practical Introduction to Deep Learning with Caffe and Python” (26 June 2016) Adil Moujahid http://adilmoujahid.com/posts/2016/06/introduction-deep-learning- python-caffe/.

Alistair Lindsay “Do We Need to Prevent Pricing Algorithms Cooking Up Markets” (2017) 38(12) ECLR 533; Ai Deng “When Machines Learn to Collude: Lessons from a Recent Research Study on Artificial Intelligence” (5 September 2017) Social Science Research Network www.ssrn.com.

Ashwin Ittoo and Nicholas Petit “Algorithmic Pricing Agents and Tacit Collusion: A Technological Perspective” (3 October 2017) Social Science Research Network www.ssrn.com.

BJ Copeland “Artificial intelligence” (11 August 2020) Encyclopedia Britannica https://www.britannica.com/technology/artificial-intelligence.

Claudia P O’Kane and Ioannis Kokkoris “A Few Reflections on the Recent Caselaw on Algorithmic Collusion” (3 August 2020) Social Science Research Network https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3665966.

David J Lynch “Policing the digital cartels” (9 January 2017) Financial Times https://www.ft.com/content/9de9fb80-cd23-11e6-864f-20dcb35cede2.

David Kelnar “The fourth industrial revolution: a primer on Artificial Intelligence (AI)” Medium https://medium.com/mmc-writes/the-fourth-industrial-revolution-a-primer-on- artificial-intelligence-ai-ff5e7fffcae1.

DF Seun “Quantum Computing: The Real Game Changer” (30 November 2018) Medium https://medium.com/the-andela-way/quantum-computing-the-real-game-changer- e60010d77fe4.

Emilio Calvano, Giacomo Calzolari, Vincenzo Denicolò and Sergio Pastorello “Artificial Intelligence, Algorithmic Pricing and Collusion” (April 2019) Social Science Research Network www.ssrn.com.

Eric Niiler “Can AI Be a Fair Judge in Court? Estonia Thinks So” (25 March 2019) Wired https://www.wired.com/story/can-ai-be-fair-judge-court-estonia-thinks-so/.

John Harrington “Developing Competition Law for Collusion by Autonomous Price-Setting Agents” (22 August 2017) Social Science Research Network www.ssrn.com

Jorge Lemus and Fernando Luco “Price Leadership and Uncertainty about Future Costs” (6 June 2020) Social Science Research Network https://ssrn.com/abstract=3186144.

Marek Kowalkiewicz “How did we get here? The story of algorithms.” (10 October 2019) Towards Data Science https://towardsdatascience.com/how-did-we-get-here-the-story-of- algorithms-9ee186ba2a07.

Martin Giles “Explainer: What is a quantum computer?” (29 January 2019) MIT Technology Review https://www.technologyreview.com/2019/01/29/66141/what-is-quantum-computing/.

Mauricio Kimura “Mauricio Kimura LinkedIn Profile” https://nz.linkedin.com/in/mauriciokimura.

Nicholas Davis “What is the fourth industrial revolution” (19 January 2016) World Economic Forum https://www.weforum.org/agenda/2016/01/what-is-the-fourth-industrial-revolution/.

Nicholas Hirst “When Margrethe Vestager takes antitrust battle to robots” (28 February 2018) Politico https://www.politico.eu/article/trust-busting-in-the-age-of-ai/.

Olivia Solon “How A Book About Flies Came To Be Priced $24 Million On Amazon” (27 April 2011) Wired https://www.wired.com/2011/04/amazon-flies-24-million/.

Rockwell Anyoha “The History of Artificial Intelligence” (28 August 2017) Harvard University http://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/.

Sarah Griffith “This AI software can tell if you’re at risk from cancer before symptoms appear” (26 August 2016) Wired https://www.wired.co.uk/article/cancer-risk-ai-mammograms.

H Other resources

  1. Theses and research papers
Kenji Lee Jia Juinn “Algorithmic Collusion & its Implications for Competition Law and Policy” (LLB (Hons) Dissertation, National University of Singapore, 2018).

Nicholas Petit “Law and Regulation of Artificial Intelligence and Robots: Conceptual Framework and Normative Implications” (working paper, University of Liege, 2017).

Samuel Dobrin “Algorithms and Collusion: Competition Law Challenges of Pricing Algorithms” (Masters Thesis, Lund University, 2019).

  1. Seminars and papers presented at conferences
“Algorithms and Collusion” (note by the Russian Federation submitted for Item 10 of the 127th OECD Competition committee, 21-23 June 2017).

“Algorithms and Collusion” (note by the United States submitted for Item 10 of the 127th OECD Competition committee, 21-23 June 2017).

Ania Thiemann and Pedro Gonzaga “Big Data: Bringing Competition Policy to the Digital Era” (Background note for 126th Meeting the OECD Competition Committee, Paris, October 2016)

Antonio Capobianco, Pedro Gonzaga and Anita Nyseó “Algorithms and Collusion – Background Note by the Secretariat” (paper presented to the OECD Secretariat to serve as a

background note for Item 10 at the 127th Meeting of the Competition Committee, 21-23 June 2017).

David Currie, Chairman of the UK Competition and Markets Authority “The Role of Competition in Stimulating Innovation” (Concurrences Innovation Economics Conference, King’s College London, 3 February 2017).

Magrethe Vestager Algorithms and competition (Bundeskartellamt 18th Conference on Competition, Berlin 16 March 2017).

Quoc Le and others “Building High-level Features Using Large Scale Unsupervised Learning” (paper presented to the 29th International Conference on Machine Learning, Edinburgh, June 2012).

Rod Sims, ACCC Chair “The ACCC’s approach to colluding robots” (Can Robots Collude Conference, Sydney, 16 November 2017).

Samir Chopra and Laurence White “Artificial Agents – Personhood in Law and Philosophy” (paper presented at the 16th European Conference on Artificial Intelligence, Valencia, 22-27 August 2004).

Simonetta Vezzoso “Competition by Design” (paper prepared for presentation at the 12th ASCOLA Conference, Stockholm University, 15-17 June 2017).

Tracey McSweeny “Competition Law: Keeping Pace in a Digital Age” (keynote remarks presented to the 16th Annual Loyola Antitrust Colloqium, Chicago, 15 April 2016).

  1. Newspaper and magazine articles
John Markoff “Computer Wins on ‘Jeopardy!’: Trivial, It’s Not” The New York Times (16 February 2011).

Roger Parloff “Why Deep Learning is Suddenly Changing Your Life” Fortune (28 September 2016).

Steven Borowicc “AlphaGo Seals 4-1 Victory over Go Grandmaster Lee Sedol” The Guardian

(15 March 2016).

  1. Interviews
Magrethe Vestager, European Commissioner (Kara Swisher, Web Summit, 6 November 2017) see video https://youtu.be/90OhCfyYOOk.
  1. Press releases
Department of Justice “Former E-Commerce Executive Charged with Price Fixing in the Antitrust Division’s First Online Marketplace Prosecution” (press release, 6 April 2015).

European Commission “Antitrust: Commission fines consumer electronics manufacturers for fixing online resale prices” (press release, 24 July 2018).

  1. Speeches

Appendices – Commerce Act 1986

1A Purpose

The purpose of this Act is to promote competition in markets for the long-term benefit of consumers within New Zealand.

2 Interpretation

(1) In this Act, unless the context otherwise requires,–

arrive at, in relation to an understanding, includes reach, and enter into

give effect to, in relation to a provision of a contract, arrangement or understanding, includes–

(a) do an act or thing in pursuance of or in accordance with that provision:

(b) enforce or purport to enforce that provision

person, includes a local authority, and any association of persons whether incorporated or not

price, includes valuable consideration in any form, whether direct or indirect; and includes any consideration that in effect relates to the acquisition or supply of goods or services or the acquisition or disposition of any interest in land, although ostensibly relating to any other matter or thing

provision, in relation to an understanding or arrangement, means any matter forming part of or relating to the understanding or arrangement

(2) In this Act, –

(a) a reference to engaging in conduct shall be read as a reference to doing or refusing to do any act, including–
(i) the entering into, or the giving effect to a provision of, a contract or arrangement; or

(ii) the arriving at, or the giving effect to a provision of, an understanding; or

(iii) the requiring of the giving of, or the giving of, a covenant:

(b) a reference to conduct, when that expression is used as a noun otherwise than as mentioned in paragraph (a), shall be read as a reference to the doing of, or the refusing to do, any act, including–

(i) the entering into, or giving effect to a provision of, a contract or arrangement; or

(ii) the arriving at, or the giving effect to a provision of, an understanding; or

(iii) the requiring of the giving, or the giving of, a covenant:

(c) ...

(3) Where any provision of this Act is expressed to render a provision of a contract or a covenant unenforceable if the provision of the contract or the covenant has or is likely to have a particular effect, that provision of this Act applies in relation to the provision of the contract or the covenant at any time when the provision of the contract or the covenant has or is likely to have that effect, notwithstanding that

(a) at an earlier time the provision of the contract or the covenant did not have that effect or was not regarded as likely to have that effect; or

(b) the provision of the contract or the covenant will not or may not have that effect at a later time.

(4) ...

(5) For the purposes of this Act,–

(a) a provision of a contract, arrangement or understanding, or a covenant shall be deemed to have had, or to have, a particular purpose if–
(i) the provision was or is included in the contract, arrangement or understanding, or the covenant was or is required to be given, for that purpose or purposes that included or include that purpose; and

(ii) that purpose was or is a substantial purpose:

(b) a person shall be deemed to have engaged, or to engage, in conduct for a particular purpose or a particular reason if–

(i) that person engaged or engages in that conduct for that purpose or reason or for purposes or reasons that included or include that purpose or reason; and

(ii) that purpose or reason was or is a substantial purpose or reason.

(6) In this Act,–

(a) a reference to a contract shall be construed as including a reference to a lease of, or a licence in respect of, any land or a building or part of a building, and

shall be so construed notwithstanding any express reference in this Act to any such lease or licence:

(b) a reference to making or entering into a contract, in relation to such a lease or licence, shall be read as a reference to granting or taking the lease or licence:

(c) a reference to a party to a contract, in relation to such a lease or licence, shall be read as including a reference to any person bound by, or entitled to the benefit of, any provision contained in the lease or licence.

Part 2 Restrictive trade practices

Practices substantially lessening competition

27 Contracts, arrangements, or understandings substantially lessening competition prohibited

(1) No person shall enter into a contract or arrangement, or arrive at an understanding, containing a provision that has the purpose, or has or is likely to have the effect, of substantially lessening competition in a market.

(2) No person shall give effect to a provision of a contract, arrangement, or understanding that has the purpose, or has or is likely to have the effect, of substantially lessening competition in a market.

(3) Subsection (2) applies in respect of a contract or arrangement entered into, or an understanding arrived at, whether before or after the commencement of this Act.

(4) No provision of a contract, whether made before or after the commencement of this Act, that has the purpose, or has or is likely to have the effect, of substantially lessening competition in a market is enforceable.

Cartel provisions

30 Prohibition on entering into or giving effect to cartel provision

No person may–

(a) Enter into a contract or arrangement, or arrive at an understanding, that contains a cartel provision; or

(b) give effect to a cartel provision.

30A Meaning of cartel provision and related terms

(1) A cartel provision is a provision, contained in a contract, arrangement, or understanding, that has the purpose, effect, or likely effect of 1 or more of the following in relation to the supply or acquisition of goods or services in New Zealand:

(a) price fixing:

(b) restricting output:

(c) market allocating.

(2) In this Act, price fixing means, as between the parties to a contract, arrangement, or understanding, fixing, controlling, or maintaining, or providing for the fixing, controlling, or maintaining of,—

(a) the price for goods or services that any 2 or more parties to the contract, arrangement, or understanding supply or acquire in competition with each other; or

(b) any discount, allowance, rebate, or credit in relation to goods or services that any 2 or more parties to the contract, arrangement, or understanding supply or acquire in competition with each other.

30B Additional interpretation relating to cartel provisions

In this Act, in relation to a cartel provision,–

(a) if a person is a party to a contract, arrangement, or understanding, each of the person’s interconnected bodies corporate is taken to be a party to the contract, arrangement, or understanding; and

(b) if a person (person A) or any of person A’s interconnected bodies corporate supplies or acquires goods or services in competition with another person (person B) or any of person B’s interconnected bodies corporate, person A is taken to supply or acquire those goods or services in competition with person B; and

(c) a reference to persons in competition with each other for the supply or acquisition of goods or services includes a reference to—

(i) persons who are, or are likely to be, in competition with each other in relation to the supply or acquisition of those goods or services; and

(ii) persons who, but for a cartel provision relating to those goods or services, would, or would be likely to, be in competition with each other in relation to the supply or acquisition of those goods or services.

30C Cartel provisions generally unenforceable

(1) No cartel provision is enforceable.

(2) However, nothing in subsection (1) affects the enforceability of a cartel provision in any contract to which section 31, 32, 33, 44A(4) or (5), or 44B applies.

31 Exception for collaborative activity

Exception for entering into cartel provision

(1) Nothing in section 30(a) applies to a person in relation to a cartel provision if, at the time of entering into or arriving at the contract, arrangement, or understanding that contains the provision, –

(a) the person and 1 or more other parties to the contract, arrangement, or understanding are involved in a collaborative activity; and

(b) the cartel provision is reasonably necessary for the purpose of the collaborative activity.

Exception for giving effect to cartel provision

(2) Nothing in section 30(b) applies to a person in relation to a cartel provision if, at the time of giving effect to the cartel provision,–

(a) the person and 1 or more other parties to the contract, arrangement, or understanding that contains the provision are involved in a collaborative activity; and

(b) the cartel provision is reasonably necessary for the purpose of the collaborative activity.

(3) Nothing in section 30(b) applies to a person in relation to a cartel provision that constitutes a restraint of trade if–

(a) the person and 1 or more other parties to the contract, arrangement, or understanding that contains the provision were involved in a collaborative activity that has ended; and

(b) the cartel provision was reasonably necessary for the purpose of the collaborative activity; and

(c) the collaborative activity did not end because the lessening of competition between any 2 or more parties became its dominant purpose.

Meaning of collaborative activity

(4) In this Act, collaborative activity means an enterprise, venture, or other activity, in trade, that –

(a) is carried on in co-operation by 2 or more persons; and

(b) is not carried on for the dominant purpose of lessening competition between any 2 or more of the parties.

(5) The purpose referred to in subsection (4)(b) may be inferred from the conduct of any relevant person or form any other relevant circumstance.

32 Exception for vertical supply contracts

(1) Nothing in section 30 applies to a person in relation to a cartel provision in a contract, if–

(a) the contract is entered into between a supplier or likely supplier of goods or services and a customer or likely customer of that supplier; and

(b) the cartel provision–

(i) relates to the supply or likely supply of the goods or services to the customer or likely customer, including to the maximum price at which the customer or likely customer may resupply the goods or services; and

(ii) does not have the dominant purpose of lessening competition between any 2 or more of the parties to the contract.

(2) The purpose referred to in subsection (1)(b)(ii) may be inferred from the conduct of any relevant person or from any other relevant circumstance.

33 Exception for joint buying and promotion agreements

A provision in a contract, arrangement, or understanding does not have the purpose, effect, or likely effect of price fixing if the provision–

(a) relates to the price for goods or services to be collectively acquired, whether directly or indirectly, by some or all of the parties to the contract, arrangement, or understanding; or

(b) provides for joint advertising of the price for the resupply of goods or services acquired in accordance with paragraph (a); or

(c) provides for a collective negotiation of the price for goods or services followed by individual purchasing at the collectively negotiated price; or

(d) provides for an intermediary to take title to goods and resell or resupply them to another party to the contract, arrangement, or understanding.

Taking advantage of market power

36 Taking advantage of market power

(1) Nothing in section applies to any practice or conduct to which this Part applies that has been authorised under Part 5.

(2) A person that has a substantial degree of power in a market must not take advantage of that power for the purpose of–

(a) restricting the entry of a person into that or any other market; or

(b) preventing or deterring a person from engaging in competitive conduct in that or any other market; or

(c) eliminating a person from that or any other market.

(3) For the purposes of this section, a person does not take advantage of a substantial degree of power in a market by reason only that the person seeks to enforce a statutory intellectual property right, within the meaning of section 45(2), in New Zealand.

(4) For the purposes of this section, a reference to a person includes 2 or more persons that are interconnected.

36B Purposes may be inferred

The existence of any of the purposes specified in section 36 or section 36A, as the case may be, may be inferred from the conduct of any relevant person or from any other relevant circumstances.

Part 5 Authorisations and clearances

Restrictive trade practices

58 Commission may grant authorisation for restrictive trade practices

(1) A person who wishes to enter into a contract or arrangement, or arrive at an understanding, to which that person considers section 27 would apply, or might apply, may apply to the Commission for an authorisation to do so and the Commission may grant an authorisation for that person to enter into the contract or arrangement, or arrive at the understanding.

(2) A person who wishes to give effect to a provision of a contract or arrangement or understanding to which that person considers section 27 would apply, or might apply, may apply to the Commission for an authorisation to do so, and the Commission may

grant an authorisation for that person to give effect to the provision of the contract or arrangement or understanding.

59A When Commission may grant authorisation

(1) The Commission may grant an authorisation to a person–

(a) to enter into a contract or arrangement, or to arrive at an understanding, even though the contract or arrangement has been entered into, or the understanding has been arrived at, before the Commission makes a determination in respect of the application for that authorisation; or

(b) to give effect to a provision of a contract or arrangement entered into, or an understanding arrived at, even though the applicant has already given, or is already giving, effect to the provision before the Commission makes a determination in respect of the application for that authorisation; or

Part 6 Enforcement, remedies, and appeals

Restrictive trade practices

80 Pecuniary penalties relating to restrictive trade practices

(1) If the court is satisfied on the application of the Commission that a person–

(a) has contravened any of the provisions of Part 2; or

(b) has attempted to contravene such a provision; or

(c) has aided, abetted, counselled, or procured any other person to contravene such a provision; or

(d) has induced, or attempted to induce, any other person, whether by threats or promises or otherwise, to contravene such a provision; or

(e) been in any way, directly or indirectly, knowingly concerned in, or party to, the contravention by any other person of such a provision; or

(f) has conspired with any other person to contravene such a provision,–

the court may order the person to pay to the Crown such pecuniary penalty as the court determines to be appropriate.

81 Injunctions may be granted by court for contraventions of Part 2

The court may, on the application of the Commission or any other person, grant an injunction restraining a person from engaging in conduct that constitutes or would constitute any of the following:

(a) a contravention of any of the provisions of Part 2:

(b) any attempt to contravene such a provision:

(c) aiding, abetting, counselling, or procuring any other person to contravene such a provision:

(d) inducing, or attempting to induce, any other person, whether by threats, promises or otherwise, to contravene such a provision:

(e) being in any way directly or indirectly, knowingly concerned in, or party to, the contravention by any other person of such a provision:

(f) conspiring with any other person to contravene such a provision.

82 Actions for damages for contravention of Part 2

(1) Every person is liable in damages for any loss or damage caused by that person engaging in conduct that constitutes any of the following:

(a) a contravention of any of the provisions of Part 2:

(b) aiding, abetting, counselling, or procuring the contravention of such a provision:

(c) inducing by threats, promises, or otherwise the contravention of such a provision:

(d) being in any way directly or indirectly, knowingly concerned in, or party to, the contravention of such a provision:

(e) conspiring with any other person in the contravention of such a provision.

(2) An action under subsection (1) may be commenced within 3 years after the matter giving rise to the contravention was discovered or ought reasonably to have been discovered. However, no action under subsection (1) may be commenced 10 years or more after the matter giving rise to the contravention.

Injunctions generally

90 Conduct by employees, agents, and others

(1) In proceedings under this Part in respect of conduct engaged in by a person other than an individual (person A), if it is necessary to establish the state of mind of person A it

is sufficient to show that a director, employee, or agent of person A, acting within the scope of the director’s, employee’s, or agent’s actual or apparent authority, had that state of mind.

(2) Conduct by a person (person B) is deemed for the purposes of this Act also to be the conduct of a person other than an individual (person A) if, at the time of the conduct,—

(a) person B was a director, employee, or agent of person A, acting within the scope of person B’s actual or apparent authority; or

(b) person B was a person who was acting on the direction, or with the consent or agreement (express or implied), of a director, employee, or agent of person A who was acting within the scope of the director’s, employee’s, or agent’s actual or apparent authority.

(3) In civil proceedings under this Part in respect of conduct engaged in by an individual (person C), if it is necessary to establish the state of mind of person C it is sufficient to show that an employee or agent of person C, acting within the scope of the employee’s or agent’s actual or apparent authority, had that state of mind.

(4) In civil proceedings under this Part, conduct by a person (person B) is deemed for the purposes of this Act also to be the conduct of an individual (person C) if, at the time of the conduct,—

(a) person B was acting at the direction, or with the consent or agreement (express or implied), of person C; or

(b) person B was an employee or agent of person C and acting within the scope of person B’s actual or apparent authority; or

(c) person B was a person who was acting on the direction, or with the consent or agreement (express or implied), of an employee or agent of person C who was acting within the scope of the employee’s or agent’s actual or apparent authority.

(5) A reference in this section to the state of mind of a person includes a reference to—

(a) the knowledge, intention, opinion, belief, or purpose of the person and the person’s reasons for that intention, opinion, belief, or purpose; and

(b) the state of mind of a person outside New Zealand.

Word count

The text of this paper (excluding acknowledgements, table of contents, footnotes, bibliography and appendices) comprises approximately 14,577 words.


NZLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.nzlii.org/nz/journals/UOtaLawTD/2020/38.html