Skip to main content

Building a Regulator

  • Chapter
  • First Online:
Book cover Robot Rules
  • 4767 Accesses

Abstract

Before we can start writing rules, we first need to ask: Who should write them? Turner argues that it is problematic to allow pure industry self-regulation because companies are responsible mainly to their shareholders whereas only governments have the power and legitimacy to create a single binding code. Turner suggests that AI regulation should be on a cross-industry and international basis. We can learn much from successful examples of international cooperation on matters including the Internet and space law. Turner concludes that we have the opportunity now to build an ideal system of regulation from a blank slate. The longer that we delay, the more difficult it will be to achieve.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 29.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 37.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See the opening paragraph of Chapter 1.

  2. 2.

    There are other related legal theories, such as Ronald Dworkin’s “interpretivism” (as to which see Ronald Dworkin, Taking Rights Seriously (Cambridge, MA: Harvard University Press, 1977); Law’s Empire (Cambridge, MA: Harvard University Press, 1986); Justice in Robes (Cambridge, MA: Harvard University Press, 2006); and Justice for Hedgehogs (Cambridge, MA: Harvard University Press, 2011). However, Dworkin’s writings largely relate to adjudication of disputes rather than the process of lawmaking and legal validity, which are the concerns here. Various types of legal realism are particularly popular in the USA. For a classic exposition, see Karl Llewellyn, The Bramble Bush: On Our Law and Its Study (New York: Oceana Publications, 1930). However, the latter theory is more of a rejection of debates as to validity rather than an attempt to engage with them. Accordingly, it will not be discussed further here.

  3. 3.

    John Gardner, “‘Legal Positivism’ : 5½ Myths”, The American Journal of Jurisprudence, Vol. 46 (2001), 199–227, 199. See also Joseph Raz, Ethics in the Public Domain (Oxford: Clarendon Press, 1994).

  4. 4.

    See, for example, John Finnis, Natural Law and Natural Rights (Oxford: Clarendon Press, 1981). In the “5½ Myths” paper (op. cit.), John Gardner contends that his version of Positivism would be acceptable to natural lawyers also.

  5. 5.

    It is perhaps unsurprising that some of the most prominent advocates of Natural Law, including both thirteenth-century Saint Thomas Aquinas and modern legal philosopher John Finnis are men of religion for whom there is a unitary God-given structure of values.

  6. 6.

    John Gardner, Law as a Leap of Faith: Essays on Law in General (Oxford: Oxford University Press, 2012), Chapter 6.

  7. 7.

    To put matters another way, Positivists concentrate on input legitimacy and Natural Lawyers on output legitimacy.

  8. 8.

    For the view of law as fictions, see Yuval Noah Harari, Sapiens: A Brief History of Humankind (London: Random House, 2015).

  9. 9.

    Verity Harding and Sean Legassick, “Why We Launched DeepMind Ethics & Society”, Website of Deepmind, 3 October 2017, https://deepmind.com/blog/why-we-launched-deepmind-ethics-society/, accessed 1 June 2018.

  10. 10.

    “Homepage”, Website of the Partnership on AI, https://www.partnershiponai.org/, accessed 1 June 2018. Microsoft has taken a slightly different approach, of eschewing external oversight for a committee which appears to be composed only from Microsoft insiders. Microsoft describes its AI and Ethics in Engineering and Research Committee in a 2018 publication as “a new internal organization that includes senior leaders from across Microsoft’s engineering, research, consulting and legal organizations who focus on proactive formulation of internal policies and on how to respond to specific issues as they arise”. Microsoft, The Future Computed: Artificial Intelligence and Its Role in Society (Redmond, Washington, DC: Microsoft Corporation, 2018), 76–77, https://msblob.blob.core.windows.net/ncmedia/2018/01/The-Future_Computed_1.26.18.pdf, accessed 1 June 2018.

  11. 11.

    Natasha Lomas, “DeepMind Now Has an AI Ethics Research Unit: We Have a Few Questions for It…”, TechCrunch, https://techcrunch.com/2017/10/04/deepmind-now-has-an-ai-ethics-research-unit-we-have-a-few-questions-for-it/, accessed 1 June 2018.

  12. 12.

    For an argument along these lines in terms of a slightly different issue: antitrust regulation of Internet platforms, see Maurits Dolmans, Jacob Turner, and Ricardo Zimbron, “Pandora’s Box of Online Ills: We Should Turn to Technology and Market-Driven Solutions Before Imposing Regulation or Using Competition Law”, Concurrences, N°3-2017.

  13. 13.

    This proposition has been widely recognised since at least the time of Aristotle. See, for instance, Pierre Pellegrin, “Aristotle’s Politics”, in The Oxford Handbook of Aristotle, edited by Christopher Shields (Oxford: Oxford University Press, 2012), 558–585.

  14. 14.

    See, for example, Thomas Donaldson and Lee E. Preston, “The Stakeholder Theory of the Corporation: Concepts, Evidence, and Implications”, The Academy of Management Review, Vol. 20, No. 1 (January 1995), 65–91; David Hawkins, Corporate Social Responsibility: Balancing Tomorrow’s Sustainability and Today’s Profitability (Hampshire, UK and New York, NY: Springer, 2006).

  15. 15.

    Christian Leuz, Dhananjay Nanda, and Peter Wysocki, “Earnings Management and Investor Protection: An International Comparison”, Journal of Financial Economics, Vol. 69, No. 3 (2003), 505–527.

  16. 16.

    Dodge v. Ford Motor Co., 170 N.W. 668 (Mich. 1919).

  17. 17.

    The full text is available at: http://archive.tobacco.org/History/540104frank.html, accessed 1 June 2018.

  18. 18.

    Kelly D. Brownell and Kenneth E. Warner, “The Perils of Ignoring History: Big Tobacco Played Dirty and Millions Died: How Similar Is Big Food?”, The Milbank Quarterly, Vol. 87, No. 1 (March 2009), 259–294.

  19. 19.

    “Partners”, Website of the Partnership on AI to Benefit People and Society, https://www.partnershiponai.org/partners/, accessed 1 June 2018.

  20. 20.

    Henry Mance, “Britain Has Had Enough of Experts, Says Gove”, Financial Times, 3 June 2016, https://www.ft.com/content/3be49734-29cb-11e6-83e4-abc22d5d108c, accessed 1 June 2018.

  21. 21.

    We return below to the problem of seeking international standards of regulation at s. 5 below.

  22. 22.

    Thomas Hobbes, Leviathan: Or, the Matter, Forme, & Power of a Common-Wealth Ecclesiasticall and Civil (London: Andrew Crooke, 1651), 62.

  23. 23.

    See, for example, Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Farnham: Ashgate, 2009); Michael N. Schmitt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics”, Harvard National Security Journal Feature (2013); Kenneth Anderson and Matthew C. Waxman, “Law and Ethics for Autonomous Weapon Systems: Why a Ban Won’t Work and How the Laws of War Can”, Stanford University, The Hoover Institution (Jean Perkins Task Force on National Security and Law Essay Series), 2013 (2013); Benjamin Wittes and Gabriella Blum, The Future of Violence: Robots and Germs, Hackers and Drones: Confronting a New Age of Threat (New York: Basic Books, 2015); Rebecca Crootof, “The Varied Law of Autonomous Weapon Systems”, in Autonomous Systems: Issues for Defence Policymakers, edited by Andrew P. Williams and Paul D. Scharre (Brussels: NATO Allied Command, 2015). Daniel Wilson writes, in his semi-satirical book, How to Survive a Robot Uprising: “If popular culture has taught us anything, it is that someday mankind must face and destroy the growing robot menace. In print and on the big screen we have been deluged with scenarios of robot malfunction, misuse and outright rebellion”. Daniel Wilson, How to Survive a Robot Uprising: Tips on Defending Yourself Against the Coming Rebellion (London: Bloomsbury, 2005), 10.

  24. 24.

    See, for example, Alex Glassbrook, The Law of Driverless Cars: An Introduction (Minehead, Somerset, UK: Law Brief Publishing, 2017); Autonomous Driving: Technical, Legal and Social Aspects, edited by Markus Maurer, J. Christian Gerdes, Barbara Lenz, and Hermann Winner (New York: SpringerLink, 2017).

  25. 25.

    See Chapter 1 at s. 5.

  26. 26.

    Kevin Kelly, “The Myth of Superhuman AI”, Wired, 24 April 2017, https://www.wired.com/2017/04/the-myth-of-a-superhuman-ai/, accessed 1 June 2018.

  27. 27.

    Ben Goertzel, “Cognitive Synergy: A Universal Principle for Feasible General Intelligence,” 2009 8th IEEE International Conference on Cognitive Informatics (Kowloon, Hong Kong, 2009), 464–468. https://doi.org/10.1109/coginf.2009.5250694.

  28. 28.

    José Hernández-Orallo, The Measure of All Minds: Evaluating Natural and Artificial Intelligence (Cambridge: Cambridge University Press, 2017).

  29. 29.

    Gerald Tesauro, “Temporal Difference Learning and TD-Gammon”, Communications of the ACM, Vol. 38, No. 3 (1995), 58–68.

  30. 30.

    Volodymyr Mnih, Koray Kavukcuoglu, David Silver, Alex Graves, Ioannis Antonoglou, Daan Wierstra, and Martin Riedmiller, “Playing Atari with Deep Reinforcement Learning”, arXiv:1312.5602v1 [cs.LG], 19 December 2013, https://arxiv.org/pdf/1312.5602v1.pdf, accessed 1 June 2018; see also Volodymyr Mnih, Koray Kavukcuoglu, David Silver, Andrei A. Rusu, Joel Veness, Marc G. Bellemare, Alex Graves, Martin Riedmiller, Andreas K. Fidjeland, Georg Ostrovski, Stig Petersen, Charles Beattie, Amir Sadik, Ioannis Antonoglou, Helen King, Dharshan Kumaran, Daan Wierstra, Shane Legg, and Demis Hassabis, “Human-Level Control Through Deep Reinforcement Learning”, Nature, Vol. 518 (26 February 2015), 529–533, https://deepmind.com/research/publications/playing-atari-deep-reinforcement-learning/, accessed 1 June 2018.

  31. 31.

    Ibid., 2.

  32. 32.

    Ibid., 6. “The input to the neural network consists is an 84 × 84 × 4 image produced by φ. The first hidden layer convolves 16 8 × 8 filters with stride 4 with the input image and applies a rectifier nonlinearity... The second hidden layer convolves 32 4 × 4 filters with stride 2, again followed by a rectifier nonlinearity. The final hidden layer is fully-connected and consists of 256 rectifier units. The output layer is a fully connected linear layer with a single output for each valid action”.

  33. 33.

    Ibid., 4–5. The DeepMind researchers explain “experience replay” as follows: “In contrast to TD-Gammon and similar online approaches, we utilize a technique known as experience replay... where we store the agent’s experiences at each time-step, et = (st, at, rt, st+1) in a data-set D = e1, …, eN, pooled over many episodes into a replay memory. During the inner loop of the algorithm, we apply Q-learning updates, or minibatch updates, to samples of experience, e ∼ D, drawn at random from the pool of stored samples. After performing experience replay, the agent selects and executes an action according to a greedy policy. Since using histories of arbitrary length as inputs to a neural network can be difficult, our Q-function instead works on fixed length representation of histories produced by a function φ”.

  34. 34.

    Steven Piantadosi and Richard Aslin, “Compositional Reasoning in Early Childhood”, PloS one, Vol. 11, No. 9 (2016), e0147734.

  35. 35.

    As to the various mental shortcuts, or “heuristics” used by humans in this exercise, see Daniel Kahneman, Thinking Fast and Slow (London: Allen Lane, 2011), 55–57.

  36. 36.

    James Kirkpatrick, Razvan Pascanu, Neil Rabinowitz, Joel Veness, Guillaume Desjardins, Andrei A. Rusu, Kieran Milan, John Quan, Tiago Ramalho, Agnieszka Grabska-Barwinska, Demis Hassabis, Claudia Clopath, Dharshan Kumaran, and Raia Hadsell, “Overcoming Catastrophic Forgetting in Neural Networks”, Proceedings of the National Academy of Sciences of the United States of America, Vol. 114, No. 13 (2017), James Kirkpatrick, 3521–3526, https://doi.org/10.1073/pnas.1611835114; See also R.M. French and N. Chater, “Using Noise to Compute Error Surfaces in Connectionist Networks: A Novel Means of Reducing Catastrophic Forgetting”, Neural Computing, Vol. 14, No. 7 (2002), 1755–1769; and K. Milan et al., “The Forget-Me-Not Process”, in Advances in Neural Information Processing Systems 29, edited by D.D. Lee, M. Sugiyama, U.V. Luxburg, I. Guyon, and R. Garnett (Red Hook, NY: Curran Assoc., 2016).

  37. 37.

    Weber and Racaniere, et al., “Imagination-Augmented Agents for Deep Reinforcement Learning”, arXiv:1707.06203v1 [cs.LG], 19 July 2017, https://arxiv.org/pdf/1707.06203.pdf, accessed 1 June 2018.

  38. 38.

    See, for example, Darrell Etherington, “Microsoft Creates an AI Research Lab to Challenge Google and DeepMind”, TechCrunch, 12 July 2017, https://techcrunch.com/2017/07/12/microsoft-creates-an-ai-research-lab-to-challenge-google-and-deepmind/, accessed 1 June 2018; Shelly Fan, “Google Chases General Intelligence with New AI That Has a Memory”, SingularityHub, 29 March 2017, https://singularityhub.com/2017/03/29/google-chases-general-intelligence-with-new-ai-that-has-a-memory/, accessed 1 June 2018.

  39. 39.

    “Think of the steps that a human being has to do to make a cup of coffee and you have covered basically 10, 20 years of your lifetime just to learn it. So for a computer to do it the same way, it has to go through the same learning, walking to a house using some kind of optical with a vision system, stepping around and opening the door properly, going down the wrong way, going back, finding the kitchen, detecting what might be a coffee machine. You can’t program these things, you have to learn it, and you have to watch how other people make coffee. … This is a kind of logic that the human brain does just to make a cup of coffee. We will never ever have artificial intelligence. Your pet, for example, your pet is smarter than any computer”. Steve Wozniak, interviewed by Peter Moon, “Three Minutes with Steve Wozniak”, PC World, 19 July 2007. See also Luke Muehlhauser, “What Is AGI?”, MIRI, https://intelligence.org/2013/08/11/what-is-agi/, accessed 1 June 2018.

  40. 40.

    Interview with Dr. Kazuo Yano, “Enterprises of the Future Will Need Multi-purpose AIs”, Hitachi Website, http://www.hitachi.co.jp/products/it/it-pf/mag/special/2016_02th_e/interview_ky_02.pdf, accessed 1 June 2018.

  41. 41.

    UK Department of Transport, “The Pathway to Driverless Cars: Detailed Review of Regulations for Automated Vehicle Technologies”, UK Government Website, February 2015, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/401565/pathway-driverless-cars-main.pdf, accessed 1 June 2018.

  42. 42.

    When in 2017 the UK’s House of Lords Science and Technology Select Committee published a report entitled “Connected and Autonomous Vehicles: The Future?”, it concentrated solely on land-based vehicles. House of Lords, Science and Technology Select Committee, “Connected and Autonomous Vehicles: The Future?”, 2nd Report of Session 2016–17, HL Paper 115 (15 March 2017). The Report expressly says at para. 23: “We have not considered remote control vehicles (RCV) or drones (unmanned aerial vehicles) in this report”. The US Department of Transport published its Federal Automated Vehicles Policy in September 2016. The US Department of Transport published its Federal Automated Vehicles Policy in September 2016. Unlike the UK’s paper, the Federal Policy mentioned the regulation of drones—but only in a two-page appendix. As the Department of Transport observed, the US Federal Aviation Authority’s, “… Challenges Seem Closest to Those That [National Highway Traffic Safety Administration] Faces in Dealing with [Highly Automated Vehicles]”, 94, https://www.transportation.gov/sites/dot.gov/files/docs/AV%20policy%20guidance%20PDF.pdf, accessed 1 June 2018. Likewise, the UK’s Automated and Electric Vehicles Act 2018, one of the world’s first pieces of legislation to address insurance for autonomous vehicles, was restricted to liability for motor vehicles which “are or might be used on roads or in other public places”. No thought was given to extending its provisions to drones even though the same issues of liability and insurance arise. “Automated and Electric Vehicles Act 2018”, UK Parliament Website, https://services.parliament.uk/bills/2017-19/automatedandelectricvehicles.html, accessed 20 August 2018.

  43. 43.

    Indeed, so important was this case that the UK Government now lists explicit guidance on this issue on its website: “Excepted Items: Confectionery: The Bounds of confectionery, Sweets, Chocolates, Chocolate Biscuits, Cakes and Biscuits: The Borderline Between Cakes and Biscuits”, UK Government Website, https://www.gov.uk/hmrc-internal-manuals/vat-food/vfood6260, accessed 1 June 2018.

  44. 44.

    “Why Jaffa Cakes Are Cakes, Not Biscuits”, Kerseys Solicitors, 22 September 2014, http://www.kerseys.co.uk/blog/jaffa-cakes-cakes-biscuits/, accessed 1 June 2018.

  45. 45.

    For a list categorising countries’ legal systems, see the Central Intelligence Agency World Factbook, Field Listing: Legal Systems, https://www.cia.gov/library/publications/the-world-factbook/fields/2100.html, accessed 1 June 2018.

  46. 46.

    Common law developed in the UK, and variants are found in countries including Australia, Canada, Ireland, India, Singapore and the USA.

  47. 47.

    See further Cross and Harris, Precedent 6; Neil Duxbury, The Nature and Authority of Precedent (Cambridge, UK: Cambridge University Press, 2008), 103; and Jowitt’s Dictionary of English Law, edited by Daniel Greenberg (4th edn. London: Sweet & Maxwell 2015), Entry on Precedent.

  48. 48.

    Oliver Wendell Holmes, The Common Law (Boston, MA: Little, Brown and Company, 1881), 1.

  49. 49.

    Kenneth Graham, “Of Frightened Horses and Autonomous Vehicles: Tort Law and Its Assimilation of Innovations”, Santa Clara Law Review, Vol. 52 (2012), 101–131. See also the views of Mark Deem: “What I think is important… is that we do it through case law…the law has this ability to be able to fill the gaps, and we should embrace that”, in “The Law and Artificial Intelligence”, Unreliable Evidence, BBC Radio 4, first broadcast 10 January 2015, http://www.bbc.co.uk/programmes/b04wwgz9, accessed 1 June 2018.

  50. 50.

    A.P. Herbert, Uncommon Law: Being 66 Misleading Cases Revised and Collected in One Volume (London: Eyre Methuen, 1969), 127.

  51. 51.

    UK House of Commons Science and Technology Committee Report on Robotics and Artificial Intelligence, Fifth Report of Session 2016–2017, Published on 12 October 2016, HC 145, https://www.publications.parliament.uk/pa/cm201617/cmselect/cmsctech/145/145.pdf, accessed 1 June 2018.

  52. 52.

    Written Evidence submitted to the UK House of Commons Science and Technology Committee by the Law Society (ROB0037), http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/robotics-and-artificial-intelligence/written/32616.html, accessed 1 June 2018.

  53. 53.

    For similar criticisms of case law as a means of creating rules for AI, see Matthew U. Scherer, “Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies and Strategies”, Harvard Journal of Law & Technology, Vol. 29, No. 2 (Spring 2016), 354–398, 388–392.

  54. 54.

    Jeremy Waldron, “The Core of the Case Against Judicial Review”, The Yale Law Journal (2006), 1346–1406, 1363.

  55. 55.

    See, for example, a speech by UK Supreme Court Justice setting out the limits of the judicial function, Lord Sumption, “The Limits of the Law”, 27th Sultan Azlan Shah Lecture, Kuala Lumpur, 20 November 2013, https://www.supremecourt.uk/docs/speech-131120.pdf, accessed 1 June 2018. See also the decision of the majority in R (Nicklinson) v. Ministry of Justice [2014] UKSC 38, where the Supreme Court declined to find that a terminally ill person had a right to be administered euthanasia, in the absence of any imprimatur from Parliament to that effect.

  56. 56.

    Jack Stilgoe and Alan Winfield, “Self-Driving Car Companies Should Not Be Allowed to Investigate Their Own Crashes”, The Guardian, 13 April 2018, https://www.theguardian.com/science/political-science/2018/apr/13/self-driving-car-companies-should-not-be-allowed-to-investigate-their-own-crashes, accessed 1 June 2018.

  57. 57.

    “Homepage”, Website of the House of Lords Select Committee on A.I., http://www.parliament.uk/ai-committee, accessed 1 June 2018.

  58. 58.

    “Homepage”, Website of the All-Party Parliamentary Group on A.I., http://www.appg-ai.org/, accessed 1 June 2018.

  59. 59.

    Another area of focus for discussions on AI and law which is outside of the problems addressed by this book is the impact of AI on the legal industry itself, for example as a replacement for lawyers and judges. As to which, see the website of the International Association for Artificial Intelligence and Law, “Homepage”, http://www.iaail.org/, accessed 30 December 2017.

  60. 60.

    “House of Commons Science and Technology Committee, Robotics and Artificial Intelligence”, Fifth Report of Session 2016–17, 13 September 2016, para. 64.

  61. 61.

    Theresa May, “Address to World Economic Forum”, 25 January 2018, https://www.weforum.org/agenda/2018/01/theresa-may-davos-address/, accessed 1 June 2018.

  62. 62.

    Rowan Manthorpe, “May’s Davos Speech Exposed the Emptiness in the UK’s AI Strategy”, Wired, 28 January 2018, http://www.wired.co.uk/article/theresa-may-davos-artificial-intelligence-centre-for-data-ethics-and-innovation, accessed 1 June 2018.

  63. 63.

    Rebecca Hill, “Another Toothless Wonder? Why the UK.gov’s Data Ethics Centre Needs Clout”, The Register, 24 November 2017, https://www.theregister.co.uk/2017/11/24/another_toothless_wonder_why_the_ukgovs_data_ethics_centre_needs_some_clout/, accessed 1 June 2018.

  64. 64.

    Ibid.

  65. 65.

    House of Lords Select Committee on Artificial Intelligence, AI in the UK : Ready, Willing and Able? Report of Session 2017–19 HL Paper 100, https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/100.pdf, accessed 1 June 2018.

  66. 66.

    The speech is available at: https://www.pscp.tv/w/1RDGldoaePmGL, accessed 1 June 2018.

  67. 67.

    Nicholas Thompson, “Emmanuel Macron Talks to Wired About France’s AI Strategy”, Wired, 31 March 2018, https://www.wired.com/story/emmanuel-macron-talks-to-wired-about-frances-ai-strategy/, accessed 1 June 2018.

  68. 68.

    Ibid.

  69. 69.

    Cédric Villani, “For a Meaningful Artificial Intelligence: Towards a French and European Strategy”, March 2018, https://www.aiforhumanity.fr/pdfs/MissionVillani_Report_ENG-VF.pdf, accessed 1 June 2018.

  70. 70.

    Anne Bajart, “Artificial Intelligence Activities”, European Commission Directorate-General for Communications Networks, Content and Technology, https://ec.europa.eu/growth/tools-databases/dem/monitor/sites/default/files/6%20Overview%20of%20current%20action%20Connect.pdf, accessed 1 June 2018.

  71. 71.

    See Chapter 8 at Sections 2.4 and 4.2.

  72. 72.

    European Commission, “Call for a High-Level Expert Group on Artificial Intelligence”, Website of the European Commission, https://ec.europa.eu/digital-single-market/en/news/call-high-level-expert-group-artificial-intelligence, accessed 1 June 2018.

  73. 73.

    “EU Member States Sign Up to Cooperate on Artificial Intelligence”, Website of the European Commission, 10 April 2018, https://ec.europa.eu/digital-single-market/en/news/eu-member-states-sign-cooperate-artificial-intelligence, accessed 1 June 2018.

  74. 74.

    “The Administration’s Report on the Future of Artificial Intelligence”, Website of the Obama White House, 12 October 2016, https://obamawhitehouse.archives.gov/blog/2016/10/12/administrations-report-future-artificial-intelligence, accessed 1 June 2018. For the reports themselves, see https://obamawhitehouse.archives.gov/sites/default/files/whitehouse_files/microsites/ostp/NSTC/preparing_for_the_future_of_ai.pdf; and https://obamawhitehouse.archives.gov/sites/default/files/whitehouse_files/microsites/ostp/NSTC/national_ai_rd_strategic_plan.pdf, accessed 1 June 2018.

  75. 75.

    “Preparing for the Future of Artificial Intelligence”, Executive Office of the President National Science and Technology Council Committee on Technology, 17–18 and 30–32 October 2016, https://obamawhitehouse.archives.gov/sites/default/files/whitehouse_files/microsites/ostp/NSTC/preparing_for_the_future_of_ai.pdf, accessed 1 June 2018.

  76. 76.

    “A Roadmap for US Robotics: From Internet to Robotics”, 31 October 2016, http://jacobsschool.ucsd.edu/contextualrobotics/docs/rm3-final-rs.pdf, accessed 1 June 2018.

  77. 77.

    Cade Metz, “As China Marches Forward on A.I., the White House Is Silent”, New York Times, 12 February 2018, https://www.nytimes.com/2018/02/12/technology/china-trump-artificial-intelligence.html, accessed 1 June 2018.

  78. 78.

    “The Japanese Robot market was worth approximately 630 billion (JPY) (approximately 4.8 billion (EUR)), in 2015”. Fumio Shimpo, “The Principal Japanese AI and Robot Strategy and Research Toward Establishing Basic Principles”, Journal of Law and Information Systems, Vol. 3 (May 2018).

  79. 79.

    “Report on Artificial Intelligence and Human Society”, Japan Advisory Board on Artificial Intelligence and Human Society, 24 March 2017, Preface, http://www8.cao.go.jp/cstp/tyousakai/ai/summary/aisociety_en.pdf, accessed 1 June 2018.

  80. 80.

    Ibid.

  81. 81.

    Advisory Board on Artificial Intelligence and Human Society, “Report on Artificial Intelligence and Human Society, Unofficial Translation”, http://www8.cao.go.jp/cstp/tyousakai/ai/summary/aisociety_en.pdf, accessed 1 June 2018.

  82. 82.

    Available in English translation from the New America Institute: “A Next Generation Artificial Intelligence Development Plan”, China State Council, translated by Rogier Creemers, Leiden Asia Centre; Graham Webster, Yale Law School Paul Tsai China Center; Paul Triolo, Eurasia Group; and Elsa Kania (Washington, DC: New America, 2017), https://na-production.s3.amazonaws.com/documents/translation-fulltext-8.1.17.pdf, accessed 1 June 2018.

  83. 83.

    Paul Triolo and Jimmy Goodrich, “From Riding a Wave to Full Steam Ahead As China’s Government Mobilizes for AI Leadership, Some Challenges Will Be Tougher Than Others”, New America, 28 February 2018, https://www.newamerica.org/cybersecurity-initiative/digichina/blog/riding-wave-full-steam-ahead/, accessed 1 June 2018.

  84. 84.

    Jeffrey Ding, “Deciphering China’s AI Dream”, in Governance of AI Program, Future of Humanity Institute (Oxford: Future of Humanity Institute, March 2018), 30, https://www.fhi.ox.ac.uk/wp-content/uploads/Deciphering_Chinas_AI-Dream.pdf, accessed 1 June 2018.

  85. 85.

    Jeffrey Ding, “Deciphering China’s AI Dream”, in Governance of AI Program, Future of Humanity Institute (Oxford: Future of Humanity Institute, March 2018), https://www.fhi.ox.ac.uk/wp-content/uploads/Deciphering_Chinas_AI-Dream.pdf, accessed 1 June 2018.

  86. 86.

    Ibid., 31.

  87. 87.

    Ibid., unofficial translation by Jeffrey Ding.

  88. 88.

    Ibid.

  89. 89.

    Paul Triolo and Jimmy Goodrich, “From Riding a Wave to Full Steam Ahead As China’s Government Mobilizes for AI Leadership, Some Challenges Will Be Tougher Than Others”, New America, 28 February 2018, https://www.newamerica.org/cybersecurity-initiative/digichina/blog/riding-wave-full-steam-ahead/, accessed 1 June 2018.

  90. 90.

    “White Paper on Standardization in AI”, National Standardization Management Committee, Second Ministry of Industry, 18 January 2018, http://www.sgic.gov.cn/upload/f1ca3511-05f2-43a0-8235-eeb0934db8c7/20180122/5371516606048992.pdf, accessed 9 April 2018. Contributors to the white paper included: the China Electronics Standardization Institute, Institute of Automation, Chinese Academy of Sciences, Beijing Institute of Technology, Tsinghua University, Peking University, Renmin University, as well as private companies Huawei, Tencent, Alibaba, Baidu, Intel (China) and Panasonic (formerly Matsushita Electric) (China) Co., Ltd.

  91. 91.

    Ibid., para. 3.3.1.

  92. 92.

    Elsa Kania, “China’s Strategic Ambiguity and Shifting Approach to Lethal Autonomous Weapons Systems”, Lawfare Blog, 17 April 2018, https://www.lawfareblog.com/chinas-strategic-ambiguity-and-shifting-approach-lethal-autonomous-weapons-systems, accessed 1 June 2018. Kania notes that China’s announcement may not be all that it seems, especially given that China appears simultaneously to be developing its own autonomous weapon systems, whilst calling for a potential future ban. The original recording of the Chinese delegation’s statement is available on the UN Digital Recordings Portal website, at: https://conf.unog.ch/digitalrecordings/index.html?guid=public/61.0500/E91311E5-E287-4286-92C6-D47864662A2C_10h14&position=1197, accessed 1 June 2018.

  93. 93.

    “Convergence on Retaining Human Control of Weapons Systems”, Campaign to Stop Killer Robots, 13 April 2018, https://www.stopkillerrobots.org/2018/04/convergence/, accessed 1 June 2018.

  94. 94.

    Paul Triolo and Jimmy Goodrich, “From Riding a Wave to Full Steam Ahead As China’s Government Mobilizes for AI Leadership, Some Challenges Will Be Tougher Than Others”, New America, 28 February 2018, https://www.newamerica.org/cybersecurity-initiative/digichina/blog/riding-wave-full-steam-ahead/, accessed 1 June 2018.

  95. 95.

    WangHun Jen, “Contextualising China’s Call for Discourse Power in International Politics”, China : An International Journal, Vol. 13, No. 3 (2015), 172–189. Project MUSE, muse.jhu.edu/article/604043, accessed 9 April 2018. See also Jin Cai, “5 Challenges in China’s Campaign for International Influence”, The Diplomat, 26 June 2017, https://thediplomat.com/2017/06/5-challenges-in-chinas-campaign-for-international-influence/, accessed 1 June 2018.

  96. 96.

    See, for example, Michel Foucault, Archeology of Knowledge, translated by A.M. Sheridan Smith (New York: Pantheon Books, 1972). The definition quoted is from Iara Lesser, “Discursive Struggles Within Social Welfare: Restaging Teen Motherhood”, The British Journal of Social Work, Vol. 36, No. 2 (1 February 2006), 283–298.

  97. 97.

    See Joseph S. Nye, Jr., “Soft Power”, Foreign Policy, No. 80, Twentieth Anniversary (Autumn 1990), 153–171.

  98. 98.

    “Decision of the Central Committee of the Communist Party of China on Deepening Cultural System Reforms to Promote Major Development and Prosperity of Socialist Culture”, Xinhua News Agency, Beijing, 25 October 2011, http://www.gov.cn/jrzg/2011-10/25/content_1978202.htm, accessed 1 June 2018.

  99. 99.

    Jin Cai, “5 Challenges in China’s Campaign for International Influence”, The Diplomat, 26 June 2017, https://thediplomat.com/2017/06/5-challenges-in-chinas-campaign-for-international-influence/, accessed 1 June 2018.

  100. 100.

    Julian E. Barnes and Josh Chin, “The New Arms Race in AI”, The Wall Street Journal, 2 March 2018, https://www.wsj.com/articles/the-new-arms-race-in-ai-1520009261, accessed 1 June 2018. See also John R. Allen and Amir Husain, “The Next Space Race Is Artificial Intelligence: And the United States Is Losing”, Foreign Policy, 3 November 2017, http://foreignpolicy.com/2017/11/03/the-next-space-race-is-artificial-intelligence-and-america-is-losing-to-china/, accessed 1 June 2018.

  101. 101.

    See above at s. 4. See also UK House of Commons Science and Technology Committee, Robotics and Artificial Intelligence, Fifth Report of Session 2016–17, 13 September 2016, para. 64; Mathew Lawrence, Carys Roberts, and Loren King, “Inequality and Ethics in the Digital Age”, IPPR Commission on Economic Justice Managing Automation Employment, December 2017, 37–39; Ryan Calo, “The Case for a Federal Robotics Commission”, Brookings Institution, 15 September 2014, 3, https://www.brookings.edu/research/the-case-for-a-federal-robotics-commission/, accessed 1 June 2018; and Matthew U. Scherer, “Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies and Strategies”, Harvard Journal of Law & Technology, Vol. 29, No. 2 (Spring 2016), 354–398, 393–398.

  102. 102.

    “Why Is the Border Between the Koreas Sometimes Called the ‘38th Parallel’?”, The Economist, 5 November 2013.

  103. 103.

    One of the most famous examples is the “Louisiana Purchase” whereby the USA acquired the territory (later the state) of Louisiana from France in 1803, for 50 million Francs.

  104. 104.

    For an example of the types of costs and difficulties caused by regulatory compliance see Stacey English and Susannah Hammond, Cost of Compliance 2017 (London: Thompson Reuters, 2017).

  105. 105.

    Vanessa Houlder, “OECD Unveils Global Crackdown on Tax Arbitrage by Multinationals”, Financial Times, 19 July 2013, https://www.ft.com/content/183c2e26-f03c-11e2-b28d-00144feabdc0, accessed 1 June 2018.

  106. 106.

    “Whither Nationalism? Nationalism Is Not Fading Away: But It Is Not Clear Where It Is Heading”, The Economist, 19 December 2017.

  107. 107.

    Ibid.

  108. 108.

    ICANN, “The IANA Functions”, December 2015, https://www.icann.org/en/system/files/files/iana-functions-18dec15-en.pdf, accessed 1 June 2018.

  109. 109.

    ICANN, “History of ICANN”, https://www.icann.org/en/history/icann-usg, accessed 1 June 2018.

  110. 110.

    Ibid. See also Clinton White House, “Framework for Global Electronic Commerce”, 1 July 1997, https://clintonwhitehouse4.archives.gov/WH/New/Commerce/read.html, accessed 1 June 2018.

  111. 111.

    “National Telecommunications Information Administration, Responses to Request for Comments”, https://www.ntia.doc.gov/legacy/ntiahome/domainname/130dftmail/, accessed 1 June 2018.

  112. 112.

    “Articles of Incorporation of Internet Corporation for Assigned Names and Numbers”, ICANN, as revised 21 November 1998, https://www.icann.org/resources/pages/articles-2012-02-25-en, accessed 1 June 2018.

  113. 113.

    ICANN-Accredited Registrars, https://www.icann.org/registrar-reports/accredited-list.html, accessed 18 January 2018.

  114. 114.

    ICANN, “Beginner’s Guide to At-Large Structures”, June 2014, https://www.icann.org/sites/default/files/assets/alses-beginners-guide-02jun14-en.pdf, accessed 1 June 2018, 3.

  115. 115.

    Ibid., 4.

  116. 116.

    Ibid., 7–8.

  117. 117.

    “History of ICANN”, ICANN, https://www.icann.org/en/history/icann-usg, accessed 1 June 2018.

  118. 118.

    “Remarks by President Trump to the 72nd Session of the United Nations General Assembly”, Website of the White House, 19 September 2017, https://www.whitehouse.gov/briefings-statements/remarks-president-trump-72nd-session-united-nations-general-assembly/, accessed 1 June 2018.

  119. 119.

    For discussion, see Arthur A. Stein, Why Nations Cooperate: Circumstance and Choice in International Relations (Ithaca and London: Cornell University Press, 1990).

  120. 120.

    OHCHR, OHRLLS, UNDESA, UNEP, UNFPA, “Global Governance and Governance of the Global Commons in the Global Partnership for Development Beyond 2015: Thematic Think Piece”, January 2013, http://www.un.org/en/development/desa/policy/untaskteam_undf/thinkpieces/24_thinkpiece_global_governance.pdf, accessed 1 June 2018.

  121. 121.

    Arthur A. Stein, Why Nations Cooperate: Circumstance and Choice in International Relations (Ithaca and London: Cornell University Press, 1990), 7–10.

  122. 122.

    Thomas C. Schelling, The Strategy of Conflict (Cambridge, MA: Harvard University Press, 1960). See also Glenn H. Snyder, “‘Prisoner’s Dilemma’ and Chicken’ Models in International Politics”, International Studies Quarterly, Vol. 15 (March 1971), 66–103.

  123. 123.

    Tucker Davey, “Developing Countries Can’t Afford Climate Change”, Future of Life Institute, 5 August 2016, https://futureoflife.org/2016/08/05/developing-countries-cant-afford-climate-change/, accessed 1 June 2018.

  124. 124.

    Helpfully, there was precedent for this: the Antarctic Treaty was agreed in 1957 by the 12 countries whose scientists had been active in that area, which included several of the major world powers: the US, France, the UK and Russia. “The Antarctic Treaty”, Website of the Antarctic Treaty Secretariat, http://www.ats.aq/e/ats.htm, accessed 1 June 2018.

  125. 125.

    US Department of State, “Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, Including the Moon and Other Celestial Bodies: Narrative”, Website of the US Department of State, https://www.state.gov/t/isn/5181.htm, accessed 1 June 2018.

  126. 126.

    General Assembly resolution 1962 (XVIII) of 13 December 1963.

  127. 127.

    Art. IX, Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, Including the Moon and Other Celestial Bodies 1967.

  128. 128.

    These are: NASA (the US), ESA (various European states), CSA (Canada), JAXA (Japan), and Roscosmos (Russia). “International Space Station”, Website of NASA, https://www.nasa.gov/mission_pages/station/main/index.html, accessed 1 June 2018.

  129. 129.

    The 1967 Agreement on the Rescue of Astronauts, the Return of Astronauts and the Return of Objects Launched into Outer Space provides for aiding the crews of spacecraft in the event of accident or emergency landing. It also establishes a procedure for returning to a launching authority a space object found beyond the territorial limits of that authority. The 1971 Convention on International Liability for Damage Caused by Space Objects provides that launching States are liable for damage caused by their space objects on the Earth’s surface or to aircraft in flight and/or to space objects of another State or to persons or property on board those objects. The 1974 Convention on Registration of Objects Launched into Outer Space provides that launching States shall maintain registries of space objects and furnish specified information on each space object launched for inclusion in a central United Nations register. The 1979 Agreement Governing the Activities of States on the Moon and Other Celestial Bodies elaborates, in more specific terms, the principles relating to the Moon and other celestial bodies set out in the 1966 Treaty. See “General Assembly Resolutions and Treaties Pertaining to the Peaceful Uses to Outer Space”, United Nations Website, http://www.un.org/events/unispace3/bginfo/gares.htm, accessed 1 June 2018.

  130. 130.

    “Role and Responsibilities”, Website of UNOOSA, http://www.unoosa.org/oosa/en/aboutus/roles-responsibilities.html, accessed 1 June 2018.

  131. 131.

    Public International Law is often distinguished from Private International Law. The latter stipulates how the national legal systems of countries interact, particularly with regard to rules on when courts of different countries will accept jurisdiction over a dispute, as well as the recognition and enforcement of foreign judgments. See for the English approach, Collins, ed., Dicey, Morris and Collins on the Conflict of Laws (15th edn. London: Sweet & Maxwell, 2016). Private International Law is not discussed further in this work.

  132. 132.

    Customary International Law is particularly difficult to define with precision, given that it is formed by a combination of state practice and opinio juris: whether states actually regard themselves as bound by law to do or not do the action in question. See, for example, Jörg Kammerhofer, “Uncertainty in the Formal Sources of International Law: Customary International Law and Some of Its Problems”, European Journal of International Law, Vol. 15, No. 3, 523–553; Frederic L. Kirgis, Jr., “Custom on a Sliding Scale”, The American Journal of International Law, Vol. 81, No. 1 (January 1987), 146–151.

  133. 133.

    Again, there is significant uncertainty as to what principles are generally recognised as such. See The Barcelona Traction Case, ICJ Reports (1970), 3.

  134. 134.

    Unlike common law systems, there is no stare decisis rule in international law. Rebecca M. Wallace, International Law (London: Sweet & Maxwell, 1986, 22).

  135. 135.

    See The Statute of the International Court of Justice, art. 38(1).

  136. 136.

    See art. 25 of the UN Charter. See also sources cited at entry entitled “Are UN Resolutions Binding”, Website of Dag Hammarskjold Library, http://ask.un.org/faq/15010, accessed 3 June 2017; Philippe Sands, Pierre Klein, and D.W. Bowett, Bowett’s Law of International Institutions (6th edn. London: Sweet & Maxwell, 2009).

  137. 137.

    Math Noortmann, August Reinisch, and Cedric Ryngaert, eds. Non-state Actors in International Law (Oxford: Hart Publishing, 2015).

  138. 138.

    Legal theorist Hans Kelsen identified the proposition “Pacta sunt servanda”: agreements are to be honoured, as the most important norm in international law. Hans Kelsen, “Théorie générale du droit international public. Problèmes choisis”, Collected Courses of The Hague Academy of International Law 42 (Boston: Brill Nijhoff, 1932), IV, 13. Discussed in Francois Rigaux, “Hans Kelsen on International Law”, European Journal of International Law, Vol. 9, No. 2 (1998), 325–343.

  139. 139.

    Ryan Goodman, “Human Rights Treaties, Invalid Reservations, and State Consent”, American Journal of International Law, Vol. 96 (2002), 531–560; Alan Boyle and Christine Chinkin, The Making of International Law (Oxford: Oxford University Press, 2007).

  140. 140.

    This is referred to as the “Westphalian Model”, following the Peace of Westphalia of 1648, ending the 30 Years War by affirming the individual control of States over their internal affairs.

  141. 141.

    Russell Hittinger, “Social Pluralism and Subsidiarity in Catholic Social Doctrine”, Annales Theologici, Vol. 16 (2002), 385–408, 396.

  142. 142.

    “Subsidiarity”, EUR-Lex (official website for EU law), http://eur-lex.europa.eu/summary/glossary/subsidiarity.html, accessed 9 December 2017. It is now enshrined as a principle within art. 5(3) of the Treaty on the Functioning of the EU.

  143. 143.

    “The Principle of Subsidiarity”, European Parliament , January 2018, http://www.europarl.europa.eu/ftu/pdf/en/FTU_1.2.2.pdf, accessed 1 June 2018.

  144. 144.

    See art. 263 of the Treaty on the Functioning of the European Union.

  145. 145.

    Christine M. Chinkin, “The Challenge of Soft Law: Development and Change in International Law”, International and Comparative Law Quarterly (1989), 850–866.

  146. 146.

    Regulations, Directives and Other Acts”, Website of the European Union, https://europa.eu/european-union/eu-law/legal-acts_en, accessed 1 June 2018.

  147. 147.

    Regulation (EU) 2015/478 of the European Parliament and of the Council of 11 March 2015 on common rules for imports.

  148. 148.

    Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights.

  149. 149.

    Joint Decision of the European Commission and the High Representative of the Union for Foreign Affairs and Security Policy on the participation of the European Union in various organisations for cooperation to prevent and counter terrorism; JOIN/2015/032. A “decision” by the European Commission might perhaps be seen as executive rather than legislative. Indeed the website of the European Commission characterises it as the EU’s “politically independent executive arm”. However, this belies the role that the European Commission plays in generating legislation. Moreover, executive acts—particularly when they have an effect of creating or communicating policies—have the effect of creating or confirming law. As such, the Commission’s decisions can properly be characterised as a form of law-making, even if they do not emanate from the purely legislative arm (i.e. the Parliament). See “European Commission”, Website of the EU, https://europa.eu/european-union/about-eu/institutions-bodies/european-commission_en, accessed 1 June 2018.

  150. 150.

    Council Recommendations—‘Promoting the use of and sharing of best practices on cross-border videoconferencing in the area of justice in the Member States and at EU level’. OJ C 250, 31 July 2015, 1–5.

  151. 151.

    See, for example, various EU guidelines and guidance on sanctions: Maya Lester QC and Michael O’Kane, “Guidelines”, European Sanctions Blog, https://europeansanctions.com/eu-guidelines/, accessed 1 June 2018.

  152. 152.

    Willem Riphagen, “From Soft Law to Jus Cogens and Back”, Victoria University of Wellington Law Review, Vol. 17 (1987), 81.

  153. 153.

    See “UNICTRAL Model Law on International Commercial Arbitration”, http://www.uncitral.org/uncitral/en/uncitral_texts/arbitration/1985Model_arbitration.html, accessed 1 June 2018.

  154. 154.

    Deanna Barmakian and Terri Saint-Amour, “Uniform Laws and Model Acts”, Harvard Law School Library, https://guides.library.harvard.edu/unifmodelacts, accessed 1 June 2018.

  155. 155.

    The Uniform Commercial Code is an example of a widely-adopted uniform law.

  156. 156.

    “About the IMLI”, Website of the IMLI, http://www.imli.org/about-us/imo-international-maritime-law-institute, accessed 1 June 2018.

  157. 157.

    Directive 2014/65/EU of the European Parliament and of the Council of 15 May 2014 on Markets in Financial Instruments, art. 67.

  158. 158.

    Ibid.

  159. 159.

    “Global Financial Innovation Network”, FCA Website, 7 August 2018, updated 9 August 2018, https://www.fca.org.uk/publications/consultation-papers/global-financial-innovation-network, accessed 16 August 2018. There is further discussion of the functioning and nature of regulatory sandboxes in Chapter 7 at 7.3.4.

  160. 160.

    Art. 15(2) of the Paris Climate Agreement 2015 provides: “The [compliance] mechanism referred to in paragraph 1 of this Article shall consist of a committee that shall be expert-based and facilitative in nature and function in a manner that is transparent, non-adversarial and non-punitive”.

  161. 161.

    See, for example, the views of Israel on the International Criminal Court: “Israel and the International Criminal Court”, Office of the Legal Adviser to the Ministry of Foreign Affairs, 30 June 2002, http://www.mfa.gov.il/MFA/MFA-Archive/2002/Pages/Israel%20and%20the%20International%20Criminal%20Court.aspx, accessed 1 June 2018.

  162. 162.

    One successful example of a mechanism to ensure a balance of quality as well as geographic diversity is the EU’s “Article 255 Committee”, which since 2010 has assessed nominees for judicial appointment to the EU’s Courts. For discussion see Tomas Dumbrovsky, Bilyana Petkova, and Marijn Van der Sluis, “Judicial Appointments: The Article 255 TFEU Advisory Panel and Selection Procedures in the Member States”, Common Market Law Review, Vol. 51 (2014), 455–482.

  163. 163.

    European Commission, “Press Release—Rule of Law: European Commission Acts to Defend Judicial Independence in Poland”, Website of the European Commission, 20 December 2017, http://europa.eu/rapid/press-release_IP-17-5367_en.htm, accessed 1 June 2018.

  164. 164.

    Ibid.

  165. 165.

    Art. 7, Treaty on the European Union.

  166. 166.

    The OECD member countries are: Australia, Austria, Belgium, Canada, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Japan, Korea, Luxembourg, Mexico, the Netherlands, New Zealand, Norway, Poland, Portugal, the Slovak Republic, Spain, Sweden, Switzerland, Turkey, the United Kingdom and the United States. The Commission of the European Communities takes part in the work of the OECD.

  167. 167.

    The 2011 revision of the Guidelines added a chapter on Human Rights aligned with the language of the UN Guiding Principles on Business and Human Rights. The Guidelines also make reference to relevant provisions of the ILO Tripartite Declaration of Principles concerning Multinational Enterprises and Social Policy as well as the Rio Declaration. See OECD Secretariat, Implementing the OECD Guidelines for Multinational Enterprises: The National Contact Points from 2000 to 2015 (2016), 11, http://mneguidelines.oecd.org/OECD-report-15-years-National-Contact-Points.pdf, accessed 1 June 2018.

  168. 168.

    Ibid., 12. NGOs accounted for 80 specific instances or 48% of all complaints from 2011 to 2016, followed by trade unions which account for a quarter of all complaints since 2011. Individuals have filed 33 complaints from 2011 to 2016 accounting for 19% of all complaints in this time period. Approximately a third of all closed specific instances were not accepted for further consideration at the initial assessment stage. A non-acceptance rate of between 30 and 40% has been relatively stable since 2000.

  169. 169.

    Ibid., Glossary.

  170. 170.

    Ibid., 12.

  171. 171.

    Ibid.

  172. 172.

    Ibid., 13.

  173. 173.

    Vilca v. Xstrata [2016] EWHC 2757 (QB) at [22], [25].

  174. 174.

    Genesis 11:1, King James Bible.

  175. 175.

    Ibid.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jacob Turner .

Rights and permissions

Reprints and permissions

Copyright information

© 2019 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Turner, J. (2019). Building a Regulator. In: Robot Rules . Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-96235-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-96235-1_6

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-319-96234-4

  • Online ISBN: 978-3-319-96235-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics