The Turtle Bay Security Roundtable: Managing the Frontiers of Technology

2018/3/23

On March 23, 2018, the Permanent Mission of Japan to the United Nations hosted the eighth meeting of the Turtle Bay Security Roundtable series. The event was organized in cooperation with the Stimson Center, an independent think tank dedicated to global security and development. Under the theme “Managing the Frontiers of Technology,” the meeting convened UN Member States, members of the Group of Experts of the 1540 Committee and other subsidiary organs of the Security Council, and experts from think tanks, industry, and academia to discuss implications of evolving technologies for international security. The event featured a formal address by Secretary-General Mr. António Guterres and panel discussions with Ms. Izumi Nakamitsu (Under-Secretary General for Disarmament), Mr. Clifton Leaf (Fortune Magazine), Mr. Kevin Cuddy (General Electric), Ms. Amy Kruse (The Platypus Institute), and Mr. Jack Clark (Open AI). 

OPENING REMARKS

Ambassador Koro Bessho opened the event by expressing his happiness in re-launching the Turtle Bay Security Roundtable after a two-year pause. He noted that the theme of the event, “Managing the Frontiers of Technology,” is timely, as the Secretary-General plans to prioritize the importance of technology in his new emerging disarmament agenda. Ambassador Bessho expressed hope that the discussion with top experts from private sector, civil society, and journalism will help lead to a better understanding of the impact of cutting-edge technology on our security environment and actions that must be taken.
 
In Japan, by way of example, robots have been stereotyped as superheroes, as depicted in popular Japanese manga ASTRO BOY and DORAEMON. In real life in Japan, robots such as PARO take care of senior citizens. However, we cannot always assume that robots will be benign to humans, said Ambassador Bessho. Some robots are designed to kill people, and dual-use technologies can be purposed to harm. Quoting science fiction writer Isaac Asimov, Ambassador Bessho said, “science gathers knowledge faster than society gathers wisdom.” He called upon session participants to think creatively about how to help keep up with the remarkable pace of technological innovation.
 
Lastly, Ambassador Bessho emphasized that we must ask what rules and norms are needed to manage the technologies of our time, such as drones, artificial intelligence (AI), and nano-technology.
    
 
FIRST SESSION: Technology Innovation and Weapons

Moderator: Rachel Stohl
Panelists: Kevin Cuddy (General Electric), Amy Kruse (The Platypus Institute), Jack Clark (Open AI)
 
The first panel focused on the impact of technological innovation on the security sphere. Rachel Stohl, the moderator, said that well-meaning technologies can be misused, or can be used in ways we hadn’t anticipated, and non-state actors can get their hands on these technologies.
 
Jack Clark warned of the possibilities of weaponization. Drones, for example, can be easily weaponized. He highlighted the implications of ‘autonomy’ for the development of “smart landmines,” for instance, and called for a wider public conversation over the implications of evolving technology. Mr. Clark noted that when processing power increases by orders of magnitude, its capacities for both good and ill will similarly grow. Regulators and government agencies struggle to keep up with these evolving technology spaces. On the other hand, increased knowledge, such as through satellites and mass analysis, will give governments greater understanding of their security situation, which could actually promote stability.
 
Amy Kruse spoke about the augmentation of human bodies. She noted that certain “augmentations” already exist and are considered normal: glasses to improve vision, for example, or coffee as a stimulant. The majority of neuroscience investment to date has been on restoration and repair, such as in prosthetics or recovery from brain injuries and post-traumatic stress disorder. But the possibility of enhancing healthy humans raises significant questions. She commented that we need to ask ourselves what we are comfortable with; whether they are governments using enhancements to monitor minority populations or building super-soldiers. She also noted the possibility of teaming up human beings with artificial intelligence to produce potent outcomes. She stressed the need for more conversation about regulations on augmentation and neuroscience. “There is more regulation on Olympic athletes than there is on this stuff”, she said.
 
Kevin Cuddy focused on the implications of additive manufacturing, which is disruptive because it can increase design sophistication while decreasing costs. The nonproliferation community, he suggested, is unduly alarmist at present, viewing additive manufacturing as “the next bogeyman.” While recognizing potential concerns down the road, Mr. Cuddy said that current nuclear technology is controlled regardless of how it is made. Additive technology as it currently exists cannot do much in the nuclear space, he said in response to a question on DPRK nuclear sanctions. He raised the issue of what elements will fall under the control of nonproliferation regimes: the design files?; the printer itself?; the components? He noted that sophistication will come from the number of lasers. He warned that unduly hasty control regimes could stifle innovation, as they had in other industries in the past. National governments and the control regimes must work with industry to determine what the technologies actually do. “Controlling too quickly before you know what you are controlling can hamper the industry,” he said. He insisted we need a technical determination of where the control is most appropriate.
 
AI considerations were raised during the question and answer. Nuclear weapons, it was pointed out, are not designed to replicate superior versions of themselves, but this is precisely what AI does. The precautionary principle was designed to stop runaway ideas by compelling the innovator to take steps to model the risks and find ways to mitigate them. Mr. Clark asked how a culture of shame around certain innovations could be engendered, and noted several instances in which the scientific community has mobilized against irresponsible research. It was also pointed out that systems can always be hacked by nefarious actors.
 
Mr. Cuddy said that most people are trying to do the right thing, but many small- to medium-sized companies don’t understand export controls or why they are important. Education is therefore key.
 
The panelists noted a lack of existing frameworks against fast-developing technologies. Ms. Kruse noted that conventions against bioweapons and chemical weapons exist, but said that synthetic biology is advancing rapidly and the process is confused. Mr. Clark added that there are almost no controls on AI apart from ethical ones, which are good but insufficient.
    

SECOND SESSION 

A Conversation on the Evolving   Prevention Toolkit
Moderator: Brian Finlay
Panelists: Izumi Nakamitsu (USG for Disarmament, UN), Clifton Leaf (Fortune Magazine)
 
Izumi Nakamitsu began the second session by warning that we should not be too alarmist in our reaction to technological advancement. She pointed out that science and technological breakthroughs over the course of human history have largely been forces for good and measurably improved the human condition. The United Nations already has a mandate to deal with the consequences of technological development. By way of example, the Treaty on the Non-Proliferation of Nuclear Weapons was developed to deal with nuclear technology.
 
The UN community is now coming together to look at the relevant issues more holistically and comprehensively with regard to the sufficiency of normative frameworks, enforcement, and gaps, she said. Actors other than Member States need to be involved in that discussion as well, but the UN, being a Member States’ organization, is still exploring the best mechanisms to achieve these goals. The First Committee of the General Assembly has given the Secretary-General a mandate to compile a report on the implications of science and technological developments on international security and disarmament, and the writing of that report is now underway.
Brian Finlay noted that technology’s role in lifting up human society has been unmistakable. However, those who focus on security issues naturally emphasize its dark side, and we need to find the balance.
Ms. Nakamitsu said that there is more to technology than just security risks. Innovation could also spur development and benefit humankind in the fields of health, education, and more. The United Nations has a significant role to play. When Mr. Finlay asked if the UN has the correct mechanisms, Ms. Nakamitsu said that the UN has convening power and can create different types of platforms and discussion mechanisms. The UN is uniquely situated to bring in non-Member State actors so that multiple stakeholders could learn from each other and develop creative, mutually beneficial solutions. Twenty-first century norm-making cannot be just straightforward treaty negotiations between states, but must include entrepreneurship, industry self-regulation, and other activity in the private sector, she said.
Clifton Leaf called for a change in the conversation. Tech companies are working on cryptography and biotech companies are dealing with gene editing to fight diseases. Dual-use technology has ample opportunity to be used for good or bad, and can be sold through the Dark Web. A network of autonomous vehicles can be hacked, just as previous networks have been hacked. He observed that virtually every technology company is focusing on its potential to disrupt.
Mr. Finlay asked what incentive companies would have to share proprietary information in the name of security. Mr. Leaf said that Fortune 500 companies have the infrastructure in place and a strong incentive to do what is right. However, smaller, non-Fortune 500 companies may not know the compliance regulations and export controls, and may not have thought through the issues. Many of these cutting edge companies are focused primarily on economic survival rather than wider public policy or security questions. Dual-use technology has not extensively discussed across many industry sectors, even as knowledge has been democratized. He said that competition sometimes drives people to innovate faster than their wisdom. Technological change has always been with us, but the rapidity of change is unprecedented.
    

FORMAL ADDRESS

António Guterres (Secretary-General, UN)
 
The Secretary-General started his speech by addressing the benefits of the era of unprecedented technological advance that we live in, what some call the Fourth Industrial Revolution. He explained that new technologies could enhance the maintenance of peace and security, including disarmament and non-proliferation objectives, by providing new tools and augmenting existing ones. The Secretary-General gave an example of the UN using unarmed, uninhabited aerial vehicles in its peacekeeping operations to strengthen their ability to protect civilians.
 
However, new technologies also have clear risks and pose challenges to regional and global stability. Civilian technologies such as synthetic biology or facial recognition software can be repurposed and used for harmful purposes. Advances in technology also generate new methods and means of warfare with greater distances, at faster speeds and with enhanced destructive power. The Secretary-General expressed concerns about such technologies being in the hands of non-state actors, including terrorist groups, and about great powers choosing military solutions over dialogue and diplomacy.
 
The Secretary-General asserted that the key question is: “How do we encourage innovation without losing control of these technologies and inadvertently fueling arms races and conflict?” He introduced a spectrum of responses to this question, including: industry self-regulation, robust implementation of national measures, and formal confidence-building measures.
 
The Secretary-General emphasized that, as a first step, we must improve global awareness and understanding of the implications of these new technologies for international peace and security. While the UN stands ready to assist, he noted that Member States must take leadership on ideas and implementation. He also encouraged that all relevant stakeholders, including industry, civil society and academia, be included in these deliberations.
    

WRAP-UP SESSION

In the wrap-up session, Mr. Clark said that measurement initiatives about the rate of progress of these transformative technologies will be the single most important thing policymakers could undertake. Ms. Kruse added that open discussions including academics, scientists, and business are not common, but we are at the point where those have to happen. They are not nice to have, but rather are must-have conversations. Mr. Cuddy argued we have to break down this confrontational relationship between industry and government, between regulator and regulated. Mr. Leaf noted that businesses must have social license to operate, including with issues like privacy and the use of data, and the public needs to understand what companies are doing.