View event album here
EXECUTIVE SUMMARY
CONTEXT AND WHY IT MATTERS
The drone revolution is gaining pace. The technology is developing much faster than the legal, regulatory and ethical framework for its use.
PEOPLE
General Sir Richard Barrons chaired a group that included a historian of strategy; prominent technology leaders; AI and robotics experts; a senior CEO; senior government lawyers; military commanders; researchers on warfare; and drone warfare practitioners.
WHERE WE ARE NOW
Drones may become as ubiquitous in different forms as smartphones. They should be seen as part of the sensor, data, autonomy and robotics revolution rather than in isolation. Their true impact will be as part of systems, with many new business models likely to emerge. It is important to view drones as much more than aerial vehicles: they will be present on land, at and under the sea, in the air and in space. The implications for privacy and for many areas of the economy and the labour market could be very significant. The questions of autonomous weapons and the use of force by drones acting in support of the police revealed significant divisions in the group. These are likely to be the most contentious questions. If the West is too cautious in exploring drone technology, then other nations will forge ahead. We will not be able to avoid the implications of drone technology through such caution.
PRESCRIPTIONS
- More work is needed on the military's approach to autonomous weapons. Do we view their development as inevitable and how will views amongst the international community differ? Is preventing this a realistic policy aim for western governments?
- The United States and the UK and other rule of law nations need to do more to explain in public the ethical and legal basis for the use of targeted killings, including by drone strikes. It should be possible to arrive at a series of principles that would be permissive but also restrictive enough to have meaning.
- We need to press our governments to break the logjam on regulation of beyond line of sight drone operations. We risk falling behind other countries with more permissive regulatory regimes.
- More work is needed urgently on the protection of privacy from data collected by drones and on cyber security. These are hard challenges.
- Governments, companies and society must understand the potential scale of change to the economy and the labour market from combinations of the technologies now accumulating, of which drone technology is an important and connected subset. The UK and other western states should consider declaring a national effort to retrain the nation for the technological revolution now upon us. If we act early, we can increase our capacity to adapt and for our people to find new and satisfying roles as their old work becomes untenable. This effort should address the needs of people at all stages of their working lives, from early education to those nearing retirement. Encouraging science and engineering studies is not nearly enough.
FULL REPORT
CONTEXT AND WHY IT MATTERS
The drone revolution is gaining pace. In warfare, President Obama made drone strikes his weapon of choice in the war on terror and there is no reason to expect President Trump to be more reticent. In commerce, early stage drone companies are proliferating at pace on land, sea and in the air. China dominates the hobbyist and commercial market and there is a risk that the West will be left behind. Leaps forward on AI, particularly on driverless vehicles, are making it urgent to consider the new world of autonomous machines and their implications. The technology is developing much faster than the legal, regulatory and ethical framework for its use.
PEOPLE
General Sir Richard Barrons, an advocate for the radical transformation of the British Armed Forces to meet the challenges of the fourth industrial revolution, chaired a group that included a renowned historian of strategy; prominent technology leaders; AI and robotics experts; a CEO of a major military and civilian engineering company; senior government lawyers from the UK and the US; serving and retired military commanders; academic researchers on warfare; drone warfare and surveillance practitioners, including a former Reaper pilot; and the founder of a drone AI start-up.
WHERE ARE WE NOW
The consensus, although some people had doubts that all the talk would turn into walk, was that the drone revolution will have a profound and broad impact on our societies in many areas, from warfare to smart cities. Drones may become as ubiquitous technology as smart phones. We found it difficult to separate the significance of drones from parallel developments in AI, robotics and autonomous cars. Whether drones are “the flying phone”, a “moving platform for sensors” or simply “robots”, their potential applications in the economy are endless. We distinguished drones principally from other technology by their mobility and considered their implications in the sea and on land as well as in the air. It is possible that the changes in the economy could be more gradual but this appears the less likely trajectory. We should prepare for major disruption with many forms of work transformed. New jobs will emerge but those of a good quality, offering decent pay and identity, will demand different and higher level skills.
WARFARE
One of the most profound disagreements in the group was on the question of autonomy for lethal drones. Some people felt this was plain bad, unnecessary and likely to lead to disaster. There would be negative implications not only for the mitigation of the evils of war but also for acceptance of the technological revolution. Citizens could be turned away from the opportunities of technology by the risks from its dark side. For sceptics, the drive towards autonomy was coming from the high cost in terms of manning for drone operations, which had proved not as efficient as they first appeared. From this perspective, it was imperative to put intent and principles first and to work internationally to limit the development of autonomous weapons, just as we had done to some effect with nuclear, biological and chemical weapons and landmines through international treaties.
Others saw high degrees of autonomy in weaponised drones as inevitable in a world where technological innovation was driven by corporations rather than by more easily controlled government labs. Autonomy was not automatically bad if embedded in a responsible chain of command that considered ethical and legal issues carefully. Autonomous drones could be just another tool. They would be used largely sensibly by rule of law nations, but the scope for their development and employment by states – and indeed by non-state actors – with different laws and ethics must be met.
Comparisons were made with the decision by responsible scientists to press for the development of nuclear weapons. It was obvious to all that nuclear weapons were a bad idea for humanity. The only thing that would have been worse for humanity would have been to have allowed Nazi Germany to develop nuclear weapons first. It was likely that the same logic would apply to autonomous weapons. Such weapons would bring decisive advantage on the battlefield, winning a “digital High Noon” by making decisions faster. States would be compelled to adopt them in self-defence. It would be a question largely of adoption rather than invention, because autonomous technology would be widespread across the civilian economy. Autonomous machines in warfare would remove people from risk, likely lower manpower costs, and potentially be more operationally effective and cheaper than manned systems. But there will always be some people in or on the loop.
A distinction was drawn between use of autonomous drones in wars of discretion where we have the luxury of choosing which weapons are appropriate and preparation for wars of necessity and survival. Such terrible wars were best avoided through deterrence by holding weapons systems as least as powerful as those of potential aggressors. If we ended up fighting such wars then such weapons would be essential, it was argued.
Weaponised drones naturally draw attention but in the future, as now, we expected most drones to act in supporting surveillance roles on the battlefield. We also expected drones to have significant roles in logistics but we were not able to explore this adequately in the time available.
AMBIGUOUS WARFARE
Another area of contention was the use of drone technology to target individuals in areas where war has not been formally declared and particularly where there is a host government of some description that has not given its consent. This was a debate about the legality, proportionality and effectiveness of targeted killing as an instrument of policy, as much as about drone technology.
There were persistent concerns across the group that, whilst effective in the short term in heading off terrorist threats, drone strikes offered a seductive option for politicians. They could feel that they were taking decisive action, whilst avoiding the risks and dangers of more fundamental approaches. This was not necessarily good, even from the perspective of realpolitik. The relative low political cost of using drones might be allowing politicians too easily to defer proper consideration and pursuit of harder options. These other options might be based on a broader range of intelligence and security work to contain the threat; development efforts to change the situation on the ground; or even the real necessity of full scale military action with boots on the ground.
This said, it was acknowledged that drone strikes resulted in very low collateral casualties in comparison to almost any other form of warfare. The exception was the deployment of Special Forces on the ground, which carried a higher risk of military casualties in some settings. A low number of civilian casualties had been achieved in the US use of drones by increasing expertise and restraint. US policy was not to authorise an attack unless the target had been definitively identified and there was near certainty that civilian casualties would be minimal. It was noted that the Trump administration might seek to loosen these constraints and all present agreed that this would be unwise.
For practitioners and for the people in areas targeted on the ground, drone strikes carry significant psychological charge. Practitioners described feeling an intimate connection to the areas targeted, created by the extended period of surveillance and study and the detailed imagery. There was probably less sense of distance and detachment than in aircraft. For those on the ground below, emotions might range from admiration for the power of the technology, through to fear, a sense of powerlessness and then anger. Simple generalisations were misleading.
SECURITY AND POLICING
Police and security forces would certainly want drones for surveillance and self-protection. We should expect to see at least sections of the police supported by drones over the next few years. There was discussion on how this might affect relations between the police and the communities they serve. It is essential that routine drone surveillance is not imposed on communities that already feel marginalised. As well as just cause – a high rate of crime in an area – the community should be involved in the decision to deploy drone surveillance and largely support it. Drones might increasingly replace fixed security infrastructure such as CCTV cameras and fences.
The deployment of drones carries great risks in terms of casual and collateral intrusion of privacy. Although, at least in the UK, CCTV is widely accepted, the mobility of drones marks a new step in the erosion of privacy. Drones employed on public duties (i.e. not undercover) would need to be clearly identifiable to the public and the public should be able to find out information on the drones’ capabilities and the data they gather.
Appropriate use of the data collected by such drones would need to be assured. The data should only be used to support investigation of serious crimes (to be defined) and not for pursuit of minor infractions – "we see from our drone coverage that you have consistently parked in a restricted zone". This line would need to be strictly maintained with a strong right of challenge. There was a real risk of “mission creep”, or of algorithmic bias against particular communities.
Undercover use of drone surveillance by police was in a sense more straightforward. Undercover surveillance is by definition intrusive and needs appropriate authorisation and oversight mechanisms in a democratic state. Drone technology will be no different to existing invasive surveillance techniques in this regard.
Different countries and cultures will have different concerns about aspects of police use of surveillance and will have different expectations of privacy.
ARREST BY DRONE
There was a sharp division between those ready to see drones take action to detain or incapacitate suspects and those who felt this crossed an important line. The division was even sharper on lethal action.
All agreed that any autonomous action, either to incapacitate a suspect or even more so to kill her, was wrong in a domestic and civil context. The final decision had to be taken by a human being. Even those prepared to countenance lethal action by a drone in a domestic context expected this only to be appropriate in the most extreme circumstances, for example a prolonged hostage siege where it was judged hostages’ lives were at immediate risk.
Those willing to contemplate non-lethal action by drones in support of the police imagined them delivering nets, tranquillisers or taser strikes to incapacitate suspects. All acknowledged the risk (as has happened with tasers) of these techniques being deployed too often, or without sufficient restraint. The police would need to be challenged. On the other hand, it was argued that such capabilities might save suspects’ lives as well as those of police officers. Particularly in the US, police officers were taking split-second decisions about whether or not to open fire on the basis of self-defence. Shootings of suspects that turned out to be unarmed undermined public trust and devastated individual lives. The ability to stand back and to reduce risks might allow officers to take calmer decisions. Drone surveillance would also cover police officers and this would have a restraining effect on their behaviour.
DRONES IN THE WIDER ECONOMY
It is difficult to divorce technology from the context in which it first appears. Drones have started their life in the popular imagination as either sinister killing machines or toys. This can obscure the potential of drones as a positive transformative force. People may also miss the potential big changes in the economy and labour that will come from fast maturing drone technology.
There is a risk that western liberal democratic regard for the wellbeing of the individual citizen, and the regulatory and legal frameworks that follow from this, could mean the West falls behind the developing world and more authoritarian regimes in development of these technologies. The lack of regulation, greater availability of space and less established infrastructure in the developing world, may mean greater latitude for drone trials and therefore faster development of the technology and business models. Authoritarian regimes may decide that the value to the nation of developing drone technologies outweighs the safety risks to the individual and again this could mean these societies develop these technologies at a faster pace. Set against this, there may be an advantage to western societies from greater personal freedom and therefore creativity and innovation.
Although China dominates the manufacturing of smaller-scale drones, this may not be where the real value in the business chain resides. Following the smartphone model, drones are likely to be platforms or constituent parts in a range of services. We cannot yet determine the full range of business models but even relatively simple line of sight drones are likely to deliver significant efficiencies in road, track, cable, pipeline, bridge, dam, canal, pipe network and building maintenance by allowing close-up inspections at much more regular intervals. In agriculture, drones will allow farmers early sight of diseases, parasites and weeds blighting crops and to control livestock. Delivery models may develop in time and then mature into personal transport solutions as are already being trialed in Dubai. The drone eco-system may begin to influence urban planning.
Autonomy will allow much greater use of road and rail networks with reduced gaps between vehicles. Although navigating crowded and mixed human and machine areas of the transport network will remain a challenge for computers, fully autonomous vehicles may soon be possible on bounded portions of the system like motorways. Although autonomous vehicles and drones can be easily misled by poor data – for example the mischievous teenager in Mountain View holding up a Stop sign to a Google car – existing systems are similarly subject to easy vandalism and manipulation but this rarely happens unless order has broken down in other ways.
Operating on land is one of the toughest environments for AI and drones to master. Progress will be much quicker at sea and in the air. The latter will depend however on new approaches to air traffic control. There is no prospect of the current human-led approach scaling sufficiently. To operate at high altitudes, drones would need advanced autonomy to be safe for aviation. An alternative approach might be to create networked systems where drones designed to be compatible are locked into the system and controlled within it as network traffic, rather than allowed to be free agents choosing their own course.
A licensing system for larger and more capable drones seems inevitable and necessary. The trick will be to set this at the right level to enable innovation whilst protecting people and particularly, of course, commercial aviation.
Blockchain is another multipurpose technology that could become crucial in the development of commercial drone services, allowing drones to be registered permanently to individuals and companies, to track their routes, and to control their capabilities.
Companies will lead in driving innovation, rather than governments. At present this is blocked by lack of clarity on regulation. But given the potential rewards for those who establish successful business models, companies may be tempted to follow the Uber model of pushing into markets in advance of regulation and carrying risks, in the hope of market dominance.
It is possible that autonomous cars and drone technologies will hit the insurance industry hard by reducing the number of accidents generated by human error. On the other hand, major lawsuits are also likely when cars and drones developed by companies cause deaths, injury and damage. Law firms will need to hire more people with technical knowledge.
Cyber security and protecting privacy were judged amongst the hardest challenges. Drones will be part of the Internet of Things and this will take cyber security from a data assurance issue to a health and safety issue.
PRESCRIPTIONS
More work is needed on the military's approach to autonomous weapons. Do we view their development as inevitable and how will other states develop and employ them? Is preventing this a realistic policy aim for western governments? Who is already developing autonomous weapons and can they be stopped?
The United States and the UK and other rule of law nations need to do more to explain in public the ethical and legal basis for the use of targeted killings, including by drone strikes, against individuals who pose a significant threat in areas that are otherwise hard to reach without the risk of casualties, whether of allied soldiers or of civilian bystanders. It should be possible to arrive at a series of principles that would be permissive but also restrictive enough to have meaning.
We need to press our governments to break the logjam on regulation of beyond line of sight drone operations. We risk falling behind other countries with more permissive regulatory regimes. We must strike the right balance between protecting individuals from potential harm and allowing innovation to flourish. The comparison must be with current levels of risk from human operation of machinery as opposed to comparisons with the ideal of absolute safety and zero accidents.
More work is needed urgently on the protection of privacy from data collected by drones and on cyber security. These are hard challenges.
Governments, companies and society must understand the potential scale of change to the economy and the labour market from the combinations of technologies now accumulating around AI, of which drone technology is an important and connected subset. The UK and other states should consider declaring a national effort to retrain the nation for the technological revolution now upon us. If we act early then we can increase our capacity to adapt and for our people to find new and satisfying roles as their old work becomes untenable. This effort should address the needs of people at all stages of their working lives, from early education to those nearing retirement. Encouraging science and engineering studies is not nearly enough.
This Note reflects the Director’s personal impressions of the conference. No participant is in any way committed to its content or expression.
CHAIR: General Sir Richard Barrons KCB CBE
Colonel Commandant and President, Honourable Artillery Company; Senior Visiting Fellow, LSE IDEAS; Senior Visiting Fellow, RUSI. Formerly: British Army (1977-2016): Commander, Joint Forces Command (2013-16).
BARBADOS/UK
Ms Charlie Dakin
University of Birmingham undergraduate (thesis: the interaction of just war theory and drones).
CANADA
Dr Isabelle Desmartis
Director General Policy Planning, Department of National Defence.
Mr Michael J. P. Moen
Founder/CEO, Panther Artificial Intelligence Corporation, New York and London; Defense Studies PhD candidate, King's College London. Formerly: Canadian Foreign Service; Goldman Sachs, New York; Venture Capitalist and Advisor.
CANADA/UK
Mr Rohan Nuttall
Vice-Curator (Vancouver Hub), Global Shapers Community, World Economic Forum. Formerly: Executive Director, Student Voice Initiative Canada, Ontario; Scientific Computing Analyst, TRIUMF (Canada's National Laboratory for Particle and Nuclear Physics), Vancouver.
CANADA/USA
Dr Heather Roff
Senior Research Fellow, Oxford Institute for Ethics, Law and Armed Conflict, University of Oxford; Research Scientist, Global Security Initiative, Arizona State University; national Cybersecurity Fellow, New America Foundation; Research Associate, Eisenhower Center for Space and Defense Studies, United States Air Force Academy; author, 'Lethal Autonomous Weapons and the Future of War' (forthcoming).
GERMANY
Ms Ulrike Franke
Doctoral candidate, University of Oxford; Research Assistant to the Director, European Council on Foreign Relations (2015-). Formerly: research team of UN Special Rapporteur on human rights and counter-terrorism, working on drone use in counterterrorism contexts; Research Assistant, International Institute of Strategic Studies, London.
Dr Frank Sauer
Senior Research Fellow and Lecturer, Bundeswehr University Munich; co-author, 'Multidimensional Definition of Autonomy in Military Robotics' (research funded by the German Federal Foreign Office, 2015) and 'Killer Drones: The Silver Bullet of Democratic Warfare?' (Security Dialogue, 2012); leader, 2017-18 Boell Foundation Task Force on 'Disruptive Technologies in 21st Century Warfare'; member, International Committee for Robot Arms Control; member, International Panel
on the Regulation of Autonomous Weapons.
INDIA/USA
Mr Manoj Saxena
Executive Chairman, Cognitive Scale; founding Managing Director, The Entrepreneurs' Fund; Chairman, SparkCognition. Formerly: General Manager, IBM Watson; Founder, Webify (acquired by IBM, 2006); Founder, Exterprise (acquired by Commerce One, 2001).
LITHUANIA
Mr Mantas Gribulis
Founder and CEO, Accelerated Dynamics, London (2015-).
UK
Air Commodore Richard Barrow CBE RAF
Royal Air Force: Air Commodore Air Staff, Ministry of Defence (2016-). Formerly: Station Commander, RAF.
Mr Victor Chavez CBE
Chief Executive Officer, Thales UK (2011-); Board Member: EngineeringUK and techUK; Advisory Board Member, Strategy and Security Institute, University of Exeter. Formerly: Deputy Chief Executive, Thales UK (2008-11); Business Development Director, Thomson-CSF; EDS Defence Ltd.
Mr Martin Clements CMG OBE
Visiting Professor, Jill Dando Institute of Security and Crime Science, University College London.
Mr Paul Clarke
Deputy Director Aviation (Unmanned Air Systems) (2016-) and RPAS Flight Operations Manager (2014-), QinetiQ, Salisbury. Formerly: Reaper MQ-9 Pilot, Royal Air Force, USA (2007-11); C-130 Hercules Pilot, Royal Air Force, Lyneham (1997-07).
Mr Chris Cole
Founder, Drone Wars UK; author, 'Convenient Killing: Armed Drones and the Playstation Mentality' (2010) and 'Drone Wars: Out of Sight, Out of Mind, Out of Control' (2016); Trustee, Trust for Research and Education on Arms Trade (TREAT) and Pax Christi.
Professor Nick Colosimo PgC BSc(Hons) CEng MIET FIKE
BAE Systems plc, London (1990-): Executive Strategy and Planning Manager - Future Capabilities (2016-); Technologist - Disruptive Capabilities (2016-); BAE Systems Global Engineering Fellow (2015-); Visiting Professor (Aerospace, Transport, Manufacturing), Cranfield University (2016-).
The Rt Hon. Professor Sir Lawrence Freedman KCMG CBE FBA FKC
Emeritus Professor of War Studies, King's College London. Formerly: Vice-Principal, King's College London. A Governor of the Ditchley Foundation.
The Rt Hon. Dominic Grieve QC MP
Member of Parliament (Conservative) for Beaconsfield (1997-); Chairman, Intelligence and Security Committee (2015-); member, Joint Committee on the National Security Strategy. Formerly: Attorney General (2010-14); Shadow Secretary of State (Justice) (2009-10); Shadow Secretary of State (Home Office) (2008-09). A Member of the Council of Management and a Governor of The Ditchley Foundation.
Mr Paul Hutton
Chief Executive Officer, Cranfield Aerospace Solutions Limited (2015-). Formerly: Chief Commercial Officer, Vocality (2014-15); Director Public, CSC UK (2011-14); Deputy CEO, Future For Youth Foundation (2012-14); Chief Operating Officer, CSC Computer Sciences (2008-11); Head of Programmes, Thales - Aerospace Division (2003-07); Managing Director, Naval Satcom Business UK, Astrium (2001-03).
Mr Will Jessett CBE
Director, Strategic Planning, Ministry of Defence. Formerly: Minister Defence Materiel, British Embassy, Washington, DC (2010-14).
Mr Peter Jones
HM Diplomatic Service: Director, Defence and International Security, Foreign and Commonwealth Office (2014-). Formerly: High Commissioner in Ghana (2011-14); Director, Migration (2009-11).
Mr Jonathan Ledgard
Founder, Rossums studio; author, 'Giraffe', 'Submergence' (film adaptation by Wim Wenders to be released in 2017), and 'Terra Firma'. Formerly: foreign correspondent and Africa correspondent, The Economist; Director, future Africa, Federal Polytechnic School of Lausanne, Switzerland (worked on pioneering droneports in Africa).
Mr Paul Newman
Director, NEX Group (2016-); Non-Executive Director, J. C. Rathbone Associates Ltd (2008-); Freeman of the City of London. Formerly: Chairman, ICAP Energy (2014-16); Managing Director, ICAP Energy Ltd (1990-2014); Prince's Council, The Prince's Charities (2009-12); Non-Executive Chairman, ICAP Shipping Ltd (2007-11); School Governor, City of Westminster (2001-03); MC Fellow, St Antony's College, University of Oxford (1989-90); Trustee Director, Epilepsy Research UK; Patron, Barts Hospital Appeal. A Governor and Member of the Council of Management, and a Member of the Finance and General Purposes Committee of The Ditchley Foundation.
Mr Mark Phelps OBE MA MSc RAF
Directorate of Legal Services, Royal Air Force; Chief of the Air Staff Fellow, PhD Candidate, Defence Studies, King's College London.
Mrs Elizabeth Quintana
Royal United Services Institute, London: Senior Research Fellow, Futures and Technology. Formerly: Head of Military Sciences team; Senior Research Fellow for Air Power & Technology; Programme Head, Acquisition; Programme Head, Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR); graduate assistant (worked on DARPA project investigating collaborative robotics), University of Texas A&M (2001).
Air Marshal G A 'Black' Robertson CBE BA FRAeS FRSA
Managing Director, Blackbourne Wells Ltd; Clerk and Company Secretary to The Honourable Company of Gloucestershire (2013-); Military Adviser to General Atomics Aeronautical Systems, Inc. (2006-). Formerly: Senior Military Adviser, BAE Systems; Royal Air Force (1966-98), latterly as Chief of Staff and Deputy Commander-in-Chief, Strike Command.
Dr Christopher Wyatt
Research Fellow (researching Unmanned Aerial Vehicles in Afghanistan and Yemen and working on aspects of psychology, conflict and conflict resolution), University of Birmingham.
Professor Jeremy Wyatt
Professor of Robotics & Artificial Intelligence, University of Birmingham. Formerly: Co-Director, Centre for Computational Neuroscience & Cognitive Robotics (2010-16); Leverhulme Fellow (2006-08).
UN/FRANCE/USA
Ms Kerstin Vignard
Deputy to the Director/Chief of Operations, United Nations Institute for Disarmament Research; Consultant to UN Groups of Governmental Experts on Developments in the field of Information and Telecommunications in the Context of International Security; Institutional lead on Emerging Security Issues; founder and editor in chief, Disarmament Forum (1999-2012).
USA
The Hon. John Bellinger III
Partner, National Security and Public International Law practices, Arnold & Porter Kaye Scholer LLP, Washington, DC; Adjunct Senior Fellow in International and National Security Law, Council on Foreign Relations; Member, Secretary of State's Advisory Committee on International Law; Member, Department of Defense Legal Policy Board; Member, Permanent Court of Arbitration, The Hague. Formerly: Legal Adviser to the U.S. Department of State, Washington, DC (2005-09); Senior Associate Counsel to the President and Legal Adviser to the National Security Council (2001-05). A Member of the Board of Directors, The American Ditchley Foundation.
Ms Christen Carpenter
Fulbright Scholar, London School of Economics; commissioned Surface Warfare Officer, U.S. Navy.
Dr Alfred Z. Spector
Chief Technology Officer, Two Sigma Investments LP, New York; Council, American Academy of Arts and Sciences. Formerly: Vice President of Research and Special Initiatives, Google, Inc. (2008-15); Vice President of Strategy and Technology, IBM's Software Business; Vice President of Services and Software Research, IBM; Founder and CEO, Transarc Corporation.
Professor Matthew C. Waxman
Liviu Librescu Professor of Law and Faculty Chair, Roger Hertog Program on Law and National Security, Columbia Law School; member and Adjunct Senior Fellow for Law and Foreign Policy, Council on Foreign Relations; co-Chair, Cybersecurity Center, Columbia Data Science Institute. Formerly: U.S. Department of State; U.S. Department of Defense; National Security Council.
ZIMBABWE
Mr Prince Abudu
DPhil Candidate in Machine Learning for Embedded Systems, Rhodes Scholar, Balliol College, University of Oxford; Operations Leader, Emergination Africa (2013-).
COUNTRY NOT SPECIFIED
Dr Peter Eckersley
Chief Computer Scientist, Electronic Frontier Foundation.