In his address to the 1978 United Nations General Assembly, Pope John Paul II emphasised that “research and technology must always be at the service of man”. Forty six years later, Artificial Intelligence (AI) is quickly shaping not only the future of work, society and home life, but increasingly we are seeing AI being armed and sent into battle.
It is used in every aspect of defence and warfare, from cyber defence to predicting maintenance repairs, to surveillance and reconnaissance. As Catholics, how should we read the signs of the times and interpret the most recent technological developments in weaponry? Should we welcome it? Should we be worried?
Turning our eyes to history, we can see that this is a story as old as time: people have continually researched, developed and deployed new technology to wage war. We have also witnessed the consequences when ethics are relegated to an afterthought; the devastating impacts of chemical weapons in World War One; the horrifying use of nuclear weapons on civilians in World War Two; and the enduring menace of landmines that are still killing civilians today in Laos and Vietnam decades after they were first laid. In this quickly emerging world of AI, how do we ensure that ethics are not relegated to an afterthought?
Pope Francis when reflecting on the rapid spread of AI stated that “[Its] workings and potential are beyond the ability of most of us to understand and appreciate, [it] has proven both exciting and disorienting”. Here, he is recognising that humanity with the tools of AI has an immense power and potential, which we are yet to fully understand, to do both good and evil.
Today, the International Affairs Department of the Bishops’ Conference released <em>Called to be Peacemakers</em>, a document that collates Catholic social teaching in relation to weaponry and arms control, situating it within today’s context. Drawing from centuries of writings and teachings, it sets out thirteen practical action points in relation to nuclear and conventional weapons. It also includes a call to place emerging technology such as AI at the service of humanity, ensuring that we learn the lessons of history.
Lethal autonomous weapons systems (LAWS) are not easily defined but can be understood as a weapon system that can independently identify and attack without direct human intervention. The Church is resolute in its opposition to the development and use of such weapons systems. Despite this, we have learned that LAWS are being used on battlefields across the globe. In fact, in 2021, the United Nations released a deeply worrying report (<a href="https://undocs.org/Home/Mobile?FinalSymbol=S%2F2021%2F229&Language=E&DeviceType=Desktop&LangRequested=False">S/2021/229</a>) which paints a story of the Haftar Affiliated Forces in Libya being "hunted down and remotely engaged by the unmanned combat aerial vehicles or lethal autonomous weapons systems" whilst retreating. The report went on to explain that the weapons systems used were programmed to attack targets without requiring data connectivity between the human operator and the weapon; in effect, a “fire-and-forget" capability.
Weapon Systems can never be a morally responsible subject. This is because it cannot truly think, feel, decide or be accountable for its actions. This then raises serious questions about the "accountability gap"; how can a weapon system be held accountable? Who is to blame for a robot that commits war crimes? Who should be placed on trial? The weapon? The solider? The commander? The coder? The corporation that made the weapon?
The UK government and the wider international community need to agree and ratify a new legally binding framework that ensures adequate, meaningful and consistent human supervision. This would mean that all weapons systems must be managed by a human operator to ensure compliance with international law and wider moral responsibilities. It would also mean that each weapon system should have input from a human at every stage of research, development and use. They should never have the capacity to contradict what the human operator has prescribed.
The Church is not alone in this view. The House of Lords Select Committee on AI in Weapons released <em>Proceed with Caution: Artificial Intelligence in Weapon Systems, </em>a report which set out a series of recommendations to the UK government. One of the recommendations urged the government to commit to integrating meaningful human control into all AI-enabled automated weapon systems to ensure human accountability. It also called for an effective international instrument on LAWS which retains human moral agency, urging the UK government to be at the forefront of such efforts.
Until such a treaty is agreed, we need to press pause on the research and development of such weapons, so we don’t sleepwalk into yet another violation of human dignity. Pope Francis, wrote in <em>Laudato Sí</em>, that “Science and technology [is] not neutral”. This means ensuring that research and emerging technology are channelled towards serving the common good of humanity, such as finding uses of AI to help combat some of the biggest challenges of our times such as the climate crisis. For example, AI is already being used to analyse different datasets to minimise emissions from energy generators which are often fired up when energy demand outstrips supply.<a href="#_ftn1" id="_ftnref1">[1]</a>
Finally, <em>Called to be Peacemakers</em> reminds each of us of the necessity that our response to such emerging technological advances in weapon systems must be situated in the context of wider disarmament efforts. Each of us has a role to play, and importantly a moral imperative to ensure that the new technologies of today are placed at the service of humanity. As Pope Francis reminds us: “Where progress, ethics and society meet […] faith, in its perennial relevance, can provide a valuable contribution”. This is especially true in the field of emerging technology and weaponry.
<a id="_ftn1" href="#_ftnref1">[1]</a> LSE, <em>What opportunities and risks does AI present for climate action? </em>4 July 2023 [<a href="http://www.lse.ac.uk/granthaminstitute/explainers/what-opportunities-and-risks-does-ai-present-for-climate-action/">www.lse.ac.uk/granthaminstitute/explainers/what-opportunities-and-risks-does-ai-present-for-climate-action/</a>] <br><em><br>Photo: Image of a drone in flight. (Credit: sommersby; iStock by Getty Images.)</em>