When we think about AI in the context of international conflict, the more immediate associations are probably related to lethal autonomous weapons (LAWS) on the battlefield. Based on representations in popular culture, such as the Terminator movies, the image of the “killer robot” is not difficult to conjure up. Yet by focusing on this aspect of AI in relation to international conflict, we are only looking at one side of the coin. Hiding on the other side of that coin are, however, rewarding questions regarding AI’s possible contribution to peace, as opposed to war. Or to put it more directly: AI “will not only wage future wars but also future peace.”
This is not to suggest that one ought to neglect critical discussions around LAWS in favor of “looking at the bright side” of AI. The concerns and dangers are all too real and the ethical questions raised by LAWS need our attention. Nevertheless, by looking at AI’s potential to contribute to the peaceful resolution of conflicts, we open up important discussions on the future of conflict resolution and the development of digital support tools for mediation. In the process, we also get to take a closer look at what makes a fundamentally human process, such as conflict resolution, so unique.
Mediation and the Role of Digital Tools
Mediation, the peaceful resolution of international conflicts through an impartial third party, plays a crucial role in reducing conflict and the suffering it brings. The UN Guidance for Effective Mediation defines it as the “process whereby a third party assists two or more parties … to prevent, manage or resolve a conflict by helping them to develop mutually acceptable agreements.” Mediation is always voluntary and requires the consent of the parties involved. It can involve high-profile diplomats as mediators such as former US Assistant Secretary of State Chester Crocker in Namibia in the 1980s or former UN Secretary General Kofi Anann in Syria in 2012. Most mediation efforts, however, take place away from the limelight, and are also undertaken by non-governmental organizations such as the Center for Humanitarian Dialogue or Swisspeace. Regardless of whether states, international organizations, or other entities and individuals act as mediators, it is a fundamentally human activity, building on establishing interpersonal relationships and trust.
This, however, is not to say that digital technology cannot play a role in supporting mediation processes. A number of recent papers have explored the role of digital technology for peace mediation. Publications from the UN Department of Political and Peacebuilding Affairs (UN DPPA) and the Centre for Humanitarian Dialogue, UNDPPA and UN Global Pulse, and DiploFoundation explore various applications, including social media and social media sentiment analysis, geographic information systems (GIS), data analytics and visualization, machine learning (AI) and virtual reality. In 2018, a number of Swiss-based organizations created the Cybermediation Initiative to further explore the role of digital tools in mediation. The UN Mediation Support Unit offers a toolkit on Digital Tools and Mediation and the Innovation Cell at UN DPPA is implementing the use of digital technology in the context of conflict resolution and providing creative thinking regarding emerging technologies. Overall, the most sustained and practical explorations focus on the role of social media and purpose-built platforms to make mediation processes more inclusive. But there is also a role for AI in supporting mediation.
AI at the Negotiation Table?
In order to conduct a meaningful discussion on the role of AI in conflict mediation, it is necessary to unpack the term further and look at specific areas of AI and their actual and potential application. The area of natural language processing (NLP), a component of artificial intelligence (AI) that lends a computer program the ability to understand human language as it is spoken and written, offers some pertinent examples for the potential of AI in conflict mediation. The following is a brief description of one such example.
Practitioners argue that successful mediation processes are inclusive and take into account the opinions and concerns of all those affected, not just those sitting at the negotiation table. A number of AI applications have been suggested and developed to create this broader understanding and to enhance inclusivity. UN Global Pulse, for example, developed and implemented a pilot project in Uganda that used “AI technology to analyze large amounts of information from public radio broadcasts,” for purposes of allowing researchers to get a sense of public conversations on various issues including refugees crossing the border from South Sudan, the impact of local natural disasters, and the quality of the health services. AI played an important role, as public radio is an important means of communication in Uganda, but due the sheer number of stations and variety of languages, it is hard to include this source of information in mediation efforts. The pilot project utilized various AI applications to filter content, transform speech to text, and analyze the text. Other examples in this area of application include the use of sentiment analysis by the Middle East Division (MED) of UN DPPA to better understand public opinion in the region. A project by the UN DPPA Innovation Cell that conducted a large-scale virtual conversation with citizens in Yemen in 2020 is also discussed as an example of AI applications in mediation. It is important to emphasize that these examples and others in the area of AI and mediation are all pilot projects or what is known as “proof of concept” (POC) projects. As of yet, there is no routine or sustained inclusion of these tools in mediation efforts. Examples such as these, however, point towards the potential of this type of AI application.
Broadly speaking, areas of application of AI tools in the field of mediation include: Knowledge management and background research (e.g. smart searches), generating a good understanding of the conflict and the parties to the conflict (e.g. big data analysis), creating greater inclusivity of mediation processes (e.g. NLP, including sentiment analysis), monitoring and implementation (e.g. automated analysis of satellite images and other GIS data). In each of these areas of application, AI systems can serve as useful tools to facilitate – yet never to replace – the work of mediators in the field.
Whom are you Going to Trust?
As always, the application of these tools raises practical and ethical questions. For example, considerations regarding bias, transparency, and explainability apply to the AI tools in the context of mediation as they apply to AI tools used in other contexts, with potential far-reaching implications. On the practical side, mediators and their teams and organizations active in the field of mediation need to think about potential collaborations with the tech sector and/or about recruiting personnel with the relevant skill-set.
Beyond the questions raised by these points, specifically relevant for the case of conflict mediation, one ought to also ask: How is the introduction of these tools impacting mediation as a social and political process? As mentioned above, trust is often described as a key ingredient of successful mediation, and the impact of AI tools on trust in the context of mediation is an important issue to investigate. It is likely that mediators need to take additional steps to establish trust in these tools. On the other hand, it is conceivable that a joint involvement in the development and implementation of AI tools used for monitoring and implementation might itself could lead to a working trust between the conflict parties.
While there is no “algorithm for peace,” AI tools have a role to play in the future of the mediation and resolution – as opposed to the creation and exacerbation – of conflicts. Mediation will remain a fundamentally human endeavor, but in the future, it might get a helping hand from AI.
This Commentary is published as part of the German-Israeli Tech Policy Dialog Platform, a collaboration between the Israel Public Policy Institute (IPPI) and the Heinrich Böll Foundation.
The opinions expressed in this text are solely that of the author/s and do not necessarily reflect the views of the Israel Public Policy Institute (IPPI) and/or the Heinrich Böll Foundation.
Why Germany should practice the cyber norms it preaches: “The Case of a Vulnerabilities Equities Process”
The year 2021 has seen new momentum in the global debate about cyber norms, that is, rules for…
"We work to make the political campaigning environment more trustworthy, transparent and comprehensible for people."
Disinfo Talks is an interview series with experts that tackle the challenge of disinformation through different prisms. Our talks…