





























See Also
See Again
© Getty Images
0 / 30 Fotos
Intelligence or an illusion?
- Artificial intelligence encompasses a broad range of technologies capable of solving problems, making decisions, and "learning" tasks typically requiring human intelligence.
© Shutterstock
1 / 30 Fotos
Intelligence or illusion?
- However, critics argue that AI is "neither artificial nor intelligent," pointing to the significant human labor and natural resources needed to create and sustain these systems.
© Shutterstock
2 / 30 Fotos
AI and PeaceTech
- Despite debates surrounding its definition, "AI" is the term widely used to describe various tools developed by “PeaceTech Lab” companies aimed at mitigating violent conflict. Dedicated funds for AI initiatives have been established, and the United Nations is actively advocating AI as a driver of innovation in peacebuilding efforts.
© Getty Images
3 / 30 Fotos
AI supporting peace negotiations
- Certain AI tools have been specifically designed to address unique challenges faced by peace negotiators. For example, they help gather and analyze information on public perspectives, providing crucial insights that can inform negotiation strategies and foster more inclusive dialogue.
© Shutterstock
4 / 30 Fotos
AI shaping policies and predicting behavior
- Many AI tools are multi-functional and not only recommend policies but also provide predictive insights into human behavior, enabling negotiators and policymakers to anticipate outcomes and make informed decisions in complex situations.
© Shutterstock
5 / 30 Fotos
Empowering voices
- In Libya and Yemen, the UN has employed natural language processing (NLP) tools to enable broader participation in political discussions. NLP technologies are designed to classify, translate, transcribe, summarize, and generate text, offering innovative ways to bridge communication gaps in complex environments.
© Getty Images
6 / 30 Fotos
Gathering insights from public opinions
- Large language models (LLMs) have been employed to analyze data gathered from individuals sharing their opinions and asking questions online. These tools process vast amounts of information, helping to identify trends and extract valuable insights to inform decision-making.
© Getty Images
7 / 30 Fotos
Using AI for transparency and inclusion
- The goal of employing AI technologies in peacebuilding initiatives is to identify areas of agreement and disagreement across diverse groups. Experts argue that such approaches are instrumental in preventing the outbreak or recurrence of wars.
© Getty Images
8 / 30 Fotos
Using AI for Sudan’s peacebuilding efforts
- Akord.ai, an innovative chatbot developed by the NGO Conflict Dynamics International (CDI), is trained on 1,500 documents centered on Sudan, including past peace agreements. This large language model aims to navigate and address the complexities of the nation’s ongoing civil war.
© Getty Images
9 / 30 Fotos
Using AI for Sudan’s peacebuilding efforts
- Azza M. Ahmed, a senior advisor for CDI’s Sudan Program, highlighted Akord.ai’s mission to empower young individuals aspiring to contribute to Sudan’s peacebuilding. The tool addresses critical barriers, offering access to knowledge about past processes and practical guidance on negotiations.
© Getty Images
10 / 30 Fotos
Using AI for Sudan’s peacebuilding efforts
- Tarig Hilal, AI innovation lead at Akord.ai, emphasized the imbalance in negotiations, where knowledge and expertise often remain concentrated among a select few. Describing the tool’s purpose, he said, “Akord.ai is like an advisor, a co-pilot, a friend.”
© Getty Images
11 / 30 Fotos
Political accommodation
- The recommendations provided by Akord.ai are rooted in the worldview of its creator, Conflict Dynamics International (CDI). The organization advocates for “political accommodation,” a conflict resolution approach emphasizing power sharing and compromise.
© Getty Images
12 / 30 Fotos
A divisive approach to peace
- While political accommodation has its advocates, it also faces criticism. Jonas Horner, an expert on Sudanese peace negotiations, argued that “groups fighting their way to the table to be ‘accommodated’” was a factor that contributed to the conflict itself.
© Getty Images
13 / 30 Fotos
Reducing barriers, raising risks
- While Akord.ai is currently focused on reducing barriers to accessing information, the potential use of various chatbots in designing peace agreements raises significant concerns. Poorly curated or biased data could have far-reaching and serious consequences.
© Getty Images
14 / 30 Fotos
The risks of AI
- Timnit Gebru, founder and executive director of the Distributed AI Research Institute, highlighted a critical limitation of large language models (LLMs). She explained, “LLMs spit out patterns of text that we’ve trained them on, but they also make stuff up.”
© Getty Images
15 / 30 Fotos
Automation bias
- Gebru emphasized the issue of “automation bias,” where people place excessive trust in automated tools. “Studies show people trust these systems too much and will make very consequential decisions based on them,” she explained.
© Shutterstock
16 / 30 Fotos
Using AI to pinpoint opportunities for peace
- Large language models are increasingly being utilized to guide the timing of peace agreements. One example is Project Didi, an innovative Israeli startup that aims to identify “moments of ripeness”: instances when peace deals may appear more acceptable despite minimal changes to their terms.
© Getty Images
17 / 30 Fotos
Inspired by peace
- Project Didi originated as an LLM trained on the linguistic patterns from the period preceding the Good Friday Agreement, which reduced active conflict in Northern Ireland more than 25 years ago, and provides a historical framework for the startup’s efforts to identify favorable moments for peace deals.
© Getty Images
18 / 30 Fotos
Adapting AI for Gaza conflict
- Project Didi’s CEO and founder Shawn Guttman and his team are tailoring the model for application in Israel’s war on Gaza. By gathering data from both Israeli and Palestinian news outlets, Project Didi uses machine learning techniques to analyze changes in public sentiment regarding peace.
© Shutterstock
19 / 30 Fotos
Challenges of ripeness
- Similar to political accommodation, the concept of "moments of ripeness" faces its share of criticism. Experts argue that these moments often coincide with local populations reaching a state of exhaustion, resulting in peace deals that fail to address the underlying causes of violence.
© Shutterstock
20 / 30 Fotos
From reactive to proactive
- Guttman envisions Project Didi as a proactive AI tool in peacebuilding efforts, striving to create moments of ripeness rather than simply identifying them. By equipping peace activists with detailed insights into whether they are effectively influencing “human hearts and minds.”
© Shutterstock
21 / 30 Fotos
The limits of AI empathy
- Scientific evidence suggests that LLMs struggle to accurately interpret human emotions, as these are often expressed in nuanced and ambiguous ways. As expected, LLMs excel in decoding clear and structured inputs, such as unambiguous coding languages.
© Shutterstock
22 / 30 Fotos
A question of ethics and accuracy
- Gebru is critical of the claims that LLMs and NLP algorithms can interpret human thoughts or emotions. She pointed out that even the accurate detection of emotional markers doesn't equate to understanding someone’s emotional state, and emphasized that even if such capabilities were possible, their use would raise profound ethical concerns.
© Getty Images
23 / 30 Fotos
AI and simplistic solutions
- AI providers often stress that their tools are not infallible and should not be seen as universal solutions. However, these products capitalize on the public's enthusiasm for AI by presenting simple, seemingly scientific solutions to deeply intricate political issues.
© Getty Images
24 / 30 Fotos
A supplement, not a solution
- While political problems often present technical challenges where AI tools can provide assistance, the root causes of war lie in power dynamics and political struggles. For AI to play a meaningful role in peacebuilding, it must be developed transparently and ethically, serving as a complement to human and political efforts.
© Getty Images
25 / 30 Fotos
Enhancing inclusion and strategy
- Experts highlight the potential of AI tools like Akord.ai to improve access to information and foster inclusivity in peace processes. Similarly, media analysis platforms such as Didi offer valuable insights into communication strategies and the optimal timing for peace talks.
© Shutterstock
26 / 30 Fotos
The complexities of AI in peacebuilding
- Both computer scientists and conflict negotiators express concerns about the underlying assumptions embedded in LLMs. They caution against the limitations of this technology and the potential dangers of revisiting unsuccessful approaches to peace.
© Getty Images
27 / 30 Fotos
The complexities of AI in peacebuilding
- The dangers posed by predictive technologies and AI systems that recommend solutions are especially acute in the current context. With 56 armed conflicts occurring worldwide, the highest number since WWII, the consequences of flawed assumptions or errors in these tools are magnified. Sources: (Conflict Dynamics International) (The Bureau of Investigative Journalism) (Medium) See also: Demystifying AI: a beginner's guide to understanding artificial intelligence
© Getty Images
28 / 30 Fotos
Palestine
- x
© Getty Images
29 / 30 Fotos
© Getty Images
0 / 30 Fotos
Intelligence or an illusion?
- Artificial intelligence encompasses a broad range of technologies capable of solving problems, making decisions, and "learning" tasks typically requiring human intelligence.
© Shutterstock
1 / 30 Fotos
Intelligence or illusion?
- However, critics argue that AI is "neither artificial nor intelligent," pointing to the significant human labor and natural resources needed to create and sustain these systems.
© Shutterstock
2 / 30 Fotos
AI and PeaceTech
- Despite debates surrounding its definition, "AI" is the term widely used to describe various tools developed by “PeaceTech Lab” companies aimed at mitigating violent conflict. Dedicated funds for AI initiatives have been established, and the United Nations is actively advocating AI as a driver of innovation in peacebuilding efforts.
© Getty Images
3 / 30 Fotos
AI supporting peace negotiations
- Certain AI tools have been specifically designed to address unique challenges faced by peace negotiators. For example, they help gather and analyze information on public perspectives, providing crucial insights that can inform negotiation strategies and foster more inclusive dialogue.
© Shutterstock
4 / 30 Fotos
AI shaping policies and predicting behavior
- Many AI tools are multi-functional and not only recommend policies but also provide predictive insights into human behavior, enabling negotiators and policymakers to anticipate outcomes and make informed decisions in complex situations.
© Shutterstock
5 / 30 Fotos
Empowering voices
- In Libya and Yemen, the UN has employed natural language processing (NLP) tools to enable broader participation in political discussions. NLP technologies are designed to classify, translate, transcribe, summarize, and generate text, offering innovative ways to bridge communication gaps in complex environments.
© Getty Images
6 / 30 Fotos
Gathering insights from public opinions
- Large language models (LLMs) have been employed to analyze data gathered from individuals sharing their opinions and asking questions online. These tools process vast amounts of information, helping to identify trends and extract valuable insights to inform decision-making.
© Getty Images
7 / 30 Fotos
Using AI for transparency and inclusion
- The goal of employing AI technologies in peacebuilding initiatives is to identify areas of agreement and disagreement across diverse groups. Experts argue that such approaches are instrumental in preventing the outbreak or recurrence of wars.
© Getty Images
8 / 30 Fotos
Using AI for Sudan’s peacebuilding efforts
- Akord.ai, an innovative chatbot developed by the NGO Conflict Dynamics International (CDI), is trained on 1,500 documents centered on Sudan, including past peace agreements. This large language model aims to navigate and address the complexities of the nation’s ongoing civil war.
© Getty Images
9 / 30 Fotos
Using AI for Sudan’s peacebuilding efforts
- Azza M. Ahmed, a senior advisor for CDI’s Sudan Program, highlighted Akord.ai’s mission to empower young individuals aspiring to contribute to Sudan’s peacebuilding. The tool addresses critical barriers, offering access to knowledge about past processes and practical guidance on negotiations.
© Getty Images
10 / 30 Fotos
Using AI for Sudan’s peacebuilding efforts
- Tarig Hilal, AI innovation lead at Akord.ai, emphasized the imbalance in negotiations, where knowledge and expertise often remain concentrated among a select few. Describing the tool’s purpose, he said, “Akord.ai is like an advisor, a co-pilot, a friend.”
© Getty Images
11 / 30 Fotos
Political accommodation
- The recommendations provided by Akord.ai are rooted in the worldview of its creator, Conflict Dynamics International (CDI). The organization advocates for “political accommodation,” a conflict resolution approach emphasizing power sharing and compromise.
© Getty Images
12 / 30 Fotos
A divisive approach to peace
- While political accommodation has its advocates, it also faces criticism. Jonas Horner, an expert on Sudanese peace negotiations, argued that “groups fighting their way to the table to be ‘accommodated’” was a factor that contributed to the conflict itself.
© Getty Images
13 / 30 Fotos
Reducing barriers, raising risks
- While Akord.ai is currently focused on reducing barriers to accessing information, the potential use of various chatbots in designing peace agreements raises significant concerns. Poorly curated or biased data could have far-reaching and serious consequences.
© Getty Images
14 / 30 Fotos
The risks of AI
- Timnit Gebru, founder and executive director of the Distributed AI Research Institute, highlighted a critical limitation of large language models (LLMs). She explained, “LLMs spit out patterns of text that we’ve trained them on, but they also make stuff up.”
© Getty Images
15 / 30 Fotos
Automation bias
- Gebru emphasized the issue of “automation bias,” where people place excessive trust in automated tools. “Studies show people trust these systems too much and will make very consequential decisions based on them,” she explained.
© Shutterstock
16 / 30 Fotos
Using AI to pinpoint opportunities for peace
- Large language models are increasingly being utilized to guide the timing of peace agreements. One example is Project Didi, an innovative Israeli startup that aims to identify “moments of ripeness”: instances when peace deals may appear more acceptable despite minimal changes to their terms.
© Getty Images
17 / 30 Fotos
Inspired by peace
- Project Didi originated as an LLM trained on the linguistic patterns from the period preceding the Good Friday Agreement, which reduced active conflict in Northern Ireland more than 25 years ago, and provides a historical framework for the startup’s efforts to identify favorable moments for peace deals.
© Getty Images
18 / 30 Fotos
Adapting AI for Gaza conflict
- Project Didi’s CEO and founder Shawn Guttman and his team are tailoring the model for application in Israel’s war on Gaza. By gathering data from both Israeli and Palestinian news outlets, Project Didi uses machine learning techniques to analyze changes in public sentiment regarding peace.
© Shutterstock
19 / 30 Fotos
Challenges of ripeness
- Similar to political accommodation, the concept of "moments of ripeness" faces its share of criticism. Experts argue that these moments often coincide with local populations reaching a state of exhaustion, resulting in peace deals that fail to address the underlying causes of violence.
© Shutterstock
20 / 30 Fotos
From reactive to proactive
- Guttman envisions Project Didi as a proactive AI tool in peacebuilding efforts, striving to create moments of ripeness rather than simply identifying them. By equipping peace activists with detailed insights into whether they are effectively influencing “human hearts and minds.”
© Shutterstock
21 / 30 Fotos
The limits of AI empathy
- Scientific evidence suggests that LLMs struggle to accurately interpret human emotions, as these are often expressed in nuanced and ambiguous ways. As expected, LLMs excel in decoding clear and structured inputs, such as unambiguous coding languages.
© Shutterstock
22 / 30 Fotos
A question of ethics and accuracy
- Gebru is critical of the claims that LLMs and NLP algorithms can interpret human thoughts or emotions. She pointed out that even the accurate detection of emotional markers doesn't equate to understanding someone’s emotional state, and emphasized that even if such capabilities were possible, their use would raise profound ethical concerns.
© Getty Images
23 / 30 Fotos
AI and simplistic solutions
- AI providers often stress that their tools are not infallible and should not be seen as universal solutions. However, these products capitalize on the public's enthusiasm for AI by presenting simple, seemingly scientific solutions to deeply intricate political issues.
© Getty Images
24 / 30 Fotos
A supplement, not a solution
- While political problems often present technical challenges where AI tools can provide assistance, the root causes of war lie in power dynamics and political struggles. For AI to play a meaningful role in peacebuilding, it must be developed transparently and ethically, serving as a complement to human and political efforts.
© Getty Images
25 / 30 Fotos
Enhancing inclusion and strategy
- Experts highlight the potential of AI tools like Akord.ai to improve access to information and foster inclusivity in peace processes. Similarly, media analysis platforms such as Didi offer valuable insights into communication strategies and the optimal timing for peace talks.
© Shutterstock
26 / 30 Fotos
The complexities of AI in peacebuilding
- Both computer scientists and conflict negotiators express concerns about the underlying assumptions embedded in LLMs. They caution against the limitations of this technology and the potential dangers of revisiting unsuccessful approaches to peace.
© Getty Images
27 / 30 Fotos
The complexities of AI in peacebuilding
- The dangers posed by predictive technologies and AI systems that recommend solutions are especially acute in the current context. With 56 armed conflicts occurring worldwide, the highest number since WWII, the consequences of flawed assumptions or errors in these tools are magnified. Sources: (Conflict Dynamics International) (The Bureau of Investigative Journalism) (Medium) See also: Demystifying AI: a beginner's guide to understanding artificial intelligence
© Getty Images
28 / 30 Fotos
Palestine
- x
© Getty Images
29 / 30 Fotos
Can AI bring peace to the Middle East?
A new approach to solving conflicts
© Getty Images
A quick search for “artificial intelligence” and “peace” reveals starkly opposing perspectives. On one end, skeptics caution that AI could pose existential threats to humanity. On the other, tech optimists tout its potential to solve pervasive issues like loneliness, climate change, and even violent conflict.
While polarized opinions are nothing new, interest in AI as a means of fostering peace has surged since 2022, following the release of ChatGPT by OpenAI. Today, a growing number of companies claim to have developed AI tools designed to help end wars.
What exactly are these tools and what risks accompany their application in the complex and high-stakes context of armed conflict? Click on to find out.
RECOMMENDED FOR YOU




































MOST READ
- Last Hour
- Last Day
- Last Week