top of page
Search

What can AI do for children affected by armed conflict?

  • Julia Freedson and Yvonne Kemper
  • Feb 28, 2023
  • 3 min read

ree

Have you thought about turning to Artificial Intelligence (AI) to improve your selfie skills or help you write an essay? These days it seems that AI is here to help with us with just about anything. This got us thinking about the possible applications of AI to improve the lives children affected by armed conflict


Ok, AI, show us what you’ve got

Though the overall impact of AI on children in war zones is not yet well-documented, there are a bunch of ways people are using AI — or could be using AI in the future — to support children affected by armed conflict. Here are a few examples that captured our attention:


· Identifying child soldiers: AI can help child protection workers identify individuals associated with armed forces or groups who may be under 18 by using computer vision and deep learning to automatically detect age and military fatigues, which could speed up the process of releasing and demobilizing children.

· Preventing online recruitment of children: AI and machine learning can be used to detect and counter the spread of disinformation by armed groups that some may use to facilitate child recruitment (see more on this in our previous blogs).


· Improving monitoring of child rights violations: AI can strengthen monitoring and reporting of violations in various ways. For example, AI can help predict areas where children may face increased risks of violations and identify factors that may indicate a high likelihood of violations. By automating aspects of this work, AI can allow organizations to respond more quickly to children who have endured grave violations of their rights.


· Increasing children’s access to lifesaving information: For example, AI-powered chatbots can help inform children and child protection workers about potential risks children face and find strategies for keeping children safe. Recently, UNICEF and the government of Ukraine used a chatbot to facilitate communication between social workers and potential host families for children without parental care due to the war in Ukraine.


· Strengthening humanitarian response for children and other civilians: AI can help humanitarians gather and analyze data on the impact of war on children, allowing them to design more effective and targeted responses to protect and support children.


· Providing access to education: It is well known that many children affected by armed conflict have their education disrupted, in part due to shortage of teachers. To address this, AI and machine learning technologies can be used to provide these children with access to quality education through tablets or other devices. While these technologies cannot fully replace human instructors, WarChild Holland’s Can’t Wait to Learn initiative has demonstrated that self-paced, autonomous learning programs can help keep children in adverse situations on track for numeracy and literacy and is currently being implemented in Sudan, Lebanon, Jordan, Chad, Bangladesh and Uganda.


Is it all good?

There are always two sides to a coin. Besides the many exciting opportunities, the use of AI technologies in armed conflict can also have potential negative consequences and can pose risks of harm to children. Here are just two examples:


1) AI enabled weapon systems could be used to develop weapons that are more precise and efficient at targeting children and the places where they gather, such as schools.


2) AI can be used for digital surveillance and to limit children’s freedoms.


Now what?

So, what do we make of AI technologies? Can we use it to protect and assist children in armed conflict, and if so, how can we do it safely and without doing harm?


Who better to ask these questions about AI than the AI chatbox itself? And this is what AI said:


While [AI] has the potential to provide significant benefits, there are also many risks that need to be carefully considered. It will be important to continue monitoring the development and use of AI in relation to children in conflict zones in order to ensure that it is used in a way that maximizes its potential benefits and minimizes any potential risks.


Thanks for the warning, AI. We could not agree more.



Comments


bottom of page