How to document war crimes in the digital age

Caitlin Thompson


The stories coming out of Bucha, a small city near Kyiv that was under Russian occupation since early March, are horrific. Witnesses have described scenes of torture, summary executions and mass graves. Journalists who accompanied Ukrainian troops moving into the city have reported seeing civilian corpses along the road, some with their hands tied behind their backs, others with a gun blow to the head.

This adds to the growing pile of evidence that Russian troops have perpetrated war crimes in Ukraine. On April 4, President Biden called for Russian President Vladimir Putin to be brought to trial. The European Union created a Joint Investigation Team with Ukraine that will collect and process evidence that can be brought before the International Criminal Court. 

“Perpetrators of war crimes will be held accountable,” tweeted European Commission President Ursula von der Leyen on Monday.

Human rights groups like Amnesty International and Human Rights Watch are collecting evidence. And across the NGO and legal advocacy community, there’s an army of people working to authenticate videos of attacks through open-source investigation, also known as OSINT, by using publicly available data to show where and when videos were shot and photos were taken.

But documenting war crimes is more complicated than pressing record on a camera or even verifying a video.

OSINT has been crucial for documenting atrocities in Syria and Yemen, but it has its limitations. In many cases, the person who shot a video either can’t be identified or can’t be found, so they can’t be called to testify in court. When videos are uploaded to social media, crucial metadata that often accompanies video files — like the time and precise geographical location where a video was shot — are either obfuscated or can’t be trusted, making those videos harder to authenticate. Anything that makes OSINT evidence easier to dispute can allow Russia to manipulate narratives in its favor.

Two groups leading in this area are Bellingcat, a research and analysis collective that maintains a heavy focus on Russia and Ukraine, and WITNESS, a U.S.-based NGO that trains people to use technology to defend human rights. I spoke with Dalila Mujagic, an associate legal advisor at WITNESS.

OSINT is “largely untested in courts of law and especially at the International Criminal Court,” said Mujagic. “Filming to get global attention on an issue is much, much different than filming for evidentiary purposes.”

In addition to showing what happened, evidence of a war crime needs to illustrate who ordered the attack and prove that it was intended to target civilians. 

So WITNESS is working with organizations like the Ukrainian Legal Advisory Group to train people on the ground to capture footage of alleged war crimes with a focus on how it can be used as evidence in court.

“Let’s say the basic crime is a killing of civilians or an attack on a hospital. What elevates that base crime to the level of an international crime requires proving the context,” Mujagic told me. 

By that, she means a 360-degree view. Details like any identifying landmarks in the background or shadows on the ground that indicate time of day. Footage of barricades that prove there is military action in the area.

Not all footage will be useful in court, said Mujagic. She pointed to the example of a video from Ukraine of a car containing the burnt remains of several people. 

“With the shakiness of the footage, how fast it’s shot — people tend to just pan everywhere because their camera follows their eye — you can’t really give it a great sense, for example, of how many bodies were in the car,” she said. “Was the car even a civilian car in the first place?”

Video evidence won’t necessarily be the smoking gun that leads to prosecution. But it will likely be a piece of the puzzle, alongside witness testimony, military communications and interviews with journalists, said Mujagic.

“If you corroborate those videos with messages from the command structures, you could potentially then prove something that sounds impossible to prove, like intent and knowledge. So did these commanders know that this was a civilian area?”  

People documenting war crimes in Ukraine are learning from ongoing conflicts, in countries like Syria and Yemen. The University of California at Berkeley’s Human Rights Center has developed a protocol for using open-source evidence collection that meets the standards to be admissible in international courts, and leaders in the space, like Bellingcat, are actively capturing and analyzing evidence in the Ukrainian context.

One big lesson is that social media is a fickle friend. YouTube’s algorithms have removed vital footage of atrocities in Syria, Yemen and dozens of other conflict areas, due to the platform’s policy prohibiting the display of gratuitous violence. The OSINT community has learned to work fast to save vital footage before it’s erased. 

Social media platforms have become“unwitting and perhaps slightly unwilling evidence lockers,” said Mujagic. For years, groups like WITNESS and Mnemonic (home to the Syrian Archive) have been pressuring platforms like YouTube to take greater responsibility for the kinds of materials that get caught in the nets of their content moderation algorithms, but have seen few results. So far, Mujagic is not seeing proof that they will do a better job of preserving vital evidence from Ukraine.


India’s ruling party got a better deal on Facebook ads in the lead-up to elections than opposition parties. On average, Prime Minister Narendra Modi’s Hindu-nationalist Bharatiya Janata Party paid $546 per 1 million views for an ad. In contrast, the main opposition party, the Indian National Congress, paid nearly 30% more — an average $702. In nine of ten elections in 2019 and 2020 analyzed by The Reporters’ Collective and, the BJP paid less than its opponents, giving them more reach for less money and bolstering the argument that Facebook’s opaque ad-targeting algorithm are influencing Indian elections in the BJP’s favor. 

The use of AI in the U.K.’s criminal justice system is “a new Wild West,” according to a report released last week by the House of Lords Home Affairs and Justice Committee. Police are using facial recognition and predictive policing tools without adequately evaluating their efficacy or outcomes. As it stands, the report says, there are no obligations for the officials using the systems to be trained, and law enforcement agencies are essentially “making it up as they go along.” I appreciate the Committee’s bluntness. 

The spyware maker FinFisher is broke. The German company is under investigation for the illegal sale of its spyware to Turkey, which then allegedly used the tool to target the phones of political activists. FinFisher has been go-to tech for authoritarian regimes around the world, including Bahrain, Egypt, Ethiopia and the United Arab Emirates. The signature spyware enables police and intelligence agencies to access the digital devices of targets like journalists and dissidents, allowing authorities to secretly monitor their every move. It is unclear whether or how the company’s claim of  insolvency will impact the results of the investigation by German officials.


  • This ProPublica investigation of how tenant screening algorithms are preventing people from getting housing.
  • This VICE article on Lantern, a tool that helps people bypass online censorship. The company plans to build a peer-to-peer network that would allow people to post on a special, censorship-resistant network even if Russia disconnects from the global internet.