When Santander U.K.’s artificial intelligence technology flagged unusual activity on a customer’s account last year, the bank’s staff had no idea it would lead to a human trafficking network being uncovered.
On their own, the transactions didn’t look like much. A 34-year-old man was making regular payments to budget airlines, mobile phone providers, and Vivastreet and Gumtree, websites that can be used to advertise genuine adult services but also to sell cars, computers, and more.
But the bank monitors and analyzes millions of transactions using technology run by ThetaRay, an AI-powered financial crime detection company, to spot patterns—and when something falls outside those patterns.
Without AI, this activity wouldn’t have been picked up, said Stephen Jennings, the head of transaction monitoring at Santander, whose team disclosed the case to the U.K. National Crime Agency.
“By understanding what is normal, AI will then very quickly tell you what’s not normal,” ThetaRay CEO Peter Reynolds said. “And that thing that doesn’t look normal is potentially crime.”
In recent years, advances in AI technologies have helped transform the detection of human trafficking and modern slavery, allowing businesses such as Santander, nonprofit organizations, and law enforcement to flag risks more effectively.
Last year’s Trafficking in Persons Report, an annual status update by the U.S. State Department on global anti-trafficking efforts, actively encouraged technology companies to use “data and algorithm tools” to detect patterns and identify suspicious activity.
Yet beyond the potential, others have raised the alarm. Human rights organizations and survivors warn that collecting data on marginalized populations and automating decisions can pose a threat.
“There is, as with a lot of AI-related technologies, tensions in using it, right?” said Anjali Mazumder, who leads the AI and justice and human rights program at the Alan Turing Institute, the U.K. national institute for data science and AI.
“What’s needed is recognizing that, yes, these tools exist that could be helpful to detect harm, but they can also cause harm.”
Around 2017, Mazumder ran workshops on modern slavery, which were attended by British government officials, United Nations agencies such as the International Labour Organization, and NGOs. When the topic of AI and machine learning (a subfield of AI that uses algorithms that learn from data to make increasingly better decisions) was brought up, there was a lot of interest but minimal use, she says.
Fast-forward nearly a decade, and the landscape looks very different.
Take Global Fishing Watch, for example. The U.S.-based nonprofit uses AI technology to tackle the widespread use of forced labor in fisheries around the world. Its model, which combines satellite data and machine learning, shows that vessels with crews that are subject to forced labor behave differently from other fleets—such as traveling farther from ports.
As with any AI tool, good data is needed, but it’s often siloed across a vast modern slavery ecosystem. The Traffik Analysis Hub, a U.K. nonprofit, works to address this, sourcing global data from leading nonprofit organizations, law enforcement, and open-source data such as journalists’ reporting.
Using AI technologies developed by IBM, it’s able to spot patterns, networks, and hot spots based on what it describes as one of the largest collections of available trafficking information. More than 200 organizations and 500 users have access, having used the information to uncover Vietnamese nationals being trafficked to work in Cambodian scam centers, for example, and networks of fake job agencies in Uganda.
Much of the initial technology focused on combating sexual exploitation, particularly online, Mazumder says. Traffic Jam, an AI software, trawls hundreds of thousands of online ads selling sex—some of which are profiles advertising trafficked individuals—and analyzes the data. Used by law enforcement agencies across Europe and North and South America, its developers say it has allowed cases to be built in months rather than years.
The problem is many of those ads are also posts from genuine sex workers—and tools using this kind of software can struggle to reliably discern between the two.
Spotlight is another tool that scrapes sex advertisements. Owned by Thorn, a U.S. nonprofit founded by actors Ashton Kutcher and Demi Moore (Kutcher resigned in 2023), its aim is to fight child exploitation. Details such as names, numbers, images, and payments are gathered and put into a giant database. Think of it as a vast Google search, widely used by U.S. law enforcement when looking for suspected trafficking victims.
Yet reporting by Forbes found that Spotlight harvested ads from genuine sex workers, often without their knowledge, alongside online sex ads of exploited children. Police then used the databases to monitor women from afar and didn’t step in when they were endangered, the reporting found.
Many of these technologies, particularly in the anti-trafficking space, explicitly target sex workers, says Olivia Snow, a dominatrix and research fellow at the UCLA Center for Critical Internet Inquiry.
“It’d be one thing if they were using AI to track clients or something. But they never are. It’s just the workers,” Snow said. “People making AI don’t actually care about human rights issues. They care about making money, and they care about getting data.”
For decades, sex workers have also reported being frozen or locked out of their bank accounts, she adds. Again, genuine sex work can involve transactions seen as trafficking indicators, such as depositing cash at certain ATMs and times of the day. Snow suspects the reason why she’s unable to send money from her bank account is connected to her work, though she won’t ever know for sure.
“They’ve definitely figured something out,” she said. “It is so dystopian.”
Santander’s Jennings acknowledges the bank’s AI tool alone can’t determine what is genuine or not—rather it flags activity it deems unusual. What’s important, he says, is the human element. After alerts are generated, a team of roughly 17 people analyze the data, taking a holistic view of the customer to understand if the transactions make sense. “We’re [thinking]: ‘Does this add up?’” Jennings said. “It’s the combination of the two things which is very, very powerful.”
The technology is best thought of as an indicator of something potentially risky, says Mazumder of the Alan Turing Institute. But that risk needs further investigation, likely offline and in person. She also cautions that gaps and bias in modern slavery data need addressing. One model from 2016 trained on sex ad websites, for example, determined that the phrase “Asian” indicated sex trafficking.
Others point out that tools are often funded by Big Tech and built for law enforcement. If the industry cared about helping survivors, they’d involve them in the process, says Sabra Boyd, a Seattle-based writer and trafficking survivor.
The results can be significant—as the Santander example shows. But Mazumder says it is vital that mechanisms are in place to ensure marginalized or vulnerable groups aren’t targeted, further exploited, or harmed in some way.
Boyd summed it up: “Nothing about us, without us.” A powerful motto of the disability rights movement for decades, for her it feels as relevant as ever.
“Who gets to build these products?” she asked. “Who gets to define what safety is for me as a trafficking survivor?”
The post AI Is Fighting Modern Slavery, for Better or Worse appeared first on Foreign Policy.