DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

‘100 Video Calls Per Day’: Models Are Applying to Be the Face of AI Scams

March 16, 2026
in News
‘100 Video Calls Per Day’: Models Are Applying to Be the Face of AI Scams

When applying for jobs, Angel talks up her language skills. “I can speak fluent English, I can speak good Chinese, I also speak Russian and Turkish,” the glamorous, 24-year-old Uzbekistani woman explains in a selfie-style video made for recruiters. Angel had arrived in the Cambodian city of Sihanoukville that day, she said, and was ready to start work immediately.

Those impressive language skills, however, have likely been put to use as part of elaborate “pig-butchering” scams targeting Americans. That’s because, instead of applying for a conventional corporate job, Angel was putting herself forward to work as an “AI face model”—sitting in front of a computer all day and making deepfake video calls to manipulate potential scam victims. Her application, which also required her height and weight, says she has already clocked up “1 year as an AI model.”

Angel is far from alone in this pursuit. A WIRED review of dozens of recruitment videos and job ads posted to Telegram show people from around the world—including Turkey, Russia, Ukraine, Belarus, and multiple Asian countries—applying to be AI models or “real face” models in Cambodia and Southeast Asia. The region has become home to vast, industrialized scamming operations that hold thousands of human trafficking victims captive and force them to run online cryptocurrency investment and romance scams.

As well as tricking people into working in scam compounds, these high-tech, multibillion-dollar criminal enterprises can also attract people into seeking “work” as part of the operations. “In the past year until today, they are also hiring people doing AI modeling,” says Hieu Minh Ngo, a cybercrime investigator at the Vietnamese scam-fighting nonprofit ChongLuaDao. “They will give you the software so they can swap their face by using AI and they can do romance scams,” he says.

Ngo, a reformed criminal hacker who now tracks scam compound activity and supports victims, identified around two dozen channels on Telegram that have some job postings for AI models in the region. Humanity Research Consultancy, an anti-human-trafficking organization, has also tracked people applying on Telegram for jobs in “known scam hub cities” as “models” and “AI models,” including Angel’s application.

The rise of AI models comes as cybercriminals are broadly adopting AI and using face-swapping as part of their online scamming. Typically, fraudsters will use fake personas to contact potential victims on social media or messaging platforms. They will often use stolen images of celebrities or attractive men or women to entice a person into talking to them.

Once they make contact, they will then bombard them with attention to help build up a relationship, before trying to get them to part with their cash. In some instances, multiple people may control the scammers’ account and message the victim under a single fake persona. But if a potential victim asks for a video call during these interactions—to check if the person they are speaking to is real, for instance—that’s when deepfake video calls and models who have their faces swapped can be used. Some Southeast Asian scam centers have dedicated “AI rooms” where the calls are made from.

Job advertisements for AI models or “real models” reviewed by WIRED demand excessive working hours, offer little free time, and require a relentless schedule. The ads are usually posted by a channel administrator and don’t include contact details or list who someone would specifically be working for. One recruitment post for an alleged six-month contract says the person will need to send photos daily, make video and voice calls, and create audio and video messages. “Approximately 100 video calls per day,” the post says.

Other posts list up to 150 potential calls per day. “Filters may be used, but ensure the image is realistic. Live-action videos are permitted; wigs are prohibited,” another ad reads. For the privilege, the person would allegedly get one full day and four half days off per month. Yet another ad lists working hours as between 10 pm and 10 am in Cambodia and a preference that the person will have a “Western accent.” One model-job ad says: “The company will retain your passport for visa and work permit management.” Taking people’s passports is one of the primary ways scam compound operators hold people captive.

While a few men apply for the AI model roles, the vast majority of applications viewed by WIRED were from young women, mostly in their early twenties. Applicants are asked to send a short video introducing themselves, text about their experience and expectations and photographs of themselves; some are required to include their marital status and “vaccination” status.

“For over three years, I have worked with Chinese companies for different kinds of projects including stock market, cryptocurrency, and love story,” one person says in a recruitment video. Another says: “Based on my experience, I am good handling customer, I persuade them to invest by using my own techniques and discussing how gold trading benefits them.”

The video applications do not contain full names or contact details, so WIRED was unable to contact those applying for roles.

Modeling applicants have requested salaries of up to $7,000 per month, according to Humanity Research Consultancy. They also make specific requests about their working conditions, many of which may not be afforded to people who have been trafficked into the scam operations. One woman requested her own room and that she “can go outside.” Another requested that they could “go home on day off” and have a “personal washing machine.”

Although some of the models are recruited to work in the roles and may get more freedoms than victims of human trafficking, says Ling Li, the cofounder of the nonprofit EOS collective which works with victims of the scam industry, they may still face harsh treatment from bosses. “One European victim told us that he saw some Italian models in his compound, but he cannot tell [if] they are [there] willingly or not because they were beaten in front of him,” she says. “And also there is some sexual harassment.”

WIRED sent Telegram a list of two dozen jobs channels and recruitment channels that have advertised AI models, alongside other roles, in recent months. The company did not appear to remove any of the channels; however, a spokesperson says its policies do not allow scamming-related activity to take place.

“Content that encourages or enables scams is explicitly forbidden by Telegram’s terms of service and is removed whenever discovered,” a spokesperson for Telegram says. “In cases such as this, there are legitimate reasons one might give their likeness, and so such content must be examined on a case-by-case basis.”

The vast majority of the model-job ads and applications on Telegram don’t specifically mention scamming work, but they include a host of red flags indicating scamming, Ngo says. “Why [do you] need AI model? That’s the first question,” Ngo says. Other warning signs include the locations being in known scamming sites in Cambodia, claims of high salaries for the region, and frequent requirements for Chinese language skills, Ngo says.

The ads and video applications also include language closely aligned with scams, according to researchers and WIRED’s review of the posts. This includes frequent mentions of “clients,” a term scam operations use instead of “victims,” plus frequent references to cryptocurrency investments or gold trading. One person, who claimed to have been working as an AI model and “real face” model for 18 months, said their previous work involved convincing people to invest: “I really know how to make good communication to a client, how to make them trust us, how to send a good picture to them, and how to make them laughing.”

However, some posts are more explicit, listing a “job market” someone was applying for as: “love scam.” Another post describes a person’s experience as: “3 year as customer service (killer) of scamming platform crypto.”

After Frank McKenna’s mom started getting scam text messages about making investments last year, he began intercepting them and talking with the senders. McKenna, the chief strategist at anti-fraud software firm Point Predictive who has closely tracked “AI models,” says he wanted to understand how they were operating, so he set up a video call between them and his mom.

“The only purpose of that call was to prove that they’re a real person and to gain trust,” McKenna says. During the call, he says, the young woman on camera appeared to be using an AI filter on her face. “It’s kind of glitchy. There’s other people in the room with her, so there’s echoing,” he says. “Then we had another short call with another AI model.”

A month or so later, McKenna says, he saw what appeared to be the same model’s recruitment video posted online, saying she was looking for a new contract as hers had expired. “It was kind of a small world of these AI models who seem to go from place to place, completely voluntarily, making pretty good money,” McKenna says. “They’re probably just in the video room doing calls all day with tons of different victims.”

The post ‘100 Video Calls Per Day’: Models Are Applying to Be the Face of AI Scams appeared first on Wired.

Inside the Gen Z Shark Tank where influencers are becoming venture investors
News

Inside the Gen Z Shark Tank where influencers are becoming venture investors

by Fortune
March 16, 2026

Last Monday, the impossible happened: I attended a venture capital event where I didn’t hear the terms “AI moat” or ...

Read more
News

Inside the Gen Z Shark Tank where influencers are becoming venture investors

March 16, 2026
News

Pokémon Legends 3 Leak Claims Next Game Could Play Like Pikmin

March 16, 2026
News

MAPPED: 65 planes were diverted as a drone strike caused flight chaos at Dubai Airport once again

March 16, 2026
News

Trump rages at Supreme Court’s ‘bad behavior’ in late-night fury: ‘Inept and embarrassing’

March 16, 2026
Former Newsom advisor received $50,000 payout after leaving state job amid federal probe

Former Newsom advisor received $50,000 payout after leaving state job amid federal probe

March 16, 2026
Trump’s Threat to Delay Summit With Xi Casts New Shadow Over China Relations

Trump’s Threat to Delay Summit With Xi Casts New Shadow Over China Relations

March 16, 2026
March Madness men’s tournament analysis: Teams and players to watch

March Madness men’s tournament analysis: Teams and players to watch

March 16, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026