Matthew Diemer, a Democrat running for election in Ohio’s Seventh Congressional District, was approached by the artificial intelligence company Civox in January with a pitch: A.I.-backed voice technology that could make tens of thousands of personalized phone calls to voters using Mr. Diemer’s talking points and sense of humor.
His campaign agreed to try out the technology. But it turned out that the only thing voters hated more than a robocall was an A.I.-backed one.
While Civox’s A.I. program made almost 1,000 calls to voters in five minutes, nearly all of them hung up in the first few seconds when they heard a voice that described itself as an A.I. volunteer, Mr. Diemer said.
“People just didn’t want to be on the phone, and they especially didn’t want to be on the phone when they heard they were talking to an A.I. program,” said the entrepreneur, who ran unsuccessfully in 2022 for the same seat he is seeking now. “Maybe people weren’t ready yet for this type of technology.”
This was supposed to be the year of the A.I. election. Fueled by a proliferation of A.I. tools like chatbots and image generators, more than 30 tech companies have offered A.I. products to national, state and local U.S. political campaigns in recent months. The companies — mostly smaller firms such as BHuman, VoterVoice and Poll the People — make products that reorganize voter rolls and campaign emails, expand robocalls and create A.I.-generated likenesses of candidates that can meet and greet constituents virtually.
But campaigns are largely not biting — and when they have, the technology has fallen flat. Only a handful of candidates are using A.I., and even fewer are willing to admit it, according to interviews with 23 tech companies and seven political campaigns. Three of the companies said campaigns agreed to buy their tech only if they could ensure that the public would never find out they had used A.I.
Much of the hesitation stems from internal campaign polls that found voters were nervous about A.I. and distrusted the technology, said four officials involved in Democratic and Republican campaigns. When campaigns turned to A.I. to generate photos or videos of candidates, the numbers were even worse, one of them said.
Some uses of A.I. in political campaigns have already flopped. In January, an A.I. robocall that mimicked President Biden’s voice in the New Hampshire primary was denounced by political watchdogs and investigated by local law enforcement. On Monday, former President Donald J. Trump posted A.I.-generated images of Taylor Swift endorsing him to his social media site, Truth Social. The response from her fans was anger and condemnation.
“Political campaigns have trust issues to begin with,” said Phillip Walzak, a political consultant in New York. “No candidate wants to be accused of posting deepfakes in the election or using A.I. in a way that deceives voters.”
The skepticism is part of a new reality for A.I. as enthusiasm over the technology has cooled. This year, tech giants and start-ups that had celebrated A.I. as the wave of the future have begun hedging their promises. Wall Street has become wary of the financial goals set by A.I. companies. And lawmakers have proposed measures that could slow the A.I. industry’s growth.
Just six months ago, it was a different story. Drawn by the promise of millions of dollars in campaign funds that candidates would spend to win, dozens of tech companies shifted their technology toward the U.S. election. They created chatbots like ChatGPT with A.I. image generators to create walking, talking clones of candidates that could interact with voters virtually.
BHuman, a New York company founded in 2020 that uses A.I. to create videos, has pitched political campaigns on a product that personalizes videos of candidates for voters. Candidates could record themselves speaking on an issue and BHuman’s A.I.-based technology could then clone their face and voice to create new videos. The opening lines could be tweaked to greet a specific voter, or recite a particular talking point.
“Imagine you’re a voter and you get a video in which a candidate says your name and speaks to your issues,” said Don Bosco, BHuman’s founder. “That is creating human connection.”
BHuman also offers a product that creates a digital replica of a candidate, mimicking the candidate’s writing style to answer emails or engaging in virtual chats with voters. Mr. Bosco declined to comment on which campaigns had used his company’s products.
Personaliz.ai, an A.I. company founded last year and based in Hyderabad, India, said it worked with more than 30 politicians in India’s national elections this year. The firm made videos where A.I. versions of candidates interacted with voters on LinkedIn and on campaign websites. They also sent personalized videos to people’s phones through WhatsApp and text messages.
Santosh Thota, chief executive of Personaliz.ai, said that the response from candidates and voters in India was “great” and that his company had seen interest from other Southeast Asian countries and had shown its tech to politicians in several African countries. But he has not seen the same interest from the United States and Europe, he said.
“People in the U.S. are skeptical of the technology,” Mr. Thota said.
Civox, which is based in London and worked with the campaign of Mr. Diemer, the Ohio Democrat running for Congress, said it was still experimenting with the right way to reach voters with its technology. Apart from its A.I. voice technology, the company offers chatbot-like programs that can answer voters’ questions on behalf of a campaign.
Ilya Mouzykantskii, Civox’s chief executive, said that A.I. was not a magic bullet for winning, but that the tools could help campaigns — especially small ones — “run more automated and targeted outreach.”
Some campaigns have been more willing to buy tech from A.I. companies for behind-the-scenes tasks, such as helping to organize email lists and voter databases, three of the companies said.
When Mr. Diemer initially began working with Civox on A.I. voice technology, he asked that his voice be used to train the A.I. robocall, the company said. But Civox urged him to go with a voice that was clearly artificially generated, so voters would know that A.I. was involved and that the campaign was acting transparently.
Mr. Diemer’s campaign eventually settled on an A.I. voice that said, “Hi, I’m Ashley, an artificial intelligence volunteer for Matt Diemer.” The calls were placed in March, just before Super Tuesday. The pickup rate of robocalls, whether done by A.I. or by human voice, was in the single digits, Civox said. Most people hung up on the calls from Mr. Diemer’s campaign in the first few seconds.
Civox declined to comment on how much the tech cost. The company worked with about a dozen political campaigns over four months in the spring and made hundreds of thousands of robocalls to test its A.I. technology.
Mr. Diemer said he didn’t regret experimenting with A.I.
“I love A.I. and tech and what it could potentially do to make political campaigns more affordable and accessible for everyone,” he said. “I don’t think everyone got what we were trying to do, or gave it a chance to see that maybe, A.I. was a great tool in reaching voters.”
The post The Year of the A.I. Election That Wasn’t appeared first on New York Times.