A “deepfake” caller posed as a top Ukrainian official in a recent videoconference with Senator Benjamin L. Cardin, the chairman of the Foreign Relations Committee, renewing fears that lawmakers could become the targets of malign actors seeking to influence U.S. politics or to obtain sensitive information.
According to an emailed warning sent by Senate security officials to lawmakers’ offices and obtained by The New York Times, a senator’s office received an email last Thursday that appeared to be from Dmytro Kuleba, until recently Ukraine’s foreign minister, requesting to connect over Zoom. On the subsequent video call, the person looked and sounded like Mr. Kuleba.
But the senator grew suspicious when the figure posing as Mr. Kuleba started acting out of character, the Senate security officials wrote, asking “politically charged questions in relation to the upcoming election” and demanding an opinion on sensitive foreign policy questions, such as whether the senator supported firing long-range missiles into Russian territory. The senator ended the call and reported it to State Department authorities, who confirmed that the figure who appeared to be Mr. Kuleba was an impersonation.
Though the Senate security office’s email did not specify that the senator was Mr. Cardin, two Senate officials familiar with the matter confirmed that he was the senator in question.
Mr. Cardin, a Maryland Democrat, also partially confirmed the episode in a statement Wednesday night. In it, he acknowledged that “in recent days, a malign actor engaged in a deceptive attempt to have a conversation with me by posing as a known individual.” Mr. Cardin did not say the individual was Mr. Kuleba or make any reference to Ukraine.
The operation was reported earlier by Punchbowl News.
Deepfake video technology uses artificial intelligence to create video of fictitious people who look and sound real. The technology has sometimes been used to impersonate public figures, including a video that circulated on social media in 2022 falsely showing President Volodymyr Zelensky of Ukraine announcing a surrender in the war with Russia.
Mr. Cardin is retiring at the end of the year. But the episode has renewed fears that foreign actors could try to target lawmakers, particularly now in a bid to influence the outcome of the November election.
It was not immediately clear who had orchestrated the operation that targeted Mr. Cardin. Intelligence officials have warned that foreign actors such as Russia, Iran and China are exploiting artificial intelligence, including deepfakes, to augment their election interference efforts — with Russia generating the most content, the Office of the Director of National Intelligence said this week.
While it is unclear if Russia was behind the impersonation of Mr. Kuleba, Moscow is currently waging a war in Ukraine. Some of the questions asked of Mr. Cardin would be of particular interest to Russia — particularly when, according to the Senate security office’s email, the impersonator asked him: “Do you support long-range missiles into Russian territory? I need to know your answer.”
Senate security officials cautioned lawmakers to remain vigilant.
“While we have seen an increase of social engineering threats in the last several months and years, this attempt stands out due to its technical sophistication and believability,” the email from the Senate’s security office said. “It is likely that other attempts will be made in the coming weeks,” the office added.
The post ‘Deepfake’ Caller Poses as Ukrainian Official in Exchange With Key Senator appeared first on New York Times.