DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Amy Klobuchar: I Knew Deepfakes Were a Problem. Then I Saw One of Myself.

August 20, 2025
in News
Amy Klobuchar: I Knew Deepfakes Were a Problem. Then I Saw One of Myself.
501
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

There’s a centuries-old expression that “a lie can travel halfway around the world while the truth is still putting on its shoes.” Today, a realistic deepfake — an A.I.-generated video that shows someone doing or saying something they never did — can circle the globe and land in the phones of millions while the truth is still stuck on a landline. That’s why it is urgent for Congress to immediately pass new laws to protect Americans by preventing their likenesses from being used to do harm. I learned that lesson in a visceral way over the last month when a fake video of me — opining on, of all things, the actress Sydney Sweeney’s jeans — went viral.

On Jul. 30, Senator Marsha Blackburn and I led a Senate Judiciary subcommittee hearing on data privacy. We’ve both been leaders in the tech and privacy space and have the legislative scars to show for it. The hearing featured a wide-reaching discussion with five experts about the need for a strong federal data privacy law. It was cordial and even-keeled, no partisan flare-ups. So I was surprised later that week when I noticed a clip of me from that hearing circulating widely on X, to the tune of more than a million views. I clicked to see what was getting so much attention.

That’s when I heard my voice — but certainly not me — spewing a vulgar and absurd critique of an ad campaign for jeans featuring Sydney Sweeney. The A.I. deepfake featured me using the phrase “perfect titties” and lamenting that Democrats were “too fat to wear jeans or too ugly to go outside.” Though I could immediately tell that someone used footage from the hearing to make a deepfake, there was no getting around the fact that it looked and sounded very real.

As anyone would, I wanted the video taken down or at least labeled “digitally altered content.” It was using my likeness to stoke controversy where it did not exist. It had me saying vile things. And while I would like to think that most people would be able to recognize it as fake, some clearly thought it was real. Studies have shown that people who see this type of content develop lasting negative views of the person in the video, even when they know it is fake.

X refused to take it down or label it, even though its own policy says users are prohibited from sharing “inauthentic content on X that may deceive people,” including “manipulated, or out-of-context media that may result in widespread confusion on public issues.” As the video spread to other platforms, TikTok took it down and Meta labeled it as A.I. However, X’s response was that I should try to get a “Community Note” to say it was a fake, something the company would not help add.

For years I have been going after the growing problem that Americans have extremely limited options to get unauthorized deepfakes taken down. But this experience of sinking hours of time and resources into limiting the spread of a single video made clear just how powerless we are right now. Why should tech companies’ profits rule over our rights to our own images and voices? Why do their shareholders and C.E.O.s get to make more money with the spread of viral content at the expense of our privacy and reputations? And why are there no consequences for the people who actually make the unauthorized deepfakes and spread the lies?


Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.


Thank you for your patience while we verify access.

Already a subscriber? Log in.

Want all of The Times? Subscribe.

The post Amy Klobuchar: I Knew Deepfakes Were a Problem. Then I Saw One of Myself. appeared first on New York Times.

Share200Tweet125Share
Why are Israel’s football clubs playing in Serbia?
Football

Why are Israel’s football clubs playing in Serbia?

by Deutsche Welle
August 20, 2025

Though Maccabi Tel-Aviv’s name comes first on UEFA’s fixture list, neither they nor opponents Dinamo Kiev will be playing at ...

Read more
News

Hurricane Erin prompts tropical storm warning for North Carolina

August 20, 2025
News

‘Black Rabbit’ Trailer Sees Jude Law & Jason Bateman In Danger & In Debt: “Everything’s Falling Apart”

August 20, 2025
News

Professor warned Bryan Kohberger would stalk, sexually abuse students just weeks before Idaho murders

August 20, 2025
News

Skylight’s Digital Photo Frame Is Basically a Giant Smart Calendar

August 20, 2025
I’m a salary negotiator. Here are 6 things I’d never do — and one tip I always follow to get more money.

I’m a salary negotiator. Here are 6 things I’d never do — and one tip I always follow to get more money.

August 20, 2025
Israel approves illegal settlement plan that would split occupied West Bank

Israel approves illegal settlement plan that would split occupied West Bank

August 20, 2025
Texas GOP lawmakers poised to finalize Trump-backed map after Dem redistricting walkout ends

Texas GOP lawmakers poised to finalize Trump-backed map after Dem redistricting walkout ends

August 20, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.