DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

‘Time is Running Out’: New Open Letter Calls for Ban on Superintelligent AI Development

October 22, 2025
in News
‘Time is Running Out’: New Open Letter Calls for Ban on Superintelligent AI Development
492
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

An open letter calling for the prohibition of the development of superintelligent AI was announced on Wednesday, with the signatures of more than 700 celebrities, AI scientists, faith leaders, and policymakers.

Among the signatories are five Nobel laureates; two so-called “Godfathers of AI;” Steve Wozniak, a co-founder of Apple; Steve Bannon, a close ally of President Trump; Paolo Benanti, an adviser to the Pope; and even Harry and Meghan, the Duke and Duchess of Sussex.

The open letter says, in full:

“We call for a prohibition on the development of superintelligence, not lifted before there is

  1. broad scientific consensus that it will be done safely and controllably, and
  2. strong public buy-in.”

The letter was coordinated and published by the Future of Life Institute, a nonprofit that in 2023 published a different open letter calling for a six-month pause on the development of powerful AI systems. Although widely-circulated, that letter did not achieve its goal.

Organizers said they decided to mount a new campaign, with a more specific focus on superintelligence, because they believe the technology—which they define as a system that can surpass human performance on all useful tasks—could arrive in as little as one to two years. “Time is running out,” says Anthony Aguirre, the FLI’s executive director, in an interview with TIME. The only thing likely to stop AI companies barreling toward superintelligence, he says, “is for there to be widespread realization among society at all its levels that this is not actually what we want.”

Polling released alongside the letter showed that 64% of Americans believe that superintelligence “shouldn’t be developed until it’s provably safe and controllable,” and only 5% believe it should be developed as quickly as possible. “It’s a small number of very wealthy companies that are building these, and a very, very large number of people who would rather take a different path,” says Aguirre.

Actors Joseph Gordon-Levitt and Stephen Fry, rapper will.i.am, and author Yuval Noah Harari also signed their names to the letter. Susan Rice, the national security advisor in Barack Obama’s Administration, signed. So did one serving member of staff at OpenAI—an organization described by its CEO, Sam Altman, as a “superintelligence research company”—Leo Gao, a member of technical staff at the company. Aguirre expects more people to sign as the campaign unfolds. “The beliefs are already there,” he says. “What we don’t have is people feeling free to state their beliefs out loud.”

“The future of AI should serve humanity, not replace it,” said Prince Harry, Duke of Sussex, in a message accompanying his signature. “I believe the true test of progress will be not how fast we move, but how wisely we steer. There is no second chance.”

Joseph Gordon-Levitt’s signature was accompanied by the message: “Yeah, we want specific AI tools that can help cure diseases, strengthen national security, etc. But does AI also need to imitate humans, groom our kids, turn us all into slop junkies and make zillions of dollars serving ads? Most people don’t want that. But that’s what these big tech companies mean when they talk about building ‘Superintelligence’.”

The statement was kept minimal to attract a broad and diverse set of signatories. But for meaningful change, Aguirre thinks regulation is necessary. “A lot of the harms come from the perverse incentive structures companies are subject to at the moment,” he says, noting that companies in America and China are competing to be first in creating superintelligence.

“Whether it’s soon or it takes a while, after we develop superintelligence, the machines are going to be in charge,” says Aguirre. “Whether or not that goes well for humanity, we really don’t know. But that is not an experiment that we want to just run toward.”

The post ‘Time is Running Out’: New Open Letter Calls for Ban on Superintelligent AI Development appeared first on TIME.

Share197Tweet123Share
Startups should only get new hires when things start to break, Y Combinator partner says
News

Startups should only get new hires when things start to break, Y Combinator partner says

by Business Insider
October 22, 2025

YC partner says that startups should not hire until they really need to.Illustration by Thomas Fuller/SOPA Images/LightRocket via Getty ImagesYC ...

Read more
News

Woman Shoots, Kills Alleged Groper Inside Beauty Store

October 22, 2025
News

Late Night Watches Trump Go ‘Hulk Smash’ on the White House

October 22, 2025
News

Doncic, Lakers lose to Butler’s Warriors in LeBron-less NBA season opener

October 22, 2025
News

Private investigator claims to know who was driving D4Vd’s abandoned Tesla

October 22, 2025
Breitbart Business Digest: Why America Needs a Price Floor for Rare Earths to Break China’s Monopoly

Breitbart Business Digest: Why America Needs a Price Floor for Rare Earths to Break China’s Monopoly

October 22, 2025
Mamdani aims to keep control while Cuomo angles for GOP votes ahead of final debate

Mamdani aims to keep control while Cuomo angles for GOP votes ahead of final debate

October 22, 2025
India’s Kohli, Rohit will regain form after Perth ODI defeat, says Ponting

India’s Kohli, Rohit will regain form after Perth ODI defeat, says Ponting

October 22, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.