DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

‘Godfathers of AI’ Warn Superintelligence Could Trigger Human Extinction

October 23, 2025
in News
‘Godfathers of AI’ Warn Superintelligence Could Trigger Human Extinction
494
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

The “godfathers” of AI have joined an unlikely mix of royals, politicians, business leaders, and TV personalities in signing a statement urging a ban on developing “superintelligence” over fears it could lead to “potential human extinction.”

The Wednesday statement, signed by more than 2,000 people—including AI pioneers Yoshua Bengio and Geoff Hinton, Apple co-founder Steve Wozniak, and former White House Chief Strategist Steve Bannon—calls for limits on creating technology that could eventually outthink humans.

The signatories are calling for a prohibition on superintelligence until there is a “broad scientific consensus that it can be developed safely and controllably,” and “strong public buy-in.”

COLOGNE, GERMANY - SEPTEMBER 13: Co-founder of Apple Steve Wozniak attends the Digital X 2022 event by Deutsche Telekom on September 13, 2022 in Cologne, Germany. Over 300 speakers and more than 300 partners, over 200 brandhouses and around 60 start-ups. Europe's leading digitization initiative will be presenting trends and technical innovations relating to digitization - from 5G, IoT and autonomous driving to metaverse and robotics.
Steve Wozniak, the co-founder of Apple, was among the signatories. Andreas Rentz/Andreas Rentz/Getty Images

The debate over the risks and benefits of AI has been ongoing among key figures involved in its funding and development. The statement includes remarks from CEOs of some of the largest AI companies, including Sam Altman, CEO of OpenAI, and Elon Musk, owner of xAI, both of whom have warned about the dangers of Artificial Superintelligence (ASI).

Altman wrote that ASI is “the greatest threat to the continued existence of humanity,” while Musk stated that it is “potentially more dangerous than nukes.”

Current AI systems are known as Artificial Narrow Intelligence (ANI) and rely on human guidance to operate. Tools like ChatGPT and other generative AI rely on large language models (LLMs), which train AI to produce human-like language and form an important step toward developing ASI.

In June, Meta opened a research facility called the “Superintelligence Lab” to compete with leading firms like OpenAI and Google in creating AI capable of matching human cognitive abilities—a milestone that Demis Hassabis, CEO of Google’s DeepMind, predicts could arrive within the next five to ten years.

The Sora 2 introduction is displayed on a mobile phone with the OpenAI company's branding seen in the background, in this photo illustration in Brussels, Belgium, on October 19, 2025.
Sam Altman, CEO of OpenAI, has warned about the dangers of superintelligence. NurPhoto/Jonathan Raa/NurPhoto via Getty Images

“Frontier AI systems could surpass most individuals across most cognitive tasks within just a few years,” wrote Yoshua Bengio, considered one of the leaders behind the rise of deep learning in AI, in a comment on the Wednesday statement.

British computer scientist Stuart J. Russell wrote: “This is not a ban or even a moratorium in the usual sense. It’s simply a proposal to require adequate safety measures for a technology that, according to its developers, has a significant chance to cause human extinction. Is that too much to ask?”

“This is not a ban or even a moratorium in the usual sense. It’s simply a proposal to require adequate safety measures for a technology that, according to its developers, has a significant chance to cause human extinction. Is that too much to ask?”

Other signatories—including the Duke and Duchess of Sussex, Prince Harry and wife Meghan Markle; author of Sapiens, Yuval Noah Harari; and actor Sir Stephen Fry—also commented on their decision to sign the statement.

“The future of AI should serve humanity, not replace it,” wrote Prince Harry in his statement of support, adding, “The true test of progress will be not how fast we move, but how wisely we steer.”

The statement also cites data from a survey conducted by the Future of Life Institute, which found that only 5 percent of Americans support the unregulated development of AI, while 64 percent believe superhuman AI shouldn’t be made until it’s proven safe.

The post ‘Godfathers of AI’ Warn Superintelligence Could Trigger Human Extinction appeared first on The Daily Beast.

Tags: U.S. News
Share198Tweet124Share
The Lost Boys of Task
News

The Lost Boys of Task

by New Republic
October 23, 2025

We are often told that there is a crisis of boys and men in this country. They’re depressed, aimless, susceptible ...

Read more
News

How TIME and Statista Determined America’s Growth Leaders of 2025

October 23, 2025
News

I spent 6 weeks in Prague. Here are the 4 best things I did, and the 2 I’d skip next time.

October 23, 2025
News

Miami Heat’s Terry Rozier arrested in FBI sports betting probe, sources say

October 23, 2025
News

London cops arrest 3 men on suspicion of spying for Russia

October 23, 2025
Why I Run

Why I Run

October 23, 2025
Under Trump, National Symphony Opens Concerts With the National Anthem

Under Trump, National Symphony Opens Concerts With the National Anthem

October 23, 2025
What Tanzanians hope their next government will deliver

What Tanzanians hope their next government will deliver

October 23, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.