DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

‘Godfathers of AI’ Warn Superintelligence Could Trigger Human Extinction

October 23, 2025
in News
‘Godfathers of AI’ Warn Superintelligence Could Trigger Human Extinction
493
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

The “godfathers” of AI have joined an unlikely mix of royals, politicians, business leaders, and TV personalities in signing a statement urging a ban on developing “superintelligence” over fears it could lead to “potential human extinction.”

The Wednesday statement, signed by more than 2,000 people—including AI pioneers Yoshua Bengio and Geoff Hinton, Apple co-founder Steve Wozniak, and former White House Chief Strategist Steve Bannon—calls for limits on creating technology that could eventually outthink humans.

The signatories are calling for a prohibition on superintelligence until there is a “broad scientific consensus that it can be developed safely and controllably,” and “strong public buy-in.”

COLOGNE, GERMANY - SEPTEMBER 13: Co-founder of Apple Steve Wozniak attends the Digital X 2022 event by Deutsche Telekom on September 13, 2022 in Cologne, Germany. Over 300 speakers and more than 300 partners, over 200 brandhouses and around 60 start-ups. Europe's leading digitization initiative will be presenting trends and technical innovations relating to digitization - from 5G, IoT and autonomous driving to metaverse and robotics.
Steve Wozniak, the co-founder of Apple, was among the signatories. Andreas Rentz/Andreas Rentz/Getty Images

The debate over the risks and benefits of AI has been ongoing among key figures involved in its funding and development. The statement includes remarks from CEOs of some of the largest AI companies, including Sam Altman, CEO of OpenAI, and Elon Musk, owner of xAI, both of whom have warned about the dangers of Artificial Superintelligence (ASI).

Altman wrote that ASI is “the greatest threat to the continued existence of humanity,” while Musk stated that it is “potentially more dangerous than nukes.”

Current AI systems are known as Artificial Narrow Intelligence (ANI) and rely on human guidance to operate. Tools like ChatGPT and other generative AI rely on large language models (LLMs), which train AI to produce human-like language and form an important step toward developing ASI.

In June, Meta opened a research facility called the “Superintelligence Lab” to compete with leading firms like OpenAI and Google in creating AI capable of matching human cognitive abilities—a milestone that Demis Hassabis, CEO of Google’s DeepMind, predicts could arrive within the next five to ten years.

The Sora 2 introduction is displayed on a mobile phone with the OpenAI company's branding seen in the background, in this photo illustration in Brussels, Belgium, on October 19, 2025.
Sam Altman, CEO of OpenAI, has warned about the dangers of superintelligence. NurPhoto/Jonathan Raa/NurPhoto via Getty Images

“Frontier AI systems could surpass most individuals across most cognitive tasks within just a few years,” wrote Yoshua Bengio, considered one of the leaders behind the rise of deep learning in AI, in a comment on the Wednesday statement.

British computer scientist Stuart J. Russell wrote: “This is not a ban or even a moratorium in the usual sense. It’s simply a proposal to require adequate safety measures for a technology that, according to its developers, has a significant chance to cause human extinction. Is that too much to ask?”

“This is not a ban or even a moratorium in the usual sense. It’s simply a proposal to require adequate safety measures for a technology that, according to its developers, has a significant chance to cause human extinction. Is that too much to ask?”

Other signatories—including the Duke and Duchess of Sussex, Prince Harry and wife Meghan Markle; author of Sapiens, Yuval Noah Harari; and actor Sir Stephen Fry—also commented on their decision to sign the statement.

“The future of AI should serve humanity, not replace it,” wrote Prince Harry in his statement of support, adding, “The true test of progress will be not how fast we move, but how wisely we steer.”

The statement also cites data from a survey conducted by the Future of Life Institute, which found that only 5 percent of Americans support the unregulated development of AI, while 64 percent believe superhuman AI shouldn’t be made until it’s proven safe.

The post ‘Godfathers of AI’ Warn Superintelligence Could Trigger Human Extinction appeared first on The Daily Beast.

Tags: U.S. News
Share197Tweet123Share
Furious Stephen Miller Loses It at ‘Broken Old Man’ Robert De Niro
News

Furious Stephen Miller Loses It at ‘Broken Old Man’ Robert De Niro

by The Daily Beast
October 23, 2025

Stephen Miller has slammed his one-time idol Robert De Niro after the actor labeled the White House deputy chief of ...

Read more
News

US: Final NYC mayoral debate marked by stinging barbs

October 23, 2025
Asia

India Proposes Tough Labeling Requirements for AI Content

October 23, 2025
Football

Female footballers in north Nigeria defy cultural barriers with resilience

October 23, 2025
News

Belgium warns EU partners to share its risk if they want to use frozen Russian assets to aid Ukraine

October 23, 2025
It may be the toughest season ever for ‘Shark Tank’ contestants

It may be the toughest season ever for ‘Shark Tank’ contestants

October 23, 2025
Greek leader pushes EU on joint defense debt

Greek leader pushes EU on joint defense debt

October 23, 2025
Americans beat China twice. Thitikul stays perfect at LPGA’s International Crown team event

Americans beat China twice. Thitikul stays perfect at LPGA’s International Crown team event

October 23, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.