AI Expert Warns Musk-signed Letter Doesn't Go Far Enough: "Literally, Everyone on Earth Will Die.'

  • by:
  • Source: Fox News
  • 03/30/2023
An artificial intelligence expert with more than two decades of experience studying AI safety said an open letter calling for six-month moratorium on developing powerful AI systems does not go far enough.

Eliezer Yudkowsky, a decision theorist at the Machine Intelligence Research Institute, wrote in a recent op-ed that the six-month "pause" on developing "AI systems more powerful than GPT-4" called for by Tesla CEO Elon Musk and hundreds of other innovators and experts understates the "seriousness of the situation." He would go further, implementing a moratorium on new large AI learning models that is "indefinite and worldwide." 

The letter, issued by the Future of Life Institute and signed by more than 1,000 people, including Musk and Apple co-founder Steve Wozniak, argued that safety protocols need to be developed by independent overseers to guide the future of AI systems.

Get latest news delivered daily!

We will send you breaking news right to your inbox

© 2024, Privacy Policy