AI Doomsday Scenarios Are Gaining Traction in Sili… | 質問の答えを募集中です! AI Doomsday Scenarios Are Gaining Traction in Sili… | 質問の答えを募集中です!

AI Doomsday Scenarios Are Gaining Traction in Sili…

未分類
AI Doomsday Scenarios Are Gaining Traction in Silicon Valley
Controversial AI theorist Eliezer Yudkowsky sits on the fringe of the industry’s most extreme circle of commentators, where extinction of the human species is the inevitable result of developing advanced artificial intelligence.

“I think we’re not ready, I think we don’t know what we’re doing, and I think we’re all going to die,” Yudkowsky said on this week’s episode of the Bloomberg Originals series AI IRL.

For the past two decades, Yudkowsky has consistently promoted his theory that hostile AI could spark a mass extinction event. As many in the AI industry shrugged or raised eyebrows at this assessment, he created the Machine Intelligence Research Institute with funding from Peter Thiel, among others, and collaborated on written work with futurists such as Nick Bostrom.

To say that some of his visions for the end of the world are unpopular would be a gross understatement; they’re on par with the prophecy that the world would end in 2012. That prediction was based on a questionable interpretation of an ancient text, as well as a dearth of supportive evidence.

——–
Subscribe to our YouTube channel: https://bit.ly/2TwO8Gm
Subscribe to Bloomberg Originals: https://www.youtube.com/BloombergTV

Bloomberg Quicktake brings you global social video spanning business, technology, politics and culture. Make sense of the stories changing your business and your world.

Connect with us on…
YouTube: https://www.youtube.com/user/Bloomberg
Breaking News on YouTube: https://www.youtube.com/@BloombergQuicktakeNow
Twitter: https://twitter.com/quicktake
Facebook: https://www.facebook.com/quicktake
Instagram: https://www.instagram.com/quicktake



 ⬇人気の記事!⬇

タイトルとURLをコピーしました