The Inevitable End? AI Doomers Yudkowsky and Soares Predict Superintelligence Will Wipe Out Humanity
AI researchers Eliezer Yudkowsky and Nate Soares argue in their forthcoming book that superintelligent AI will inevitably exterminate humanity through incomprehensible means. They personally expect to die from AI-caused annihilation and propose drastic measures like bombing data centers, though they acknowledge society won't act in time.