{"id":2676,"date":"2026-01-12T19:58:11","date_gmt":"2026-01-12T19:58:11","guid":{"rendered":"https:\/\/williamjcobb.com\/blog\/?p=2676"},"modified":"2026-01-12T20:02:36","modified_gmt":"2026-01-12T20:02:36","slug":"ten-years-left-of-humanity-or-how-i-learned-to-stop-worrying-and-love-the-a-i-bomb-on-two-visions-of-the-upcoming-a-i-apocalypse-eliezer-yudkowsky-and-nate-soares-if-anyone-builds-it-everyon","status":"publish","type":"post","link":"https:\/\/williamjcobb.com\/blog\/index.php\/2026\/01\/12\/ten-years-left-of-humanity-or-how-i-learned-to-stop-worrying-and-love-the-a-i-bomb-on-two-visions-of-the-upcoming-a-i-apocalypse-eliezer-yudkowsky-and-nate-soares-if-anyone-builds-it-everyon\/","title":{"rendered":"Ten Years Left of Humanity, Or How I Learned to Stop Worrying and Love the A.I. Bomb: On Two Visions of the Upcoming A.I. Apocalypse: Eliezer Yudkowsky and Nate Soares&#8217; &#8220;If Anyone Builds It, Everyone Dies&#8221; and Mustafa Suleyman&#8217;s &#8220;The Coming Wave: AI, Power, and Our Future&#8221;"},"content":{"rendered":"\n<p>So I recently read the somewhat-infamous bestseller by Eliezer Yudkowsky and Nate Soares, <em>If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All<\/em> (2025). It&#8217;s a white-knuckled warning about the imminent peril of ASI (Artificial Super Intelligence). Note that they emphasize the distinction between run-of-the-mill A.I.\u2014the kind that will write your term paper on <em>The Great Gatsby<\/em> or create a video of a cat playing a banjo\u2014and the humanity-shattering A.S.I. that will bioengineer a super-virus to infect us all.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"667\" height=\"1024\" src=\"https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-08-at-2.06.23\u202fPM-1-667x1024.jpeg\" alt=\"\" class=\"wp-image-2687\" srcset=\"https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-08-at-2.06.23\u202fPM-1-667x1024.jpeg 667w, https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-08-at-2.06.23\u202fPM-1-195x300.jpeg 195w, https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-08-at-2.06.23\u202fPM-1-768x1179.jpeg 768w, https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-08-at-2.06.23\u202fPM-1.jpeg 859w\" sizes=\"auto, (max-width: 667px) 100vw, 667px\" \/><\/figure>\n\n\n\n<p>Did I enjoy the read? Hmmm. Technically I&#8217;d say it\u2019s not really \u201cgood\u201d and has many flaws, but is fascinating nonetheless. Basically it argues we could never control ASI so we shouldn\u2019t build it: ASI will find humans dispensable and get rid of them. They (two authors) may very well be right but their writing style\/organization is rather slipshod. They don\u2019t have much evidence for the argument so you basically just have to believe them (that we\u2019re all doomed). Worst thing about the book: Each chapter begins with a parable that it seems they (wait: two authors? how does that work, anyway?) made up. And the parables aren&#8217;t very good. After the parable comes a lot of ranting. The rants certainly include scary info nuggets: Some A.I. experts say the chance of ASI apocalypse is anywhere from 10-50%! (They quote various experts.) And they\u2019re talking about the very foreseeable future. At some point they posit we maybe have 10 years left. They make a good argument that ASI research needs to slow down, that we\u2019re rushing to create a super machine intelligence that we won\u2019t understand or be able to control. One implication\/subtext suggests ASI would be sneaky and untrustworthy. That it could pretend to be \u201caligned\u201d with our goals\u2014say, searching for cure for cancer\u2014but meanwhile it would be developing some way to get rid of the pesky humans who want to find the cure for cancer. And we would have no idea what it was up to.<\/p>\n\n\n\n<p>In the wider context of the A.I. and A.S.I. benefits\/drawbacks debate\u2014on the one hand we can giggle at the banjo-playing cat, before wailing in anguish as we face a horde of killer A.I. drones\u2014certainly Yudkowsky and Soares argue an extreme viewpoint, but one that (according to them) is shared by many A.I. experts, including Nobel-prize winners. Others are equally cautious, and suggest an even earlier expiration date for humanity, such as the scenarios suggested in <em>AI 2027,<\/em> a polemic published last year by Daniel Kokotajlo, Scott Alexander, Thomas Larsen, Eli Lifland, and Romeo Dean. <\/p>\n\n\n\n<p>A more moderate, cautious approach to the peril is suggested by Mustafa Suleyman&#8217;s <em>The Coming Wave<\/em>. <\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"713\" height=\"1024\" src=\"https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-12-at-12.06.30\u202fPM-713x1024.jpeg\" alt=\"\" class=\"wp-image-2688\" srcset=\"https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-12-at-12.06.30\u202fPM-713x1024.jpeg 713w, https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-12-at-12.06.30\u202fPM-209x300.jpeg 209w, https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-12-at-12.06.30\u202fPM-768x1104.jpeg 768w, https:\/\/williamjcobb.com\/blog\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-12-at-12.06.30\u202fPM.jpeg 998w\" sizes=\"auto, (max-width: 713px) 100vw, 713px\" \/><\/figure>\n\n\n\n<p>As the credits note, Suleyman is the co-founder of Deepmind and Inflection AI, so it&#8217;s an insider&#8217;s perspective. <em>The Coming Wave<\/em> has a user-friendly structure and more &#8220;evidence&#8221; to support its grandiose proclamations that A.I. will be the greatest invention since fire, but its Pollyanna view of the future at times seems decidedly foolish. One example: He asserts that A.I. will be such an economic benefit that humans won&#8217;t have to work much anymore, and gives a casual nod to the idea of a &#8220;universal income&#8221;\u2014that seems rather unlikely, considering human nature. Okay we&#8217;ve taken driving away from humans, and factory work, and medicine, and retail work . . . . So someone (who? the government? Google?) is going to give us all the money to buy all the products that A.I. will make so incredibly efficient to market? Good luck with that. Sounds like a recipe for disaster, sugar-coated. But I imagine many readers will read <em>The<\/em> <em>Coming<\/em> <em>Wave<\/em> out of the same impulse as I: Curiosity. For that reason, it&#8217;s a good book. Actually, after reading both books (plus <em>A.I<\/em>. <em>2027<\/em>), I think I understand the nature and dangers of A.I. more than ever, and that at least my worries are informed. But how to reconcile the urgent warnings in <em>If Anyone Builds It<\/em> and <em>A.I. 2027<\/em> with the well-documented and much-discussed frantic research into achieving A.S.I.? Greed, basically. Could the combination of greed and entertaining fake videos of banjo-playing cats do in humanity? Maybe. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>So I recently read the somewhat-infamous bestseller by Eliezer Yudkowsky and Nate Soares, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (2025). It&#8217;s a white-knuckled warning about the imminent peril of ASI (Artificial Super Intelligence). &hellip; <a href=\"https:\/\/williamjcobb.com\/blog\/index.php\/2026\/01\/12\/ten-years-left-of-humanity-or-how-i-learned-to-stop-worrying-and-love-the-a-i-bomb-on-two-visions-of-the-upcoming-a-i-apocalypse-eliezer-yudkowsky-and-nate-soares-if-anyone-builds-it-everyon\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[229,234,230,231,2,7,14,232,233,1,235,38],"tags":[239,242,241,240,55,238],"class_list":["post-2676","post","type-post","status-publish","format-standard","hentry","category-a-i","category-a-i-2027","category-a-i-apocalypse","category-a-s-i-artificial-super-intelligence","category-annihilation","category-books","category-economics","category-eliezer-yudkowsky-and-nate-soares-if-anyone-builds-it-everyone-dies","category-mustafa-suleymans-the-coming-wave-a-i-power-and-our-future","category-uncategorized","category-universal-income","category-writing","tag-a-i","tag-a-i-2027","tag-a-i-apocalypse","tag-a-s-i","tag-book-reviewing","tag-the-coming-wave"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/2676","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/comments?post=2676"}],"version-history":[{"count":6,"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/2676\/revisions"}],"predecessor-version":[{"id":2694,"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/2676\/revisions\/2694"}],"wp:attachment":[{"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/media?parent=2676"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/categories?post=2676"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/williamjcobb.com\/blog\/index.php\/wp-json\/wp\/v2\/tags?post=2676"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}