"If Anyone Builds It, Everyone Dies" by Eliezer Yudkowsky and Nate Soares
This was, most likely, the most unsettling book of 2025 for me. This little blog of mine started with a review of Eliezer's book, and now here we are again, discussing another one of his. This time, though, I felt that the message's importance overcame the author's usual graphomania. The book is concise and very scary.
The analogies Eliezer uses are still far from what an average person would understand, besides maybe the comparison of ASI to nuclear weapons. However, even this fails to strike the note with a layperson, as nuclear annihilation is something tangible and scary. In contrast, ASI is so far beyond what an average human can imagine that it doesn't feel existential.
On the other hand, the authors provide substantial evidence in terms of signals that ASI is concerning. For instance, when the CCP chairman raised concerns about AI being as powerful as nuclear weapons, or a hypothetical scenario, when ASI starts to attack AI companies because it sees better AI development as a threat.
Additionally, I felt uneasy when Eliezer discussed AI receiving money. At bwl.gg, we build a tool that gives money to AI... Crazy how relevant this book is!