Apologies for the silence. The speak-first, elaborate later style of newsletters has taken some getting used to. From now on, Collapsology will appear weekly. I’m optimizing for variance and feedback. Feel free to post any in the comments. Existential Risk research studies the upper sliver of risks that face our ostensibly undifferentiated humanity, risks that would destroy the entire world in an irreversible way: massive solar flares, the newly popular figure of the interstellar asteroid, bio and nanotechnology risks, and, of course, the risk from superintelligent (or just very powerful but stupid) AI. Because these risks are so extreme, and mostly more or less causally unrelated to one another, such research tends to parcel them out into distinct chapters in books on the topic, dealing with them one at a time.
Do you expect to be writing at all about a) how societies have avoided imminent collapse, or b) how, despite radical problems, they have succeeded in recovering?
Do you expect to be writing at all about a) how societies have avoided imminent collapse, or b) how, despite radical problems, they have succeeded in recovering?