The collapse of complex societies

Those who crit­i­cize so-called “AI doomers” often over­look that there is a broader, intel­lec­tu­ally serious tradi­tion of tech­no­log­ical doomerism that goes back decades. To revisit these works is to wonder whether AI really presents new risks, or if it is simply the mani­fes­ta­tion of a risk previ­ously fore­told.

Of course, predic­tions of global apoc­a­lypse are as ancient as humanity. Given the histor­ical track record—no global apoc­a­lypse yet!—those predicting apoc­a­lypse have tradi­tion­ally had a rough time being taken seri­ously. Still, there is a big differ­ence between predicting the arrival of apoc­a­lypse ex nihilo vs. a reasoned argu­ment that it neces­sarily emerges from specific human deci­sions and habits.

The Limits of Growth

This book, published in 1972, was an early effort to quan­ti­ta­tively model the effects of tech­no­log­ical change. I read it some years ago. As the title implies, The Limits of Growth consid­ered how five key global factors would affect human devel­op­ment:

If the present growth trends in world popu­la­tion, indus­tri­al­iza­tion, pollu­tion, food produc­tion, and resource deple­tion continue unchanged, the limits to growth on this planet will be reached some­time within the next one hundred years. The most prob­able result will be a rather sudden and uncon­trol­lable decline in both popu­la­tion and indus­trial capacity.

In support of their theo­ries, and unusu­ally for the era, the authors of TLoG relied exten­sively on soft­ware simu­la­tions. This allowed the authors to, say, model the course of human progress under real­istic current assump­tions. In this model, popu­la­tion decreases rapidly after around 2050:

But using a soft­ware model also allowed the authors to run simu­la­tions with different starting assump­tions, such as this one with natural resources doubled. Perhaps coun­ter­in­tu­itively, the model predicts that increasing resources causes the big drop in popu­la­tion to happen sooner because uncon­trolled pollu­tion negates the bene­fits of the extra resources:

For 1972, these charts are pretty rad. They remind me of output from programs in my beloved child­hood copy of BASIC Computer Games.

Consis­tent with the usual histor­ical reac­tion to doomers, TLoG’s methods and conclu­sions were divi­sive for decades. The orig­inal authors produced multiple sequels: Beyond the Limits in 1992, Limits to Growth: the 30-Year Update in 2004, and Limits and Beyond in 2022. Spoiler: the popu­la­tion still collapses. In the 2023 paper Recal­i­bra­tion of Limits to Growth, a sepa­rate group of authors ran the TLoG simu­la­tions with updated empir­ical data. Result? You guessed it.

The Collapse of Complex Societies

This 1988 book, by Joseph Tainter, I read earlier this year. I recom­mend it—as a work of suspenseful persua­sive writing, it’s pretty great. (For those who prefer to preserve the suspense: it’s safe to read this section, but skip the last one.)

Tainter is an anthro­pol­o­gist. His defi­n­i­tion of collapse is similar to that of TLoG: “a rapid, signif­i­cant loss of an estab­lished level of sociopo­lit­ical complexity.” It doesn’t mean the end of a society, neces­sarily. Rather, it marks a point after which quality of life tends to get ever worse for ever more people.

But the primary ques­tion posed by the book is never said out loud till near the end: is our own society approaching a state of collapse? Might it already be happening?

In contrast to TLoG, Tainter considers collapse retro­spec­tively rather than prospec­tively. Even though complex soci­eties have only existed for a small frac­tion of human history, Tainter observes that they’ve never­the­less displayed a strong propen­sity toward collapse. Why?

Tainter’s theory is straight­for­ward. First, he notes that sociopo­lit­ical complexity incurs costs, and that increasing complexity imposes increasing costs. Second, he observes that econ­o­mists have shown that the prin­ciple of dimin­ishing returns is one of the best evidenced in human history—nearly an iron law. Putting these together, Tainter theo­rizes that soci­eties grow more complex until:

… the increased costs of sociopo­lit­ical evolu­tion … reach a point of dimin­ishing marginal returns. … After a certain point, increased invest­ments in complexity fail to yield propor­tion­ately increasing returns. Marginal returns decline and marginal costs rise. Complexity as a strategy becomes increas­ingly costly, and yields decreasing marginal bene­fits.

Or more collo­qui­ally—luxury becomes neces­sity. Complex soci­eties are thereby forced into a posi­tion where an increasing share of their social resources are spent main­taining the status quo rather than investing in future improve­ments.

(Relat­edly, Tainter notes that the percep­tion of chil­drea­ring becoming more expen­sive per gener­a­tion is not illu­sory, because invest­ments in complexity that benefit chil­dren also tend to increase, imposing a corre­sponding burden on parents. As an alter­na­tive, Tainter mentions a current world society where “[c]hildren are mini­mally cared for by their mothers until age three, and then are put out to fend for them­selves,” appar­ently by picking up work “in agri­cul­tural fields … scar[ing] off birds and baboons.”)

The dimin­ishing return on invest­ment in social complexity does not itself produce collapse. But it puts the society into a state of increasing vulner­a­bility. As its allo­ca­tion of resources to complexity increases, it progres­sively loses the ability to cope with the occa­sional external shocks that complex soci­eties are ordi­narily well-adapted to endure (e.g., war, natural disaster, pandemic, etc.) Even­tu­ally, with suffi­cient weak­ening of the society and a suffi­ciently dire shock, the society will—inevitably—collapse.

So what does it all mean for us?

Tainter deftly leaves this point aside till nearly the end of the book, having given us readers ample time and clues to consider the issue ourselves.

Tainter agrees with the TLoG authors that we can’t simply grow our way out of the problem through, say, increased research invest­ment, because such research cannot produce substi­tutes for all forms of social complexity. For instance, research can produce better soft­ware that can perform certain tasks more cheaply and thereby reduce orga­ni­za­tional costs. But we can’t easily invent entire substi­tutes for the complex, socially valu­able orga­ni­za­tions that use the soft­ware, e.g.—the mili­tary, the judi­cial system, etc. Further­more, increasing these research invest­ments would require allo­cating an esca­lating propor­tion of our social wealth, which is already being spent on current goodies we’d rather not relin­quish.

But Tainter disagrees with the TLoG authors that economic decel­er­a­tion or “unde­vel­op­ment” is an option. Tainter notes that although complex soci­eties are a compar­a­tively recent inven­tion in human history, our planet is currently covered with inter­de­pen­dent complex human soci­eties. Thus, today’s pursuit of social complexity is not only an end unto itself, but also geopo­lit­i­cally compet­i­tive. In prin­ciple, a nation could choose to decel­erate its own economic growth to fore­stall collapse in the future. But that would simply make itself vulner­able to domi­na­tion by another nation today. Such decel­er­a­tion would there­fore be polit­i­cally irra­tional.

Tainter thus arrives at maximum deadpan:

Collapse, if and when it comes again, will this time be global. No longer can any indi­vidual nation collapse. World civi­liza­tion will disin­te­grate as a whole. Competi­tors who evolve as peers collapse in like manner. … It is diffi­cult to know whether world indus­trial society has yet reached the point where the marginal return for its overall pattern of invest­ment has begun to decline. … Even if [that] point … has not yet been reached, that point will inevitably arrive. … The polit­ical conflicts that this will cause, coupled with the increas­ingly easy avail­ability of nuclear weapons, will create a dangerous world situ­a­tion in the fore­see­able future.

So if you were worried that climate change or AI or cryp­tocur­rency would doom the human race—relax, we were doomed no matter what.

The final page of Tainter’s book offers one produc­tive sugges­tion, though with a strong sci-fi vibe:

A new energy subsidy is neces­sary if a declining stan­dard of living and a future global collapse are to be averted. A more abun­dant form of energy might not reverse the declining marginal return on invest­ment in complexity, but it would make it more possible to finance that invest­ment … 

Here indeed is a paradox: a disas­trous condi­tion that all decry may force us to tolerate a situ­a­tion of declining marginal returns long enough to achieve a tempo­rary solu­tion to it. This reprieve must be used ratio­nally to seek for and develop the new energy source(s) that will be neces­sary … This research and devel­op­ment must be an item of the highest priority, even if, as predicted, this requires real­lo­ca­tion of resources from other economic sectors. Adequate funding of this effort should be included in the budget of every indus­tri­al­ized nation (and the results shared by all).

In other words, humanity can post­pone—though not avoid—its big global collapse by collec­tively choosing to make massive economic sacri­fices to fund the devel­op­ment of trans­for­ma­tional energy tech­nology. Or, I suppose, some soft­ware device that can itself invent the energy tech­nology.

Any ideas?

PS—the most realistic artistic depiction of collapse

Chil­dren of Men, about which more later.