Have We Entered the Ages of Discord?

Peter Turchin, of Secular Cycles fame, predicted that political violence and discord in the United States would reach a peak in 2020 (or thereabouts). He even put that prediction in writing in a book entitled Ages of Discord:

In 2010 I made the prediction that the United States will experience a period of heightened social and political instability during the 2020s…Structural-demographic theory (SDT) suggests that the violence spike of the 2020s will be worse than the one around 1970, and perhaps as bad as the last big spike during the 1920s. Thus, the expectation is that there will be more than 100 events per 5 years. In terms of the second metric, we should expect more than 5 fatalities per 1 million of population per 5 years, if the theory is correct.

And there you have it. If violence doesn’t exceed these thresholds by 2025, then SDT is wrong.

A Quantitative Prediction for Political Violence in the 2020s (Cliodynamica)

And the 1970s were pretty bad. From a review of Ages of Discord at Slate Star Codex:

The 1970s underground wasn’t small. It was hundreds of people becoming urban guerrillas. Bombing buildings: the Pentagon, the Capitol, courthouses, restaurants, corporations. Robbing banks. Assassinating police. People really thought that revolution was imminent, and thought violence would bring it about.

Book Review: Ages Of Discord (Slate Star Codex)

See also Coronavirus and Our Age of Discord (Cliodynamica)

There are several general trends during the pre-crisis phase that make the rise and spread of pandemics more likely. At the most basic level, sustained population growth results in greater population density, which increases the basic reproduction number of nearly all diseases. Even more importantly, labor oversupply, resulting from overpopulation, depresses wages and incomes for most. Immiseration, especially its biological aspects, makes people less capable of fighting off pathogens. People in search of jobs move more and increasingly concentrate in the cities, which become breeding grounds for disease. Because of greater movement between regions, it is easy for disease to jump between cities.

Elites, who enjoy growing incomes resulting from low worker wages, spend them on luxuries, including exotic ones. This drives long-distance trade, which more tightly connects distant world regions. My 2008 article is primarily about this process, which we call “pre-modern globalizations.” As a result, a particularly aggressive pathogen arising in, for example, China, can rapidly jump to Europe.

Finally, when the crisis breaks out, it brings about a wave on internal warfare. Marauding armies of soldiers, rebels, and brigands, themselves become incubators of disease that they spread widely as they travel through the landscape.

This description is tailored to pre-modern (and early modern) Ages of Discord. Today, in 2020, details are different. But the main drivers — globalization and popular immiseration — are the same…

Right now Turchin is starting to look like Nostradamus. He hasn’t addressed this so far on his blog, but I’m interested to hear his take.

One thing I wonder about though: the police state is so much more powerful than it was in the 1970s due to digital surveillance technology. I mean, everyone carries around a device that tracks all their movements all the time, and the few who don’t will be noticeable by their absence. Cameras are everywhere. Our online presence is constantly monitored, as Edward Snowden revealed (I don’t think it’s a coincidence that laws prohibiting government monitoring of the citizenry have been repealed just in the last few weeks). Plus, the systems of cybernetic control for managing large populations are so much more sophisticated, as Adam Curtis described in All Watched Over By Machines of Loving Grace. I think these cybernetic systems also foment discord as well, since they allow segments of the population to live in completely separate realities managed by different sets of elites—there is no consensus reality anymore, as responses to the pandemic showed.

But it does seem like an alarming number of people have been disenfranchised and have no constructive outlet for their anger, and no effective recourse for changing the system anymore. Add to that Great Depression levels of popular immiseration while elites are being bailed out with unlimited funds. This is what happens when you make peaceful revolution impossible—violent revolution becomes inevitable.

UPDATE: Turchin’s latest post (June 1)

What is much more certain is that the deep structural drivers for instability continue to operate unabated. Worse, the Covid-19 pandemic exacerbated several of these instability drivers. This means that even after the current wave of indignation, caused by the killing of George Floyd, subsides, there will be other triggers that will continue to spark more fires—as long as the structural forces, undermining the stability of our society, continue to provide abundant fuel for them.

Archaeology/Anthropology Roundup

I want to get back to some of the topics I’ve left hanging, but first I’d like to mention a few other topics that have been sadly neglected during the whole—er, pandemic thing—but that we frequently discuss here on the blog. Specifically archaeology and architecture. This one will be about archaeology.

I want to highlight something that came out about a month ago that you’re probably aware of. If not, here it is: the Amazon rain forest has been found to be one of the cradles of agriculture.

The original cradles of agriculture described in history textbooks were the great river valley of Mesopotamia between the Tigris and Euphrates rivers, along with the Nile valley. As archaeology expanded from its European origins, the Indus river valley in India/Pakistan and the Yellow river valley in China were included as cradles of agriculture. Then came New World sources of maize and potatoes in Central and South America. In recent years, archaeologists have included a few other places, notably Papua New Guinea. Now, it seems we can add the Amazon rain forest to the list:

There’s a small and exclusive list of places where crop cultivation first got started in the ancient world – and it looks as though that list might have another entry, according to new research of curious ‘islands’ in the Amazon basin.

The savannah of the Llanos de Moxos in northern Bolivia is littered with thousands of patches of forest, rising a few feet above the surrounding wetlands. Many of these forest islands, as researchers call them, are thought to be the remnants of human habitation from the early and mid-Holocene.

Now, thanks to new analysis of the sediment found in some of these islands, researchers have unearthed signs that these spots were used to grow cassava (manioc) and squash a little over 10,000 years ago.

That’s impressive, as this timing places them some 8,000 years earlier than scientists had previously found evidence for, indicating that the people who lived in this part of the world – the southwestern corner of the Amazon basin – got a head start on farming practices.

In fact, the findings suggest that southwestern Amazonia can now join China, the Middle East, Mesoamerica, and the Andes as one of the areas where organised plant growing first got going – in the words of the research team, “one of the most important cultural transitions in human history”.

Strange Forest Patches Littering The Amazon Point to Agriculture 10,000 Years Ago (Science Alert)

The researchers were able to identify evidence of manioc (cassava, yuca) that were grown 10,350 years ago. Squash appears 10,250 years ago, and maize more recently – just 6,850 years ago.

“This is quite surprising,” said Dr [Umberto] Lombardo. “This is Amazonia, this is one of these places that a few years ago we thought to be like a virgin forest, an untouched environment. Now we’re finding this evidence that people were living there 10,500 years ago, and they started practising cultivation.”

The people who lived at this time probably also survived on sweet potato and peanuts, as well as fish and large herbivores. The researchers say it’s likely that the humans who lived here may have brought their plants with them.They believe their study is another example of the global impact of the environmental changes being felt as the world warmed up at the end of the last ice age.

“It’s interesting in that it confirms again that domestication begins at the start of the Holocene period, when we have this climate change that we see as we exit from the ice age,” said Dr Lombardo. “We entered this warm period, when all over the world at the same time, people start cultivating.”

Crops were cultivated in regions of the Amazon ‘10,000 years ago’ (BBC)

Note that what is grown appears to be vegetable plants like cassava, yucca and squash, and not cereal grains. Recall James Scott’s point that annual cereal grains were a starting point for civilizations, as they were preservable and ripened at the same rate at the same time, making them confiscatable and by central authorities. Cultures that subsisted on perishable garden plants, however, could escape the trap of civilization.

Here’s a major study that ties into the feasting theory: the first beer was brewed a part of funerary rites for the dead:

The first beer was for the dead. That’s according to a 2018 study of stone vessels from Raqefet Cave in Israel, a 13,000-year-old graveyard containing roughly 30 burials of the Natufian culture. On three limestone mortars, archaeologists found wear and tear and plant molecules, interpreted as evidence of alcohol production. Given the cemetery setting, researchers propose grog was made during funerary rituals in the cave, as an offering to the dearly departed and refreshment for the living. Raqefet’s beer would predate farming in the Near East by as much as 2,000 years — and booze production, globally, by some 4,000 years.

The beer hypothesis, published in the Journal of Archaeological Science: Reports, comes from Raqefet excavators, based at Israel’s University of Haifa, and Stanford University scientists, who conducted microscopic analyses. In previous research, they made experimental brews the ancient way, to see how the process altered artifacts. Some telltale signs were then identified on Raqefet stones: A roughly 10-inch diameter mortar, carved directly into the cave floor, had micro-scratches — probably from a wooden pestle — and starch with damage indicative of mashing, heating and fermenting, all steps in alcohol production. Two funnel-shaped stones had traces of cereals, legumes and flax, interpreted as evidence that they were once lined with woven baskets and used to store grains and other beer ingredients. Lead author Li Liu thinks Natufians also made bread, but that these three vessels were for beer — the earliest yet discovered.

Was the First Beer Brewed for the Dead? (Discover)

The counterpoint is that they were baking bead instead, leading back to the old question: what were grains first cultivated for, beer or bread? My suspicion is the former, with the latter being an effective use of “surplus” resources, or a backup strategy in the case of food shortages.

The connection between beer-brewing and funerary rites is significant, however. The feasting theory of inequality’s origins doesn’t go into much detail about why such feasts were held. But if such rituals feasts were held as a means of commemorating the dead—most likely tied to ancestor worship—then the existence of such events takes on additional importance.

When I talked about the history of cities and the feasting theory, I noted that these seem to have taken place in ritual areas that were marked off (sacred versus profane) for the purposes of feasting and trade, and where multiple different cultures would coalesce and mingle. At such locations, both feasting and trading were carried out. These locations appear to have played a crucial role in human social development, and they’ve been found all over the world. Archaeologists have been studying one in Florida:

More than a thousand years ago, people from across the Southeast regularly traveled to a small island on Florida’s Gulf Coast to bond over oysters, likely as a means of coping with climate change and social upheaval.

Archaeologists’ analysis of present-day Roberts Island, about 50 miles north of Tampa Bay, showed that ancient people continued their centuries-long tradition of meeting to socialize and feast, even after an unknown crisis around A.D. 650 triggered the abandonment of most other such ceremonial sites in the region. For the next 400 years, out-of-towners made trips to the island, where shell mounds and a stepped pyramid were maintained by a small group of locals. But unlike the lavish spreads of the past, the menu primarily consisted of oysters, possibly a reflection of lower sea levels and cool, dry conditions.

During tough times, ancient ‘tourists’ sought solace in Florida oyster feasts (Phys.org)

So I guess Florida has always been a magnet for tourists.

And although Stonehenge is well-known, much less known is Pömmelte, “Germany’s Stonehenge”.

Starting in April, an about-4,000-year-old settlement will be excavated to provide insights into Early Bronze Age life. Settlements of this size have not yet been found at the related henges in the British Isles.

Pömmelte is a ring-shaped sanctuary with earth walls, ditches and wooden piles that is located in the northeastern part of Germany, south of Magdeburg. The site is very much reminiscent of the world-famous monument Stonehenge, and it is likely that the people there performed very similar rituals to those of their counterparts in what is now Britain 4,300 years ago.

Who lived near Pömmelte, the ‘German Stonehenge’? (DW)

This place reminds me a lot of Woodhenge at the Cahokia complex (Wikipedia), which I was able to visit a few years ago. The presence of such similar structures separated across vast times and places (precluding any chance of cultural contact) is something that we need to think deeply about.

From the article above, I also learned about the Nebra Sky Disc (Wikipedia). Recall that the first cities were trying to replicate a “cosmic order” here on earth.

Related: Hunter-gatherer networks accelerated human evolution (Science Daily)

Humans began developing a complex culture as early as the Stone Age. This development was brought about by social interactions between various groups of hunters and gatherers, a UZH study has now confirmed…

The researchers equipped 53 adult Agta living in woodland in seven interconnected residential camps with tracking devices and recorded every social interaction between members of the different camps over a period of one month. The researchers also did the same for a different group, who lived on the coast….The team of researchers then developed a computer model of this social structure and simulated the complex cultural creation of a plant-based medicinal product.

In this fictitious scenario, the people shared their knowledge of medicinal plants with every encounter and combined this knowledge to develop better remedies. This process gradually leads to the development of a highly effective new medicinal product. According to the researchers’ simulation, an average of 250 (woodland camps) to 500 (coastal camps) rounds of social interactions were required for the medicinal product to emerge.

And see: Social Networks and Cooperation in Hunter-Gatherers (NCBI)

A lesser-known megalithic necropolis: the Ħal Saflieni Hypogeum (Wikpedia) 5,000 years ago. Do these look like they were built by people who were filthy and starving?

Related: I only recently heard about this site, but apparently there was a significant industrial complex devoted to the manufacture of flint tools that functioned during the stone age, and well into the Bronze and Iron ages: Grimes Graves (Wikipedia). This gives great insight into the fact that complex specialization of labor and regional comparative advantage have always been with us; they weren’t invented at the time of Smith or Ricardo. We just didn’t fetishize them the way we do now.

And the salt mines of Hallstatt in modern-day Germany have been used for thousands of years since the Bronze Age as well. Apparently, mining required child labor:

Mining there began at least 7,000 years ago and continues modestly today. That makes the UNESCO World Heritage site “the oldest industrial landscape in the world [that’s] still producing,” says [archaeologist Hans] Reschreiter, who has led excavations at Hallstatt for nearly two decades.

But the mine’s peak was during the Bronze and Iron ages, when salt’s sky-high value made Hallstatt one of Europe’s wealthiest communities. Archaeologists understand a great deal about operations then, thanks to an extraordinary hoard of artifacts including leather sacks, food scraps, human feces and millions of used torches.

Many of the finds are made of perishable materials that are usually quick to decay. They survived in the mine’s tunnels because salt is a preservative — the very reason it was in such high demand during Hallstatt’s heyday.

Among the artifacts, the small shoes and caps showed children were in the mine. But researchers needed more evidence to determine whether the young ones were merely tagging along with working parents or actually mining.

To understand the children’s roles, Austrian Academy of Sciences anthropologist Doris Pany-Kucera turned to their graves. In a study of 99 adults from Hallstatt’s cemetery, she found skeletal markers of muscle strain and injury, suggesting many villagers performed hard labor — some from an early age.

Then, in 2019, she reported her analysis of the remains of 15 children and teenagers, finding signs of repetitive work. Children as young as 6 suffered arthritis of the elbow, knee and spine. Several had fractured skulls or were missing bits of bone, snapped from a joint under severe strain. Vertebrae were worn or compressed on all individuals.

Combining clues from the Hallstatt bones and artifacts, researchers traced the children’s possible contributions to the salt industry. They believe the youngest children — 3- to 4-year-olds — may have held the torches necessary for light. By age 8, kids likely assumed hauling and crawling duties, carrying supplies atop their heads or shimmying through crevices too narrow for grown-ups…

The Ancient Practice of Child Labor Is Coming to Light (Discover)

Add this point is important:

It’s no surprise that the young labored at Hallstatt. Children are, and always have been, essential contributors to community and family work. A childhood of play and formal education is a relatively modern concept that even today exists mostly in wealthy societies.

There are those who say that, despite all our technological advancements, we haven’t really reduced the need for human labor. But that’s clearly untrue! We’ve already effectively eliminated the labor of everyone under 18, and from a practical standpoint, nearly everyone over 21. We just forget it because it’s been normalized, but people younger than 18 have labored all throughout human history, even into the early twentieth century. Now they are no longer needed or wanted. And with ever more schooling required for jobs, we’re just increasing the age requirement to enter the workforce. Note that “retirement”—to the extent that it continues to exist—is also a modern phenomenon, eliminating people over 55/60 from the workforce. Labor has most certainly been eliminated, and will continue to be.

Neanderthals and humans co-existed in Europe much longer than we previously thought. (Guardian)

A reminder that many of the earliest human habitats are under the water: Early humans thrived in this drowned South African landscape (Phys.org)

Archaeologists analyzed an ancient cemetery in Hungary, with the distinctly unique elongated skulls the Huns were known for:

They found that Mözs-Icsei dűlő was a remarkably diverse community and were able to identify three distinct groups across two or three generations (96 burials total) until the abandonment of Mözs cemetery around 470 AD: a small local founder group, with graves built in a brick-lined Roman style; a foreign group of twelve individuals of similar isotopic and cultural background, who appear to have arrived around a decade after the founders and may have helped establish the traditions of grave goods and skull deformation seen in later burials; and a group of later burials featuring mingled Roman and various foreign traditions.

51 individuals total, including adult males, females, and children, had artificially deformed skulls with depressions shaped by bandage wrappings, making Mözs-Icsei dűlő one of the largest concentrations of this cultural phenomenon in the region. The strontium isotope ratios at Mözs-Icsei dűlő were also significantly more variable than those of animal remains and prehistoric burials uncovered in the same geographic region of the Carpathian Basin, and indicate that most of Mözs’ adult population lived elsewhere during their childhood. Moreover, carbon and nitrogen isotope data attest to remarkable contributions of millet to the human diet.

Deformed skulls in an ancient cemetery reveal a multicultural community in transition (Phys.org)

See also: Strange, elongated skulls reveal medieval Bulgarian brides were traded for politics (Science)

Speaking of burials: Researchers found 1,000 year old burials in Siberia wearing copper masks: Mummified by accident in copper masks almost 1,000 years ago: but who were they? (Siberian Times) I thought this was fascinating, due to the fact that copper has been shown to kill Coronaviruses, and we have been told to wear masks to prevent transmission. Copper-infused masks are becoming popular (a Google search turned up the above article). Coincidence? Probably.

Religion in South America:

An ancient group of people made ritual offerings to supernatural deities near the Island of the Sun in Lake Titicaca, Bolivia, about 500 years earlier than the Incas, according to an international team of researchers. The team’s findings suggest that organized religion emerged much earlier in the region than previously thought.

Rise of religion pre-dates Incas at Lake Titicaca (phys.org)

This is possibly the coolest scientific study ever conducted: a group of scientists have reconstructed Bronze Age fighting techniques by looking at the wear marks on Bronze Age weapons and armor. Wow! Time to redo that famous fight scene from Troy?

While a graduate student at Newcastle University, [University of Göttingen archaeologist Raphael Hermann] recruited members of a local club devoted to recreating and teaching medieval European combat styles, and asked them to duel with the replicas, using motions found in combat manuals written in the Middle Ages. After recording the combat sequences using high-speed cameras, the researchers noted the type and location of dents and notches left after each clash.

The team assigned characteristic wear patterns to specific sword moves and combinations. If the motions left the same distinctive marks found on Bronze Age swords, Hermann says, it was highly likely that Bronze Age warriors had also used those moves. For example, marks on the replica swords made by a technique known to medieval German duelists as versetzen, or “displacement”—locking blades in an effort to control and dominate an opponent’s weapon—were identical to distinct bulges found on swords from Bronze Age Italy and Great Britain.

Next, Hermann and colleagues put 110 Bronze Age swords from Italy and Great Britain under a microscope and cataloged more than 2500 wear marks. Wear patterns were linked to geography and time, suggesting distinct fighting styles developed over centuries… Displacement, for example, didn’t show up until 1300 B.C.E. and appeared in Italy several centuries before it did in Great Britain.

“In order to fight the way the marks show, there has to be a lot of training involved,” Hermann says. Because the marks are so consistent from sword to sword, they suggest different warriors weren’t swinging at random, but were using well-practiced techniques. Christian Horn, an archaeologist at the University of Gothenburg who was not involved in the research, agrees, and says the experiments offer quantitative evidence of things archaeologists had only been able to speculate about.

Sword-wielding scientists show how ancient fighting techniques spread across Bronze Age Europe (Science Magazine)

This is also important from a historical standpoint: it indicates that the Bronze Age likely saw the rise of a class of professional fighters, as opposed to the all-hands-on-deck mêlée fighting style of all adult males that probably characterized Stone Age warfare. Because fighting became “professionalized” due to the existence of these bronze weapons–which required extensive training to use effectively—the use of force passed into the hands of a specialist warrior caste who were able to impose their will on lesser-armed populations.

This probably explains at least some of the origins of inequality, as those who specialized in the use of violence (as opposed to farming or trading) could then perforce become a ruling class. Inequality always rises when the means of force become confined to a specific class of people. Note also that money in coined form was first invented to pay specialist mercenaries in the Greek states of Asia Minor. These mercenaries were likely the ones who were training in the intensive combat techniques described by the study above.

Related: Medieval battles weren’t as chaotic as people think nor as movies portray! (Reddit) Given how humans react to violence psychologically, how would medieval battles really look, as opposed to the battle scenes depicted in movies? (Hint: not like a mosh pit)

Possibly related: : Modern men are wimps, according to new book (Phys.org). Controversial, but likely correct; our ancestors had much more physical lives and the less fit would not have reproduced as well. My unprovable notion is that we became so effective at warfare that the most violent people would have died off in these types of conflicts, leading to more placid people having a reproductive advantage. Thus, we become less violent over time.

Definitely related: What Compelled the Roman Way of Warfare? Killing for the Republic (Real Clear Defense)

Any polity can field an army through compulsion or other violent means. What matters more is what makes your average person choose to stay on the battlefield. [Steele] Brand argues the Roman Republic motivated its soldiers by publicly honoring at all times the initiative, strength, discipline, perseverance, courage, and loyalty of individual citizens. Moreover, it was this combination of public and private values, flexible political institutions, and a tailored upbringing that gradually culminated in the superiority of the Roman legion against the arguably technically superior Macedonian phalanx at Pydna. Brand calls the entirety of this system “civic militarism,” defined as “self defense writ large for the state.”

Paging Dr. Julian Jaynes: Majority of authors ‘hear’ their characters speak, finds study (Guardian). See also The Origin of Consciousness Reading Companion Part 1 (Put a Number On It)

Collaspe files:

…a new movement called “collapsology”—which warns of the possible collapse of our societies as we know them—is gaining ground.

With climate change exposing how unsustainable the economic and social model based on fossil fuels is, they fear orthodox thinking may be speeding us to our doom.

The theory first emerged from France’s Momentum Institute, and was popularised by a 2015 book, “How Everything Can Collapse”. Some of its supporters, like former French environment minister Yves Cochet, believe the coronavirus crisis is another sign of impending catastrophe.

While the mathematician, who founded France’s Green party, “still hesitates” about saying whether the virus will be the catalyst for a domino effect, he quoted the quip that “it’s too early to say if it’s too late”.

Yet Cochet—whose book “Before the Collapse” predicts a meltdown in the next decade—is convinced that the virus will lead to “a global economic crisis of greater severity than has been imagined”.

The 74-year-old, who retired to France’s rural Brittany region so he could live more sustainably, is also worried about an impending “global disaster with lots of victims, both economic and otherwise”.

“What is happening now is a symptom of a whole series of weaknesses,” warned Professor Yves Citton of Paris VIII University.

“It isn’t the end of the world but a warning about something that has already been set in motion,” he told AFP, “a whole series of collapses that have begun”.

The slide may be slow, said Jean-Marc Jancovici, who heads the Shift Project think-tank which aims to “free economics from carbon”.

But “a little step has been taken (with the virus) that there is no going back”, he argued.

Others have a more chilling take.

“The big lesson of history… and of the Horsemen of the Apocalypse is that pestilence, war and famine tend to follow in each others’ wake,” said Pablo Servigne, an ecologist and agricultural engineer who co-wrote “How Everything Can Collapse”.

“We have a pandemic which could lead to another shock—wars, conflicts and famines,” he added.

“And famines will make us more vulnerable to other pandemics.”

‘Collapsology’: Is this the end of civilisation as we know it? (Phys.org)

The last ice age (or Last Glacial Maximum) peaked around 26,000 years ago. The earth warmed over the coming millennia, driven by an increase in radiation from the sun due to changes in the earth’s orbit (the Milankovic cycles) amplified by CO₂ released from warming water, which further warmed the atmosphere.

But even as the earth warmed it was interrupted by cooler periods known as “stadials”. These were caused by melt water from melting ice sheets which cool large regions of the ocean.

Marked climate variability and extreme weather events during the early Holocene retarded development of sustainable agriculture.

Sparse human settlements existed about 12,000 – 11,000 years ago. The flourishing of human civilisation from about 10,000 years ago, and in particular from 7,000 years ago, critically depended on stabilisation of climate conditions which allowed planting and harvesting of seed and growing of crops, facilitating growth of villages and towns and thereby of civilisation.

Peak warming periods early in the Holocene were associated with prevalence of heavy monsoons and heavy floods, likely reflected by Noah’s ark story.

The climate stabilised about 7,000 – 5,000 years ago. This allowed the flourishing of civilisations along the Nile, Tigris, Euphrates, Indus and the Yellow River.

The ancient river valley civilisations cultivation depended on flow and ebb cycles, in turn dependent on seasonal rains and melting snows in the mountain sources of the rivers. These formed the conditions for production of excess food.

When such conditions declined due to droughts or floods, civilisations collapsed. Examples include the decline of the Egyptian, Mesopotamian and Indus civilisations about 4,200 years ago due to severe drought.

Throughout the Holocene relatively warm periods, such as the Medieval Warm Period (900-1200 AD), and cold periods, such as the Little Ice Age (around 1600 – 1700 AD), led to agricultural crises with consequent hunger, epidemics and wars. A classic account of the consequences of these events is presented in the book Collapse by Jared Diamond.

It’s not just Middle Eastern civilisations. Across the globe and throughout history the rise and fall of civilisations such as the Maya in Central America, the Tiwanaku in Peru, and the Khmer Empire in Cambodia, have been determined by the ebb and flow of droughts and floods.

Greenhouse gas levels were stable or declined between 8,000-6,000 years ago, but then began to rise slowly after 6,000 years ago. According to William Ruddiman at the University of Virginia, this rise in greenhouse gases was due to deforestation, burning and land clearing by people. This stopped the decline in greenhouse gases and ultimately prevented the next ice age. If so, human-caused climate change began much earlier than we usually think.

Rise and fall in solar radiation continued to shift the climate. The Medieval Warm Period was driven by an increase in solar radiation, while the Little Ice Age was caused at least in part by a decrease.

Now we’ve changed the game again by releasing over 600 billion tonnes of carbon into the atmosphere since the Industrial Revolution, raising CO₂ concentrations from around 270 parts per million to about 400 parts per million…

Climate and the rise and fall of civilizations: a lesson from the past (The Conversation)

Lewis Mumford on the Medieval City

One of my favorite passages from The Culture of Cities (pp. 49-51):

In the main, then, the medieval town was not merely a vital social environment: it was likewise adequate, at least to a greater degree than one would gather from its decayed remains, on the biological side. There were smoky rooms to endure; but there was also perfume in the garden behind the burgher’s house: the fragrant flowers and the savory herbs were widely cultivated. There was the smell of the barnyard in the street, diminishing in the sixteenth century, except for the growing presence of horses: but there would also be the odor of flowering orchards in the spring, or the scent of the new mown hay, floating across the fields in the early summer.

Though cockneys my wrinkle their noses at this combination of odors, no lover of the country will be put off by the smell of horse-dung or cow-dung, even though mingled occasionally with that of human excrement: is the reek of gasoline exhaust, the sour smell of a subway crowd, the pervasive odor of a garbage dump, or the chlorinated rankness of a public lavatory more gratifying? Even in the matter of smells, sweetness is not entirely on the side of the modern city.

As for the eye and ear, there is no doubt where the balance of advantage goes: the majority of medieval towns were infinitely superior to those erected during the last century. One awoke in the medieval town to the crowing of the cock, the chirping of the birds nesting under the eaves, or to the tolling of the hours in the monastery on the outskirts, perhaps to the chime of bells in the new bell-tower. Song rose easily on the lips, from the plain chant of the monks to the refrains of the ballad singer in the market place, or that of the apprentice and the house-maid at work. As late as the seventeenth century, the ability to hold a part in a domestic choral song was rated by Pepys as an indispensable quality in a new maid.

There were work songs distinct for each craft, often composed to the rhythmic tapping or hammering of the craftsman himself. Fitz-Stephens reported in the twelfth century that the sound of the water mill was a pleasant one among the green fields of London. At night there would be complete silence, but for the stirring of animals and the calling of hours by the town watch. Deep sleep was possible in the medieval town, untainted by either human or mechanical noises.

If the ear was stirred, the eye was even more deeply delighted. The craftsman who had walked through the fields and woods on holiday, came back to his stone-carving or his wood-working with a rich harvest of impressions to be transferred to his work. The buildings, so far from being “quaint,” were as bright and clean as a medieval illumination, often covered with whitewash, so that all the colors of the image makers in paint or glass or polychromed wood would dance on the walls, even as the shadows quivered like sprays of lilac on the facades of the more richly carved buildings. (Patina and picturesqueness were the results of time’s oxidation: not original attributes of the architecture.)

Common men thought and felt in images, far more than in the verbal abstractions used by scholars: esthetic discipline might lack a name, but its fruit were everywhere visible. Did the citizens of Florence vote as to the type of column that was to be used on the Cathedral? Image makers carved statues, painted triptychs, decorated the walls of the cathedral, the guild hall, the town hall, the burgher’s house: color and design were everywhere the normal accomplishment of the practical daily tasks.

There was visual excitement in the array of goods in the open market: velvets and brocades, copper and shining steel, tooled leather and brilliant glass, to say nothing of foods arranged in their panniers under the open sky. Wander around the survivals of these medieval markets today. Whether they be as drab as the Jews’ Market in Whitechapel, or as spacious as that on the Plain Palais at Geneva, they will still have some of the excitement of their medieval prototypes.


The History of Pandemics

With the global disruption of COVID-19, there have been a number of stories in news outlets documenting the history of past pandemics in an effort to make sense of it all. One name that has come up frequently is Walter Scheidel. The Stanford University historian wrote a book some years ago which acquired a great deal of attention called “The Great Leveler.” In it, he contended that only catastrophes reduced wealth and income inequality, without which it would grow without bound. One recurring leveler was plagues and pandemics (along with war, famine, collapse and political revolution).

I reviewed that book in a series of three posts:

The Great Leveler Review (Part One)

The Great Leveler Review (Part Two)

The Great Leveler Review (Part Three)

I’ve gone back and cleaned up the typos (the ones I found, anyway). I think these posts are actually quite good (and I’m a terrible critic of myself), so they’re most likely worth a reread (if I do say so myself).

Here’s Scheidel himself writing in The New York Times summarizing the leveling effect he found during pandemics:

…as successive waves of plague shrank the work force, hired hands and tenants “took no notice of the king’s command,” as the Augustinian clergyman Henry Knighton complained. “If anyone wanted to hire them he had to submit to their demands, for either his fruit and standing corn would be lost or he had to pander to the arrogance and greed of the workers.”

As a result of this shift in the balance between labor and capital, we now know…that real incomes of unskilled workers doubled across much of Europe within a few decades. According to tax records that have survived in the archives of many Italian towns, wealth inequality in most of these places plummeted.

In England, workers ate and drank better than they did before the plague and even wore fancy furs that used to be reserved for their betters. At the same time, higher wages and lower rents squeezed landlords, many of whom failed to hold on to their inherited privilege. Before long, there were fewer lords and knights, endowed with smaller fortunes, than there had been when the plague first struck…

In all of these cases, he notes, the elites pushed back. They weren’t content with their “lessers” having a greater share of the pie (which is, after all, why they were elites):

In late medieval Eastern Europe, from Prussia and Poland to Russia, nobles colluded to impose serfdom on their peasantries to lock down a depleted labor force. This altered the long-term economic outcomes for the entire region: Free labor and thriving cities drove modernization in Western Europe, but in the eastern periphery, development fell behind.

Farther south, the Mamluks of Egypt, a regime of foreign conquerors of Turkic origin, maintained a united front to keep their tight control over the land and continue exploiting the peasantry. The Mamluks forced the dwindling subject population to hand over the same rent payments, in cash and kind, as before the plague. This strategy sent the economy into a tailspin as farmers revolted or abandoned their fields.

The elite pushback often failed in the short-term:

…more often than not, repression failed. The first known plague pandemic in Europe and the Middle East, which started in 541, provides the earliest example. Anticipating the English Ordinance of Laborers by 800 years, the Byzantine emperor Justinian railed against scarce workers who “demand double and triple wages and salaries, in violation of ancient customs” and forbade them “to yield to the detestable passion of avarice” — to charge market wages for their labor. The doubling or tripling of real incomes reported on papyrus documents from the Byzantine province of Egypt leaves no doubt that his decree fell on deaf ears…

During the Great Rising of England’s peasants in 1381, workers demanded, among other things, the right to freely negotiate labor contracts. Nobles and their armed levies put down the revolt by force, in an attempt to coerce people to defer to the old order. But the last vestiges of feudal obligations soon faded. Workers could hold out for better wages, and landlords and employers broke ranks with one another to compete for scarce labor.

And yet, in the long-term, people ended up no better off than they had started:

None of these stories had a happy ending for the masses. When population numbers recovered after the plague of Justinian, the Black Death and the American pandemics, wages slid downward and elites were firmly back in control. Colonial Latin America went on to produce some of the most extreme inequalities on record. In most European societies, disparities in income and wealth rose for four centuries all the way up to the eve of World War I. It was only then that a new great wave of catastrophic upheavals undermined the established order, and economic inequality dropped to lows not witnessed since the Black Death, if not the fall of the Roman Empire.

Why the Wealthy Fear Pandemics (NYTimes)

Past pandemics redistributed income between the rich and poor, according to Stanford historian (Stanford News)

Black Death historian: ‘A coronavirus depression could be the great leveller’ (Guardian)

Can a pandemic remake society? A historian explains. (Vox)

Here are some other pages from history, in somewhat chronological order:

White and Mordechai focused their efforts on the city of Constantinople, capital of the Roman Empire, which had a comparatively well-described outbreak in 542 CE. Some primary sources claim plague killed up to 300,000 people in the city, which had a population of some 500,000 people at the time. Other sources suggest the plague killed half the empire’s population. Until recently, many scholars accepted this image of mass death. By comparing bubonic, pneumonic, and combined transmission routes, the authors showed that no single transmission route precisely mimicked the outbreak dynamics described in these primary sources.

New call to examine old narratives: Infectious disease modeling study casts doubt on the Justinianic Plague’s impact (Phys.org)

Heraclitus compared our lot to beasts, winos, deep sleepers and even children – as in, “Our opinions are like toys.” We are incapable of grasping the true logos. History, with rare exceptions, seems to have vindicated him.

There are two key Heraclitus mantras.

1) “All things come to pass according to conflict.” So the basis of everything is turmoil. Everything is in flux. Life is a battleground. (Sun Tzu would approve.)

2) “All things are one.” This means opposites attract. This is what Heraclitus found when he went tripping inside his soul – with no help of lysergic substances. No wonder he faced a Sisyphean task trying to explain this to us, mere children.

And that brings us to the river metaphor. Everything in nature depends on underlying change. Thus, for Heraclitus, “as they step into the same rivers, other and still other waters flow upon them.” So each river is composed of ever-changing waters.

‘It is disease that makes health sweet and good’ (Asia Times)

Despite the lack of healthcare and public health measures as we understand them – and we will never know how many plague victims died of neglect, hunger and thirst, or of secondary infections – the plague in medieval England, and Western Europe as a whole, was mediated by a system of research, intellectual authority and technical countermeasures.

But that system was religious, based on the Christian church’s management of the passage of souls from this earth to the next world. The forerunner of the modern emergency vehicle was the bell of the priest’s attendants, advising the dying that relief was at hand, in the form of an expert trained and qualified to take confession and administer the other sacraments that would ensure safe passage, if not to heaven, at least to purgatory.

The dividing line between rich and poor wasn’t so much access to drugs or the best doctors as to post-mortem religious services: the prayers, candles, masses and chantries that were meant to speed the dead to a better hereafter. The technical emergencies the authorities faced weren’t shortages of hospital beds and doctors but of candle wax and confessors. Priests were not immune to the plague.

‘Emergency’, or its Latin equivalent, was the word used by the bishop of Bath and Wells in January 1349, six months after the plague began in England, when he broadcast an urgent message to his flock via the surviving parish priests in his diocese. ‘We understand,’ he wrote, ‘that many people are dying without the sacrament of penance, because they do not know what they ought to do in such an emergency and believe that even in an emergency confession of their sins is no use or worth unless made to [an ordained] priest.’ What they had to do, he told them, was ‘make confession of their sins, according to the teaching of the apostle, to any lay person, even to a woman if a man is not available.’

In 1348 (London Review of Books)

It’s hard to keep a virulent disease down. The first and biggest burst of plague lasted from the late 1340s until about 1353. Just as the world started thinking things were getting back to normal, another wave hit in 1360. After that there were new waves every 10 years or so. Europe’s population didn’t get back to pre-plague levels for a century and a half.

We’ve come a long way since the Black Death (Asia Times)

Quarantining was invented during the first wave of bubonic plague in the 14th century, but it was deployed more systematically during the Great Plague. Public servants called searchers ferreted out new cases of plague, and quarantined sick people along with everyone who shared their homes. People called warders painted a red cross on the doors of quarantined homes, alongside a paper notice that read “LORD HAVE MERCY UPON US.” (Yes, the all-caps was mandatory).

The government supplied food to the housebound. After 40 days, warders painted over the red crosses with white crosses, ordering residents to sterilize their homes with lime. Doctors believed that the bubonic plague was caused by “smells” in the air, so cleaning was always recommended. They had no idea that it was also a good way to get rid of the ticks and fleas that actually spread the contagion.

Of course, not everyone was compliant. Legal documents at the U.K. National Archives show that in April 1665, Charles II ordered severe punishment for a group of people who took the cross and paper off their door “in a riotious manner,” so they could “goe abroad into the street promiscuously, with others.” It’s reminiscent of all those modern Americans who went to the beaches in Florida over spring break, despite what public health experts told them.

Just as some American politicians blame the Chinese for the coronavirus, there were 17th century Brits who blamed the Dutch for spreading the plague. Others blamed Londoners. Mr. Pepys had relocated his family to a country home in Woolwich, and writes in his diary that the locals “are afeard of London, being doubtfull of anything that comes from thence, or that hath lately been there … I was forced to say that I lived wholly at Woolwich.”

Annalee Newitz: What social distancing looked like in 1666 (Salt Lake Tribune)

In​ the cold autumn of 1629, the plague came to Italy. It arrived with the German mercenaries (and their fleas) who marched through the Piedmont countryside. The epidemic raged through the north, only slowing when it reached the natural barrier of the Apennines. On the other side of the mountains, Florence braced itself. The officials of the Sanità, the city’s health board, wrote anxiously to their colleagues in Milan, Verona, Venice, in the hope that studying the patterns of contagion would help them protect their city. Reports came from Parma that its ‘inhabitants are reduced to such a state that they are jealous of those who are dead’. The Sanità learned that, in Bologna, officials had forbidden people to discuss the peste, as if they feared you could summon death with a word.

Plague was thought to spread through corrupt air, on the breath of the sick or trapped in soft materials like cloth or wood, so in June 1630 the Sanità stopped the flow of commerce and implemented a cordon sanitaire across the mountain passes of the Apennines. But they soon discovered that the boundary was distressingly permeable. Peasants slipped past bored guards as they played cards. In the dog days of the summer, a chicken-seller fell ill and died in Trespiano, a village in the hills above Florence. The city teetered on the brink of calamity.

By August, Florentines were dying. The archbishop ordered the bells of all the churches in the city to be rung while men and women fell to their knees and prayed for divine intercession. In September, six hundred people were buried in pits outside the city walls. As panic mounted, rumours spread: about malicious ‘anointers’, swirling infection through holy water stoups, about a Sicilian doctor who poisoned his patients with rotten chickens. In October, the number of plague burials rose to more than a thousand. The Sanità opened lazaretti, quarantine centres for the sick and dying, commandeering dozens of monasteries and villas across the Florentine hills. In November, 2100 plague dead were buried. A general quarantine seemed the only answer. In January 1631, the Sanità ordered the majority of citizens to be locked in their homes for forty days under threat of fines and imprisonment.

In his Memoirs of the Plague in Florence, Giovanni Baldinucci described how melancholy it was ‘to see the streets and churches without anybody in them’. As the city fell quiet, ordinary forms of intimacy were forbidden. Two teenage sisters, Maria and Cammilla, took advantage of their mother’s absence in the plague hospital to dance with friends who lived in the same building. When they were discovered, their friends’ parents were taken to prison. At their trial, the mother, Margherita, blamed the two girls: ‘Oh traitors, what have you done?’ Another pair of sisters found relief from the boredom of quarantine by tormenting their brother. Arrested after one of the Sanità’s policemen saw them through an open door, one of them explained in court that ‘in order to pass the time we dressed our brother up in a mask, and we were dancing among ourselves, and while he was … dressed up like that, the corporal passed by … and saw what was going on inside the house.’ Dancing and dressing up were treacherous actions, violating the Sanità’s measures to control movement, contact, breath. But loneliness afflicted people too…


The poor were judged not only careless but physically culpable, their bodies frustratingly vulnerable to disease. The early decades of the 17th century in Europe saw widespread famines, sky-high grain prices, declining wages, political breakdown and violent religious conflicts. (This is the ‘general crisis of the 17th century’ that Important Male Historians like to debate.) One Florentine administrator, surveying the surrounding countryside, reported that even before the epidemic struck, villages were ‘full of people, who feed themselves with myrtle berries, acorns and grasses, and whom one sees along the roads seeming like corpses who walk’. The city was not much better. A diarist in Florence in 1630 noted the ‘many poor children who eat the stalks of cabbages that they find on the street, as though, through their hunger, they seem like fruit’. Famine was compounded by the steep decline of the textile industry in the city, as producers in England, Holland and Spain undercut prices; the number of wool workshops halved between 1596 and 1626. These long, lean years of unemployment and hunger had left Florentines acutely susceptible to the coming epidemic.


The Sanità arranged the delivery of food, wine and firewood to the homes of the quarantined (30,452 of them). Each quarantined person received a daily allowance of two loaves of bread and half a boccale (around a pint) of wine. On Sundays, Mondays and Thursdays, they were given meat. On Tuesdays, they got a sausage seasoned with pepper, fennel and rosemary. On Wednesdays, Fridays and Saturdays, rice and cheese were delivered; on Friday, a salad of sweet and bitter herbs. The Sanità spent an enormous amount of money on food because they thought that the diet of the poor made them especially vulnerable to infection, but not everyone thought it was a good idea. Rondinelli recorded that some elite Florentines worried that quarantine ‘would give [the poor] the opportunity to be lazy and lose the desire to work, having for forty days been provided abundantly for all their needs’.

The provision of medicine was also expensive. Every morning, hundreds of people in the lazaretti were prescribed theriac concoctions, liquors mixed with ground pearls or crushed scorpions, and bitter lemon cordials. The Sanità did devolve some tasks to the city’s confraternities. The brothers of San Michele Arcangelo conducted a housing survey to identify possible sources of contagion; the members of the Archconfraternity of the Misericordia transported the sick in perfumed willow biers from their homes to the lazaretti. But mostly, the city government footed the bill. Historians now interpret this extensive spending on public health as evidence of the state’s benevolence: if tracts like Righi’s brim over with intolerance towards the poor, the account books of the Sanità tell an unflashy story of good intentions.

But the Sanità – making use of its own police force, court and prison – also punished those who broke quarantine. Its court heard 566 cases between September 1630 and July 1631, with the majority of offenders – 60 per cent – arrested, imprisoned, and later released without a fine. A further 11 per cent were imprisoned and fined. On the one hand, the majority of offenders were spared the harshest penalties, of corporal punishment or exile. On the other, being imprisoned in the middle of a plague epidemic was potentially lethal; and the fines levied contributed to the operational budget of the public health system. The Sanità’s lavish spending on food and medicine suggests compassion in the face of poverty and suffering. But was it kindness, if those salads and sausages were partly paid for by the same desperate people they were intended to help? The Sanità’s intentions may have been virtuous, but they were nevertheless shaped by an intractable perception of the poor as thoughtless and lazy, opportunists who took advantage of the state of emergency.

Early modern historians used to be interested in the idea of the ‘world turned upside down’: in moments of inversion during carnival when a pauper king was crowned and the pressures of a deeply unequal society released. But what emerges from the tangle of stories in John Henderson’s book is a sense that for many the world stood still during the plague. The disease waned in the early summer of 1631 and, in June, Florentines emerged onto the streets to take part in a Corpus Christi procession, thanking God for their reprieve. When the epidemic finally ended, about 12 per cent of the population of Florence had died. This was a considerably lower mortality rate than other Italian cities: in Venice 33 per cent of the population; in Milan 46 per cent; while the mortality rate in Verona was 61 per cent. Was the disease less virulent in Florence or did the Sanità’s measures work? Percentages tell us something about living and dying. But they don’t tell us much about survival. Florentines understood the dangers, but gambled with their lives anyway: out of boredom, desire, habit, grief…

Florence Under Siege: Surviving Plague in an Early Modern City by John Henderson.

Inclined to Putrefication (London Review of Books)

The majority of the population feared and condemned inoculation. Even many of those who were in favor of it were torn by doubts and religious scruples. Was inoculation a “lawful” practice? Was smallpox not a “judgement of God,” sent to punish and humble the people for their sins? Was being inoculated not like “taking God’s Work out of His Hand”?

Douglass played upon such popular scruples to the apparent discomfiture of his clerical opponents. Turning to the ministers he challenged them to determine, as a “Case of Conscience,” how placing more trust in human measures than in God was consistent with the devotion and subjection owed to the all-wise providence of the Lord. That he had not raised this issue in good faith becomes evident from a passage contained in a private letter suggesting jeeringly that his correspondent might perhaps admire how the clergy reconciled inoculation with their doctrine of predestination…

Ever since she had accompanied her husband on a diplomatic mission to Turkey, where she had become acquainted with inoculation and convinced of its merits, it had been Lady Mary Wortley Montagu’s ambition to bring “this useful invention into fashion in England.” That the country’s best medical minds had not sanctioned the practice did not deter Lady Mary. She bided her time. In the 1721 epidemic she asked Charles Maitland, the physician who four years earlier had inoculated her young son in Constantinople, to perform the operation now on her little daughter. She also enlisted the interest of the Princess of Wales, at whose request the King agreed to pardon a number of prisoners who were under sentence of death if they submitted to inoculation. Six convicts in Newgate Prison were ready to do so, and on August 9, about the time Boylston was injecting his patients, they were inoculated by Maitland. The results at first were good. The ice had been broken and during the next months further persons underwent inoculation at his hands. The culmination of Lady Mary’s crusade was the inoculation of the daughters of the Prince and Princess of Wales…

With improvement in its techniques, inoculation gained increasing favor as a method for the prophylaxis of smallpox until it finally, nearly eighty years later, gave way to Jenner’s magnificent discovery of vaccination.

When Cotton Mather Fought Smallpox (American Heritage)

Asiatic cholera, one of humanity’s greatest scourges in the modern period, came to Europe for the first time in the years after 1817, traveling by ship and caravan route from the banks of the Ganges, where it was endemic, to the Persian Gulf, Mesopotamia and Iran, the Caspian Sea and southern Russia, and then—thanks to troop movements occasioned by Russia’s wars against Persia and Turkey in the late 1820s and its suppression of the revolt in Poland in 1830–1831—to the shores of the Baltic Sea. From there its spread westward was swift and devastating, and before the end of 1833 it had ravaged the German states, France, and the British Isles and passed on to Canada, the western and southern parts of the United States, and Mexico.

Politics of a Plague (NYRB)

Typhoid was a killer but it belonged to another world. The disease thrived in the overcrowded, insanitary conditions of New York’s slums, such as Five Points, Prospect Hill and Hell’s Kitchen. The family of one of the victims hired a researcher called George Soper and the diligent Mr Soper proved to be Mary’s nemesis – even though when he first tracked her down she chased him out of her kitchen with a carving fork. And that’s part of the problem with Mary.

It’s possible to sympathise with her refusal to believe that she could be transmitting a disease from which she never suffered herself. But Mr Soper had correctly identified her as an asymptomatic carrier of Typhoid fever. She would never get the disease herself but would never stop giving it to other people.

Not surprisingly, Mary Mallon found this impossible to understand. But the New York authorities were desperate and in 1907 Mary was exiled to the isolation facility on North Brother Island in the river outside New York.

How Typhoid Mary left a trail of scandal and death (BBC)

At the end of the 19th century, one in seven people around the world had died of tuberculosis, and the disease ranked as the third leading cause of death in the United States. While physicians had begun to accept German physician Robert Koch’s scientific confirmation that TB was caused by bacteria, this understanding was slow to catch on among the general public, and most people gave little attention to the behaviors that contributed to disease transmission. They didn’t understand that things they did could make them sick.

In his book, Pulmonary Tuberculosis: Its Modern Prophylaxis and the Treatment in Special Institutions and at Home, S. Adolphus Knopf, an early TB specialist who practiced medicine in New York, wrote that he had once observed several of his patients sipping from the same glass as other passengers on a train, even as “they coughed and expectorated a good deal.” It was common for family members, or even strangers, to share a drinking cup.

With Knopf’s guidance, in the 1890s the New York City Health Department launched a massive campaign to educate the public and reduce transmission. The “War on Tuberculosis” public health campaign discouraged cup-sharing and prompted states to ban spitting inside public buildings and transit and on sidewalks and other outdoor spaces—instead encouraging the use of special spittoons, to be carefully cleaned on a regular basis. Before long, spitting in public spaces came to be considered uncouth, and swigging from shared bottles was frowned upon as well. These changes in public behavior helped successfully reduce the prevalence of tuberculosis.

How Epidemics of the Past Changed the Way Americans Lived (Smithsonian)

Hassler shared his doubts about a closure order, but suggested that a short closure order would “limit most of all the cases to the home and give the other places a chance to thoroughly clean up and thus we may bring about a condition that will reduce the number of cases.” Several in attendance felt that a general closure order would induce panic in the people, would be costly, and would not stop the spread of the epidemic. Theater owners and dance hall operators supported a closure order, hoping that it would bring a quick end to the epidemic that was already causing a drastic reduction in revenue (one owner estimated that his receipts had fallen off 40% since the start of the epidemic). After some discussion, the Board of Health voted to close all places of public amusement, ban all lodge meetings, close all public and private schools, and to prohibit all dances and other social gatherings effective at 1:00 am on Friday, October 18. The Board did not close churches, but instead recommended that services and socials be either discontinued during the epidemic or held in the open air. City police were given a list of the restrictions and directed to ensure compliance with the order. The Liberty Loan drive, always the concern of citizens as they tried to outdo other cities in fundraising, would be allowed to continue by permit, as would all public meetings.

Despite the closure order and gathering ban, the centerpiece of San Francisco’s crusade against influenza was the face mask. Several other cities also mandated their use, and many more recommended them for private citizens as well as for physicians, nurses, and attendants who cared for the ill. But it was San Francisco that pushed for the early and widespread use of masks as a way to prevent the spread of the dread malady. On October 18, the day that the other health measures went into effect, Hassler ordered that all barbers wear masks while with customers, and recommended clerks who came into contact with the general public also don them. The next day, Hassler added hotel and rooming house employees, bank tellers, druggists, store clerks, and any other person serving the public to the list of those required to wear masks. Citizens were again strongly urged to wear masks while in public. On October 21, the Board of Health met and issued a strong recommendation to all residents to wear a mask while in public.

The wearing of a mask immediately became of a symbol of wartime patriotism…

The American Influenza Epidemic of 1918-1919: San Francisco (Influenza Archive)

It’s difficult to say where this pandemic is leading. On the one hand, it has revealed the extent to which the most essential workers of our society are underpaid and undervalued. It has shown how dependent we are on transient and undocumented workers who are routinely brutalized, especially in the food system. It has exposed the dark underbelly of how food ends up on our shelves and how fragile our food system really is. It has led to an upsurge in union activism and strikes. It has demonstrated the fragility of long, just-in-time supply chains and the downside of outsourcing absolutely everything, such that no one country can produce anything anymore.

It has laid bare the cracks in our society. It has shown that the philosophy of “small government” promoted by billionaires and corporations is a disaster in times of crisis. It has shown that the pattern of crippling and hobbling state and local governments in favor of empowering markets and wealthy private actors is counterproductive. It has shown the utter folly of tying the basics of life to formal employment, such as housing and health care. It has shown that depending on “free markets” for absolutely everything doesn’t work when those markets shut down due to inevitable crises. It has shown the fecklessness and incompetence of America’s leaders, as well as their amorality and bottomless greed.

Yet it has also empowered authoritarians and dictators the world over. It has superempowered the ability of states to track and monitor their citizens. It has devastated local economies and small businesses, while shifting wealth, power, and economic activity to transnational corporations who have access to unlimited money from captured governments. It has led to an upsurge in activity among the extremist far-right and well-armed and organized Fascist militias. The stock market reaches a new high every time the unemployment rate goes up, while the financial industry is bailed out. Unemployment is at Great Depression levels, while workers in the U.S. are told by politicians to fend for themselves. “Essential” workers are ordered back to work or threatened with benefit cut-offs. To date, it has increased inequality.

It has also reduced pollution levels and crippled much of air travel, perhaps forever. It has substantially reduced demand for fossil fuels, even as prices reach all-time lows. It has caused cities to close off streets and avenues to cars in favor of bicycles and pedestrians. It has increased the viability of working from home.

In short, it’s complicated. But much of what happens will be up to us. Will we become more extremist, authoritarian and unequal? Will we continue to embrace the Social Darwinism promoted by our betters? Or will we demand essential workers be paid better, unions to no longer be suppressed, working hours to drop, commuting to go away, streets to be prioritized to bikes, and the government spend its trillions on helping the average citizen rather than just big corporations and the investor class? It could go either way. Walter Scheidel concludes:

In looking for illumination from the past on our current pandemic, we must be wary of superficial analogies. Even in the worst-case scenario, Covid-19 will kill a far smaller share of the world’s population than any of these earlier disasters did, and it will touch the active work force and the next generation even more lightly. Labor won’t become scarce enough to drive up wages, nor will the value of real estate plummet. And our economies no longer rely on farmland and manual labor.

Yet the most important lesson of history endures. The impact of any pandemic goes well beyond lives lost and commerce curtailed. Today, America faces a fundamental choice between defending the status quo and embracing progressive change. The current crisis could prompt redistributive reforms akin to those triggered by the Great Depression and World War II, unless entrenched interests prove too powerful to overcome.

Why Democrats Suck

So the news is that Larry Summers is Joe Biden’s economic advisor.

I’ll take credit for being early on the “People like Larry Summers are the problem with the Democrats” train. I wrote a whole post on it way back in November. In it, I wrote:

Listening to arrogant Ivy League hyper-elite technocrats like Larry Summers is exactly why the Democratic Party is in the pathetic state its is in, and continually loses elections, even to incompetent morons like Donald Trump. If Larry Summers is a representation of “liberal values” than God help us all.

Don’t Think Like an Economist

Here are some insights into Mr. Summers’ worldview from various people. From Yanis Varoufakis:

‘There are two kinds of politicians,’ [Summers] said: ‘Insiders and outsiders. The outsiders prioritize their freedom to speak their version of the truth. The price of their freedom is that they are ignored by the insiders, who make the important decisions. The insiders, for their part, follow a sacrosanct rule: never turn against other insiders and never talk to outsiders about what insiders say or do. Their reward? Access to inside information and a chance, though no guarantee, of influencing powerful people and outcones.’ Whith that Summers arrived at his question. ‘So, Yanis,’ he said, ‘which of the two are you?’

From Elizabeth Warren:

Late in the evening, Larry leaned back in his chair and offered me some advice. By now, I’d lost count of Larry’s diet Cokes, and our table was strewn with bits of food and spilled sauces. Larry’s tone was in the friendly-advice category. He teed it up this way: I had a choice. I could be an insider or I could be an outsider. Outsiders can say whatever they want. But people on the inside don’t listen to them. Insiders, however, get lots of access and a chance to push their ideas. People–powerful people–listen to what they have to say. But insiders also understand one unbreakable rule: They don’t criticize other insiders.

I had been warned.

From Thomas Frank’s book, Listen Liberal (p. 173):

‘One of the challenges in our society is that the truth is kind of a disequalizer.’ Larry Summers told journalist Ron Suskin during the early days of the Obama administration. ‘One of the reasons that inequality has probably gone up in our society is that people are being treated closer to the way that they’re supposed to be treated.’

And let’s not forget:

In the 1990s, during Bill Clinton’s presidency, the derivatives market was taking off and Brooksley Born was chair of the Commodities Futures Trading Commission. She warned that unregulated derivatives trading posed a risk to the nation’s financial stability. She wanted more transparency of this dark market.

But Born was undercut in her efforts by no less than Treasury Secretary Robert Rubin, Federal Reserve Chairman Alan Greenspan, Deputy Secretary of the Treasury Larry Summers and SEC Chair Arthur Levitt. This boys club turned out to be dead wrong. But they had the power. They convinced Congress to strip the CFTC of its power to regulate derivatives.

The Cassandras of Our Time: Brooksley Born and Ann Ravel (Brennan Center)

Summers is also a favorite economist of the Marginal Revolution blog from George Mason University and the Mercatus Center, the epicenter of Kochenomics.

And remember, folks, the Democrats are the “Leftist” party in the United States. After all, where are you going to go?

That doesn’t bode well for the Biden campaign does it? But it does make sense: Biden is opposed to Medicare for all, student debt forgiveness, subsidized higher education, green job creation programs, wealth taxes, higher minimum wages and universal basic income. In opposing these, he consistently invokes the old canard: Howyagunnapayforit?.

Either that, or it’s “means test” everything. After all, we have to make absolutely sure that no one “undeserving” may >*gasp*< get a benefit they don’t deserve! Perish the thought! Only the truly bereft are worthy of any kind of societal benefit; the rest of us “real citizens” can get our needs met by shopping in the big, glorious Market.

Of course, this means testing bullshit leaves all sort of cracks that people often slip through, ensuring that any government program is as unpopular as possible. This is by design. So, if you’re too rich, or too poor, you cannot get health care via the government. Too poor: get Medicaid. Suddenly earn $1 over the cutoff: sorry no Medicaid for you. Have you tried the Obamacare exchanges? Rich enough to have a “Cadillac Plan?” Oh, we’re going to tax that. All just so we don’t have to cover everyone.

Or take higher education. Make under X amount: here’s a (partial) scholarship. Make over X amount? No college aid for you. All so we don’t have free higher education for all.

Robert Evens put it well on a podcast about the West Virginia coal miners’ war (the Battle of Blair Mountain):

[59:52] “You’ll hear people saying ‘basic Income seems like a great idea, but what if X group…what if rich people get it; that’s not fair.'”

“One of the problems with that is that, when you start saying stuff like, ‘We need a basic income; we need free college; we need universal health care,’ and people start bringing [up], ‘What about this group, what about that group?’ What they’re really saying is ‘I don’t believe that this is an inherent right. I think certain individual groups might deserve it, but I don’t see it as an inherent right.'”

“And I think one of the lessons of the labor movement is [that] this shit only works when you treat it like an inherent right and you reject attempts to divide people, even among groups that might make sense to you at the time. Because, in reality, if you’re agreeing to that division at all, you are against the idea that people have a right to this sort of thing.”

The other thing it does is allow recipients of such “government largesse” to be depicted as “cheats” and “scroungers.” Add that to the bogus idea that “my tax dollars fund the government,” and you play right into the Conservative/Libertarian framing of, “They’re stealing my hard-earned (it’s always ‘hard-earned’) money to give money to those layabouts which I’m not even entitled to!” In other words, it’s deliberately sabotaging social programs to give Conservatives the ammunition they need to destroy it.

Again, to repeat, this is by design!

And since the Democrats know that the baton of government will inevitably be passed back and forth between the parties, they can count on Republicans to chip away at, or even dismantle, the programs that they’ve created. They can then depict them as the bad guys, even though that was the plan all along. Good cop, meet bad cop.

Here’s the dirty little secret: They don’t want these programs to succeed!

Thus the two party duopoly functions as one wrestling tag-team implementing the same set of Neoliberal policies to enrich the donor class at our expense.

But if both parties are virtually identical when it comes to economic philosophy, who can you vote for if you don’t agree with that kind philosophy?

No one. And that’s the goal of the two-party system. There is no alternative. That’s why the Democrats were far more effective in opposing Bernie Sanders than the have been opposing the so-called “mortal threat” Trump. #Resistance.

In my original post, I said:

My core point is this: this kind of autistic “economic thinking” is the very reason why the voting public believes there is no substantial difference between the Republicans and the (Neoliberal) Democrats. And they’re right! It’s also worth noting that Professor Cowen has let the cat out of the bag, tacitly admitting that the very discipline of economics is inherently right-wing (it makes him suspect among the left…). Yet it still masquerades as ideologically neutral!

Don’t Think Like an Economist

This article from Policy Tensor makes a lot of the same points:

The tunnel vision of global leaders and the wider discourse of the articulate class is symptomatic of a deeper malaise. Put simply, we are in the grip of a very powerful ideology. It is an ideology that subordinates all goals, including the survival of our species and the web of life with which it is inextricably intertwined, to the goal of maximizing economic growth.

But it does much more. Economics as ideology distorts our perception of contemporary and historical reality. It misguides us into flawed explanatory schema for the most important historical explananda. It sharply narrows the possibility space of human action. And, most important of all, it closes off all rational courses of action that may thwart the collapse of world civilization that is increasingly getting backed-in as we ride up the hockey stick of doom.

Economics as Ideology (1): Introduction (Policy Tensor)

The genius of economics is that it is an ideology that masquerades as non-ideological. Economists always win the debate once you accept their framing of the world: as a cost-benefit cash nexus, full of rational actors where nature has no inherent value. Add that new factory to GDP, don’t subtract all the people who will get cancer from it, and so on.

Once you accept their premises, you are guided along (as if by an invisible hand) to their desired conclusions, which, by some coincidence, always benefit the rich and powerful.

And these axioms have colonized our consciousness to the extent that we don’t even think of them as axioms, we just accept them as natural. They’ve achieved cultural hegemony in Gramasci’s terminology.

When someone says, “you just don’t understand economics,” what they’re really saying is, “You’re not looking at the world through the same blinkered, autistic view as I am, therefore you can’t be taken seriously.”

Ideology consists of widely-shared lenses that are worn unconsciously. It is when we are not aware of our limits applicability of our reference frame, when we mistake the map for the territory, that we are being ideological. More often than not, we are simply unaware that we are using a specific lens to interrogate reality. Ideology manifests itself in widely-shared and unarticulated premises. It is most evident in things that are simply assumed to be true and require no justification whatsoever — mere assertion suffices. But even though widely accepted, such premises may not hold. A gap thus opens up between discourse and reality. Such gaps are a recipe for disaster. All man-made catastrophes are due, in large part, to such gaps.

Hence Tyler Cowen’s complement to Summers cited in the original post: He never ceases to think like an economist. Because thinking like an economist will be sure to get you to the libertarian conclusions that Cowen and his patrons favor, even if you are officially classified as “liberal” or are a member of the Democratic Party. Two parties, one ideology.

Recall that the modern discipline of economics as developed under the marginal revolution in the late 1800s (hence the name of the blog) is based on the following core theorems:

There are two fundamental theorems of welfare economics.

-First fundamental theorem of welfare economics (also known as the “Invisible Hand Theorem”):

any competitive equilibrium leads to a Pareto efficient allocation of resources.

The main idea here is that markets lead to social optimum. Thus, no intervention of the government is required, and it should adopt only “laissez faire” policies. However, those who support government intervention say that the assumptions needed in order for this theorem to work, are rarely seen in real life.

It must be noted that a situation where someone holds every good and the rest of the population holds none, is a Pareto efficient distribution. However, this situation can hardly be considered as perfect under any welfare definition. The second theorem allows a more reliable definition of welfare

-Second fundamental theorem of welfare economics:

any efficient allocation can be attained by a competitive equilibrium, given the market mechanisms leading to redistribution.

This theorem is important because it allows for a separation of efficiency and distribution matters. Those supporting government intervention will ask for wealth redistribution policies.

Welfare economics I: Fundamental theorems (Policonomics)

In other words, the greatest welfare (optimal good) is achieved by government getting out of the way and letting markets rip. This is not a value statement; this is baked into the very heart of economics as a discipline! Also note:

“…a situation where someone holds every good and the rest of the population holds none, is a Pareto efficient distribution.” Hmmmm…

The second theorem states that the “winners” will compensate the “losers”, and this supposedly “cleans up” the problems with the first theorem. But as noted in my earlier post, that turns out to be not so clean-cut:

In 1939, Cambridge economist Nicholas Kaldor asserted that the political problem with cost-benefit analysis—that someone always loses out—wasn’t a problem. This was because the government could theoretically redirect a little money from the winners to the losers, to even things out: For example, if a policy caused corn consumption to drop, the government could redirect the savings to aggrieved farmers. However, it didn’t provide any reason why the government would rebalance the scale, just that it was possible. What is now called the Kaldor-Hicks principle, “is a theory, “ Appelbaum says, “to gladden the hearts of winners: it is less clear that losers will be comforted by the possession of theoretical benefits.” The principle remains the theoretical core of cost-benefit analysis, Appelbaum says. It’s an approach that sweeps the political problems of any policy—what to do about the losers—under the rug.

Of course that becomes harder when you’ve had forty years of billionaire-funded think tanks promoting the idea that any wealth earned in the market is just no matter what; that market distribution is “fair”; that taxes are “punishing the winners”; and that any assistance to the less fortunate will “encourage dependence on big government.” In short, that redistribution is immoral.

Funny how those think-tanks don’t show up in any of the theorems of welfare economics. So much for theorem #2.

And I’m sure that all those people newly unemployed are just waiting to take advantage of the Pareto optimal distributions of free markets to see them through the next few months.

But what do I know? I’m an “outsider.”

“Nothing will fundamentally change” (Real World Economic Review)