A Theory About Stocks

A lot of people have been utterly mystified by the fact that the stock market seems to be going up and up and up, even as unemployment soars to Great Depression levels, pandemics shut down economies across the globe, American cities are in full revolt, the military is on the streets, clouds of locusts stalk Africa, hurricane season approaches, and the world just generally seems to be melting down around us.

How the f*ck can stocks still be going up???

First the obvious statement: stocks are not the economy. But I’m sure you already knew that. As Paul Krugman put it:

[W]henever you consider the economic implications of stock prices, you want to remember three rules. First, the stock market is not the economy. Second, the stock market is not the economy. Third, the stock market is not the economy. That is, the relationship between stock performance — largely driven by the oscillation between greed and fear — and real economic growth has always been somewhere between loose and nonexistent…Did I mention that the stock market is not the economy?

The stock market is simply a casino. Yes, yes, you’ll read in the economic textbooks written by very serious academic scholars using sophisticated academic terms like “capital resource allocation,” and “price discovery,” and all that, as well as about how distinguished gentlemen are doing very, very serious research and making totally rational assumptions about the future needs of society based on prospectuses and sober, realistic assessments of future earnings and capital flows and…blabbity-blabbity-blah.

Don’t believe a word of it—it’s a f*cking casino. That’s the best way to describe what’s going on when you strip away all the economic jargon designed to baffle us into thinking it’s something so sophisticated that us mere mortals with our puny brains cannot possibly understand.

News flash: Casinos aren’t rational!

Now, of course, it’s not just pure dumb luck like spinning a roulette wheel. It’s obvious that some companies have better future prospects than others. You can get information to make informed choices, just as you can count cards to do better at games of blackjack. But the future is inherently unknowable, and stocks are as much bets on the future as they are assessments of the present.

So what’s a stock worth? What someone else is willing to pay for it.

And I hope you’re smart enough not to buy in to economists’ explanation about how this is all totally rational. In fact, as anyone who has had a family member with a chronic gambling addiction in their lives knows all too well, it is precisely when one is gambling and seeking a windfall that one is at their least rational.

Furthermore, stock bubbles have been a persistent phenomenon through the entire history of capitalism, from the Tulip Bubble (where a single tulip bulb equaled a years’ wages), to the Mississippi bubble, to the South Sea Bubble, to railroad mania, the Great Depression, to 2008, and everything in between.

However, I have a theory as to why today’s stock market seems to be so awesomely divorced from the actual world as to seem like it’s on a totally different planet.

In the age of neoliberalism, with wages having been hollowed out for generations, gambling in the stock market is now the only realistic way to make money anymore. Internet gurus like Mr. Money Mustache gain fame and celebrity by telling us we can all get rich by pouring all our money into stonks and retire at 30! (seriously, this is what the guy says). Everywhere on the internet, everyone seems to have morphed overnight into little mini-J.P. Morgans, managing their oh-so complex portfolios, buying and selling their way into the one percent and ready to share their galaxy brain financial knowledge with the rest of us mortals. On Reddit, anyone not gambling in the market is a chump and deserves to starve!

Our only way to retire, we re told, is buying stocks. Even pension funds are invested in stocks. They even tried to put all our Social Security money into the stock market, for crying out loud (which totally wouldn’t have unrealistically inflated stock values at all, oh no!)

Where else is the money gonna go???

Basically neoliberalism has built everything around the edifice of the stock market. Everything is wrapped up in it. Absolutely everything is invested in these imaginary numbers, untethered from reality. It might as well be a f*cking video game score. To that end, we can all just extend and pretend forever. Why the hell not?

Stocks, stocks, stocks, stocks. Is it any wonder the market always goes up?

To be alive in America is to be assaulted by endless high-decibel blather about the critical importance of the stock market. There are entire TV channels devoted to it, new highs are always celebrated on network news, it’s on the front page of newspapers, it’s on an app that comes preinstalled on your iPhone, and the president is constantly yelling at you about it.

Yet the stock market has little direct relevance for regular people. By some estimates, the richest 10 percent of U.S. households account for over 80 percent of American stock ownership. The richest 1 percent by themselves own half of that, or 40 percent of stock. Half of Americans own no stock at all.

Once you understand this, the media’s stock market mania is maddeningly hilarious. It’s as though half of the national news was yammering about the weather in Greenwich, Connecticut. (“Our top story on ABC World News Tonight: This afternoon Greenwich was unseasonably warm.”) And no one notices how bizarre this is.

By contrast, think about economic facts with concrete relevance to the lives of normal people: the unemployment rate, whether the middle class is getting raises, if the minimum wage is going up, strikes, health care, workplace safety. There’s no cable TV ticker about that.

Coronavirus Matters, the Stock Market Doesn’t, and Thinking It Does May Literally Kill Us (The Intercept)

Paul Krugman also points to the lack of alternatives for actual productive investment:

Investors are buying stocks in part because they have nowhere else to go. In fact, there’s a sense in which stocks are strong precisely because the economy as a whole is so weak. What, after all, is the main alternative to investing in stocks? Buying bonds. Yet these days bonds offer incredibly low returns. The interest rate on 10-year U.S. government bonds is only 0.6 percent, down from more than 3 percent in late 2018. If you want bonds that are protected against future inflation, their yield is minus half a percent. So buying stock in companies that are still profitable despite the Covid-19 recession looks pretty attractive.

And why are interest rates so low? Because the bond market expects the economy to be depressed for years to come, and believes that the Federal Reserve will continue pursuing easy-money policies for the foreseeable future. As I said, there’s a sense in which stocks are strong precisely because the real economy is weak.

Crashing Economy, Rising Stocks: What’s Going On? (New York Times via Reddit)

And, of course, the Federal reserve is pumping staggering amounts money into the stock market, buying up assets all over the place. By some measures, several trillion have been spent buying up assets such as stocks, bonds and other securities. Strangely, neither Joe Biden nor Donald Trump ever once asked howyagunnapayforit—they only do that for policies that benefit anyone outside the investor class.

Now, to the point: Nearly all financial trading nowadays is done by bots. That is, computers trading with each other to get the best deal. The technical tern for this is fintech (financial tech).

I tried to find out how long the average stock is held today. I couldn’t find it. I’ve heard everything from four months to 22 seconds. The only point of agreement is that they are being held for shorter and shorter time periods over the years, and trading volume has increased by a big amount (just how big is also hard to discern). But just how short is a mystery. Michael Hudson apparently buys the 22 seconds figure:

Michael Hudson, a former Wall Street economist at Chase Manhattan Bank who also helped establish the world’s first sovereign debt fund recently said: “Take any stock in the United States. The average time in which you hold a stock is – it’s gone up from 20 seconds to 22 seconds in the last year. “Most trades are computerised. Most trades are short-term. The average foreign currency investment lasts – it’s up now to 30 seconds, up from 28 seconds last month. The financial sector is short term, yet they talk as if they’re long term.”

Computerised high-frequency trading, which makes up about 70pc of all trades, is the subject of the book, The Fear Index, published late last year.


However, this Business Insider article disputes some of those figures, citing the original source, but it doesn’t give any alternative figures. I would imagine BI doesn’t want people to start questioning the stock market, and Hudson is very plugged in to the finance and economic worlds, so I think his figures can be trusted. Most financial reporting that us “ordinary people” can find via Google is designed to prop up the legitimacy of the casino by pulling the wool over our eyes.

This is just anecdote, but I was talking with a friend who works in IT. He was talking about someone he knew who worked in tech in the financial sector. He told me that this person moved one of the computers from one office to another office closer to the fiber optic line to shave off 15 femtoseconds from trading speed. Yes, femotoseconds, that’s what he said. To save you the search, a femtosecond is one quadrillionth of a second (10-15)

Now, bots are obviously unaware that the world is melting down around them. How could they? They have only one goal: buy and sell stocks to maximize value—almost like the mythical Paperclip Maximizer of AI paranoia.

What’s the cardinal rule of stock buying? Buy low and sell high.

We humans tend to do the opposite thanks to cognitive biases such as loss aversion and the Bandwagon Effect. We see stocks going up and we want to buy. We see stocks going down and we get spooked, so we sell. That is, we do the opposite of what we should rationally be doing—we tend to buy high and sell low.

So that’s why the financial industry turned to computers.

Computers do not have the irrational biases that fallible humans do. That’s why they are considered better. And that’s why trading is increasingly done by these computers, often fortified with some sort of AI to game the system using algorithms developed by “quants”. I’ve repeatedly heard that the trading pits where you see all those angry, overweight white dudes in ties screaming at each other like a troop of rabid baboons until they’re beet-red in the face, is kept open only as a sort of performance theater for the masses—there’s no actual trading going on there anymore. It’s all run by computers.

It’s hard to get good data on this; there seems to be a lot of secrecy surrounding it. Presumably they need to prop up the legitimacy of the “democratic” stock market for us average rubes.

But if you’re a bot and you’re designed to buy low, what happens when stocks start dropping in price? When the price is dropping, that means stocks are cheaper. When this happens to fairly good (esp. blue-chip) stocks, that means that they are undervalued. So what do you do? You buy!

Of course, you don’t know the real world of actual people and things “out there” is melting down, because you’re just a brain in a box.

So the computer brain in the box just sees “undervalued stocks” and thinks “buy”. And then other bots see this and they buy too. They follow their instructions. And then they sell the stocks to each other. Wash, rinse repeat. Almost like a simulation.

And, voilà, the stock market goes up. Trading goes on as normal. Paradoxically, the lower the prices go, the more undervalued the stocks appear to the bots, and so the more they buy expecting to get a bargain. So buying activity actually increases! And then the bots buy and sell the stocks to each other, until they go back up to more-or-less where they were before, which they read as the “correct” price, because they are just brains in boxes, with no knowledge that meatspace is quickly sliding into the depths of hell.

A video game score is the appropriate analogy here. What “real world” thing is your video game score tied to? Does it care whether you’re sick or unemployed? No, it only cares whether you’re playing, and the number is just a made up number based on the whim of a bunch of computer programmers somewhere.

So, the reason stocks are still so high, in my theory, aside from just the Fed money bazooka, is because trading is done by machines with simple mandates based on number crunching, blissfully unaware of the real world going on outside the box, which is riddled with pandemic disease, increasing violence, insurrection, social breakdown, economic depression, and environmental collapse.

I imagine a time in the far future where, after the oceans have risen and coastal cities drowned, when the Amazon rain forest has been slashed and burned and billions flee parts of the globe too hot for human metabolism, after 70 percent of terrestrial animals have gone extinct and we’re scraping the ocean floor for methane hydrates to run our remaining power stations, the computers will still be happily swapping stocks between each other, trading away in the darkness, unmonitored, sending the Dow to 100,000,000, or whatever imaginary nonsense number it will be in the future.

That’s my theory, anyway. However, I have no specialist knowledge in either trading or fintech, so I might be way off on this. If by some unlikely circumstance, someone with actual inside knowledge of either computerized trading or the stock market happens to read this, please let me know if this theory holds any water or is total bullsh!t. Thanks.


Presented without comment:


Civilization Never Changes

I’m glad I was able to recall where I read this fact:

When humans start treating animals as subordinates, it becomes easier to do the same thing to one another. The first city-states in Mesopotamia were built on this principle of transferring methods of control from creatures to human beings, according to the archaeologist Guillermo Algaze at the University of California in San Diego. Scribes used the same categories to describe captives and temple workers as they used for state-owned cattle.

How domestication changes species, including the human (Aeon)

Because it sets this up perfectly:

Do I even need to comment? Plus ça change, plus c’est la même chose…

Have We Entered the Ages of Discord?

Peter Turchin, of Secular Cycles fame, predicted that political violence and discord in the United States would reach a peak in 2020 (or thereabouts). He even put that prediction in writing in a book entitled Ages of Discord:

In 2010 I made the prediction that the United States will experience a period of heightened social and political instability during the 2020s…Structural-demographic theory (SDT) suggests that the violence spike of the 2020s will be worse than the one around 1970, and perhaps as bad as the last big spike during the 1920s. Thus, the expectation is that there will be more than 100 events per 5 years. In terms of the second metric, we should expect more than 5 fatalities per 1 million of population per 5 years, if the theory is correct.

And there you have it. If violence doesn’t exceed these thresholds by 2025, then SDT is wrong.

A Quantitative Prediction for Political Violence in the 2020s (Cliodynamica)

And the 1970s were pretty bad. From a review of Ages of Discord at Slate Star Codex:

The 1970s underground wasn’t small. It was hundreds of people becoming urban guerrillas. Bombing buildings: the Pentagon, the Capitol, courthouses, restaurants, corporations. Robbing banks. Assassinating police. People really thought that revolution was imminent, and thought violence would bring it about.

Book Review: Ages Of Discord (Slate Star Codex)

See also Coronavirus and Our Age of Discord (Cliodynamica)

There are several general trends during the pre-crisis phase that make the rise and spread of pandemics more likely. At the most basic level, sustained population growth results in greater population density, which increases the basic reproduction number of nearly all diseases. Even more importantly, labor oversupply, resulting from overpopulation, depresses wages and incomes for most. Immiseration, especially its biological aspects, makes people less capable of fighting off pathogens. People in search of jobs move more and increasingly concentrate in the cities, which become breeding grounds for disease. Because of greater movement between regions, it is easy for disease to jump between cities.

Elites, who enjoy growing incomes resulting from low worker wages, spend them on luxuries, including exotic ones. This drives long-distance trade, which more tightly connects distant world regions. My 2008 article is primarily about this process, which we call “pre-modern globalizations.” As a result, a particularly aggressive pathogen arising in, for example, China, can rapidly jump to Europe.

Finally, when the crisis breaks out, it brings about a wave on internal warfare. Marauding armies of soldiers, rebels, and brigands, themselves become incubators of disease that they spread widely as they travel through the landscape.

This description is tailored to pre-modern (and early modern) Ages of Discord. Today, in 2020, details are different. But the main drivers — globalization and popular immiseration — are the same…

Right now Turchin is starting to look like Nostradamus. He hasn’t addressed this so far on his blog, but I’m interested to hear his take.

One thing I wonder about though: the police state is so much more powerful than it was in the 1970s due to digital surveillance technology. I mean, everyone carries around a device that tracks all their movements all the time, and the few who don’t will be noticeable by their absence. Cameras are everywhere. Our online presence is constantly monitored, as Edward Snowden revealed (I don’t think it’s a coincidence that laws prohibiting government monitoring of the citizenry have been repealed just in the last few weeks). Plus, the systems of cybernetic control for managing large populations are so much more sophisticated, as Adam Curtis described in All Watched Over By Machines of Loving Grace. I think these cybernetic systems also foment discord as well, since they allow segments of the population to live in completely separate realities managed by different sets of elites—there is no consensus reality anymore, as responses to the pandemic showed.

But it does seem like an alarming number of people have been disenfranchised and have no constructive outlet for their anger, and no effective recourse for changing the system anymore. Add to that Great Depression levels of popular immiseration while elites are being bailed out with unlimited funds. This is what happens when you make peaceful revolution impossible—violent revolution becomes inevitable.

UPDATE: Turchin’s latest post (June 1)

What is much more certain is that the deep structural drivers for instability continue to operate unabated. Worse, the Covid-19 pandemic exacerbated several of these instability drivers. This means that even after the current wave of indignation, caused by the killing of George Floyd, subsides, there will be other triggers that will continue to spark more fires—as long as the structural forces, undermining the stability of our society, continue to provide abundant fuel for them.

Archaeology/Anthropology Roundup

I want to get back to some of the topics I’ve left hanging, but first I’d like to mention a few other topics that have been sadly neglected during the whole—er, pandemic thing—but that we frequently discuss here on the blog. Specifically archaeology and architecture. This one will be about archaeology.

I want to highlight something that came out about a month ago that you’re probably aware of. If not, here it is: the Amazon rain forest has been found to be one of the cradles of agriculture.

The original cradles of agriculture described in history textbooks were the great river valley of Mesopotamia between the Tigris and Euphrates rivers, along with the Nile valley. As archaeology expanded from its European origins, the Indus river valley in India/Pakistan and the Yellow river valley in China were included as cradles of agriculture. Then came New World sources of maize and potatoes in Central and South America. In recent years, archaeologists have included a few other places, notably Papua New Guinea. Now, it seems we can add the Amazon rain forest to the list:

There’s a small and exclusive list of places where crop cultivation first got started in the ancient world – and it looks as though that list might have another entry, according to new research of curious ‘islands’ in the Amazon basin.

The savannah of the Llanos de Moxos in northern Bolivia is littered with thousands of patches of forest, rising a few feet above the surrounding wetlands. Many of these forest islands, as researchers call them, are thought to be the remnants of human habitation from the early and mid-Holocene.

Now, thanks to new analysis of the sediment found in some of these islands, researchers have unearthed signs that these spots were used to grow cassava (manioc) and squash a little over 10,000 years ago.

That’s impressive, as this timing places them some 8,000 years earlier than scientists had previously found evidence for, indicating that the people who lived in this part of the world – the southwestern corner of the Amazon basin – got a head start on farming practices.

In fact, the findings suggest that southwestern Amazonia can now join China, the Middle East, Mesoamerica, and the Andes as one of the areas where organised plant growing first got going – in the words of the research team, “one of the most important cultural transitions in human history”.

Strange Forest Patches Littering The Amazon Point to Agriculture 10,000 Years Ago (Science Alert)

The researchers were able to identify evidence of manioc (cassava, yuca) that were grown 10,350 years ago. Squash appears 10,250 years ago, and maize more recently – just 6,850 years ago.

“This is quite surprising,” said Dr [Umberto] Lombardo. “This is Amazonia, this is one of these places that a few years ago we thought to be like a virgin forest, an untouched environment. Now we’re finding this evidence that people were living there 10,500 years ago, and they started practising cultivation.”

The people who lived at this time probably also survived on sweet potato and peanuts, as well as fish and large herbivores. The researchers say it’s likely that the humans who lived here may have brought their plants with them.They believe their study is another example of the global impact of the environmental changes being felt as the world warmed up at the end of the last ice age.

“It’s interesting in that it confirms again that domestication begins at the start of the Holocene period, when we have this climate change that we see as we exit from the ice age,” said Dr Lombardo. “We entered this warm period, when all over the world at the same time, people start cultivating.”

Crops were cultivated in regions of the Amazon ‘10,000 years ago’ (BBC)

Note that what is grown appears to be vegetable plants like cassava, yucca and squash, and not cereal grains. Recall James Scott’s point that annual cereal grains were a starting point for civilizations, as they were preservable and ripened at the same rate at the same time, making them confiscatable and by central authorities. Cultures that subsisted on perishable garden plants, however, could escape the trap of civilization.

Here’s a major study that ties into the feasting theory: the first beer was brewed a part of funerary rites for the dead:

The first beer was for the dead. That’s according to a 2018 study of stone vessels from Raqefet Cave in Israel, a 13,000-year-old graveyard containing roughly 30 burials of the Natufian culture. On three limestone mortars, archaeologists found wear and tear and plant molecules, interpreted as evidence of alcohol production. Given the cemetery setting, researchers propose grog was made during funerary rituals in the cave, as an offering to the dearly departed and refreshment for the living. Raqefet’s beer would predate farming in the Near East by as much as 2,000 years — and booze production, globally, by some 4,000 years.

The beer hypothesis, published in the Journal of Archaeological Science: Reports, comes from Raqefet excavators, based at Israel’s University of Haifa, and Stanford University scientists, who conducted microscopic analyses. In previous research, they made experimental brews the ancient way, to see how the process altered artifacts. Some telltale signs were then identified on Raqefet stones: A roughly 10-inch diameter mortar, carved directly into the cave floor, had micro-scratches — probably from a wooden pestle — and starch with damage indicative of mashing, heating and fermenting, all steps in alcohol production. Two funnel-shaped stones had traces of cereals, legumes and flax, interpreted as evidence that they were once lined with woven baskets and used to store grains and other beer ingredients. Lead author Li Liu thinks Natufians also made bread, but that these three vessels were for beer — the earliest yet discovered.

Was the First Beer Brewed for the Dead? (Discover)

The counterpoint is that they were baking bead instead, leading back to the old question: what were grains first cultivated for, beer or bread? My suspicion is the former, with the latter being an effective use of “surplus” resources, or a backup strategy in the case of food shortages.

The connection between beer-brewing and funerary rites is significant, however. The feasting theory of inequality’s origins doesn’t go into much detail about why such feasts were held. But if such rituals feasts were held as a means of commemorating the dead—most likely tied to ancestor worship—then the existence of such events takes on additional importance.

When I talked about the history of cities and the feasting theory, I noted that these seem to have taken place in ritual areas that were marked off (sacred versus profane) for the purposes of feasting and trade, and where multiple different cultures would coalesce and mingle. At such locations, both feasting and trading were carried out. These locations appear to have played a crucial role in human social development, and they’ve been found all over the world. Archaeologists have been studying one in Florida:

More than a thousand years ago, people from across the Southeast regularly traveled to a small island on Florida’s Gulf Coast to bond over oysters, likely as a means of coping with climate change and social upheaval.

Archaeologists’ analysis of present-day Roberts Island, about 50 miles north of Tampa Bay, showed that ancient people continued their centuries-long tradition of meeting to socialize and feast, even after an unknown crisis around A.D. 650 triggered the abandonment of most other such ceremonial sites in the region. For the next 400 years, out-of-towners made trips to the island, where shell mounds and a stepped pyramid were maintained by a small group of locals. But unlike the lavish spreads of the past, the menu primarily consisted of oysters, possibly a reflection of lower sea levels and cool, dry conditions.

During tough times, ancient ‘tourists’ sought solace in Florida oyster feasts (Phys.org)

So I guess Florida has always been a magnet for tourists.

And although Stonehenge is well-known, much less known is Pömmelte, “Germany’s Stonehenge”.

Starting in April, an about-4,000-year-old settlement will be excavated to provide insights into Early Bronze Age life. Settlements of this size have not yet been found at the related henges in the British Isles.

Pömmelte is a ring-shaped sanctuary with earth walls, ditches and wooden piles that is located in the northeastern part of Germany, south of Magdeburg. The site is very much reminiscent of the world-famous monument Stonehenge, and it is likely that the people there performed very similar rituals to those of their counterparts in what is now Britain 4,300 years ago.

Who lived near Pömmelte, the ‘German Stonehenge’? (DW)

This place reminds me a lot of Woodhenge at the Cahokia complex (Wikipedia), which I was able to visit a few years ago. The presence of such similar structures separated across vast times and places (precluding any chance of cultural contact) is something that we need to think deeply about.

From the article above, I also learned about the Nebra Sky Disc (Wikipedia). Recall that the first cities were trying to replicate a “cosmic order” here on earth.

Related: Hunter-gatherer networks accelerated human evolution (Science Daily)

Humans began developing a complex culture as early as the Stone Age. This development was brought about by social interactions between various groups of hunters and gatherers, a UZH study has now confirmed…

The researchers equipped 53 adult Agta living in woodland in seven interconnected residential camps with tracking devices and recorded every social interaction between members of the different camps over a period of one month. The researchers also did the same for a different group, who lived on the coast….The team of researchers then developed a computer model of this social structure and simulated the complex cultural creation of a plant-based medicinal product.

In this fictitious scenario, the people shared their knowledge of medicinal plants with every encounter and combined this knowledge to develop better remedies. This process gradually leads to the development of a highly effective new medicinal product. According to the researchers’ simulation, an average of 250 (woodland camps) to 500 (coastal camps) rounds of social interactions were required for the medicinal product to emerge.

And see: Social Networks and Cooperation in Hunter-Gatherers (NCBI)

A lesser-known megalithic necropolis: the Ħal Saflieni Hypogeum (Wikpedia) 5,000 years ago. Do these look like they were built by people who were filthy and starving?

Related: I only recently heard about this site, but apparently there was a significant industrial complex devoted to the manufacture of flint tools that functioned during the stone age, and well into the Bronze and Iron ages: Grimes Graves (Wikipedia). This gives great insight into the fact that complex specialization of labor and regional comparative advantage have always been with us; they weren’t invented at the time of Smith or Ricardo. We just didn’t fetishize them the way we do now.

And the salt mines of Hallstatt in modern-day Germany have been used for thousands of years since the Bronze Age as well. Apparently, mining required child labor:

Mining there began at least 7,000 years ago and continues modestly today. That makes the UNESCO World Heritage site “the oldest industrial landscape in the world [that’s] still producing,” says [archaeologist Hans] Reschreiter, who has led excavations at Hallstatt for nearly two decades.

But the mine’s peak was during the Bronze and Iron ages, when salt’s sky-high value made Hallstatt one of Europe’s wealthiest communities. Archaeologists understand a great deal about operations then, thanks to an extraordinary hoard of artifacts including leather sacks, food scraps, human feces and millions of used torches.

Many of the finds are made of perishable materials that are usually quick to decay. They survived in the mine’s tunnels because salt is a preservative — the very reason it was in such high demand during Hallstatt’s heyday.

Among the artifacts, the small shoes and caps showed children were in the mine. But researchers needed more evidence to determine whether the young ones were merely tagging along with working parents or actually mining.

To understand the children’s roles, Austrian Academy of Sciences anthropologist Doris Pany-Kucera turned to their graves. In a study of 99 adults from Hallstatt’s cemetery, she found skeletal markers of muscle strain and injury, suggesting many villagers performed hard labor — some from an early age.

Then, in 2019, she reported her analysis of the remains of 15 children and teenagers, finding signs of repetitive work. Children as young as 6 suffered arthritis of the elbow, knee and spine. Several had fractured skulls or were missing bits of bone, snapped from a joint under severe strain. Vertebrae were worn or compressed on all individuals.

Combining clues from the Hallstatt bones and artifacts, researchers traced the children’s possible contributions to the salt industry. They believe the youngest children — 3- to 4-year-olds — may have held the torches necessary for light. By age 8, kids likely assumed hauling and crawling duties, carrying supplies atop their heads or shimmying through crevices too narrow for grown-ups…

The Ancient Practice of Child Labor Is Coming to Light (Discover)

Add this point is important:

It’s no surprise that the young labored at Hallstatt. Children are, and always have been, essential contributors to community and family work. A childhood of play and formal education is a relatively modern concept that even today exists mostly in wealthy societies.

There are those who say that, despite all our technological advancements, we haven’t really reduced the need for human labor. But that’s clearly untrue! We’ve already effectively eliminated the labor of everyone under 18, and from a practical standpoint, nearly everyone over 21. We just forget it because it’s been normalized, but people younger than 18 have labored all throughout human history, even into the early twentieth century. Now they are no longer needed or wanted. And with ever more schooling required for jobs, we’re just increasing the age requirement to enter the workforce. Note that “retirement”—to the extent that it continues to exist—is also a modern phenomenon, eliminating people over 55/60 from the workforce. Labor has most certainly been eliminated, and will continue to be.

Neanderthals and humans co-existed in Europe much longer than we previously thought. (Guardian)

A reminder that many of the earliest human habitats are under the water: Early humans thrived in this drowned South African landscape (Phys.org)

Archaeologists analyzed an ancient cemetery in Hungary, with the distinctly unique elongated skulls the Huns were known for:

They found that Mözs-Icsei dűlő was a remarkably diverse community and were able to identify three distinct groups across two or three generations (96 burials total) until the abandonment of Mözs cemetery around 470 AD: a small local founder group, with graves built in a brick-lined Roman style; a foreign group of twelve individuals of similar isotopic and cultural background, who appear to have arrived around a decade after the founders and may have helped establish the traditions of grave goods and skull deformation seen in later burials; and a group of later burials featuring mingled Roman and various foreign traditions.

51 individuals total, including adult males, females, and children, had artificially deformed skulls with depressions shaped by bandage wrappings, making Mözs-Icsei dűlő one of the largest concentrations of this cultural phenomenon in the region. The strontium isotope ratios at Mözs-Icsei dűlő were also significantly more variable than those of animal remains and prehistoric burials uncovered in the same geographic region of the Carpathian Basin, and indicate that most of Mözs’ adult population lived elsewhere during their childhood. Moreover, carbon and nitrogen isotope data attest to remarkable contributions of millet to the human diet.

Deformed skulls in an ancient cemetery reveal a multicultural community in transition (Phys.org)

See also: Strange, elongated skulls reveal medieval Bulgarian brides were traded for politics (Science)

Speaking of burials: Researchers found 1,000 year old burials in Siberia wearing copper masks: Mummified by accident in copper masks almost 1,000 years ago: but who were they? (Siberian Times) I thought this was fascinating, due to the fact that copper has been shown to kill Coronaviruses, and we have been told to wear masks to prevent transmission. Copper-infused masks are becoming popular (a Google search turned up the above article). Coincidence? Probably.

Religion in South America:

An ancient group of people made ritual offerings to supernatural deities near the Island of the Sun in Lake Titicaca, Bolivia, about 500 years earlier than the Incas, according to an international team of researchers. The team’s findings suggest that organized religion emerged much earlier in the region than previously thought.

Rise of religion pre-dates Incas at Lake Titicaca (phys.org)

This is possibly the coolest scientific study ever conducted: a group of scientists have reconstructed Bronze Age fighting techniques by looking at the wear marks on Bronze Age weapons and armor. Wow! Time to redo that famous fight scene from Troy?

While a graduate student at Newcastle University, [University of Göttingen archaeologist Raphael Hermann] recruited members of a local club devoted to recreating and teaching medieval European combat styles, and asked them to duel with the replicas, using motions found in combat manuals written in the Middle Ages. After recording the combat sequences using high-speed cameras, the researchers noted the type and location of dents and notches left after each clash.

The team assigned characteristic wear patterns to specific sword moves and combinations. If the motions left the same distinctive marks found on Bronze Age swords, Hermann says, it was highly likely that Bronze Age warriors had also used those moves. For example, marks on the replica swords made by a technique known to medieval German duelists as versetzen, or “displacement”—locking blades in an effort to control and dominate an opponent’s weapon—were identical to distinct bulges found on swords from Bronze Age Italy and Great Britain.

Next, Hermann and colleagues put 110 Bronze Age swords from Italy and Great Britain under a microscope and cataloged more than 2500 wear marks. Wear patterns were linked to geography and time, suggesting distinct fighting styles developed over centuries… Displacement, for example, didn’t show up until 1300 B.C.E. and appeared in Italy several centuries before it did in Great Britain.

“In order to fight the way the marks show, there has to be a lot of training involved,” Hermann says. Because the marks are so consistent from sword to sword, they suggest different warriors weren’t swinging at random, but were using well-practiced techniques. Christian Horn, an archaeologist at the University of Gothenburg who was not involved in the research, agrees, and says the experiments offer quantitative evidence of things archaeologists had only been able to speculate about.

Sword-wielding scientists show how ancient fighting techniques spread across Bronze Age Europe (Science Magazine)

This is also important from a historical standpoint: it indicates that the Bronze Age likely saw the rise of a class of professional fighters, as opposed to the all-hands-on-deck mêlée fighting style of all adult males that probably characterized Stone Age warfare. Because fighting became “professionalized” due to the existence of these bronze weapons–which required extensive training to use effectively—the use of force passed into the hands of a specialist warrior caste who were able to impose their will on lesser-armed populations.

This probably explains at least some of the origins of inequality, as those who specialized in the use of violence (as opposed to farming or trading) could then perforce become a ruling class. Inequality always rises when the means of force become confined to a specific class of people. Note also that money in coined form was first invented to pay specialist mercenaries in the Greek states of Asia Minor. These mercenaries were likely the ones who were training in the intensive combat techniques described by the study above.

Related: Medieval battles weren’t as chaotic as people think nor as movies portray! (Reddit) Given how humans react to violence psychologically, how would medieval battles really look, as opposed to the battle scenes depicted in movies? (Hint: not like a mosh pit)

Possibly related: : Modern men are wimps, according to new book (Phys.org). Controversial, but likely correct; our ancestors had much more physical lives and the less fit would not have reproduced as well. My unprovable notion is that we became so effective at warfare that the most violent people would have died off in these types of conflicts, leading to more placid people having a reproductive advantage. Thus, we become less violent over time.

Definitely related: What Compelled the Roman Way of Warfare? Killing for the Republic (Real Clear Defense)

Any polity can field an army through compulsion or other violent means. What matters more is what makes your average person choose to stay on the battlefield. [Steele] Brand argues the Roman Republic motivated its soldiers by publicly honoring at all times the initiative, strength, discipline, perseverance, courage, and loyalty of individual citizens. Moreover, it was this combination of public and private values, flexible political institutions, and a tailored upbringing that gradually culminated in the superiority of the Roman legion against the arguably technically superior Macedonian phalanx at Pydna. Brand calls the entirety of this system “civic militarism,” defined as “self defense writ large for the state.”

Paging Dr. Julian Jaynes: Majority of authors ‘hear’ their characters speak, finds study (Guardian). See also The Origin of Consciousness Reading Companion Part 1 (Put a Number On It)

Collaspe files:

…a new movement called “collapsology”—which warns of the possible collapse of our societies as we know them—is gaining ground.

With climate change exposing how unsustainable the economic and social model based on fossil fuels is, they fear orthodox thinking may be speeding us to our doom.

The theory first emerged from France’s Momentum Institute, and was popularised by a 2015 book, “How Everything Can Collapse”. Some of its supporters, like former French environment minister Yves Cochet, believe the coronavirus crisis is another sign of impending catastrophe.

While the mathematician, who founded France’s Green party, “still hesitates” about saying whether the virus will be the catalyst for a domino effect, he quoted the quip that “it’s too early to say if it’s too late”.

Yet Cochet—whose book “Before the Collapse” predicts a meltdown in the next decade—is convinced that the virus will lead to “a global economic crisis of greater severity than has been imagined”.

The 74-year-old, who retired to France’s rural Brittany region so he could live more sustainably, is also worried about an impending “global disaster with lots of victims, both economic and otherwise”.

“What is happening now is a symptom of a whole series of weaknesses,” warned Professor Yves Citton of Paris VIII University.

“It isn’t the end of the world but a warning about something that has already been set in motion,” he told AFP, “a whole series of collapses that have begun”.

The slide may be slow, said Jean-Marc Jancovici, who heads the Shift Project think-tank which aims to “free economics from carbon”.

But “a little step has been taken (with the virus) that there is no going back”, he argued.

Others have a more chilling take.

“The big lesson of history… and of the Horsemen of the Apocalypse is that pestilence, war and famine tend to follow in each others’ wake,” said Pablo Servigne, an ecologist and agricultural engineer who co-wrote “How Everything Can Collapse”.

“We have a pandemic which could lead to another shock—wars, conflicts and famines,” he added.

“And famines will make us more vulnerable to other pandemics.”

‘Collapsology’: Is this the end of civilisation as we know it? (Phys.org)

The last ice age (or Last Glacial Maximum) peaked around 26,000 years ago. The earth warmed over the coming millennia, driven by an increase in radiation from the sun due to changes in the earth’s orbit (the Milankovic cycles) amplified by CO₂ released from warming water, which further warmed the atmosphere.

But even as the earth warmed it was interrupted by cooler periods known as “stadials”. These were caused by melt water from melting ice sheets which cool large regions of the ocean.

Marked climate variability and extreme weather events during the early Holocene retarded development of sustainable agriculture.

Sparse human settlements existed about 12,000 – 11,000 years ago. The flourishing of human civilisation from about 10,000 years ago, and in particular from 7,000 years ago, critically depended on stabilisation of climate conditions which allowed planting and harvesting of seed and growing of crops, facilitating growth of villages and towns and thereby of civilisation.

Peak warming periods early in the Holocene were associated with prevalence of heavy monsoons and heavy floods, likely reflected by Noah’s ark story.

The climate stabilised about 7,000 – 5,000 years ago. This allowed the flourishing of civilisations along the Nile, Tigris, Euphrates, Indus and the Yellow River.

The ancient river valley civilisations cultivation depended on flow and ebb cycles, in turn dependent on seasonal rains and melting snows in the mountain sources of the rivers. These formed the conditions for production of excess food.

When such conditions declined due to droughts or floods, civilisations collapsed. Examples include the decline of the Egyptian, Mesopotamian and Indus civilisations about 4,200 years ago due to severe drought.

Throughout the Holocene relatively warm periods, such as the Medieval Warm Period (900-1200 AD), and cold periods, such as the Little Ice Age (around 1600 – 1700 AD), led to agricultural crises with consequent hunger, epidemics and wars. A classic account of the consequences of these events is presented in the book Collapse by Jared Diamond.

It’s not just Middle Eastern civilisations. Across the globe and throughout history the rise and fall of civilisations such as the Maya in Central America, the Tiwanaku in Peru, and the Khmer Empire in Cambodia, have been determined by the ebb and flow of droughts and floods.

Greenhouse gas levels were stable or declined between 8,000-6,000 years ago, but then began to rise slowly after 6,000 years ago. According to William Ruddiman at the University of Virginia, this rise in greenhouse gases was due to deforestation, burning and land clearing by people. This stopped the decline in greenhouse gases and ultimately prevented the next ice age. If so, human-caused climate change began much earlier than we usually think.

Rise and fall in solar radiation continued to shift the climate. The Medieval Warm Period was driven by an increase in solar radiation, while the Little Ice Age was caused at least in part by a decrease.

Now we’ve changed the game again by releasing over 600 billion tonnes of carbon into the atmosphere since the Industrial Revolution, raising CO₂ concentrations from around 270 parts per million to about 400 parts per million…

Climate and the rise and fall of civilizations: a lesson from the past (The Conversation)

Lewis Mumford on the Medieval City

One of my favorite passages from The Culture of Cities (pp. 49-51):

In the main, then, the medieval town was not merely a vital social environment: it was likewise adequate, at least to a greater degree than one would gather from its decayed remains, on the biological side. There were smoky rooms to endure; but there was also perfume in the garden behind the burgher’s house: the fragrant flowers and the savory herbs were widely cultivated. There was the smell of the barnyard in the street, diminishing in the sixteenth century, except for the growing presence of horses: but there would also be the odor of flowering orchards in the spring, or the scent of the new mown hay, floating across the fields in the early summer.

Though cockneys my wrinkle their noses at this combination of odors, no lover of the country will be put off by the smell of horse-dung or cow-dung, even though mingled occasionally with that of human excrement: is the reek of gasoline exhaust, the sour smell of a subway crowd, the pervasive odor of a garbage dump, or the chlorinated rankness of a public lavatory more gratifying? Even in the matter of smells, sweetness is not entirely on the side of the modern city.

As for the eye and ear, there is no doubt where the balance of advantage goes: the majority of medieval towns were infinitely superior to those erected during the last century. One awoke in the medieval town to the crowing of the cock, the chirping of the birds nesting under the eaves, or to the tolling of the hours in the monastery on the outskirts, perhaps to the chime of bells in the new bell-tower. Song rose easily on the lips, from the plain chant of the monks to the refrains of the ballad singer in the market place, or that of the apprentice and the house-maid at work. As late as the seventeenth century, the ability to hold a part in a domestic choral song was rated by Pepys as an indispensable quality in a new maid.

There were work songs distinct for each craft, often composed to the rhythmic tapping or hammering of the craftsman himself. Fitz-Stephens reported in the twelfth century that the sound of the water mill was a pleasant one among the green fields of London. At night there would be complete silence, but for the stirring of animals and the calling of hours by the town watch. Deep sleep was possible in the medieval town, untainted by either human or mechanical noises.

If the ear was stirred, the eye was even more deeply delighted. The craftsman who had walked through the fields and woods on holiday, came back to his stone-carving or his wood-working with a rich harvest of impressions to be transferred to his work. The buildings, so far from being “quaint,” were as bright and clean as a medieval illumination, often covered with whitewash, so that all the colors of the image makers in paint or glass or polychromed wood would dance on the walls, even as the shadows quivered like sprays of lilac on the facades of the more richly carved buildings. (Patina and picturesqueness were the results of time’s oxidation: not original attributes of the architecture.)

Common men thought and felt in images, far more than in the verbal abstractions used by scholars: esthetic discipline might lack a name, but its fruit were everywhere visible. Did the citizens of Florence vote as to the type of column that was to be used on the Cathedral? Image makers carved statues, painted triptychs, decorated the walls of the cathedral, the guild hall, the town hall, the burgher’s house: color and design were everywhere the normal accomplishment of the practical daily tasks.

There was visual excitement in the array of goods in the open market: velvets and brocades, copper and shining steel, tooled leather and brilliant glass, to say nothing of foods arranged in their panniers under the open sky. Wander around the survivals of these medieval markets today. Whether they be as drab as the Jews’ Market in Whitechapel, or as spacious as that on the Plain Palais at Geneva, they will still have some of the excitement of their medieval prototypes.


The History of Pandemics

With the global disruption of COVID-19, there have been a number of stories in news outlets documenting the history of past pandemics in an effort to make sense of it all. One name that has come up frequently is Walter Scheidel. The Stanford University historian wrote a book some years ago which acquired a great deal of attention called “The Great Leveler.” In it, he contended that only catastrophes reduced wealth and income inequality, without which it would grow without bound. One recurring leveler was plagues and pandemics (along with war, famine, collapse and political revolution).

I reviewed that book in a series of three posts:

The Great Leveler Review (Part One)

The Great Leveler Review (Part Two)

The Great Leveler Review (Part Three)

I’ve gone back and cleaned up the typos (the ones I found, anyway). I think these posts are actually quite good (and I’m a terrible critic of myself), so they’re most likely worth a reread (if I do say so myself).

Here’s Scheidel himself writing in The New York Times summarizing the leveling effect he found during pandemics:

…as successive waves of plague shrank the work force, hired hands and tenants “took no notice of the king’s command,” as the Augustinian clergyman Henry Knighton complained. “If anyone wanted to hire them he had to submit to their demands, for either his fruit and standing corn would be lost or he had to pander to the arrogance and greed of the workers.”

As a result of this shift in the balance between labor and capital, we now know…that real incomes of unskilled workers doubled across much of Europe within a few decades. According to tax records that have survived in the archives of many Italian towns, wealth inequality in most of these places plummeted.

In England, workers ate and drank better than they did before the plague and even wore fancy furs that used to be reserved for their betters. At the same time, higher wages and lower rents squeezed landlords, many of whom failed to hold on to their inherited privilege. Before long, there were fewer lords and knights, endowed with smaller fortunes, than there had been when the plague first struck…

In all of these cases, he notes, the elites pushed back. They weren’t content with their “lessers” having a greater share of the pie (which is, after all, why they were elites):

In late medieval Eastern Europe, from Prussia and Poland to Russia, nobles colluded to impose serfdom on their peasantries to lock down a depleted labor force. This altered the long-term economic outcomes for the entire region: Free labor and thriving cities drove modernization in Western Europe, but in the eastern periphery, development fell behind.

Farther south, the Mamluks of Egypt, a regime of foreign conquerors of Turkic origin, maintained a united front to keep their tight control over the land and continue exploiting the peasantry. The Mamluks forced the dwindling subject population to hand over the same rent payments, in cash and kind, as before the plague. This strategy sent the economy into a tailspin as farmers revolted or abandoned their fields.

The elite pushback often failed in the short-term:

…more often than not, repression failed. The first known plague pandemic in Europe and the Middle East, which started in 541, provides the earliest example. Anticipating the English Ordinance of Laborers by 800 years, the Byzantine emperor Justinian railed against scarce workers who “demand double and triple wages and salaries, in violation of ancient customs” and forbade them “to yield to the detestable passion of avarice” — to charge market wages for their labor. The doubling or tripling of real incomes reported on papyrus documents from the Byzantine province of Egypt leaves no doubt that his decree fell on deaf ears…

During the Great Rising of England’s peasants in 1381, workers demanded, among other things, the right to freely negotiate labor contracts. Nobles and their armed levies put down the revolt by force, in an attempt to coerce people to defer to the old order. But the last vestiges of feudal obligations soon faded. Workers could hold out for better wages, and landlords and employers broke ranks with one another to compete for scarce labor.

And yet, in the long-term, people ended up no better off than they had started:

None of these stories had a happy ending for the masses. When population numbers recovered after the plague of Justinian, the Black Death and the American pandemics, wages slid downward and elites were firmly back in control. Colonial Latin America went on to produce some of the most extreme inequalities on record. In most European societies, disparities in income and wealth rose for four centuries all the way up to the eve of World War I. It was only then that a new great wave of catastrophic upheavals undermined the established order, and economic inequality dropped to lows not witnessed since the Black Death, if not the fall of the Roman Empire.

Why the Wealthy Fear Pandemics (NYTimes)

Past pandemics redistributed income between the rich and poor, according to Stanford historian (Stanford News)

Black Death historian: ‘A coronavirus depression could be the great leveller’ (Guardian)

Can a pandemic remake society? A historian explains. (Vox)

Here are some other pages from history, in somewhat chronological order:

White and Mordechai focused their efforts on the city of Constantinople, capital of the Roman Empire, which had a comparatively well-described outbreak in 542 CE. Some primary sources claim plague killed up to 300,000 people in the city, which had a population of some 500,000 people at the time. Other sources suggest the plague killed half the empire’s population. Until recently, many scholars accepted this image of mass death. By comparing bubonic, pneumonic, and combined transmission routes, the authors showed that no single transmission route precisely mimicked the outbreak dynamics described in these primary sources.

New call to examine old narratives: Infectious disease modeling study casts doubt on the Justinianic Plague’s impact (Phys.org)

Heraclitus compared our lot to beasts, winos, deep sleepers and even children – as in, “Our opinions are like toys.” We are incapable of grasping the true logos. History, with rare exceptions, seems to have vindicated him.

There are two key Heraclitus mantras.

1) “All things come to pass according to conflict.” So the basis of everything is turmoil. Everything is in flux. Life is a battleground. (Sun Tzu would approve.)

2) “All things are one.” This means opposites attract. This is what Heraclitus found when he went tripping inside his soul – with no help of lysergic substances. No wonder he faced a Sisyphean task trying to explain this to us, mere children.

And that brings us to the river metaphor. Everything in nature depends on underlying change. Thus, for Heraclitus, “as they step into the same rivers, other and still other waters flow upon them.” So each river is composed of ever-changing waters.

‘It is disease that makes health sweet and good’ (Asia Times)

Despite the lack of healthcare and public health measures as we understand them – and we will never know how many plague victims died of neglect, hunger and thirst, or of secondary infections – the plague in medieval England, and Western Europe as a whole, was mediated by a system of research, intellectual authority and technical countermeasures.

But that system was religious, based on the Christian church’s management of the passage of souls from this earth to the next world. The forerunner of the modern emergency vehicle was the bell of the priest’s attendants, advising the dying that relief was at hand, in the form of an expert trained and qualified to take confession and administer the other sacraments that would ensure safe passage, if not to heaven, at least to purgatory.

The dividing line between rich and poor wasn’t so much access to drugs or the best doctors as to post-mortem religious services: the prayers, candles, masses and chantries that were meant to speed the dead to a better hereafter. The technical emergencies the authorities faced weren’t shortages of hospital beds and doctors but of candle wax and confessors. Priests were not immune to the plague.

‘Emergency’, or its Latin equivalent, was the word used by the bishop of Bath and Wells in January 1349, six months after the plague began in England, when he broadcast an urgent message to his flock via the surviving parish priests in his diocese. ‘We understand,’ he wrote, ‘that many people are dying without the sacrament of penance, because they do not know what they ought to do in such an emergency and believe that even in an emergency confession of their sins is no use or worth unless made to [an ordained] priest.’ What they had to do, he told them, was ‘make confession of their sins, according to the teaching of the apostle, to any lay person, even to a woman if a man is not available.’

In 1348 (London Review of Books)

It’s hard to keep a virulent disease down. The first and biggest burst of plague lasted from the late 1340s until about 1353. Just as the world started thinking things were getting back to normal, another wave hit in 1360. After that there were new waves every 10 years or so. Europe’s population didn’t get back to pre-plague levels for a century and a half.

We’ve come a long way since the Black Death (Asia Times)

Quarantining was invented during the first wave of bubonic plague in the 14th century, but it was deployed more systematically during the Great Plague. Public servants called searchers ferreted out new cases of plague, and quarantined sick people along with everyone who shared their homes. People called warders painted a red cross on the doors of quarantined homes, alongside a paper notice that read “LORD HAVE MERCY UPON US.” (Yes, the all-caps was mandatory).

The government supplied food to the housebound. After 40 days, warders painted over the red crosses with white crosses, ordering residents to sterilize their homes with lime. Doctors believed that the bubonic plague was caused by “smells” in the air, so cleaning was always recommended. They had no idea that it was also a good way to get rid of the ticks and fleas that actually spread the contagion.

Of course, not everyone was compliant. Legal documents at the U.K. National Archives show that in April 1665, Charles II ordered severe punishment for a group of people who took the cross and paper off their door “in a riotious manner,” so they could “goe abroad into the street promiscuously, with others.” It’s reminiscent of all those modern Americans who went to the beaches in Florida over spring break, despite what public health experts told them.

Just as some American politicians blame the Chinese for the coronavirus, there were 17th century Brits who blamed the Dutch for spreading the plague. Others blamed Londoners. Mr. Pepys had relocated his family to a country home in Woolwich, and writes in his diary that the locals “are afeard of London, being doubtfull of anything that comes from thence, or that hath lately been there … I was forced to say that I lived wholly at Woolwich.”

Annalee Newitz: What social distancing looked like in 1666 (Salt Lake Tribune)

In​ the cold autumn of 1629, the plague came to Italy. It arrived with the German mercenaries (and their fleas) who marched through the Piedmont countryside. The epidemic raged through the north, only slowing when it reached the natural barrier of the Apennines. On the other side of the mountains, Florence braced itself. The officials of the Sanità, the city’s health board, wrote anxiously to their colleagues in Milan, Verona, Venice, in the hope that studying the patterns of contagion would help them protect their city. Reports came from Parma that its ‘inhabitants are reduced to such a state that they are jealous of those who are dead’. The Sanità learned that, in Bologna, officials had forbidden people to discuss the peste, as if they feared you could summon death with a word.

Plague was thought to spread through corrupt air, on the breath of the sick or trapped in soft materials like cloth or wood, so in June 1630 the Sanità stopped the flow of commerce and implemented a cordon sanitaire across the mountain passes of the Apennines. But they soon discovered that the boundary was distressingly permeable. Peasants slipped past bored guards as they played cards. In the dog days of the summer, a chicken-seller fell ill and died in Trespiano, a village in the hills above Florence. The city teetered on the brink of calamity.

By August, Florentines were dying. The archbishop ordered the bells of all the churches in the city to be rung while men and women fell to their knees and prayed for divine intercession. In September, six hundred people were buried in pits outside the city walls. As panic mounted, rumours spread: about malicious ‘anointers’, swirling infection through holy water stoups, about a Sicilian doctor who poisoned his patients with rotten chickens. In October, the number of plague burials rose to more than a thousand. The Sanità opened lazaretti, quarantine centres for the sick and dying, commandeering dozens of monasteries and villas across the Florentine hills. In November, 2100 plague dead were buried. A general quarantine seemed the only answer. In January 1631, the Sanità ordered the majority of citizens to be locked in their homes for forty days under threat of fines and imprisonment.

In his Memoirs of the Plague in Florence, Giovanni Baldinucci described how melancholy it was ‘to see the streets and churches without anybody in them’. As the city fell quiet, ordinary forms of intimacy were forbidden. Two teenage sisters, Maria and Cammilla, took advantage of their mother’s absence in the plague hospital to dance with friends who lived in the same building. When they were discovered, their friends’ parents were taken to prison. At their trial, the mother, Margherita, blamed the two girls: ‘Oh traitors, what have you done?’ Another pair of sisters found relief from the boredom of quarantine by tormenting their brother. Arrested after one of the Sanità’s policemen saw them through an open door, one of them explained in court that ‘in order to pass the time we dressed our brother up in a mask, and we were dancing among ourselves, and while he was … dressed up like that, the corporal passed by … and saw what was going on inside the house.’ Dancing and dressing up were treacherous actions, violating the Sanità’s measures to control movement, contact, breath. But loneliness afflicted people too…


The poor were judged not only careless but physically culpable, their bodies frustratingly vulnerable to disease. The early decades of the 17th century in Europe saw widespread famines, sky-high grain prices, declining wages, political breakdown and violent religious conflicts. (This is the ‘general crisis of the 17th century’ that Important Male Historians like to debate.) One Florentine administrator, surveying the surrounding countryside, reported that even before the epidemic struck, villages were ‘full of people, who feed themselves with myrtle berries, acorns and grasses, and whom one sees along the roads seeming like corpses who walk’. The city was not much better. A diarist in Florence in 1630 noted the ‘many poor children who eat the stalks of cabbages that they find on the street, as though, through their hunger, they seem like fruit’. Famine was compounded by the steep decline of the textile industry in the city, as producers in England, Holland and Spain undercut prices; the number of wool workshops halved between 1596 and 1626. These long, lean years of unemployment and hunger had left Florentines acutely susceptible to the coming epidemic.


The Sanità arranged the delivery of food, wine and firewood to the homes of the quarantined (30,452 of them). Each quarantined person received a daily allowance of two loaves of bread and half a boccale (around a pint) of wine. On Sundays, Mondays and Thursdays, they were given meat. On Tuesdays, they got a sausage seasoned with pepper, fennel and rosemary. On Wednesdays, Fridays and Saturdays, rice and cheese were delivered; on Friday, a salad of sweet and bitter herbs. The Sanità spent an enormous amount of money on food because they thought that the diet of the poor made them especially vulnerable to infection, but not everyone thought it was a good idea. Rondinelli recorded that some elite Florentines worried that quarantine ‘would give [the poor] the opportunity to be lazy and lose the desire to work, having for forty days been provided abundantly for all their needs’.

The provision of medicine was also expensive. Every morning, hundreds of people in the lazaretti were prescribed theriac concoctions, liquors mixed with ground pearls or crushed scorpions, and bitter lemon cordials. The Sanità did devolve some tasks to the city’s confraternities. The brothers of San Michele Arcangelo conducted a housing survey to identify possible sources of contagion; the members of the Archconfraternity of the Misericordia transported the sick in perfumed willow biers from their homes to the lazaretti. But mostly, the city government footed the bill. Historians now interpret this extensive spending on public health as evidence of the state’s benevolence: if tracts like Righi’s brim over with intolerance towards the poor, the account books of the Sanità tell an unflashy story of good intentions.

But the Sanità – making use of its own police force, court and prison – also punished those who broke quarantine. Its court heard 566 cases between September 1630 and July 1631, with the majority of offenders – 60 per cent – arrested, imprisoned, and later released without a fine. A further 11 per cent were imprisoned and fined. On the one hand, the majority of offenders were spared the harshest penalties, of corporal punishment or exile. On the other, being imprisoned in the middle of a plague epidemic was potentially lethal; and the fines levied contributed to the operational budget of the public health system. The Sanità’s lavish spending on food and medicine suggests compassion in the face of poverty and suffering. But was it kindness, if those salads and sausages were partly paid for by the same desperate people they were intended to help? The Sanità’s intentions may have been virtuous, but they were nevertheless shaped by an intractable perception of the poor as thoughtless and lazy, opportunists who took advantage of the state of emergency.

Early modern historians used to be interested in the idea of the ‘world turned upside down’: in moments of inversion during carnival when a pauper king was crowned and the pressures of a deeply unequal society released. But what emerges from the tangle of stories in John Henderson’s book is a sense that for many the world stood still during the plague. The disease waned in the early summer of 1631 and, in June, Florentines emerged onto the streets to take part in a Corpus Christi procession, thanking God for their reprieve. When the epidemic finally ended, about 12 per cent of the population of Florence had died. This was a considerably lower mortality rate than other Italian cities: in Venice 33 per cent of the population; in Milan 46 per cent; while the mortality rate in Verona was 61 per cent. Was the disease less virulent in Florence or did the Sanità’s measures work? Percentages tell us something about living and dying. But they don’t tell us much about survival. Florentines understood the dangers, but gambled with their lives anyway: out of boredom, desire, habit, grief…

Florence Under Siege: Surviving Plague in an Early Modern City by John Henderson.

Inclined to Putrefication (London Review of Books)

The majority of the population feared and condemned inoculation. Even many of those who were in favor of it were torn by doubts and religious scruples. Was inoculation a “lawful” practice? Was smallpox not a “judgement of God,” sent to punish and humble the people for their sins? Was being inoculated not like “taking God’s Work out of His Hand”?

Douglass played upon such popular scruples to the apparent discomfiture of his clerical opponents. Turning to the ministers he challenged them to determine, as a “Case of Conscience,” how placing more trust in human measures than in God was consistent with the devotion and subjection owed to the all-wise providence of the Lord. That he had not raised this issue in good faith becomes evident from a passage contained in a private letter suggesting jeeringly that his correspondent might perhaps admire how the clergy reconciled inoculation with their doctrine of predestination…

Ever since she had accompanied her husband on a diplomatic mission to Turkey, where she had become acquainted with inoculation and convinced of its merits, it had been Lady Mary Wortley Montagu’s ambition to bring “this useful invention into fashion in England.” That the country’s best medical minds had not sanctioned the practice did not deter Lady Mary. She bided her time. In the 1721 epidemic she asked Charles Maitland, the physician who four years earlier had inoculated her young son in Constantinople, to perform the operation now on her little daughter. She also enlisted the interest of the Princess of Wales, at whose request the King agreed to pardon a number of prisoners who were under sentence of death if they submitted to inoculation. Six convicts in Newgate Prison were ready to do so, and on August 9, about the time Boylston was injecting his patients, they were inoculated by Maitland. The results at first were good. The ice had been broken and during the next months further persons underwent inoculation at his hands. The culmination of Lady Mary’s crusade was the inoculation of the daughters of the Prince and Princess of Wales…

With improvement in its techniques, inoculation gained increasing favor as a method for the prophylaxis of smallpox until it finally, nearly eighty years later, gave way to Jenner’s magnificent discovery of vaccination.

When Cotton Mather Fought Smallpox (American Heritage)

Asiatic cholera, one of humanity’s greatest scourges in the modern period, came to Europe for the first time in the years after 1817, traveling by ship and caravan route from the banks of the Ganges, where it was endemic, to the Persian Gulf, Mesopotamia and Iran, the Caspian Sea and southern Russia, and then—thanks to troop movements occasioned by Russia’s wars against Persia and Turkey in the late 1820s and its suppression of the revolt in Poland in 1830–1831—to the shores of the Baltic Sea. From there its spread westward was swift and devastating, and before the end of 1833 it had ravaged the German states, France, and the British Isles and passed on to Canada, the western and southern parts of the United States, and Mexico.

Politics of a Plague (NYRB)

Typhoid was a killer but it belonged to another world. The disease thrived in the overcrowded, insanitary conditions of New York’s slums, such as Five Points, Prospect Hill and Hell’s Kitchen. The family of one of the victims hired a researcher called George Soper and the diligent Mr Soper proved to be Mary’s nemesis – even though when he first tracked her down she chased him out of her kitchen with a carving fork. And that’s part of the problem with Mary.

It’s possible to sympathise with her refusal to believe that she could be transmitting a disease from which she never suffered herself. But Mr Soper had correctly identified her as an asymptomatic carrier of Typhoid fever. She would never get the disease herself but would never stop giving it to other people.

Not surprisingly, Mary Mallon found this impossible to understand. But the New York authorities were desperate and in 1907 Mary was exiled to the isolation facility on North Brother Island in the river outside New York.

How Typhoid Mary left a trail of scandal and death (BBC)

At the end of the 19th century, one in seven people around the world had died of tuberculosis, and the disease ranked as the third leading cause of death in the United States. While physicians had begun to accept German physician Robert Koch’s scientific confirmation that TB was caused by bacteria, this understanding was slow to catch on among the general public, and most people gave little attention to the behaviors that contributed to disease transmission. They didn’t understand that things they did could make them sick.

In his book, Pulmonary Tuberculosis: Its Modern Prophylaxis and the Treatment in Special Institutions and at Home, S. Adolphus Knopf, an early TB specialist who practiced medicine in New York, wrote that he had once observed several of his patients sipping from the same glass as other passengers on a train, even as “they coughed and expectorated a good deal.” It was common for family members, or even strangers, to share a drinking cup.

With Knopf’s guidance, in the 1890s the New York City Health Department launched a massive campaign to educate the public and reduce transmission. The “War on Tuberculosis” public health campaign discouraged cup-sharing and prompted states to ban spitting inside public buildings and transit and on sidewalks and other outdoor spaces—instead encouraging the use of special spittoons, to be carefully cleaned on a regular basis. Before long, spitting in public spaces came to be considered uncouth, and swigging from shared bottles was frowned upon as well. These changes in public behavior helped successfully reduce the prevalence of tuberculosis.

How Epidemics of the Past Changed the Way Americans Lived (Smithsonian)

Hassler shared his doubts about a closure order, but suggested that a short closure order would “limit most of all the cases to the home and give the other places a chance to thoroughly clean up and thus we may bring about a condition that will reduce the number of cases.” Several in attendance felt that a general closure order would induce panic in the people, would be costly, and would not stop the spread of the epidemic. Theater owners and dance hall operators supported a closure order, hoping that it would bring a quick end to the epidemic that was already causing a drastic reduction in revenue (one owner estimated that his receipts had fallen off 40% since the start of the epidemic). After some discussion, the Board of Health voted to close all places of public amusement, ban all lodge meetings, close all public and private schools, and to prohibit all dances and other social gatherings effective at 1:00 am on Friday, October 18. The Board did not close churches, but instead recommended that services and socials be either discontinued during the epidemic or held in the open air. City police were given a list of the restrictions and directed to ensure compliance with the order. The Liberty Loan drive, always the concern of citizens as they tried to outdo other cities in fundraising, would be allowed to continue by permit, as would all public meetings.

Despite the closure order and gathering ban, the centerpiece of San Francisco’s crusade against influenza was the face mask. Several other cities also mandated their use, and many more recommended them for private citizens as well as for physicians, nurses, and attendants who cared for the ill. But it was San Francisco that pushed for the early and widespread use of masks as a way to prevent the spread of the dread malady. On October 18, the day that the other health measures went into effect, Hassler ordered that all barbers wear masks while with customers, and recommended clerks who came into contact with the general public also don them. The next day, Hassler added hotel and rooming house employees, bank tellers, druggists, store clerks, and any other person serving the public to the list of those required to wear masks. Citizens were again strongly urged to wear masks while in public. On October 21, the Board of Health met and issued a strong recommendation to all residents to wear a mask while in public.

The wearing of a mask immediately became of a symbol of wartime patriotism…

The American Influenza Epidemic of 1918-1919: San Francisco (Influenza Archive)

It’s difficult to say where this pandemic is leading. On the one hand, it has revealed the extent to which the most essential workers of our society are underpaid and undervalued. It has shown how dependent we are on transient and undocumented workers who are routinely brutalized, especially in the food system. It has exposed the dark underbelly of how food ends up on our shelves and how fragile our food system really is. It has led to an upsurge in union activism and strikes. It has demonstrated the fragility of long, just-in-time supply chains and the downside of outsourcing absolutely everything, such that no one country can produce anything anymore.

It has laid bare the cracks in our society. It has shown that the philosophy of “small government” promoted by billionaires and corporations is a disaster in times of crisis. It has shown that the pattern of crippling and hobbling state and local governments in favor of empowering markets and wealthy private actors is counterproductive. It has shown the utter folly of tying the basics of life to formal employment, such as housing and health care. It has shown that depending on “free markets” for absolutely everything doesn’t work when those markets shut down due to inevitable crises. It has shown the fecklessness and incompetence of America’s leaders, as well as their amorality and bottomless greed.

Yet it has also empowered authoritarians and dictators the world over. It has superempowered the ability of states to track and monitor their citizens. It has devastated local economies and small businesses, while shifting wealth, power, and economic activity to transnational corporations who have access to unlimited money from captured governments. It has led to an upsurge in activity among the extremist far-right and well-armed and organized Fascist militias. The stock market reaches a new high every time the unemployment rate goes up, while the financial industry is bailed out. Unemployment is at Great Depression levels, while workers in the U.S. are told by politicians to fend for themselves. “Essential” workers are ordered back to work or threatened with benefit cut-offs. To date, it has increased inequality.

It has also reduced pollution levels and crippled much of air travel, perhaps forever. It has substantially reduced demand for fossil fuels, even as prices reach all-time lows. It has caused cities to close off streets and avenues to cars in favor of bicycles and pedestrians. It has increased the viability of working from home.

In short, it’s complicated. But much of what happens will be up to us. Will we become more extremist, authoritarian and unequal? Will we continue to embrace the Social Darwinism promoted by our betters? Or will we demand essential workers be paid better, unions to no longer be suppressed, working hours to drop, commuting to go away, streets to be prioritized to bikes, and the government spend its trillions on helping the average citizen rather than just big corporations and the investor class? It could go either way. Walter Scheidel concludes:

In looking for illumination from the past on our current pandemic, we must be wary of superficial analogies. Even in the worst-case scenario, Covid-19 will kill a far smaller share of the world’s population than any of these earlier disasters did, and it will touch the active work force and the next generation even more lightly. Labor won’t become scarce enough to drive up wages, nor will the value of real estate plummet. And our economies no longer rely on farmland and manual labor.

Yet the most important lesson of history endures. The impact of any pandemic goes well beyond lives lost and commerce curtailed. Today, America faces a fundamental choice between defending the status quo and embracing progressive change. The current crisis could prompt redistributive reforms akin to those triggered by the Great Depression and World War II, unless entrenched interests prove too powerful to overcome.

Why Democrats Suck

So the news is that Larry Summers is Joe Biden’s economic advisor.

I’ll take credit for being early on the “People like Larry Summers are the problem with the Democrats” train. I wrote a whole post on it way back in November. In it, I wrote:

Listening to arrogant Ivy League hyper-elite technocrats like Larry Summers is exactly why the Democratic Party is in the pathetic state its is in, and continually loses elections, even to incompetent morons like Donald Trump. If Larry Summers is a representation of “liberal values” than God help us all.

Don’t Think Like an Economist

Here are some insights into Mr. Summers’ worldview from various people. From Yanis Varoufakis:

‘There are two kinds of politicians,’ [Summers] said: ‘Insiders and outsiders. The outsiders prioritize their freedom to speak their version of the truth. The price of their freedom is that they are ignored by the insiders, who make the important decisions. The insiders, for their part, follow a sacrosanct rule: never turn against other insiders and never talk to outsiders about what insiders say or do. Their reward? Access to inside information and a chance, though no guarantee, of influencing powerful people and outcones.’ Whith that Summers arrived at his question. ‘So, Yanis,’ he said, ‘which of the two are you?’

From Elizabeth Warren:

Late in the evening, Larry leaned back in his chair and offered me some advice. By now, I’d lost count of Larry’s diet Cokes, and our table was strewn with bits of food and spilled sauces. Larry’s tone was in the friendly-advice category. He teed it up this way: I had a choice. I could be an insider or I could be an outsider. Outsiders can say whatever they want. But people on the inside don’t listen to them. Insiders, however, get lots of access and a chance to push their ideas. People–powerful people–listen to what they have to say. But insiders also understand one unbreakable rule: They don’t criticize other insiders.

I had been warned.

From Thomas Frank’s book, Listen Liberal (p. 173):

‘One of the challenges in our society is that the truth is kind of a disequalizer.’ Larry Summers told journalist Ron Suskin during the early days of the Obama administration. ‘One of the reasons that inequality has probably gone up in our society is that people are being treated closer to the way that they’re supposed to be treated.’

And let’s not forget:

In the 1990s, during Bill Clinton’s presidency, the derivatives market was taking off and Brooksley Born was chair of the Commodities Futures Trading Commission. She warned that unregulated derivatives trading posed a risk to the nation’s financial stability. She wanted more transparency of this dark market.

But Born was undercut in her efforts by no less than Treasury Secretary Robert Rubin, Federal Reserve Chairman Alan Greenspan, Deputy Secretary of the Treasury Larry Summers and SEC Chair Arthur Levitt. This boys club turned out to be dead wrong. But they had the power. They convinced Congress to strip the CFTC of its power to regulate derivatives.

The Cassandras of Our Time: Brooksley Born and Ann Ravel (Brennan Center)

Summers is also a favorite economist of the Marginal Revolution blog from George Mason University and the Mercatus Center, the epicenter of Kochenomics.

And remember, folks, the Democrats are the “Leftist” party in the United States. After all, where are you going to go?

That doesn’t bode well for the Biden campaign does it? But it does make sense: Biden is opposed to Medicare for all, student debt forgiveness, subsidized higher education, green job creation programs, wealth taxes, higher minimum wages and universal basic income. In opposing these, he consistently invokes the old canard: Howyagunnapayforit?.

Either that, or it’s “means test” everything. After all, we have to make absolutely sure that no one “undeserving” may >*gasp*< get a benefit they don’t deserve! Perish the thought! Only the truly bereft are worthy of any kind of societal benefit; the rest of us “real citizens” can get our needs met by shopping in the big, glorious Market.

Of course, this means testing bullshit leaves all sort of cracks that people often slip through, ensuring that any government program is as unpopular as possible. This is by design. So, if you’re too rich, or too poor, you cannot get health care via the government. Too poor: get Medicaid. Suddenly earn $1 over the cutoff: sorry no Medicaid for you. Have you tried the Obamacare exchanges? Rich enough to have a “Cadillac Plan?” Oh, we’re going to tax that. All just so we don’t have to cover everyone.

Or take higher education. Make under X amount: here’s a (partial) scholarship. Make over X amount? No college aid for you. All so we don’t have free higher education for all.

Robert Evens put it well on a podcast about the West Virginia coal miners’ war (the Battle of Blair Mountain):

[59:52] “You’ll hear people saying ‘basic Income seems like a great idea, but what if X group…what if rich people get it; that’s not fair.'”

“One of the problems with that is that, when you start saying stuff like, ‘We need a basic income; we need free college; we need universal health care,’ and people start bringing [up], ‘What about this group, what about that group?’ What they’re really saying is ‘I don’t believe that this is an inherent right. I think certain individual groups might deserve it, but I don’t see it as an inherent right.'”

“And I think one of the lessons of the labor movement is [that] this shit only works when you treat it like an inherent right and you reject attempts to divide people, even among groups that might make sense to you at the time. Because, in reality, if you’re agreeing to that division at all, you are against the idea that people have a right to this sort of thing.”

The other thing it does is allow recipients of such “government largesse” to be depicted as “cheats” and “scroungers.” Add that to the bogus idea that “my tax dollars fund the government,” and you play right into the Conservative/Libertarian framing of, “They’re stealing my hard-earned (it’s always ‘hard-earned’) money to give money to those layabouts which I’m not even entitled to!” In other words, it’s deliberately sabotaging social programs to give Conservatives the ammunition they need to destroy it.

Again, to repeat, this is by design!

And since the Democrats know that the baton of government will inevitably be passed back and forth between the parties, they can count on Republicans to chip away at, or even dismantle, the programs that they’ve created. They can then depict them as the bad guys, even though that was the plan all along. Good cop, meet bad cop.

Here’s the dirty little secret: They don’t want these programs to succeed!

Thus the two party duopoly functions as one wrestling tag-team implementing the same set of Neoliberal policies to enrich the donor class at our expense.

But if both parties are virtually identical when it comes to economic philosophy, who can you vote for if you don’t agree with that kind philosophy?

No one. And that’s the goal of the two-party system. There is no alternative. That’s why the Democrats were far more effective in opposing Bernie Sanders than the have been opposing the so-called “mortal threat” Trump. #Resistance.

In my original post, I said:

My core point is this: this kind of autistic “economic thinking” is the very reason why the voting public believes there is no substantial difference between the Republicans and the (Neoliberal) Democrats. And they’re right! It’s also worth noting that Professor Cowen has let the cat out of the bag, tacitly admitting that the very discipline of economics is inherently right-wing (it makes him suspect among the left…). Yet it still masquerades as ideologically neutral!

Don’t Think Like an Economist

This article from Policy Tensor makes a lot of the same points:

The tunnel vision of global leaders and the wider discourse of the articulate class is symptomatic of a deeper malaise. Put simply, we are in the grip of a very powerful ideology. It is an ideology that subordinates all goals, including the survival of our species and the web of life with which it is inextricably intertwined, to the goal of maximizing economic growth.

But it does much more. Economics as ideology distorts our perception of contemporary and historical reality. It misguides us into flawed explanatory schema for the most important historical explananda. It sharply narrows the possibility space of human action. And, most important of all, it closes off all rational courses of action that may thwart the collapse of world civilization that is increasingly getting backed-in as we ride up the hockey stick of doom.

Economics as Ideology (1): Introduction (Policy Tensor)

The genius of economics is that it is an ideology that masquerades as non-ideological. Economists always win the debate once you accept their framing of the world: as a cost-benefit cash nexus, full of rational actors where nature has no inherent value. Add that new factory to GDP, don’t subtract all the people who will get cancer from it, and so on.

Once you accept their premises, you are guided along (as if by an invisible hand) to their desired conclusions, which, by some coincidence, always benefit the rich and powerful.

And these axioms have colonized our consciousness to the extent that we don’t even think of them as axioms, we just accept them as natural. They’ve achieved cultural hegemony in Gramasci’s terminology.

When someone says, “you just don’t understand economics,” what they’re really saying is, “You’re not looking at the world through the same blinkered, autistic view as I am, therefore you can’t be taken seriously.”

Ideology consists of widely-shared lenses that are worn unconsciously. It is when we are not aware of our limits applicability of our reference frame, when we mistake the map for the territory, that we are being ideological. More often than not, we are simply unaware that we are using a specific lens to interrogate reality. Ideology manifests itself in widely-shared and unarticulated premises. It is most evident in things that are simply assumed to be true and require no justification whatsoever — mere assertion suffices. But even though widely accepted, such premises may not hold. A gap thus opens up between discourse and reality. Such gaps are a recipe for disaster. All man-made catastrophes are due, in large part, to such gaps.

Hence Tyler Cowen’s complement to Summers cited in the original post: He never ceases to think like an economist. Because thinking like an economist will be sure to get you to the libertarian conclusions that Cowen and his patrons favor, even if you are officially classified as “liberal” or are a member of the Democratic Party. Two parties, one ideology.

Recall that the modern discipline of economics as developed under the marginal revolution in the late 1800s (hence the name of the blog) is based on the following core theorems:

There are two fundamental theorems of welfare economics.

-First fundamental theorem of welfare economics (also known as the “Invisible Hand Theorem”):

any competitive equilibrium leads to a Pareto efficient allocation of resources.

The main idea here is that markets lead to social optimum. Thus, no intervention of the government is required, and it should adopt only “laissez faire” policies. However, those who support government intervention say that the assumptions needed in order for this theorem to work, are rarely seen in real life.

It must be noted that a situation where someone holds every good and the rest of the population holds none, is a Pareto efficient distribution. However, this situation can hardly be considered as perfect under any welfare definition. The second theorem allows a more reliable definition of welfare

-Second fundamental theorem of welfare economics:

any efficient allocation can be attained by a competitive equilibrium, given the market mechanisms leading to redistribution.

This theorem is important because it allows for a separation of efficiency and distribution matters. Those supporting government intervention will ask for wealth redistribution policies.

Welfare economics I: Fundamental theorems (Policonomics)

In other words, the greatest welfare (optimal good) is achieved by government getting out of the way and letting markets rip. This is not a value statement; this is baked into the very heart of economics as a discipline! Also note:

“…a situation where someone holds every good and the rest of the population holds none, is a Pareto efficient distribution.” Hmmmm…

The second theorem states that the “winners” will compensate the “losers”, and this supposedly “cleans up” the problems with the first theorem. But as noted in my earlier post, that turns out to be not so clean-cut:

In 1939, Cambridge economist Nicholas Kaldor asserted that the political problem with cost-benefit analysis—that someone always loses out—wasn’t a problem. This was because the government could theoretically redirect a little money from the winners to the losers, to even things out: For example, if a policy caused corn consumption to drop, the government could redirect the savings to aggrieved farmers. However, it didn’t provide any reason why the government would rebalance the scale, just that it was possible. What is now called the Kaldor-Hicks principle, “is a theory, “ Appelbaum says, “to gladden the hearts of winners: it is less clear that losers will be comforted by the possession of theoretical benefits.” The principle remains the theoretical core of cost-benefit analysis, Appelbaum says. It’s an approach that sweeps the political problems of any policy—what to do about the losers—under the rug.

Of course that becomes harder when you’ve had forty years of billionaire-funded think tanks promoting the idea that any wealth earned in the market is just no matter what; that market distribution is “fair”; that taxes are “punishing the winners”; and that any assistance to the less fortunate will “encourage dependence on big government.” In short, that redistribution is immoral.

Funny how those think-tanks don’t show up in any of the theorems of welfare economics. So much for theorem #2.

And I’m sure that all those people newly unemployed are just waiting to take advantage of the Pareto optimal distributions of free markets to see them through the next few months.

But what do I know? I’m an “outsider.”

“Nothing will fundamentally change” (Real World Economic Review)

A Finance Primer 2

Previously we learned what securities were: financial instruments designed to swap debt all over the place (we’ll ignore ownership for now). Basically, they’re just IOUs.

The financial “industry” consists of trading and gambling with these securities. For every financial transaction, there are parties and counterparties. Like the tango, it takes two.

As I mentioned, banks and other financial institutions loan to each other all the time. This turns out to be quite important. I want to outline a little more of the financial system using that concept.


As I said above, financial institutions such as banks lend to, and borrow from, each other. But they don’t do it the way they lend to and borrow from you and me. They buy and sell IOUs.

If you deposit $100 and then return a month later to take out your $100, it’s not the exact same $100 you deposited. When you give the bank your money, they invest it and the money they give you is the money they’ve made. Banks are trying to make money from their deposits just like everyone else, so that’s why they loan it out to other institutions. One place where they do so called the financial repo market. This is basically banks loaning to other banks.

Recall that banks have a reserve requirement, which is the ratio of deposit liabilities (what the bank owes its customers) to loanable money.

If a bank has more reserves at the central bank than they need, they loan it out to other banks who may be short. This is done through what are called repurchase agreements, which is where the term “repo” comes from.

So if Bank A has an additional $50m at the end of Thursday, and Bank B needs to get $50m on its books to comply with regulations, Bank A will loan out their extra $50m for an agreed interest rate, and will repurchase the loan on Friday morning. Hence repurchase agreements.

The way this is done is by selling securities. In the above example, Bank B sells Bank A securities in exchange for cash. It agrees to buy back the securities from Bank A at a later date at an agreed-upon (usually higher) price.

Example: Transition Borrower sells government bond worth $100 to lender for $100 cash Borrower agrees to buy back bond for $101 in 1 year 1 year later… Borrower buys bond back for $101, lender receives $101.

So what happened here? Changing the verbiage to what makes more intuitive sense, the borrower gave a lender an asset in exchange for $100 cash. 1 year later the borrower paid back the initial amount ($100) plus an extra amount that they had previously agreed upon ($1) and got the asset back.

Repo Rate (for 1 year repo) = (Final Price-Initial Price)/Initial Price so in this case we see the Repo rate is 1%.

ELI5: What is the repo rate? (Reddit)

Recall that Treasury securities is basically another word for the government’s debt; i.e. the “national debt.” The same national debt that you hear all the scary stories about. Turns out that the debt is necessary for the financial system to function–something the horror stories never manage to tell you. Having no national debt would actually be a problem. Banks are, in fact, actually required to hold a certain amount of U.S. debt in the form of treasury securities.

If the U.S. paid off its debt there would be no more U.S. Treasury bonds in the world…The U.S. borrows money by selling bonds. So the end of debt would mean the end of Treasury bonds.

But the U.S. has been issuing bonds for so long, and the bonds are seen as so safe, that much of the world has come to depend on them. The U.S. Treasury bond is a pillar of the global economy.

Banks buy hundreds of billions of dollars’ worth, because they’re a safe place to park money. Mortgage rates are tied to the interest rate on U.S. treasury bonds.The Federal Reserve — our central bank — buys and sells Treasury bonds all the time, in an effort to keep the economy on track.

If Treasury bonds disappeared, would the world unravel? Would it adjust somehow?

What If We Paid Off The Debt? The Secret Government Report (NPR)

The agreed interest rate that banks charge each other for 1 day loans is called the Fed Funds Rate. The interest rate banks lend to each other obviously affects the rate at which banks lend to you. That’s why it’s extremely important to the financial system as a whole. It sets many of the other domestic interest rates in a sort of domino effect.

The federal funds rate is the rate at which depository institutions (banks) lend reserve balances to other banks on an overnight basis (or slightly longer). Reserves are excess balances held at the Federal Reserve to maintain reserve requirements. The rate is primarily determined by the balance of supply and demand for the funds.

[The Fed Funds Rate] is just about the most fundamental metric in the entire financial system. Everything from government bonds, to commercial loans, to your mortgage is influenced by how much banks have to pay to square their books at the end of the day.

ELI5: What exactly is the financial repo market? (Reddit)

If there is not enough cash in the system, the Fed Funds rate will increase, affecting the entire system. If the rate is being affected not by market fundamentals, but by some sort of financial crisis for instance, more money is injected via the Federal Reserve to try and bring the rate down.

The Fed Funds rate…is not strictly controlled by the Federal Reserve, but is a market effect that’s a result of Fed actions. Specifically, the Fed buys or sells bonds. If the Fed buys bonds there is more cash in the economy (which means in bank coffers) and fewer interest-paying bonds. This makes rates go down because money supply is higher and banks need to entice someone to borrow it. Inversely, if the Fed sells bonds it pulls cash out of the economy and can make rates higher…Buying or selling bonds a little at a time lets the market naturally adjust the rate to where they want it.

Why does the Fed raise interested [sic] rates (Reddit)

If banks need to borrow money directly from the Fed itself, this is called the discount window, for reasons I’m not entirely sure of, except to make this more complicated and obscure. The Fed is just the national bank, and the interest rate banks pay to borrow from the Fed is called the discount rate. This rate is determined by the Fed. When the Fed wants to inject more money into the system, they lower the discount rate so banks can borrow more money.

Money is a commodity, and it’s “price” is the interest rate. So the Fed doesn’t “declare” an interest rate. It sets a target interest rate, and then buys or sells bonds in order to achieve that interest rate.

We said last time that bonds (IOUs) are a way of “locking up” money for a while. You can imagine dollars being removed temporarily from society like prisoners sitting behind bars in a jail cell if you like. The length of time the bond is for (term) is the “sentence” for the money locked away in the bond. By the time the bond “gets out of jail,” it has had a baby called interest. How many “babies” it has had is determined by the amount of time it has been locked up for.

Selling treasury bonds takes cash *out* of the system (because people pay cash for the bonds.) Selling bonds puts cash back *into* the system (because the bonds are redeemed in cash). Sometimes, of course, the bond is not redeemed for cash, it is simply “rolled over”–one IOU is exchanged for another IOU–in which case cash is not injected back into the system.

The removal and injecting of cash into the system influences the amount of overall money in the system, which consequently determines the interest rate–the price for money. The rate the banks lend to each other helps determine the rate at which they lend to you and everyone else.

Peter Conti-Brown describes the process:

The [Federal Open Markets Committee], in its eight annual meetings, establishes the target federal funds rate, or the rate it wishes to see in the markets for these interbank, short-term loans.

To reach this target, the Fed buys or sells securities on the open market, through the trading desk at the Federal Reserve Bank of New York. Here’s the connection to the federal government: the New York Fed’s primary conventional tool to accomplish the FOMC’s objectives is the purchase and sale of short-term government securities.

When the FOMC decides to raise interest rates, the New York Fed pulls cash out of the financial system by selling the short-term government debt securities the Fed keeps on its books; when the FOMC decides to lower interest rates, the New York Fed injects cash into the finacial system by buying those securities back from the market participants.

When the Fed buys a Treasury security on the open market, it provides the bank on the other side of the transaction with cash–an electronic modification to the bank’s balance sheet. This purchase removes the security from the bank’s balance sheet, and replaces it with greater reserves in the bank’s account at its local Federal Reserve Bank.

In this way, the Fed has expanded the money supply by removing from the banking system an asset that that is harder to sell on the market–a government bond or, more recently, a mortgage backed security–and replacing it with cash, literally Federal Reserve Notes (mostly electronic, of course). These notes are easier assets to use to exchange for other goods and services. Indeed, as the notes themselves report, they are “legal tender for all debts public and private.” In the monetary metaphor, cash is the most “liquid” of assets.

The Fed regulates how much cash banks must keep in their reserves, which are deposited at the bank’s regional Fed. If the bank already has the requisite level of reserves required by the Fed, the cash that is added to its balance sheet through its sale of a Treasury security to the Fed is is something extra.

The Power and Independence of the Federal Reserve pp. 131-132

The money injected into the banks in this way is called “high powered money” because the banking system can multiply that money through the fractional reserve lending. If the reserve requirement is adjusted down, banks can hold less cash and the money multiplier effect is greater, expanding the money supply. If the reserve requirement goes up, banks can lend less of their deposits, resulting in less money creation.

Banks are generally in the business of taking the “something extra” and injecting it into the economy in the form of bank loans. Under normal circumstances, the bank will lend the cash it has received from the Fed in exchange for its more illiquid security. Doing so expands the money supply in the economy as the bank borrower spends that money and multiplies the money’s reach through a daisy chain of spending, investing, and saving.

Because most consumers of bank credit–usually businesses, but also individuals–don’t carry around much cash in their wallets or under their mattresses, the money the first bank receives through the Fed’s initial market transaction goes to another bank or business, which gives to another bank or business, and on and on. The effect is a more or less predictable expansion of the money supply throughout the banking system. ibid. pp. 132-133

In a financial panic the banks need cash. Based on the above, what happens? Rather than being locked up, money is set free. The Fed buys Treasury securities, i.e. government debt. Or it buys the mortgage-backed securities (which are theoretically backed by people paying their mortgages, except in the financial crisis they weren’t, meaning that they were essentially worthless paper). That’s why the debt increases. It has to. That’s how the system is designed. Remember, treasury securities are the government’s debt.

US Fed balance sheet increases to record $6.62 trillion (Deccan Herald)

So we see that the selling of bonds has nothing to do with the government needing to get the money from the private sector in order to spend. The sovereign never needs to “borrow” the currency over which it has the exclusive right to issue–that is logically ridiculous. It does so by convention (holdovers from the gold standard era and mercantilism), which ends up providing a risk-free asset–Treasury securities–for the private sector to hold cash. The selling and buying of this asset help to stabilize the price of money by injecting it into, or taking it out of, the banking system.

Of course, as we learned last time, the fractional reserve theory of banking is obsolete. Now that demand deposits are the main way of conducting transactions, and are treated like cash, the loan itself creates the deposit on the bank’s balance sheet.

In the [intermediation of loanable funds] (ILF) model [of banking], bank loans represent the intermediation of real savings, or loanable funds, between non-bank savers and non-bank borrowers. But in the real world, the key function of banks is the provision of financing, or the creation of new monetary purchasing power through loans, for a single agent that is both borrower and depositor.

The bank therefore creates its own funding, deposits, in the act of lending, in a transaction that involved no intermediation whatsoever. Third parties are only involved in that the borrower/depositor needs to be sure that others will accept his new deposit in payment for goods, services, or assets. This is never in question, because bank deposits are any modern economy’s dominant medium of exchange.

Furthermore, if the loan is for physical investment purposes, this new lending and money is what triggers investment and therefore…saving. Saving is therefore a consequence, not a cause, of such lending. Saving does not finance investment, financing does. To argue otherwise confuses the respecting macroeconomic roles of resources (saving) and debt-based money (financing).

Bank of England Working Paper No. 529: Banks are not intermediaries of loanable funds–and why this matters. Zoltan Jakab and Michael Kumhof.

So we saw three ways the Fed can influence the amount of money in the banking system, and hence, the economy: 1) Manipulate the fed funds rate by buying or selling Treasury securities; 2) Lower the discount rate; that is, the cost to borrow from the Fed 3.) Lower the reserve requirement—the amount of deposit money banks must keep in their accounts at the Fed.

What if the price of money is already so low that it’s practically free? Turns out that’s now. When that happens, rates can’t really go lower, so it’s a problem. We’ll discuss that another time.


If a particular bank didn’t have enough reserves, it would borrow from other banks in the system, as we said above. But in a financial panic, when bad loans are roiling the entire system due to bad harvest or something, every bank is trying to make sure it has enough money to stave off a bank run, and no bank wants to lend out money. They can’t get a temporary loan from the other bank, because the other bank is in the exact same boat! And so on, throughout the entire financial system.

So in that type of environment, where would the banks get the money they needed? Who would loan it out? Often the answer was a bailout organized by wealthy Wall Street financiers getting the money from London or something. But after enough of these ad hoc rescues (the last one famously spearheaded by J.P. Morgan), everyone knew that some sort of permanent bank “above” all the others was needed to loan out money from the federal government during times of banking panic or financial crisis. And these banking panics used to happen a lot. And I mean a lot. As “Adam Smith” writes:

The framers of the law that created [the Federal Reserve] were literally panic-stricken. Schoolchildren once learned the dates of bank panics in history as if they were battles: the Panic of 1837, the Panic of 1873, the Panic of 1893, the Panic of 1907. Boom, bust, the agitated lines of depositors stretching out into the street, people anxious to get their money out while it was still possible; then collapse, and depression.

When Congress established this Reserve system in 1913, it contemplated that the Reserve could stop a run on the country’s banks with money that was not gold, not silver, not anything in the vaults. By a series of mechanisms, the Federal Reserve could add money to the government’s own bank account. Stricken banks would have a senior friend, and the Federal Reserve, the central bank, would regulate the county’s supply of money. This office had great financial power. PAPER MONEY, pp. 15-16

Thus, the Fed is the lender of last resort. Peter Conti-Brown describes:

Maturity transformation is what makes banks so vulnerable to failure. The system relies on an assumption that only a few short-term depositors will withdraw their money on a given day. When more depositors show up at a bank demanding even more in withdrawals than is on deposit in the vaults, the bank will fail unless new cash can be secured.

The first priority of a bank facing a panic, then, is to publicly and ostentatiously demonstrate that it has secured new funds, usually from another bank. As soon as the panic is stabilized, a bank with a well-managed portfolio will pay back both its emergency debts to the other banks that have stepped in to the breach and those the bank owes to its regular depositors. (In the sometimes confusing vernacular of banking, the bank’s “assets” include the loans it is owed by homeowners, business owners, and the like; its “liabilities” include the deposits that savers can demand at any time.)

If the panic spreads, however, the usual sources of short-term credit to the first scared banker–other banks–are themselves tied up trying to fend off bank runs of their own. At that point, a localized panic has become systemic.

In that event, banks scramble in a dash to find liquidity in the system, wherever liquidity can be found. The central bank is the solution to the scramble for liquidity. The central bank is the “lender of last resort,” the bank that exists to restore order to the financial system.

Peter Conti-Brown; The Power and Independence of the Federal Reserve, p. 152

So the problem with all the “End the Fed” dipshits is, how ya gonna stabilize the banking system, then? Or should we just let it crash every ten years?

In the bad old days, if a bank failed, you lost all your money. Now deposits are insured, up to a limit. If your bank fails, you will be reimbursed by the insurance payout for the amount you lost up to the insurance limit (so that $1000.00 in your account is safe.) This was an innovation of the Great Depression and it pretty much eliminated bank runs.

But, as you can see above, a lot of assets on the bank’s books are loans for real estate, i. e houses. There are always some people who default on their mortgages, or course. But if enough of those mortgages go bad all at the same time, then assets go poof, and the bank’s balance sheets become insolvent all of the sudden. And if it happens all over the entire country, then the above scenario applies—each bank is trying to save its own balance sheet, and so has no money to loan to other potentially insolvent banks in the system. The Federal Reserve has to step in.

And that, my friends, is what 2008 was all about.

Debt is a tradable commodity, and when wound back to the source somebody is paying that debt. And when they stop paying, that debt suddenly becomes a worthless commodity. Multiply that by a few trillion and you’ve got an economic crisis on your hands. So goes the financial system.

Making it worse was that the mortgage debt was “sliced and diced” into mortgage backed securities and traded all over the place. So some pension system in Oregon or somewhere was dependent on a bunch of mortgages in Las Vegas, and now it’s gone bust too. Plus, many had taken out insurance policies on these securities, and the underwriters assumed they’d never have to pay up. When everything went bad at once, the claims they had to pay overwhelmed the money supply they had on hand the insurance companies went belly-up too.

Why did this happen? Well, the banks made shady loans. And, in a perfect world, banks that made shady loans should go under. But when it’s endemic throughout the whole damn banking system, you can’t just let the entire banking system itself go under, because capitalism requires a banking system in order to function (which hopefully is obvious). So it’s got to be saved, and so some people who made the original shady deals got rescued too by default. Kind of like if terrorists blew a hole in a ship, and then got pulled from the water during the rescue along with all the other drowning passengers in the aftermath. It would have been nice if we had at least prosecuted the shady dealers in the courts, but, oh well, you know…

The problem now is that the money is disappearing because businesses can’t do business, and workers can’t earn money because they’re unemployed. Even those who are not unemployed are not spending it because 1.) they’re stashing it away because of uncertainty, or 2.) everywhere they would have spent it is closed. Nonessential businesses are prohibited by governments from doing businesses at all, and many workers can’t leave their homes except for essential errands and enjoying the outdoors while keeping their distance from everyone. With no revenue coming in, business can’t pay back their loans–and loans are the bank’s assets, remember. With businesses not doing business and people not earning salaries, money is disappearing, and loans are going bad left and right.

Because banks have accounts at the Fed, as we saw above, they are first at the trough for money. We ordinary citizens do not have accounts at the Fed. Although an interesting proposal suggests that maybe we should:

Now more than ever, therefore, we need an alternative to entrusting our security to institutions so prone to disaster, which is why Fed accounts for all is a proposal that is not only attractive and practical, but also urgent. Bankers can tell you that the Fed is an enviably indulgent loan-officer, charging minimal interest rates – currently 2.5 percent—on loans which, when passed on to customers in the form of credit card debt, carry hefty (17 percent!) profitable interest rates. So why shouldn’t the rest of us get in on the act?

This is no fringe proposal, having been advanced by a number of responsible authorities and even in a paper published last year by the eminently orthodox Federal Reserve Bank of St. Louis. The authors, two Swiss economists, proposed “central bank electronic money for all” allowing “all households and firms to open accounts at central banks, which then would allow them to make electronic payments with central bank money instead of commercial bank deposits.”

Forget Checks, How About Giving Everyone a Federal Reserve Account? (The American Conservative)

This would also allow the government to replace the lost paychecks from everyone either losing their job or being forced to stay at home:

Finally, this necessary reform would pave the way for an equally useful innovation: universal basic income. The notion of assuring everyone of a guaranteed income with no strings attached has been gaining increasing attention and support around the world in recent years, and in the presidential campaign of Andrew Yang. It has indeed been implemented in a number of locales with striking success.

One notable example, the Alaska Permanent Fund, distributes up to $2,000 to every Alaskan citizen every year. When the fund was inaugurated (by a Republican governor) in 1976 the state ranked highest in poverty rates in the country. Twenty years later, Alaska had the lowest. When the British Labour Party proposed a move toward UBI in its election manifesto prior to last year’s election, the proposal elicited a predictably choleric response from some, with the Financial Times sputtering that “rewarding people for staying at home, is what lies behind social decay”. Given that we are all now encouraged or forced to stay at home, the complaint seems ironic in the extreme.

Thus we see that the Fed is 1.) A lender who lends money to solvent banks when no one else can or will (lender of last resort); and 2.) The thing that manipulates the price of money by buying and selling securities in the market to keep the supply of money in line with what the economy actually needs. This results (hopefully) in not too much inflation or deflation.

We’ll talk about those two things next time.

Where Money Comes From

(Originally from previous post—broken apart because of length)

By now, hopefully you should know that new money is created when people and businesses take out loans.

That is, money is injected into the economy by governments through banks as intermediaries.

This is something that is quite controversial in economic circles. For a long time–and still often today–many economists do not accept this explanation, despite overwhelming evidence for it.

For a long time, in fact, economists did not really think about money at all. This may seem odd, in that most of us think that economics is the study of money! But they portrayed money as simply a means of exchange–the intermediate good that allowed one thing to be exchanged for another thing in the real economy. It was not worthy of note in and of itself, they thought. Sure, how much of it there was floating around might be important, but beyond that you didn’t really need to think about it too much.

To explain how money was created, economists developed three alternative theories of banking. Let’s look at each one in turn.

1.) The loanable funds theory. This is the idea that banks are simply intermediaries between savers and borrowers.

So you save $100 dollars a week out of your paycheck, let’s say. Multiply that by thousands of workers and businesses throughout the entire economy and you’ve got increasing piles of cash piling up in bank’s vaults (or, rather, balance sheets) over time.

At the same time, you’ve also got people who need money. They want to do some sort of profitable enterprise, but they don’t have the money to do it right now. Or they want to buy something that they can’t pay for on the spot, but can pay back over a period of time, like a house or car for instance. Where do they go? To the bank, of course!

If you’re a bank, you pay savers a particular interest rate to get the money into your vaults, and then you loan out that money at a higher interest rate to those who want to borrow. That’s how you make your money.

So the banks are the intermediaries between savers and borrowers—those who want to save and those who want to borrow. Those are never perfectly in sync in any particular bank, so banks borrow from and loan to each other as a routine matter. But the banking system as a whole functions as an effective intermediary between savers and borrowers. The rest of the money is circulating, presumably.

In this scenario, only when the desire to borrow is greater than the desire to save, is new money injected into the system by governments through various means. Banks borrow new money from the government’s central bank to loan out.

2.) The fractional reserve theory. This idea is similar to above, but allows for money to be created not by individual banks, but through the banking system as whole through the process of “multiple deposit expansion.” That is, while the banks themselves are still intermediaries just as above, but when the central bank injects money into the banking system, that money is multiplied through the actions of banks making loans–the multiplier effect.

With a reserve of 10 percent, a bank would lend out 90 percent of a deposit, which would increase the deposits at other banks in the system, who would subsequently lend out 10 percent of those deposits, resulting in an expansion of money throughout the banking system due to the process of loaning money. Any individual bank still has to get deposits in order to lend, according to the theory. But that act of lending does create new money elsewhere in the system. George Goodman (a.k.a. “Adam Smith”) explains how it works. He imagines an oil company depositing $100.00 in a U.S. bank:

Now the bank has a deposit, let’s say…of $100. The Federal Reserve says that bank has to keep 10 percent of the deposit as a reserve. You walk in and borrow $90. You put that money in a checking account; now it’s a deposit there, and your cousin Charley can walk in and borrow $81, because that fractional reserve is set each time. Your cousin Charley deposits his loan in his checking account, and the bank lends $72.90 to the next borrower. That’s the way the multiplier works, and it keeps on going. If the Federal Reserve wants more money in the banks, it lowers the fractional reserve, so that you can borrow $95 instead of $90, and your cousin Charley can borrow $85.50 instead of $81. if the Federal Reserve wants there to be less money, it raises that fractional reserve.

PAPER MONEY, p. 245.

This system allegedly originated in the goldsmith’s discovery that he could make more loans than there was actual gold in his vaults, so long as too many people didn’t show up to claim their gold all at once.

Paper money, it is said, originated with the goldsmiths of Europe who held the private gold hoards deposited by wealthy citizens for safekeeping. The goldsmith issued a receipt for the gold deposit, and over time, it became clear that the receipt itself could be used in commerce since whoever owned that piece of paper could go to the goldsmith and claim the gold.

Modern banking originated in the goldsmith’s discovery that they could safely write more receipts and lend them to people, exceeding the total gold that was on hand, so long as they always kept a reasonable minimum in reserve to honor withdrawals. This was the origin of fractional reserve banking and the bank lending that created money.

This private money system endured for centuries and was inherited by the American Republic: privately owned banks created money by issuing paper bank notes, paper backed by a promise that at any time it could be redeemed in gold.

In nineteenth-century America, the money in use consisted mainly of these privately issued bank notes, backed by gold oor silver guarantees. The money’s value was really dependent, therefore, on the soundness and probity of each bank that issued notes. Banking scandals were recurrent, particularly on the frontier, where ambitious bankers, eager to make new loans for enterprises, sometimes printed paper money that had no gold behind it. Governments imposed regulations to keep banks honest, but the bankers still were free to create their own varieties of money. When banks failed, their money failed with them.

SECRETS OF THE TEMPLE; William Grieder pp. 227-228

3.) The credit creation theory. In this view, new money is created when loans are extended. That is, the bank does not have to make sure it has enough deposits to make the loan; it simply creates a deposit for the amount of the loan that the lender can draw against.

The third theory of banking is at odds with the other two theories by representing banks not [simply] as financial intermediaries — neither in aggregate nor individually. Instead, each bank is said to create credit and money out of nothing whenever it executes bank loan contracts or purchases assets.

So banks do not need to first gather deposits or reserves to lend. Since bank lending is said to create new credit and deposit money, an increase in total balances takes place without a commensurate decrease elsewhere. Therefore according to this theory, over time bank balance sheets and measures of the money supply tend to show a rising trend in time periods when outstanding bank credit grows — unlike with the financial intermediation theory, where only existing purchasing power can be re-allocated and the money supply does not rise.

A lost century in economics: Three theories of banking and the conclusive evidence; Richard A. Werner

William Grieder summarized this process in his mammoth book, Secrets of the Temple. First, he describes the transition from bank notes hypothetically backed by gold, to demand deposits delineated in bank ledgers:

The money illusion was transferred to a new object with the rise of demand deposits, better known as checking accounts. Instead of currency, the paper money created by banks, people hesitantly came to accept that money also existed simply as an account in the bank’s ledger, redeemable by personal drafts or checks. In the United States, the transition was inadvertently stimulated by government regulation. The National Bank Act, enacted during the Civil War, placed a heavy tax on new bank notes issued by state banks, and in order to avoid the tax, banks encouraged customers to use demand deposits–writing personal checks instead of drawing out their money in cash.

It took generations for the public to overcome its natural distrust of checks, but by 1900 most people were persuaded. Personal checks, written by the buyers themselves, were accepted as just as valuable as dollar bills. Currency remained in use, but demand deposits were by now the bulk of the money supply. The nationalization of currency issuance, completed with the creation of the Federal Reserve in 1913, simply continued this arrangement. A new dimension of trust had added to the illusion. pp. 227-228

He then goes on to describe just how money is created using these demand deposit accounts via the banking system:

New money was created not only by the Federal Reserve but also by private commercial banks. They did it by new lending, by expanding the outstanding loans on their books. Routinely, a bank borrowed money from some group, the depositors, and lent it to someone else, the borrowers, a straightforward function as intermediary. But, if that was all that occurred, then credit would be frozen in size, unable to expand with new economic growth. On the margins, therefore, bankers expanded their lending on their own and the overall pool of credit drew–and the banks credit turned it into money.

A bank officer authorizes a $100,000 loan to a small-business man–a judgement that the businessman’s future earnings will be sufficient to repay the loan, that his enterprise would create real value in the future, which would justify the risk and the creation of the additional money. Ordinarily the banker would not hand over $100,000 in dollar bills. He would simply write a check or, more likely, enter a credit in the businessman’s bank account for $100,000. Either way, money has been created by the simple entry in a ledger.

Implausible as that might seem, it was a reality that everyone would accept, even if they were unaware of its audacity. The businessman would go out and spend the money, writing checks on his new account, and everyone would honor their value. The creation of new money, thus, was really based on bank-created debt.

This concept is what baffled and outraged so many critics of the money system. Money ought to be “real,” they insisted. It should be based on something tangible from the past, accumulated wealth like gold, not on a banker’s hunch about the future.

How could such a system possibly work? Why didn’t it collapse and produce social disaster? The short, simple explanation was: trust. People trusted the banks…They believed, perhaps not even knowing the actual mechanics, that bankers would use this magic prudently. Banks would make sound loans that would be repaid, and they would always keep enough money on hand so that any individual depositor could always withdraw his when he needed it. pp. 59-60

Clearly, the trust in banks described by Grieder above has been undermined due to the financialization of the economy, not to mention the bailouts. David Graeber sums up the three school of banking in the New York Review of Books:

Economists, for obvious reasons, can’t be completely oblivious to the role of banks, but they have spent much of the twentieth century arguing about what actually happens when someone applies for a loan.

One school insists that banks transfer existing funds from their reserves, another that they produce new money, but only on the basis of a multiplier effect (so that your car loan can still be seen as ultimately rooted in some retired grandmother’s pension fund). Only a minority—mostly heterodox economists, post-Keynesians, and modern money theorists—uphold what is called the “credit creation theory of banking”: that bankers simply wave a magic wand and make the money appear, secure in the confidence that even if they hand a client a credit for $1 million, ultimately the recipient will put it back in the bank again, so that, across the system as a whole, credits and debts will cancel out. Rather than loans being based in deposits, in this view, deposits themselves were the result of loans.

The one thing it never seemed to occur to anyone to do was to get a job at a bank, and find out what actually happens when someone asks to borrow money. In 2014 a German economist named Richard Werner did exactly that, and discovered that, in fact, loan officers do not check their existing funds, reserves, or anything else. They simply create money out of thin air, or, as he preferred to put it, “fairy dust.”

…Before long, the Bank of England (the British equivalent of the Federal Reserve, whose economists are most free to speak their minds since they are not formally part of the government) rolled out an elaborate official report called “Money Creation in the Modern Economy,” replete with videos and animations, making the same point: existing economics textbooks, and particularly the reigning monetarist orthodoxy, are wrong. The heterodox economists are right. Private banks create money.

Central banks like the Bank of England create money as well, but monetarists are entirely wrong to insist that their proper function is to control the money supply. In fact, central banks do not in any sense control the money supply; their main function is to set the interest rate—to determine how much private banks can charge for the money they create. Almost all public debate on these subjects is therefore based on false premises. For example, if what the Bank of England was saying were true, government borrowing didn’t divert funds from the private sector; it created entirely new money that had not existed before.

Why is this important? It’s important because it shows that if private borrowing creates new money, then it is excessive levels of private borrowing that will expand the money supply. Consequently, debt defaults will contract the money supply. So there’s more to macroeconomic stability than merely “government money printing.”

All this is a prelude to two very important points:

1.) Private debt and public debt are two very different things. It is private, not public debt which is a cause for alarm.


2.) A crucial distinction must be made between the “real” economy of providing goods and services, and the banking/financial sector of the economy which makes money from debt and interest.

A lot of the mistakes in understanding the modern economy have come from just those two misunderstandings. But without understanding these facts, you cannot understand what is really going on in the economy, and you’ll be making the same mistakes as all those armchair commentators worrying about “excessive government debt,” or hyperinflation “any day now.”