Deconstructing Jordan Peterson

While doing research for my last post, I ran across an interesting juxtaposition. I was looking at postmodern philosophers, and according to Wikipedia, one of the most prominent American postmodernists was a guy called Richard Rorty.

So I thought that I should take a look at this Rorty guy if he’s emblematic of American postmodernism, the same philosophy that Peterson claims is simply Marxism in disguise and has a “death grip” on North American universities.

Richard Rorty (1931–2007) developed a distinctive and controversial brand of pragmatism that expressed itself along two main axes. One is negative—a critical diagnosis of what Rorty takes to be defining projects of modern philosophy. The other is positive—an attempt to show what intellectual culture might look like, once we free ourselves from the governing metaphors of mind and knowledge in which the traditional problems of epistemology and metaphysics (and indeed, in Rorty’s view, the self-conception of modern philosophy) are rooted.

The centerpiece of Rorty’s critique is the provocative account offered in Philosophy and the Mirror of Nature. In this book, and in the closely related essays collected in Consequences of Pragmatism, Rorty’s principal target is the philosophical idea of knowledge as representation, as a mental mirroring of a mind-external world.

Providing a contrasting image of philosophy, Rorty has sought to integrate and apply the milestone achievements of Dewey, Hegel and Darwin in a pragmatist synthesis of historicism and naturalism. Characterizations and illustrations of a post-epistemological intellectual culture, present in both PMN and CP, are more richly developed in later works, … In these writings, ranging over an unusually wide intellectual territory, Rorty offers a highly integrated, multifaceted view of thought, culture, and politics, a view that has made him one of the most widely discussed philosophers in our time.

Richard Rorty (Stanford Encyclopedia of Philosophy)

Okay, well that’s pretty complicated, and I’m not sure what to make of it. Is this the stuff that’s turning college students into Maoist Red Guards?

But the interesting thing is that I found that some of Rorty’s writings went viral in the aftermath of Trump’s election victory in 2016, particularly this passage:

Members of labor unions, and unorganized unskilled workers, will sooner or later realize that their government is not even trying to prevent wages from sinking or to prevent jobs from being exported. Around the same time, they will realize that suburban white-collar workers — themselves desperately afraid of being downsized — are not going to let themselves be taxed to provide social benefits for anyone else.

At that point, something will crack. The nonsuburban electorate will decide that the system has failed and start looking for a strongman to vote for — someone willing to assure them that, once he is elected, the smug bureaucrats, tricky lawyers, overpaid bond salesmen, and postmodernist professors will no longer be calling the shots.

Hmmm. Sounds pretty damn accurate, doesn’t it? It’s even more impressive that it was written back in 1998 during the Clinton administration, before even George W. Bush much less Donald Trump.

That quote is from this Vox article: Richard Rorty’s prescient warnings for the American left. I confess, I read this article when it first came out, but I had no idea who Richard Rorty–or Jordan Peterson–were at the time.

But the most salient part of the article is Rorty’s discussion of identity politics and change in emphasis on the Leftist tradition in America. Far from being a proponent of identity politics, this philosopher–who is considered to be one of the exemplars of postmodernist thought in America–issues a stark warning to the American Left about focusing on identity politics to the exclusion of all else. He also eerily predicts the politics of today, including the rise of Dr. Jordan Peterson and the alt-right more generally.

He begins be reviewing how the focus of the left in america changed due to the Vietnam war:

The focus of leftist politics changed in the 1960s. For Rorty, the left ceased to be political and instead became a cultural movement…The Vietnam War, more than anything else, set the left on its new trajectory. The war was seen as an indictment of the whole system, of America as such. Thus the broader anti-communist Cold War become a central fault line for left-wing activists. Led largely by students, the new left regarded anyone opposed to communism — including Democrats, union workers, and technocrats — as hostile…

From [Rorty’s] perspective, the problem was the total rejection of pragmatic reform. The belief that there was nothing in America that could be salvaged, no institutions that could be corrected, no laws worth passing, led to the complete abandonment of conventional politics. Persuasion was replaced by self-expression; policy reform by recrimination.

There was a shift away from economics towards a “politics of difference” or “identity” or “recognition.” If the intellectual locus of pre-’60s leftism was social science departments, it was now literature and philosophy departments. And the focus was no longer on advancing alternatives to a market economy or on the proper balance between political freedom and economic liberalism. Now the focus was on the cultural status of traditionally marginalized groups…

And it did this by “teaching Americans to recognize otherness,” as Rorty put it. Multiculturalism, as it’s now called, was about preserving otherness, preserving our differences; it doesn’t oblige us to cease to notice those differences. There’s nothing morally objectionable about that. As a political strategy, however, [multiculturalism is] problematic. It reinforces sectarian impulses and detracts from coalition-building.

The pivot away from politics toward culture spawned academic fields like women and gender studies, African-American studies, Hispanic-American studies, LGBTQ studies, and so on. These disciplines do serious academic work, but they don’t minister to concrete political ends. Their goal has been to make people aware of the humiliation and hate endured by these groups, and to alienate anyone invested in that hate.

Wow, that sounds pretty dead-on. Indeed, even Wikipedia notes of “Western Marxism

The phrase “Western Marxism” wasn’t coined until 1953, by Maurice Merleau-Ponty. While often contrasted with the Marxism of the Soviet Union, Western Marxists were often divided in their opinion of it and other Marxist-Leninist states…Since the 1960s, the concept has been closely associated with the New Left and the focus on identity politics and the cultural domain, rather than economics and class struggle (this became especially prominent in the United States and the Western world).

Rorty explains that this focus on marginalized groups will enable a populist right to emerge in response to Americans (especially white Americans) believing their culture is under attack. This will distract them from economic issues such as the consequences of globalism and financialization. The left’s focus on cultural issues thus created an opening for the populist right, for people like Pat Buchanan, and later Donald Trump, who galvanize support among the white working class by exploiting racial grievance, cultural differences and economic anxiety. As Rorty explains:

While the Left’s back was turned, the bourgeoisification of the white proletariat which began in WWII and continued up through the Vietnam War has been halted, and the process has gone into reverse. America is now proletarianizing its bourgeoisie, and this process is likely to culminate in bottom-up revolt, of the sort [Pat] Buchanan hopes to foment.

Buchanan, you might recall, was touting the “cultural Marxism” meme back in the Nineties, long before anyone had heard of an obscure Canadian psychology professor named Jordan Peterson. This article from a right-wing news site (back in 2010!) gives an overview of Mr. Buchanan’s worldview:

“The United States has undergone a cultural, moral and religious revolution. A militant secularism has arisen in this country. It has always had a hold on the intellectual and academic elites, but in the 1960s it captured the young in the universities and the colleges. “This is the basis of the great cultural war we’re undergoing….We are two countries now. We are two countries morally, culturally, socially, and theologically. Cultural wars do not lend themselves to peaceful co-existence. One side prevails, or the other prevails.

“The truth is that while conservatives won the Cold War with political and economic Communism, we’ve lost the cultural war with cultural Marxism, which I think has prevailed pretty much in the United States. It is now the dominant culture. Whereas those of us who are traditionalists, we are, if you will, the counterculture.”

So states Patrick J. Buchanan in the opening scenes of James Jaeger’s new film, Cultural Marxism: The Corruption of America. As always, Buchanan is outspoken and splendidly patriotic in his testimony on the present degeneration of our country. Many of us born before the 1960s and its shocking nihilism agree vehemently with him. We were raised in a land far removed philosophically from the America we are cursed with today, and this disturbing fact weighs heavily upon our hearts and minds.

Cultural Marxism and the Corruption of America (The Daily Bell)

I suggest reading the article in its entirety. These paragraphs, especially, sound eerily similar to the rhetoric of Dr. Peterson:

Critical Theory,” the brain-child of Max Horkeimer, was the first and most important of these strategies. Under its auspices, every tradition of Western life was to be redefined as “prejudice” and “perversion.” And these redefinitions were to be instilled into the social stream via devastating, scholarly criticisms of all values such as the family, marriage, property, individualism, faith in God, etc. These criticisms proved to be quite successful in the aftermath of the world’s collapse into the Great Depression, which brought about widespread disillusionment with the traditional capitalist society that had evolved in the West since the Renaissance and discovery of the New World.

The strategic criticisms were soon expanded by demarcating society’s members as either “victims” or “oppressors.” All who were economically successful were defined as oppressors, and all who were not successful were termed victims. Religious authorities became “witch-doctors.” Advocates of different social roles for men and women became “fascists.” Corporate heads became “exploiters.” Fathers became “patriarchal tyrants.” Families became “primitive clans.” The stream of criticism was relentless and extremely sophisticated in an intellectual sense. Thus it mesmerized the pundit class who then disseminated the criticisms’ fundamental content to the populace at large.

Compare to Peterson’s rhetoric cited in my previous post:

The postmodernists built on the Marxist ideology, Peterson said. “They started to play a sleight of hand, and instead of pitting the proletariat, the working class, against the bourgeois, they started to pit the oppressed against the oppressor. That opened up the avenue to identifying any number of groups as oppressed and oppressor and to continue the same narrative under a different name.”…“And so since the 1970s, under the guise of postmodernism, we’ve seen the rapid expansion of identity politics throughout the universities,” he said. “It’s come to dominate all of the humanities—which are dead as far as I can tell—and a huge proportion of the social sciences.”…“We’ve been publicly funding extremely radical, postmodern leftist thinkers who are hellbent on demolishing the fundamental substructure of Western civilization. And that’s no paranoid delusion. That’s their self-admitted goal,” …

https://www.theepochtimes.com/jordan-peterson-explains-how-communism-came-under-the-guise-of-identity-politics_2259668.html

Similar to Buchanan, Peterson believes that being responsible is the new counterculture: Jordan Peterson – Growing Up and Being Useful is The New Counterculture (YouTube).

All Peterson does is transfer the culpability for undermining Western civilization from the 1930’s Frankfurt School to the 1960’s French Postmodernists. Note that the idea that multiculturalism is an attack on “Western values” and that all of our major institutions have been taken over by socialist-minded elites imposing their views from above is a staple of alt-right thinking. It was an intrinsic part of Anders Breivik’s manifesto published right before his killing spree.

And Peterson wonders why they’re protesting.

Rorty’s prescient warning was that elites would emphasize identity politics on purpose in order to divide the working classes and keep them from coalescing around an economic agenda that would endanger elite power (unions, higher minimum wages, universal healthcare, higher taxes on unearned wealth, financial regulations, job creation, etc.):

By divorcing itself from class and labor issues, the left lost sight of its economic agenda and waged a culture war that empowers the right and has done little to improve the lives of the very people it seeks to defend. Rorty’s advice to the left was to pay attention to who benefits from such a strategy:

The super-rich will have to keep up the pretense that national politics might someday make a difference. Since economic decisions are their prerogative, they will encourage politicians of both the Left and the Right, to specialize in cultural issues. The aim will be to keep the minds of the proles elsewhere – to keep the bottom 75 percent of Americans and the bottom 95 percent of the world’s population busy with ethnic and religious hostilities, and with debates about sexual mores. If the proles can be distracted from their own despair by media-created pseudo-events…the super-rich will have little to fear.

Big business benefits most from the culture wars. If the left and the right are quarreling over religion or race or same-sex marriage, nothing much changes, or nothing that impacts wealth concentration changes. Rorty is particularly hard on Presidents Jimmy Carter and Bill Clinton, both of whom he accuses of retreating “from any mention of redistribution” and of “moving into a sterile vacuum called the center.” The Democratic Party, under this model, has grown terrified of redistributionist economics, believing such talk would drive away the suburbanite vote. The result, he concludes, is that “the choice between the major parties has come down to a choice between cynical lies and terrified silence.”

Rorty’s concern was not that the left cared too much about race relations or discrimination (it should care about these things); rather, he warned that it stopped doing the hard work of liberal democratic politics. He worried that it’s retreat into academia, into theory and away from the concrete, would prove politically disastrous.

Immediately after the now-famous passage about a future “strongman,” Rorty offered yet another disturbing prophecy:

One thing that is very likely to happen is that the gains made in the past forty years by black and brown Americans, and by homosexuals, will be wiped out. Jocular contempt for women will come back into fashion. The words ‘nigger’ and ‘kike’ will once again be heard in the workplace. All the sadism which the academic Left has tried to make unacceptable to its students will come flooding back. All the resentment which badly educated Americans feel about having their manners dictated to them by college graduates will find an outlet.

If this were to happen, Rorty added, it would be a calamity for the country and the world. People would wonder how it happened, and why the left was unable to stop it. They wouldn’t understand why the left couldn’t “channel the mounting rage of the newly dispossessed” and speak more directly to the “consequences of globalization.” They would conclude that the left had died, or that it existed but was “no longer able to engage in national politics.”

“Jocular contempt for women will come back into fashion…All the resentment which badly educated Americans feel about having their manners dictated to them by college graduates will find an outlet…” Er, holy shit, this is exactly what has happened! I mean, does this not explain the rise of the alt-right movement in a nutshell? And he wrote this back in 1998, before anyone had heard of 4chan, Reddit, Facebook or YouTube!!!

Who benefits from such a strategy? Maybe the same people promoting Dr. Peterson as “the world’s most important public intellectual.”

So, not only does this prominent postmodern philosopher NOT endorse identity politics, but he explicitly warns against it! Of course, this is just one individual. But it certainly argues against the fact that some shadowy, united cabal of radical leftist postmodernists is enthusiastically pushing identity politics and multiculturalism to undermine the West and turn us all into communists. Or that this strategy is successful.

Instead of identity politics and media shaming, what would be successful?. Rorty suggests:

…Rorty’s vision of an “inspirational liberalism” is worth revisiting…The first of his three lectures is devoted to John Dewey and Walt Whitman, both of whom, on his view, personified American liberalism at its best. These were pragmatists who understood the role of national pride in motivating political change. They understood that politics is a game of competing stories “about a nation’s self-identity, and between differing symbols of its greatness.”

The strength of Dewey and Whitman was that they could look at America’s past with clear eyes…and go beyond the disgust it invoked, beyond the cultural pessimism. They articulated a civic religion that challenged the country to do better, to forge a future that lived up to the promise of America. In Rorty’s words, they recognized that “stories about what a nation has been and should try to be are not attempts at accurate representation, but rather attempts to forge a moral identity.”

Both the Right and the left have a story to tell, and the difference is enormous:

For the Right never thinks that anything much needs to be changed: it thinks the country is basically in good shape, and may well have been in better shape in the past. It sees the Left’s struggle for social justice as mere troublemaking, as utopian foolishness. The Left, by definition, is the party of hope. It insists that our nation remains unachieved.

“[The Right] sees the Left’s struggle for social justice as mere troublemaking, as utopian foolishness.” Well now, that’s a pretty accurate description of the heart of Jordan Peterson’s worldview as far as I can tell. To reinforce this point, Peterson deploys ideas from Darwinism, such as his now infamous discussion of lobster battles for hierarchical supremacy.

The Perplexing Mr. Nietzsche

Speaking of philosophers, is anyone more confused and misunderstood that Mr. Nietzsche?

In the right-wing article on multiculturalism cited above, Nietzsche is cited as an inspiration for the evil cultural Marxist conspiracy:

The cultural Marxists adopted Nietzsche’s “transvaluation of all values,” in which the Mad Hatter’s world is instituted. Everything that previously was an evil now becomes a virtue while all the old virtues become evils. Individualism, self-reliance, property, profit, family, traditional marriage, fidelity to spouse, strength of will, personal honor, rising through merit — all these integral pillars of our civilization become distinctive evils that oppress us as humans. They must be rooted out of our existence.

Yet, at the same time, Nietzsche is also a favorite philosopher of the alt-right:

In her recent book about the rise of the alt-right, Irish academic Angela Nagle discusses their obsession with civilizational decay. “They’re disgusted by what they consider a degenerate culture,” she told me in a recent interview.

Nietzsche made these same arguments more than 100 years ago. The story he tells in The Genealogy of Morality is that Christianity overturned classical Roman values like strength, will, and nobility of spirit. These were replaced with egalitarianism, community, humility, charity, and pity. Nietzsche saw this shift as the beginning of a grand democratic movement in Western civilization, one that championed the weak over the strong, the mass over the individual.

The alt-right — or at least parts of the alt-right — are enamored of this strain of Nietzsche’s thought. The influential alt-right blog Alternative Right refers to Nietzsche as a great “visionary” and published an essay affirming his warnings about cultural decay.

“Future historians will likely look back on the contemporary West as a madhouse,” the essay’s author writes, “where the classic virtues of heroism, high culture, nobility, self-respect, and reason had almost completely disappeared, along with the characteristics of adulthood generally.”

Nietzsche is also frequently cited by many white nationalists:

“You could say I was red-pilled by Nietzsche.”

That’s how white nationalist leader Richard Spencer described his intellectual awakening to the Atlantic’s Graeme Wood last June. “Red-pilled” is a common alt-right term for that “eureka moment” one experiences upon confrontation with some dark and previously buried truth.

For Spencer and other alt-right enthusiasts of the 19th-century German philosopher Friedrich Nietzsche, that dark truth goes something like this: All the modern pieties about race, peace, equality, justice, civility, universal suffrage — that’s all bullshit. These are constructs cooked up by human beings and later enshrined as eternal truths.

Nietzsche says the world is in constant flux, that there is no capital-T truth. He hated moral and social conventions because he thought they stifled the individual. In one of his most famous essays, The Genealogy of Morality, which Spencer credits with inspiring his awakening, Nietzsche tears down the intellectual justifications for Christian morality. He calls it a “slave morality” developed by peasants to subdue the strong. The experience of reading this was “shattering,” Spencer told Wood. It upended his “moral universe.”

The alt-right is drunk on bad readings of Nietzsche. The Nazis were too (Vox)

“There is no capital-T truth? All modern pieties are bullshit? Stifling the individual? This seems like exactly the sort of stuff Peterson regularly rails against in his attacks on postmodernism.

Peterson’s embracing of Nietzsche is also troubling. Nietzsche was, of course, associated with the Nazis, mainly through his sister, who was a fan of the movement and intentionally distorted his posthumous writings to reflect that. But pinning Nazism on Nietzsche would be as disingenuous as pinning the crimes of Communism on Marx. Yet his promotion of order as being a “masculine” phenomenon, (Logos) and chaos being a “feminine” phenomenon strike me as vaguely authoritarian. Peterson claims he is actually anti-authoritarian, and an avowed enemy of “extremism” of both the Left AND the Right. But it’s hard to get that from his metaphysics. An obsession with “order” and “masculine virtues” are both staples of right-wing thought. So is an obsession with “civilizational decline.” According to the right, civilizational Decline comes about when feminine ‘chaos” triumphs over masculine “order.”–the same affliction the alt-right claims is weakening society.

Much of Peterson’s philosophy is responding to Nietzsche, and it does so in two ways: He agrees with Nietzsche that life is hard and will inevitably involve enduring misery. To survive, one must be prepared for this. But for Peterson, preparation does not involve defining one’s own truth and reality, as Nietzsche said. Instead of assuming the world will conform to one’s own will, Peterson advocates the importance of taking responsibility for oneself and living in accordance with the objective reality of the world around us.

For Peterson, there is objective truth and reality, and we cannot simply transcend all moral frameworks and create truth for ourselves…To deny these constraints leads to chaos—internally, interpersonally, societally. This is the main point of Peterson’s recently released Twelve Rules for Life: An Antidote to Chaos, wherein he lays out a moral framework that he believes will help people live life to the fullest—however unavoidably tragic life may be. Rule Eight: “Tell the Truth—or, at least, don’t lie,” addresses the Nietzschean, post-modern axiom of the subjectivity of truth head on. Peterson contends that we intuitively know what truth is, and that “lies make you weak and you can feel it . . . you cannot get away with warping the structure of being.” …Similarly, Rule Seven — Pursue what is meaningful, not what is expedient — also defies Nietzschean nihilism and corresponds with Peterson’s understanding of an objective reality. “Meaning is what we do to buttress our self against the tragedy of life … our pursuit of meaning is an instinct. Perhaps our deepest instinct… meaning is the antidote to the malevolence of life.” To deny meaning exists, to pursue happiness instead of meaning, or to seek meaning in the wrong things will lead to chaos.

But Peterson borrows from, in addition to criticizing, Nietzsche. Both men rail against the “last man,” the human type that seeks to shirk risk and responsibility in favor of comfort and safety. Like Nietzsche, Peterson’s view offers an “ideal human type” that lives by a superior code. For Nietzsche it was Übermensch that lived by a code of his own creation— a “master morality” of “might makes right,” also popularized by Thrasymachus in Book I of Plato’s Republic. For Peterson, the ideal is a mode of existence wherein one lives within the preordained structure of the universe and nobly grits the challenges that life throws their way.

Why Jordan Peterson Is the Last Gentleman (Law and Liberty)

Ignoring the real problem

Is the “radical Left” really the biggest problem in the world today? If Postmodernism is a philosophy that rejects all truth and universal values and defines reality as whatever one chooses it to be, isn’t that more compatible with right-wing politics in America today? Consider the quote of a Bush administration official:

The phrase [Reality-based community] was attributed by journalist Ron Suskind to an unnamed official in the George W. Bush Administration who used it to denigrate a critic of the administration’s policies as someone who based their judgements on facts. In a 2004 article appearing in the New York Times Magazine, Suskind wrote:

The aide said that guys like me were ‘in what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality.’ […] ‘That’s not the way the world really works anymore,’ he continued. ‘We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality—judiciously, as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors…and you, all of you, will be left to just study what we do’.

The source of the quotation was later identified as Bush’s senior advisor Karl Rove, although Rove has denied saying it.

Reality-based Community (Wikipedia)

“Create your own reality?” Sounds pretty postmodern to me. And from the very next Republican administration:

“Alternative facts” is a phrase used by U.S. Counselor to the President Kellyanne Conway during a Meet the Press interview on January 22, 2017, in which she defended White House Press Secretary Sean Spicer’s false statement about the attendance numbers of Donald Trump’s inauguration as President of the United States. When pressed during the interview with Chuck Todd to explain why Spicer “utter[ed] a provable falsehood”, Conway stated that Spicer was giving “alternative facts”. Todd responded, “Look, alternative facts are not facts. They’re falsehoods.”
Conway’s use of the phrase “alternative facts” to describe demonstrable falsehoods was widely mocked on social media and sharply criticized by journalists and media organizations…The phrase was extensively described as Orwellian. Within four days of the interview, sales of the book 1984 had increased by 9,500%…

Alternative Facts (Wikipedia)

It doesn’t get more postmodern than that does it? Create your own reality? Alternative Facts? The world has no objective order or reality. It is up to us to define our own truth, purpose and reality for ourselves. Consider this quote from Peterson:

18:06: Among these post-modernist types, man, they don’t give a damn for facts. In fact, facts for them are currently whatever the current power hierarchy uses to justify their acquisition of power.

Sounds like the trump administration to me. And is it the Left who is really anti-science?

The Washington Post recently reported that officials at the Center for Disease Control were ordered not to use words like “science-based,” apparently now regarded as disablingly left-leaning. But further reporting in the New York Times appears to show that the order came not from White House flunkies but from officials worried that Congress would reject funding proposals marred by the offensive terms. One of our two national political parties — and its supporters — now regards “science” as a fighting word. Where is our Robert Musil, our pitiless satirist and moralist, when we need him (or her)?

The United States of America Is Decadent and Depraved

In fact, this article makes the case that Trump is our first postmodern president:

[Postmodern] writers describe a world where the visual has triumphed over the literary, where fragmented sound bites have replaced linear thinking, where nostalgia (“Make America Great Again”) has replaced historical consciousness or felt experiences of the past, where simulacra is indistinguishable from reality, where an aesthetic of pastiche and kitsch (Trump Tower) replaces modernism’s striving for purity and elitism, and where a shared plebeian culture of vulgarity papers over intensifying class disparities. In virtually every detail, Trump seems like the perfect manifestation of postmodernism.

For Baudrillard, “the perfect crime” was the murder of reality, which has been covered up with decoys (“virtual reality” and “reality shows”) that are mistaken for what has been destroyed. “Our culture of meaning is collapsing beneath our excess of meaning, the culture of reality collapsing beneath the excess of reality, the information culture collapsing beneath the excess of information—the sign and reality sharing a single shroud,” Baudrillard wrote in The Perfect Crime (1995). The Trump era is rich in such unreality. The president is not only a former reality-show star, but one whose fame is based more on performance than reality—on the idea that he’s a successful businessman. Although his real estate and gambling empire suffered massive losses in the early 1990s, and Trump’s “finances went into a tailspin,” he survived thanks to the superficial value of his brand, which he propped up though media manipulation.

In Baudrillard’s terms, Trump is a simulacra businessman, a copy of a reality that has no real existence. All sorts of simulacrum and decoy realities now flourish. Consider the popularity of conspiracy theories, evidence of a culture where it’s easy for fictional and semi-fictional narratives to spread like wildfire through social media. Trump loves spreading conspiracy theories about his enemies, and his enemies love spreading conspiracy theories about him.

America’s First Postmodern President (The New Republic)

To me, the most tragic thing about Jordan Peterson is that not only does he recite right-wing talking points to his audience of impressionable and hurting you men, he advises them to get with the program and grin and bear it. Do not challenge or question a social order that is crushing you, just master it. And that narrative certainly benefits a certain group of people.

And we’re living in a time eerily similar to that which saw the rise of right-wing regimes around the world in the 1930’s. And once again we see illiberal regimes rising around the world due to the economic circumstances. We see extremist parties rising because the mainstream parties have lost their ability to effect change.

Peterson never tires of telling us about the millions of people who died under Communist repression. His house is apparently decorated wall-to-wall with Soviet propaganda art. He even named his daughter after Mikhail Gorbachev. But consider what is happening in Russia right now:

Now a museum, Perm-36 is the only part of Joseph Stalin’s Gulag that still survives. The network of brutal labour camps was where Soviet Russia sent its political opponents, as well as many criminals and kulaks – wealthier peasants. During Stalin’s Great Terror in the 1930s, millions passed through the system. Hard physical work on meagre rations in extreme weather killed vast numbers…The museum at this site was founded by historian Viktor Shmyrov in the 1990s as post-Soviet Russia opened up to the world.

“The Gulag was a huge phenomenon but there are practically no traces of it left,” he says. “That’s why Perm-36 needed preserving.” The country opened many archives then too, revealing the scale and details of decades of political repression. But the desire to dig deep into that past has been fading.

In 2014, Perm-36 was taken over by the local authorities and the museum’s founder was removed. The new administration then tried to soften the museum’s focus, says Shmyrov. “The dominant idea now is that the Gulag was necessary, both economically and to bring discipline and order.” One member of the new team admits there were changes. “There was a lean towards justifying the repressions, maybe three years when the museum wavered,” historian Sergei Sheverin says, standing by rows of barbed wire. At one point, the Gulag museum’s own website defended Stalin’s imprisonment of scientists – to force them to work for the state.

Sheverin suggests the museum was a stain on the “Great Power” narrative of Russia that’s now led by Putin. That approach has seen Stalin rehabilitated because of his role in the Soviet defeat of Nazi Germany. “The policy from above is that we shouldn’t remember the bad things, only the good,” says Sheverin.

The museum’s founder Viktor Shmyrov suspects there was an additional reason for his removal. Perm-36 used to host an annual forum and music festival that attracted thousands. In a place where free-thinkers were once incarcerated, Shmyrov says the festival had developed into a “freedom space”. “Not one person there could say a good thing about Vladimir Putin of course,” he says. “We used to have a powerful civil society. Now they’re bringing order and control.” The attempts to dilute the historical message at Perm-36 sparked opposition from human rights activists and the independent press…

Does Putin’s Russia reject the West? (BBC)

But not, apparently, from Jordan Peterson who was busy fighting the real enemies of freedom: Candian politicians attempting to protect transgender people and the Ontario Education Association.

Meanwhile, in China, the president has removed limits to being president for life:

Last week China stepped from autonomy into dictatorship. That was when Xi Jinping … let it be known that he will change China’s constitution so that he can rule as president for as long as he chooses …. This is not just a big change for China but also strong evidence that the West’s 25 year long bet on China has failed.

After the collapse of the Soviet Union, the West welcomed [China] into the global economic order. Western leaders believed that by giving China a stake in institutions such as the World Trade Organization would bind it into the rules based system … They hoped that economic integration would encourage China to evolve into a market economy and that, as its people grew wealthier, its people would come to yearn for democratic reforms ….

Economists Shocked That China Invalidates Their Pet View That Economic Liberalization Producers Political Liberalization (Naked Capitalism)

As Rorty predicted, the elites are using cultural issues to keep us divided against one another as they consolidate power and engage in a new enclosure movement. Peterson is just the latest arrow in their quiver.

Without prompting, he raged, with operatic scattergun anger against postmodernism, Marxists and—his favourite bogeymen—“social justice warriors.” It was the day after the U.S. presidential election, and I was still reeling from Trump’s victory. Peterson was unperturbed. He said Trump was no worse than Reagan and that the Democrats got what they deserved for abandoning the working class and playing identity politics. I was initially surprised—someone who spent a lifetime studying tyranny wasn’t maybe a tad worried about a president with such undisguised autocratic ambitions? But then I remembered that Trump, too, has long blamed political correctness for America’s ills, and reflexively used the phrase to dismiss any criticism he faced—everything from his treatment of women to his proposed immigration ban on Muslims. And, among many Trump supporters, “social justice warrior” is a favourite epithet used to disqualify his critics.

The Pronoun Warrior

Jordan Peterson: Useful Idiot

I’ve spent a fair deal of time–way too much, actually–trying to get a handle on the Jordan Peterson phenomenon. And it is best to distinguish JP the phenomenon from JP the person, because from I can tell, they are indeed quite different and distinct.

I’m going to state at the outset what I had originally put in my conclusion. That:

1.) The Jordan Peterson phenomenon is mainly caused by our failure to take the pain of men, especially young men, seriously.

Men, especially white men, today are dealing with an impossible series of challenges. There are few satisfying roles for them in society anymore. They are ridiculed. They feel persecuted. They feel unloved. The rise of the Sheconomy has made the only jobs on offer for men ones that they don’t particularly enjoy doing or are not particularly suited for. Even in the few fields that are still (temporarily) male-dominated, such as computer programming, we are told that that this means we have a “diversity problem” that needs to be corrected, while no one frets about the paltry number of male home health care aides or registered nurses. Men are blamed for creating and sustaining a system that is shortening their own lifespans, and one that men feel is increasingly stacked against them (for example, child support and visitation rights).

Men quickly find that their natural interests do not overlap with what society wants or needs anymore, and their inclinations are seen as inherently boorish and cruel. They find that the traits that make them desirable as workers make them undesirable as romantic partners. They find video games and pot more satisfying than working in a dead-end job where you are treated like a virtual serf.

Peterson understands this phenomenon. He understands that men, in general, are less agreeable than women, and that they have different cognitive styles. He knows this from his psychological studies. He also knows that men, especially young men, have been abandoned by society that has no use for many of them and are feeling hopeless and adrift. This quote from James Howard Kunstler describes the situation pretty well:

“The general run of humanity really does need some sort of a coherent armature for daily life. And that incudes role models who offer examples of behavior that will allow them to thrive rather than to be defeated by life. They need a certain amount of discipline in order to fulfill the behavior that those role models show them, and they need some aspiration, some ability to aspire to the products or the results of leading what we might call a good life. And a lot of those things are missing, especially in these unfortunately sort-of disenfranchised, throw-away, forgotten, lower middle classes that we have in America. ”

“You can see it very clearly in my region, which was, as I said, a former thriving region of small manufacturing, small factories…around the confluence of the Hudson river and the Battenkill River where I am. Granted, a lot of these companies were paternalistic, but as part of that paternalism they sponsored a lot of institutional activities for people. You know, they had baseball teams, they had outings, [and] they paid these people enough to live decently, and these people produced children who aspired to do better. And they were able to do better. They got a better education by eighth grade in the 1920’s than people are getting now in grad schools. And all of this stuff has dissolved.”

“You actually need quite a bit of built-in structure in everyday life for a society to thrive and individuals to thrive within it. And that’s not there, and we don’t care about it. We just don’t care. We have eliminated most of the public gathering places in small town America. I live in a town that doesn’t have a coffee shop [or] a bar, anyplace that somebody might go outside their home. And there’s the expectation that all of the ‘community’ that you’re going to be a part of is found on your TV set. Well that’s just a lie. It’s based on a very basic and almost universal misunderstanding in America that the virtual is an adequate substitute for the authentic. That having relationships with made-up people on TV is the same as having relationships with people who are really in your life.”

“And so that structure for leading a good life is absent. We’re seeing the results of it in this ‘anything goes and nothing matters’ society that we’ve created for ourselves.”

James Howard Kunstler: Racketeering Is Ruining Us (Peak Prosperity interview, YouTube)

Into this vacuum steps Jordan Peterson with his theories about how “anything goes and nothing matters” is the postmodernist creed, with its ultimate roots in Marxism, and that the universities are spreading this pessimistic message of “cultural Marxism.” To counteract this, he turns to philosophers like Nietzsche and looks to archetypes and mythology to restore a lost order (logos) to life.

“I think at a deep level the West has lost faith in masculinity. That’s no different than the death of God. It’s the same thing. And Nietzsche knew what the consequences of that would be, that’s most of what he wrote about. So you’d say the divine symbol of masculinity has been obliterated. Well, so then what do you expect? What’s going to happen? That means masculinity is going to become weak. Especially if the symbol is also denigrated, which it definitly is.”

“So what that means is that the ideal that men could aspirte to is denigrated? Well, then with your ideal in tatters, you’re weak. That’s definitional. So I think the reason that men have been responding positively to my thinking is that I don’t buy any of that. I like the masculine spirit. It’s necessary. And its not fundamentally carnage and pillaging. Its not fundamentally rape culture. It’s not fundamentally world destroying And all of those aspersions have been cast upon it. That’s partly the guilt of Western society for technological progress…”

Jordan Peterson – The West Has Lost Faith In Masculinity (YouTube)

2.) To me, the most tragic thing about the JP phenomenon is the fact that, in my not-so-humble-opinion, the destruction of white males is caused primarily by our economic system of globalized financial casino capitalism which seeks no other goal than to maximize profit for a small international investors class, consequences to the health of society be damned. It leads to a “devil take the hindmost” attitude, where society is a zero-sum game divided into winners and losers.

But instead of taking a critical look at that system, Peterson places the blame, and the responsibility for solving it, squarely on the shoulders of the individual. I think this is not only self-defeating, but it is actually harmful. Numerous studies have shown that in countries where individuals blame wider economic forces for their unemployment, rather than their own personal fortitude, there is less self-hatred and self-harm.

When the job search becomes a blame game (MIT News)

American Dream breeds shame and blame for job seekers (BBC)

Peterson not only does not wish to look at these forces, but is a staunch defender of libertarian market values. Not only is there no class war, declares Peterson, but even thinking in class terms makes you a Marxist!

3.) One could hardly think of a better way to kneecap a genuine Leftist movement than unleashing the divisive identity politics seen on college campuses. But where are these ideas really coming from? Are they truly ‘Marxist’ as Peterson asserts?

We know that, by definition, the men suffering the most in America today are those without college degrees. This was the conclusion of the Case/Deaton study. Life spans are actually declining for men and women without degrees. This means that, by definition, the people suffering the most in our society have no idea what is really going on on college campuses! Yet they are continually warned of a “Red Peril” emanating from college campuses by the alt-right and vote accordingly. It’s the Red scare updated for the twenty-first century.

In my opinion, this is entirely a media-manufactured phenomenon. Why? As Adam Curtis opined, ‘Angry people click more.’ Keeping people angry and outraged seems to be the main purpose of media these days because it is profitable. Keeping people informed is less important than profits.

Are the semi-mythical “Social Justice Warriors” actually closet Maoists dedicated to spreading communism beyond the campus? Consider that it is at the core of the Marxist project for workers to set aside superficial differences such as race, gender and nationality, and recognize their class role as the main reason they are exploited. The social justice warriors clearly do not want that.

Liberals would be satisfied with a world in which exploitation and wealth were evenly distributed across demographic groups. The left doesn’t want that. We want no exploitation of anyone. That necessarily means that white men shouldn’t be exploited either…So, lonely and/or broke white men sometimes feel the left offers them no explanation for their suffering. You know who does? Jordan Peterson. He says to them, I know you feel bad, and let me tell you why. And then he feeds them a bunch of hateful bullshit. More and more people are going for it. He has the number one bestselling book on Amazon…

Slavoj Zizek makes this point as well:

If I were to engage in paranoiac speculations, I would be much more inclined to say that the Politically Correct obsessive regulations (like the obligatory naming of different sexual identities, with legal measures taken if one violates them) are rather a Left-liberal plot to destroy any actual radical Left movement. Suffice it to recall the animosity against Bernie Sanders among some LGBT+ and feminist circles, whose members have no problems with big corporate bosses supporting them. The “cultural” focus of PC and #MeToo is, to put it in a simplified way, a desperate attempt to avoid the confrontation with actual economic and political problems, i.e., to locate women’s oppression and racism in their socio-economic context…Liberals will have to take note that there is a growing radical Left critique of PC, identity politics and #MeToo…

A Reply To My Critics Concerning An Engagement With Jordan Peterson (Philosophical Salon)

This surprisingly intelligent YouTube comment makes a similar point:

For a long time it has been a tactic of US intelligence to support a moderate group, be it progressive or reactionary, as a way of blocking a more extremist group from gaining support. This happened domestically in the 60’s with progressive movements as well. Most famously Gloria Steinem was covertly supported by the CIA as a way of keeping attention away from more dangerous radicals. Culturally, things like universities in effect reproduce this dynamic. By having an Overton window big enough to include a lot of progressive politics, they can exclude actually dangerous stuff. This is the [role] political correctness basically plays. By maintaining vigorous debate within a specific window, and outrage for anything outside of that, it vanguards against real leftist politics of the sort actual Marxists argue for.

As this comment from an article in the Guardian about Peterson’s book states: “I thought Marxism was about “workers of the world unite” not ‘let’s fragment into a million separate indentities and fight each other.'”

So, who the hell is Jordan Peterson, anyway?

Jordan Peterson is a formerly obscure Canadian psychology professor who became an overnight sensation by posting a series of YouTube videos describing his opposition to Canadian Bill C-16. Opposition to this bill has become something of a a cause celebre among a certain group of self-described anti-Leftist activists who like to militate against against “identity politics.” He argued that the bill forced him to call people by their “preferred pronoun,” or else face sanction. He argued that this amounted to a form of “compelled speech,” and that language was a battleground that he would not cede to the “radical Left.”

In other words, if I were a transgender person and demanded Peterson call me, I don’t know, ‘apple,’ he would have to do so.

Now, I think we can all agree this is a little silly. But to Peterson, this was no less than a threat to freedom and very foundations of Western civlization.

On September 27, University of Toronto psychology professor Jordan Peterson posted a video titled Professor Against Political Correctness on his YouTube channel. The lecture, the first in a three-part series recorded in Peterson’s home office, was inspired by two recent events that he said made him nervous.

The first was the introduction of Bill C-16, a federal amendment to the Canadian Human Rights Act and Criminal Code that would add gender identity and gender expression to the list of prohibited grounds for discrimination. Peterson’s second concern was that U of T’s human resources department would soon make anti-bias and anti-discrimination training mandatory for its staff—training he believed to be ineffective, coercive and politically motivated. “I know something about the way that totalitarian, authoritarian political states develop,” Peterson said in the first video, “and I can’t help but think I’m seeing a fair bit of that right now.”

Other profs in his position might have written op-eds, circulated petitions or negotiated with university officials. But Peterson is a big believer in the power of YouTube—“a Gutenberg revolution for speech,” he calls it—and, as it turns out, he had a lot to get off his chest. He carpet-bombed Marxists (“no better than Nazis”), the Ontario Human Rights Commission (“perhaps the biggest enemy of freedom currently extant in Canada”), the Black Liberation Collective (“they have no legitimacy among the people they purport to represent”) and HR departments in general (“the most pathological elements in large organizations”).

Peterson also said he would absolutely not comply with the implied diktat of Bill C-16, which could make the refusal to refer to people by the pronouns of their choice an actionable form of harassment. He believes the idea of a non-binary gender spectrum is specious and he dismisses as nonsensical the raft of gender-neutral pronouns that transgender people have adopted—ze, vis, hir, and the singular use of they, them and their. “I don’t recognize another person’s right to determine what pronouns I use to address them,” he said grimly. “I think they’re connected to an underground apparatus of radical left political motivations. I think uttering those words makes me a tool of those motivations. And I’m going to try and be a tool of my own motivations as clearly as I can articulate them and not the mouthpiece of some murderous ideology.”...In his fervent opinion, the issue wasn’t pronouns, per se. It was much bigger than that. It was truth itself. Being told what to say—and by the government no less—was just one more step along the slippery slope to tyranny. The way Peterson tells it, the only thing standing between us and a full-blown fascist insurrection was him.

The Pronoun Warrior (Toronto Life)

Underground apparatus? Murderous Ideology? What the f*ck is he talking about???

According to Peterson, the mandated use of such pronouns is a “slippery slope” down the road to totalitarianism, re-education camps and gulags, and identity politics is the “camel’s nose” for FULL COMMUNISM.

Peterson contends that “political correctness” is actually a mutated form of Communist ideology, the same ideology, he claims, that directly led to the murder of millions of innocent individuals in the twentieth century. Furthermore, he claims that entire fields of academia have been corrupted by “radical postmodernism” including nearly all the humanities such as anthropology and literature. He further alleges that these “Neo-Marxists” have seized control of universities, government departments and corporate HR departments.

Despite his fear of leftist goon squads patrolling college campuses, no one, not one single person, has been arrested or jailed, or even fined over this law. It is a totally artificial crisis, manufactured in order to smear the radical left on college campuses and foment outrage. It’s pure grandstanding. Here is what legal scholars think in a letter from the Canadian Bar Association:

For human rights legislation, the CHRA prohibits denying or differentiating adversely in the provision of goods, services, facilities or accommodation customarily available to the general public, commercial or residential accommodation, or, employment on the basis of a prohibited ground of discrimination. The Act applies to federal and federally regulated entities.

The amendment to the CHRA will not compel the speech of private citizens. Nor will it hamper the evolution of academic debates about sex and gender, race and ethnicity, nature and culture, and other genuine and continuing inquiries that mark our common quest for understanding of the human condition.

RE: Bill C-16, An Act to amend the Canadian Human Rights Act and the Criminal Code (gender identity or expression) (Canadian Bar Association – PDF)

However, millions of people watched the videos and tens of thousands contributed to Peterson’s Patreon account, to the tune of over $50,000 a month. Being a martyr has its advantages. Chapo Trap House described him as “the Rosa Parks of Pronouns.”

If Peterson were really so concerned about the threats to free speech coming from employers such as his university, then why isn’t he arguing for more union representation, which has the added benefit if reducing inequality (which he claims to want to do):

I’m seeing a lot of comments from the political right and centre-right worrying about the possibility that workers may be fired for expressing conservative views…It strikes me that this would be a really good time for people…to campaign for an end to employment at will, and the introduction of the kind of unfair dismissal laws that protect workers in most democratic countries, but not, for the most part, in the US. Among other things, these laws prohibit firing employees on the basis of their political opinions. Better still, though, would be a resurgence of unionism. Union contracts generally require dismissal for cause, and unionised workers have some actual backup when it comes to a dispute with employers.

Free speech, unfair dismissal and unions (Crooked Timber)

So is Peterson’s far right?

Short answer: no. This video, Jordan Peterson: Am I Far Right?, gives a good simple description of what Peterson’s major influences are:

In an emailed rebuttal to a journalist who termed him a figure of the “far right”, he described his own politics as those of a “classic British liberal … temperamentally I am high on openness which tilts me to the left, although I am also conscientious which tilts me to the right. Philosophically I am an individualist, not a collectivist of the right or the left. Metaphysically I am an American pragmatist who has been strongly influenced by the psychoanalytic and clinical thinking of Freud and Jung.”

12 Rules for Life by Jordan B Peterson review – a self-help book from a culture warrior (Guardian)

There seem to be three, mutually interlocking Jordan Petersons:

A. The tenured psychology professor, who has written books and papers, and whose lectures have been described as ‘life changing’ by students who took his courses.

B. The self-help guru, who talks about things like metaphysical truth, Jungian archetypes and seeking meaning whose ideas resemble Joseph Campbell’s work in a lot of ways.

C. The rabid anti-Communist crusader who engages in conspiracy theories and red-baiting, who sees secret Communism behind every campus action he doesn’t like.

Peterson’s fans commonly depict him as “misunderstood.” This is because, for almost everything he has said, he has said the opposite at some point, or used weasel words to meliorate his stance. He’s also been accused of doing a Gish gallop through the topics he describes, making describing what he really believes like nailing jello to a tree.

Why, then, is he considered to be far right?

Well, one major reason is that Peterson’s primary fan base is the alt-right, whether he likes it or not. It was not Peterson on his A or B incarnations that made him famous and put money in his coffers; it was version C. And he knows it.

A large part of this is because Peterson’s preferred enemies list is exactly the same as that of the alt-right: Social Justice Warriors, feminists, political correctness, activists (such as black lives matter and LQBTQ), the undifferentiated “radical left,” HR departments, entire academic disciplines (such as anything with ‘studies’ in the title), postmodernism, but above all, Marxists and Neo-Marxists.

Peterson throws around the terms “Marxism” and “Neo-Marxism” sloppily and interchangeably, and without precise definitions. For a man whose cardinal rules include “Be precise in your speech,” he is extremely sloppy using these phrases, making it difficult to know exactly what he is talking about. This video from the Epoch Times is the most comprehensive statement of Peterson’s ideology:

The accompanying article in the Epoch Times, an anti-comummunist newspaper founded by dissidents from the Falun Gong movement, transcribes the main points of the interview:

Peterson said it’s not possible to understand our current society without considering the role postmodernism plays within it, “because postmodernism, in many ways—especially as it’s played out politically—is the new skin that the old Marxism now inhabits.”

[…]

By the end of the 1960s, he said, even French intellectuals like Jean-Paul Sartre had to admit that the communist experiment—whether under Marxism, Stalinism, Maoism, or any other variant—was “an absolute, catastrophic failure.”

Rather than do away with the ideology, however, they merely gave it a new face and a new name. “They were all Marxists. But they couldn’t be Marxists anymore, because you couldn’t be a Marxist and claim you were a human being by the end of the 1960s,” said Peterson.

The postmodernists built on the Marxist ideology, Peterson said. “They started to play a sleight of hand, and instead of pitting the proletariat, the working class, against the bourgeois, they started to pit the oppressed against the oppressor. That opened up the avenue to identifying any number of groups as oppressed and oppressor and to continue the same narrative under a different name. And so since the 1970s, under the guise of postmodernism, we’ve seen the rapid expansion of identity politics throughout the universities,” he said. “It’s come to dominate all of the humanities—which are dead as far as I can tell—and a huge proportion of the social sciences.”

“We’ve been publicly funding extremely radical, postmodern leftist thinkers who are hellbent on demolishing the fundamental substructure of Western civilization. And that’s no paranoid delusion. That’s their self-admitted goal,” he said, noting that their philosophy is heavily based in the ideas of French philosopher Jacques Derrida, “who, I think, most trenchantly formulated the anti-Western philosophy that is being pursued so assiduously by the radical left.”

“The people who hold this doctrine—this radical, postmodern, communitarian doctrine that makes racial identity or sexual identity or gender identity or some kind of group identity paramount—they’ve got control over most low-to-mid level bureaucratic structures, and many governments as well,” he said. “But even in the United States, where you know a lot of the governmental institutions have swung back to the Republican side, the postmodernist types have infiltrated bureaucratic organizations at the mid-to-upper level.”

“I don’t think its dangers can be overstated,” Peterson said. “And I also don’t think the degree to which it’s already infiltrated our culture can be overstated.”

Jordan Peterson Exposes the Postmodernist Agenda. Communist principles in postmodernism were spread under the guise of identity politics (Epoch Times)

Now, technically, Peterson doesn’t use the term “Cultural Marxism” directly in the video, preferring to use the term “Neo-Marxism.” As far as I can tell, however, the terms are interchangeable; I could not find any information distinguishing between the two, so I will consider them the same unless I find out some new information. He certainly describes them in the same terms.

Given that he took grave exception to the use of term “far right” in reference to him, to the point of demanding a retraction, one can only assume he is okay with the phrase “cultural Marxism” in reference to this video, otherwise he would have demanded that the term be removed and relaced with a more accurate one.

That Peterson is also vehemently anti-Marxist would be relatively unremarkable were it not for the fact that, in many of his online disquisitions about what he sees as a left-wing takeover of campus culture, he uses the terms “Marxism” and “postmodernism” almost interchangeably. Not only are these two schools of thought very different from one another, they are also in certain respects mutually antagonistic. You don’t need an MA in critical theory to figure it out: the travails of the Democratic Party during the primaries for 2016’s presidential election highlighted, in a very public and destructive way, the ideological fault lines in US progressive politics. The bitter schism between the Hillary Clinton camp — which mobilized aggressively around identity politics — and the old-school leftists who rallied around Bernie Sanders ultimately helped clear Donald Trump’s path to the presidency. (Historically, the burgeoning of identity politics in US campus culture in the 1980s and ’90s went hand in hand with the ascendancy of postmodernist ideas that explicitly repudiated Marxism.) It’s not just that this sloppy use of language exposes Peterson as an intellectual lightweight; the tendency to causally conflate various disparate phenomena that one happens not to like — in this instance, postmodernism, Marxism, and political correctness — is the calling card of the paranoiac.

A Messiah-cum-Surrogate-Dad for Gormless Dimwits: On Jordan B. Peterson’s “12 Rules for Life” (Los Angeles Review of Books)

What is “Cultural Marxism?”

Cultural Marxism is a ‘snarl word’ and dog-whistle phrase that refers to the Frankfurt School, a loosely organized group of academic and writers based in Germany during the Weimar Republic who were influneced by Marx. They were part of what we would today call a “private think tank” based in Frankfurt. For a good overview, I suggest listening to this slightly less baised overview from the BBC Radio four’s excellent In Our Time show: BBC Radio 4 – In Our Time, The Frankfurt School

Weimar Germany was a time much like our own: economic dislocation, rampant unemployment, declining faith in liberal democracy; communists, anti-communists, fascists and anti-fascists battling it out in the streets, marches and protests, etc. Despite all the chaos, there was a feeling of ‘hope and change;’ one scholar in the show compares it to an ‘Obama moment.’

Yet, instead of revolution, the nation turned to the right-wing Nazi Party.

Marx himself believed that successful revolution could only take place where the forces of capitalist production were sufficiently advanced. In such a scenario, the inherent contradictions of capitalism would cause it to falter, leading to socialist structures taking over in a more-or-less organic manner.

Instead, all the major communist revolutions were agrarian revolts by peasants against the aristocracy, rather than the proletariat rising up and seizing the means of production from capitalists in industrialized countries. Because the mass production of capitalism was not yet fully developed in these countries, Marx himself could have predicted their failure, and would not be surprised at the chaos under their implementation. Most Communists consider the Soviet Union as a form of state capitalism.

The Frankfurt School think tank pondered this question: Why didn’t the revolution occur in Germany after the War, where it “should” have occurred? Why didn’t the proletariat rise up and overthrow the capitalist class in the advanced capitalist countries of Western Europe, as many thought was inevitable? To answer this question, the Frankfurt School looked at more than just the economic structure, they decided to look at the culture itself. Capitalism wasn’t just an economic system, they argued. It colonized the minds of the individual people living under it, such that they could see no alternatives. It was embedded in the very DNA of society. To this end, they developed a  “critical theory,” which was, as you can imagine, critical of capitalist society, but addressed itself mainly to sociocultural issues rather than the economic workings of society as Marx had done.

They never called themselves “cultural Marxists,” however. Rather, that label first came from the National Socialist (Nazi) Party. The National Socialists didn’t use the phrase “cultural Marxism,” instead preferring the term “cultural Bolshevism.”

A History of Nazi Germany describes how the Weimar Republic brought about increased freedom of expression (modernism), then described by critics as decadent and irrational. Traditionalist Germans thought that this was causing German culture to decay and that society was heading towards a moral collapse.

The Nazis labelled this modernism as “Cultural Bolshevism” and, through “Jewish Bolshevism”, claimed that Jews were primarily behind Communism. In particular, they argued that Jews had orchestrated the Russian Revolution and were the main power behind Bolshevists.

This Jewish-led Bolshevist assault was described by Adolf Hitler as a disease that would weaken the Germans and leave them prey to the Jews, with Marxism being perceived as just another part of an “international Jewish conspiracy”. An ideological objective was thus the “purification” to eliminate alien influences and protect Germany’s culture.

Cultural Marxism (RationalWiki)

This concept of Marxists undermining Western civilization, and equating being “critical” and “pessimistic” with an attempt to subvert Western values is a staple of far right which began in Nazi Germany as a reaction to dislocation and rapid change. It’s a thread that runs through the alt-right today.

As this article points out:

[Peterson’s] obsessive anti-communism sits uncomfortably with [his] supposed anti-fascism. The main opposition to Adolf Hitler’s rise, after all, came, not from high-minded conservatives like Peterson, but from German socialist and communist worker’s parties. And Hitler secured support domestically and internationally in part by promising to crush that leftist opposition.

How anti-Leftism has made Jordan Peterson a mark for Fascist Propaganda (Pacific Standard)

In fact, a lot of “high minded conservatives” and prominent intellectuals threw their support behind Adolf Hitler and the Nazi Party. Many wealthy, conservative Americans did too, especially those strenuously opposed to the “socialist” policies of Franklin D. Roosevelt, policies that are quite similar to those advocated by, for example, Bernie Sanders today.

The “cultural Marxist” conspiracy theory didn’t die with the end of the Third Reich, however. Instead, it was revived and greatly expanded by the rising conservative movement of the 1990’s as the Republican Party merged with movement conservatism and the John Birch Society. They blamed everything they claimed was destroying American society on Marxists who were behind “politically correct” speech and quotas.

This post is a good explanation of why, ‘”[C]ultural Marxism” is a poorly framed interpretation of Marxist theory and is flawed in its conception”: Cultural Marxism, Cultural Conservatism and the Frankfurt School: Making Sense of Nonsense (How to Paint Your Panda). But then again, maybe it’s part of the conspiracy!

‘Cultural Marxism’ becomes a rallying cry for the modern-day alt-right

The conflagration of Marxism with political correctness and activism began long before anyone had ever heard of the good professor. It actually started in the Nineties, with roots going back to the Seventies.

This conspiracy theory hinges on the idea that the Frankfurt School wasn’t just an arcane strain of academic criticism. Instead, the Frankfurt School was behind an ongoing Marxist plot to destroy the capitalist West from within, spreading its tentacles throughout academia and indoctrinating students to hate patriotism & freedom. Thus, rock’n’roll, Sixties counterculture, the civil rights movement, the anti-war movement, homosexuality, modern feminism, and in general all the “decay” in the West since the 1950s are allegedly products of the Frankfurt school…[rationalWiki]

Its origins were surprisingly deliberate, emerging from a paleoconservative Washington think tank called the Free Congress Foundation. The FCF was founded by Paul Weyrich, a founder of the Heritage Foundation and namer of the so-called Moral Majority movement. Weyrich also created a TV network called National Empowerment Television, a short-lived predecessor to Fox News, which aired a documentary in 1999 called “Political Correctness: The Frankfurt School.” Hosted by…William Lind, it presents an account of the origin of what we now call “identity politics.”

Weyrich first presented his notion of Cultural Marxism in a 1998 speech to the Civitas Institute’s Conservative Leadership Conference, later repeating this usage in his widely syndicated “culture war letter”. At Weyrich’s request, William S. Lind wrote a short history of his conception of Cultural Marxism for the Free Congress Foundation; in it Lind identifies the presence of homosexuals on television as proof of Cultural Marxist control over the mass media and claims that Herbert Marcuse considered a coalition of “blacks, students, feminist women, and homosexuals” as a vanguard of cultural revolution…[wikipedia]

These came, Lind tells us, from the Institute for Social Research, or the Frankfurt School. There, Theodor Adorno, Herbert Marcuse, and their cronies created a school of thought called “critical theory,” which the FCF gave the name “cultural Marxism.” This frightening idea fused the impertinence of Marx with the indecency of Freud, producing a new threat to Western values far beyond those posed by Copernicus or Darwin… [https://www.viewpointmag.com/2018/01/23/postmodernism-not-take-place-jordan-petersons-12-rules-life/]

Sounds an awful lot like Peterson’s rhetoric, doesn’t it? In his essay, Lind declared, in rhetoric virtually identical to that of the stump speeches of Jordan Peterson:

“Political Correctness is cultural Marxism. It is Marxism translated from economic into cultural terms. It is an effort that goes back not to the 1960s and the hippies and the peace movement, but back to World War I. If we compare the basic tenets of Political Correctness with classical Marxism the parallels are very obvious.”

Lind wasn’t satisfied with just an online essay. He also produced a series of videos which can easily be accessed on YouTube, whose ideas are virtually identical to the political views of Dr. Peterson:

In 1999, Lind led the creation of an hour-long program entitled “Political Correctness: The Frankfurt School”. Some of Lind’s content went on to be reproduced by James Jaeger in his YouTube film “CULTURAL MARXISM: The Corruption of America.” The historian Martin Jay commented on this phenomenon saying that Lind’s original documentary:

‘… spawned a number of condensed textual versions, which were reproduced on a number of radical right-wing sites. These in turn led to a welter of new videos now available on YouTube, which feature an odd cast of pseudo-experts regurgitating exactly the same line. The message is numbingly simplistic: all the ills of modern American culture, from feminism, affirmative action, sexual liberation and gay rights to the decay of traditional education and even environmentalism are ultimately attributable to the insidious influence of the members of the Institute for Social Research who came to America in the 1930’s.

Heidi Beirich likewise holds that the conspiracy theory is used to demonize various conservative “bêtes noires” including “feminists, homosexuals, secular humanists, multiculturalists, sex educators, environmentalists, immigrants, and black nationalists”.

Wait a minute, that’s the exact same enemies list as Jordan Peterson!

Indeed, I’ve spent some time watching these documentaries. Now, when I say the rhetoric is the same, you don’t have to take my word for it. Watch the Jordan Peterson video above. Watch the William Lind documentaries. Make up your own mind.

Although the theory became more widespread in the late 1990s and through the 2000s, the modern iteration of the theory originated in Michael Minnicino’s 1992 essay “New Dark Age: Frankfurt School and ‘Political Correctness'”, published in Fidelio Magazine by the Schiller Institute. The Schiller Institute, a branch of the LaRouche movement, further promoted the idea in 1994. The Minnicino article charges that the Frankfurt School promoted Modernism in the arts as a form of cultural pessimism and shaped the counterculture of the 1960s (such as the British pop band The Beatles) after the Wandervogel of the Ascona commune.

The idea that the counterculture was a fifth column for communism is an old chestnut  going back to the 1960’s, as is the idea that colleges were radicalizing middle American children. The Powell memorandum back in the 1970’s sounded a paranoid alarm about how students on college campuses were being indoctrinated by insidious left-wing professors to hate the “free enterprise” system.

According to Chip Berlet, who specializes in the study of far-right movements, the Cultural Marxism conspiracy theory found fertile ground within the Tea Party movement of 2009, with contributions published in the American Thinker and WorldNetDaily highlighted by some Tea Party websites.

More recently, the Norwegian terrorist Anders Behring Breivik included the term in his document “2083: A European Declaration of Independence”, which—along with The Free Congress Foundation’s Political Correctness: A Short History of an Ideology—was e-mailed to 1,003 addresses approximately 90 minutes before the 2011 bomb blast in Oslo for which Breivik was responsible. Segments of William S. Lind’s writings on Cultural Marxism have been found within Breivik’s manifesto.

[https://en.wikipedia.org/wiki/Frankfurt_School#Cultural_Marxism_conspiracy_theory]

Right-wing agitprop outlets such as Breitbart, whose head Steve Bannon served in the Trump administration, also commonly use cultural Marxism as a snarl word and all-purpose bogeyman for everything they believe is destroying America from within, in terms alarmingly similar to those of the Nazis:

Breitbart views so called ”Cultural Marxism”as the root of all evil. Cultural marxism destroys the language. Cultural Marxists wants to have equality between the sexes. they threaten the western civilization, and hate God and they love Muslims and Homosexuals too.

Yes, Cultural Marxists are behind Muslim” immigration to, they claim. It all started with talk about the rights of minorites in the 60s, as they write:

Under this “cultural Marxism,” progressives believed they would aid racial and sexual minorities — and now Islamic minorities — by transferring cultural power and status from ordinary Americans, especially from white working-class Americans and away from individualistic-minded Christian churches…

The present day cultural Marxists, including former President Obama

are also encouraging the migration of devout Muslims and their theocratic political leaders into the United States.

And that leads to terrorism.

The resulting spike in jihad attacks…”

The Nazi Roots of the Word ”Cultural Marxism” (Breitbart Watch)

And this idea has even infiltrated the highest levels of the U.S. military:

In July 2017, Rich Higgins was removed by US National Security Advisor H. R. McMaster from the United States National Security Council following the discovery of a seven-page memorandum he had authored, describing a conspiracy theory concerning a plot to destroy the presidency of Donald Trump by Cultural Marxists, as well as Islamists, globalists, bankers, the media, and members of the Republican and Democratic parties.

As RationalWiki states, “Nobody denies that the Frankfurt School existed (and championed its fair share of nutty ideas). Critics of the pseudohistorical ‘Cultural Marxism’ conspiracy theory merely argue that the school was tediously unsuccessful (and, as such, somewhat unimportant) in the broad scheme of Western progressivism — and, more obvious still, that all liberals aren’t commies as well.”

Now, it’s obviously clear that Peterson’s understanding of “Cultural Marxism” is very different than Anders Breivik, the Norwegian mass murderer. But Peterson’s constant use of this term is worrying. After all, this is what our young men are listening to! Peterson’s claims are that things like bill C-16 lead to the gulag and reeducation camps. Yet ideas virtually identical to the ones he is peddling have already directly led to the deaths of 77 people in Norway. It’s even gained cachet among people with their fingers on the nuclear button. What’s the real threat here???

According to the Southern Poverty Law Center, the right-wing theory of cultural Marxism holds that the Jewish, Marxist philosophers of the 1930s Frankfurt School hatched a conspiracy to corrupt American values by promoting sexual liberation and anti-racism…Peterson has tweaked this argument a bit. In his lectures, he mostly traces cultural rot to postmodernists like Derrida (whose work Peterson comically garbles) rather than to the Frankfurt School.

In Peterson’s new book, though, he does explicitly link postmodernism to the Frankfurt school, and in other venues he regularly uses and approves the term “cultural Marxism.” One of his videos is titled “Postmodernism and Cultural Marxism.” On Facebook, he shared a Daily Caller article titled “Cultural Marxism Is Destroying America” that begins, with outright racism, “Yet again an American city is being torn apart by black rioters.” The article goes on to blame racial tension in the U.S. on … you guessed it: the Frankfurt School.

Of course, it is possible to criticize the left without falling into fascism. Joseph Stalin was a murderous monster; Communist regimes have done horrible things that led to the deaths of millions of people. But the left in the U.S. and Canada is not promoting armed revolution or mass murder. In his cultural Marxism video, Peterson argues that, whether you’re talking about Leninist insurrection or folks criticizing sexism or racism in cultural products, “the end result is much the same.” That’s dangerous nonsense, which can easily be used to justify any extreme of violence. If your gender studies professor is the equivalent of Lenin … well, we’d better destroy her, right?

How anti-Leftism has made Jordan Peterson a mark for Fascist Propaganda (Pacific Standard)

His constant promotion of these paranoid conspiracy theories to his audience of impressionable, frustrated, and economically precarious young men makes him what I would characterize, somewhat ironically, a “useful idiot” for the far-right. This is why Peterson’s “I’m so misunderstood” schtick is disingenuous, as are the claims that he is “misinterpreted.” I think it’s pretty clear from the evidence above, in his own words, what he believes.

The tragic thing is, there was a guy who wrote in very similar terms about the rootlessness, despair and alienation that young men would inevitably experience under capitalism. He also gained a following as well. His name? Karl Marx:

Matthew Syed in the Times gives us a wonderful example of Marxist thinking. He asks why marathon running is so popular, and says it’s because it satisfies a desire for self-improvement which we cannot get from paid labour:

We live in a world where the connection between effort and reward is fragmenting. In our past, we hunted, gathered and built…We could observe, track and consume the fruits of our labour. We could see the connection between our sweat and toil, and the value we derived from them. In today’s globally dispersed capitalist machine, this sense is disappearing.

This is pure Marxism. Marx thought that people had a desire for self-actualization through work, but that capitalism thwarted this urge. In capitalism, he wrote:

Labor is external to the worker, i.e., it does not belong to his intrinsic nature; that in his work, therefore, he does not affirm himself but denies himself, does not feel content but unhappy, does not develop freely his physical and mental energy but mortifies his body and ruins his mind. The worker therefore only feels himself outside his work, and in his work feels outside himself.

Jon Elster claims that Marx “condemned capitalism mainly because it frustrated human development and self-actualization.”

Marx was right. The fact that we spend our leisure time doing things that others might call work – gardening, DIY, baking, blogging, playing musical instruments – demonstrates our urge for self-actualization. And yet capitalist work doesn’t fulfill this need. As the Smith Institute said (pdf):

Not only do we have widespread problems with productivity and pay, as well as growing insecurity at work, but also a significant minority of employees suffer from poor management, lack of meaningful voice and injustice at work. For too many workers, their talent, skills and potential go unrealised, leaving them less fulfilled and the economy failing to fire on all cylinders.

This poses the question: why isn’t there more demand at the political level for fulfilling work?

Alienation: The Non-Issue (Stumbling and Mumbling)

Perhaps because people like Jordan Peterson and his ilk would rather we focus on the threat from radical postmodernist feminist college professors, and the identitarian Neoliberals just want to make sure that there are enough minorities represented among the exploiters. Divide and rule has been a standard tactic to maintain power in America since Bacon’s Rebellion invented the very concept of “race” to keep working classes from teaming up against the aristocracy who were–dare we say it–oppressing them. It was only when Martin Luther King attempted to bring poor whites into his movement that he was assassinated.

The meaning and self-actualization Peterson is peddling in his book simply isn’t possible under the capitalist system. And that’s the problem. No amount of self-help or story-building is going to change that fact.

Combining white paranoia about being a minority with a deteriorating economy and constant fears of cultural Marxism, and peddling those ideas to angry young men has not shown itself to lead to a good result, historically. Is Peterson too ignorant of history to see this?

Admittedly it’s not always easy to distinguish between a harmless retro eccentric and a peddler of poisonous and potentially murderous ideas. So let’s take stock: Masculinist persecution myth? Check. Repeated appeals to Darwinism to justify social hierarchies? Check. A left-wing conspiracy to take over the culture? Check. Romanticization of suffering? Check. Neurotic angst about “chaos”? Check. Like many of his sort, Peterson sees himself as a defender of the best traditions of Western civilization and the Enlightenment. But there is an old adage: if it looks like a duck and quacks like a duck, chances are it’s a duck.

A Messiah-cum-Surrogate-Dad for Gormless Dimwits: On Jordan B. Peterson’s “12 Rules for Life” (Los Angeles Review of Books)

Finally, here are some more good comments from that YouTube video. I’ve combined several of them together which make the point that Marxism and Postmodernism have nothing to do with the identitarian politics on college campuses.

I’m so fucking tired of people using the term postmodernist as a catchall for leftists – postmodernism has literally nothing to do with Marxism, in fact by its very nature is at odds with the material nature of Marxism… Derrida wasn’t a Marxist, he wasn’t even a political radical unlike many of his colleges. Derrida didn’t even write about Marx at all until the 90s, after the time in which all of his intellectual cohort had given up on Marx. Derrida’s philosophical heritage is by way of the structuralism of Saussure and Levi-Strauss, and hermeneutic philosophy of Heidegger and Gadamer…

Something that most people who aren’t in the academic left don’t realize is that Foucault is seen as a clear break with Marxism, and distinctly not as an extension of it. Foucault was the first one to pose a distinctly different understanding of oppression, a sort of anarchist flavoured one, against the Marxists. There have been attempts at reconciliation, the most significant of which is Empire by Negri and Hardt, where they incorporated Foucault’s biopolitical framework to create framework for analyzing the world after the cold war. Postmodernism, insofar as that term refers to anything at all, is the wave of thinkers who broke with Marx after reading Nietzsche, which is both the case for Foucault and Deleuze. Lyotard and Baurillard also broke with Marx altogether, though for different reasons. Negri, Deleuze, and Althusser all also became anti-Hegelian, all adopting Spinoza as a model for bizarre anti-dialectical forms of “Marxism”.

Basically, this is a lot more complicated than Peterson, or you, understand. The people who are collected up into ‘postmodernism’ were serious intellectual with real insight, and while most of it I don’t think is correct, it’s important and interesting stuff.

Because Postmodernism doesn’t actually refer to anything, it is an empty label and basically exists only as a term of abuse by people who don’t want to actually engage with various philosophers and social thinkers. The really is no common factor philosophically that link Derrida, Foucault, Deleuze, Rorty, etc etc… What connects all these people is really just their attempt to explain society in the era they lived in. …Peterson just straight up doesn’t understand the topology of the left. Peterson has never lived in a place where ‘the left’ and ‘liberals’ where universally understood to be categorically different orientations in politics. For Peterson communists are just ‘very liberal’ people, while in European politics, for most of the most of the postwar period being ‘very liberal’ was the same as saying ‘very anti-communist’. In places like France and Italy the Communist party was often the second or third biggest party, and was distinctly separate from anything called a ‘liberal’ party. This fact means that Peterson totally conflates the Marxist left with the ‘left-liberalism’ or progressive liberalism…

Next time we’ll take a look at how Peterson defends and shores up those systems.

The World According to Blyth, Keen & Curtis

Mark Blyth is a popular speaker. You have probably seen his talks all over YouTube. His specialty is explaining the politics of austerity (and why it’s a bad idea), and the rise of populism around the world, what he calls “Global Trumpism.”

I’ve seen several of his talks, and the ideas behind them are very simple. So I’m going to try and explain them in a straightforward manner below, sprinkled with a few quotes.

I’m also going to incorporate two other thinkers whose views are very similar and who fill in some of the gaps: economist Steve Keen and filmmaker Adam Curtis. Each has their own unique take on our situation, but all of their views gel together into one coherent big-picture summation of what has happened to the post-war world, and how we got into our economic predicament. I’m also going to add a few points of my own along the way where applicable.

The First Macroeconomic Regime 1945-1973

After the Second World War, nearly the whole world lay in ruins. Over 50 million people were dead. The architects of the post-war order vowed that they would do whatever possible to ensure that it would never happen again, no matter what the cost.

They realized that it was the economic dislocations and upheavals of the Great Depression–the joblessness, the inflation, the lack of a safety net, the radicalization of the population, that had ultimately led to the rise of Fascism and war.

At the same time, Communism was ascendant. Stalin had taken over Eastern Europe and was flexing his muscles. Mao and the Communists came to power in China. These two countries alone represented a significant share of the world’s total land area and population. They were joined by numerous smaller states–Cuba, North Korea, Vietnam–and numerous revolutionaries in places like Africa and Central America.

No one would support capitalism if all the gains went solely to the very top and most people were becoming worse off. So the post-war order would focus on one thing above all: full employment and social stability. Blyth uses the term “macroeconomic regime” to describe the set of policies that run the economy. The post-war macroeconomic regime was based around stability, Keynesianism, national economies, unionization, and especially full employment as the goal.

During this period, labor’s share of income went up, while capital’s share went down. This had never happened before. A prosperous, consuming middle class was created in the industrialized world. The results of this were spectacular. Here’s Blyth:

“Back in the day, from the end of World War two, from 1945 to about 1975, this is the golden era. It was the period where something very weird happened that never happened before. The top of the income distribution came down, the bottom went up, and the whole distribution jumped. This is when you got the birth of the American middle classes. This is when British Prime Minister Harold Wilson said to the working people in Britain, ‘You’ve never had it so good,’ and he was right.”

“And there was a unique combination of circumstances that produced that world. Mainly the reaction to the great Depression, fascism, world war two; and the success of the Soviet Union appealing as an alternative economic model after the chaos of the Twenties and Thirties.”

“So at the end of that period, we built a world that looked like this: The Cold War Era. The policy target was full employment, regardless if you were Sweden or the United States. That’s what the government cared about, because we saw the disastrous consequences of unemployment on a decade-long period.”

But there was a problem. This problem was articulated in 1943 by a Polish economist named Michał Kalecki who was working in the basement of the London School of Economics. His seven-page paper was called The Political Consequences of Full Employment. His paper effectively predicted what was to come in the 1970’s.

The Political-Economy of Kalecki (The Ad-Hoc Globalists)

The Old Macroeconomic Regime Crumbles: 1973-1979

We all know what happened next. The wheels came off in the 1970’s. Everything started falling apart the same year I was born–1973–which I’m sure was just a coincidence. But the question is, why did it fall apart?

Kalcecki’s explanation was that if you had a siloed economy of restricted capital and labor flows that targeted full employment, employees would take advantage of the situation to demand higher wages. If anyone can simply go out and get another job, businesses have to keep wages high to be competitive. But that eats into their profits. So they raise their prices to compensate. But rising prices eats into the workers’ pocketbooks. So the workers demand higher wages still. Businesses again raise prices to compensate for higher wages. Workers again demand more money to compensate for higher prices. And so on, and so on, in what economists call a wage-price spiral. More specifically, they call increasing wages pushing inflation up “wage push inflation“. Wage push inflation resulted from the full employment policies of the first macroeconomic regime:

Wage push inflation is a general increase in the cost of goods that is preceded by and results from an increase in wages. To maintain corporate profits after an increase in wages, employers must increase the prices they charge for the goods and services they provide. The overall increased cost of goods and services has a circular effect on the wage increase; eventually, as goods and services in the market overall increase, then higher wages will be needed to compensate for the increased prices of consumer goods. (Investopedia)

Blyth explains it this way:

“So what killed that first regime was inflation.”

“But it failed in the 1970’s. And the reason it failed was the following. Imagine you’ve decided that I’m going to target full employment, and that’s going to be my one policy goal. So you’re going to run a very tight, restrictive set of labor markets. And wages are going to get bid up, to the point that when you get to the late sixties when you’re running the Vietnam War off the books, your real unemployment rate is about 2-1/2 to three percent.”

“So the worst guy in your firm can leave work and then walk straight into another job and get a pay rise. The only way that firms can deal with this is by pushing up prices. So they push up prices, then what happens? Labor figures out they haven’t really had a pay rise. So they want more money. So they get a pay rise. So they want more money. And it all gets pushed up into inflation.”

“When inflation goes up and up and up like this, it becomes irrational to be an investor. So the investment rate collapses. Unemployment goes up despite the inflation. We get the great stagflation of the 1970’s.”

What high inflation and rising wages did was make it easy for people to service their debts. It was a “debtor’s paradise.” But creditors were not so happy. The value of their investments was eaten away by inflation. So they stopped investing, going on what was effectively an “investor strike.” The result of the dearth of investment due to high inflation was economic stagnation. Stagnation + inflation = stagflation.

“The Great Inflation of the 1970s destroyed faith in paper assets, because if you held a bond, suddenly the bond was worth much less money than it was before. But it was a brilliant time to be a debtor.”

“How many of you took out a mortgage in the 1970’s? You made out like bandits! Because if you have a 3 percent mortgage and there’s 10 percent inflation, it’s great; the bank was eating it and you were getting the capital gain. And then when you elected Reagan you locked in high real interest rates and your house increased in value. What a deal!”

So in the Blyth/Kalecki view, full employment policies, combined with an investor strike, caused the stagflation of the 1970’s. But there are a couple of alternate explanations I’d like to add.

One was the Vietnam War. That war caused a massive increase in government spending, as do all wars. At the same time, instead of raising taxes, they were being cut by the administration. Throughout history it has been the position of governments to “pay for” wars by raising taxes and/or borrowing.

I put “pay for” in quotes because MMT tells us that taxes do not fund government spending for war, or for anything else for that matter. So why the need for increased taxes? Because if the government is printing more and more money to pay for war costs, but it’s not “unprinting” money via taxation or soaking up the excess with war bonds, then you’re increasing the overall amount of money circulating in the economy. If you do this without a corresponding increase in productivity, then of course you will get inflation. It doesn’t help that the increased economic activity was mostly going to things that were being shipped halfway around the world to be blown up.

The other explanation is a sudden spike in the cost of oil. The formation of the OPEC cartel in the years prior cause the price of oil to triple overnight in the early 1970’s. Gas lines formed. The Arab Oil Embargo for the Six-Day war was another blow. This occurred as the U.S. hit domestic peak oil in 1972. Later in the decade, the Iranian Revolution would cause speculation in oil markets to raise the price once again (there was no actual supply shortage). It was the single largest transfer of national wealth in human history from the industrialized world to the oil producing nations of OPEC, particularly Saudi Arabia and the Gulf states. Interestingly, it was after this transfer of wealth that the threat of Islamic terrorism began to rise, funded by this money.

The price of oil tends to coincide with the change of macroeconomic regimes and financial dislocations. This is ignored by many economists.

Economists have a name for this phenomenon too. They call it cost-push inflation:

Cost-push inflation is a phenomenon in which the general price levels rise (inflation) due to increases in the cost of wages and raw materials. Cost-push inflation develops because the higher costs of production factors decreases in aggregate supply (the amount of total production) in the economy. Because there are fewer goods being produced (supply weakens) and demand for these goods remains consistent, the prices of finished goods increase (inflation). (Investopedia)

In this alternative view, full employment did not cause the problem. Rather it was an ill-advised war that the government refused to “pay for,” coupled with an unpredictable rise in the cost of the substance most dear to the economy–energy–as the result of cartel manipulation and geopolitical tensions.

Who’s right? What was the real cause? Hard to say, but I lean towards oil. If full employment was such a problem, then why did it take Kalecki’s prophecy thirty years to come true? Maybe because that’s when the energy costs spiked. The ugly conclusion from Blyth’s view, as I see it, is that is we cannot have full employment, otherwise inflation will inevitably be out of control. That is, we “need” a certain portion of the populace to be unemployed. I find this disturbing. It amounts to what is basically human sacrifice.

Inflation was running rampant. Stagflation. The Misery Index. Put on a sweater. Malaise forever. History’s greatest monster. You know the deal.

1970s stagnation (Angry Bear)

The system badly needed a reset. How was this accomplished?

The New Macroeconomic Regime 1980-2007: Neoliberalism

Blyth describes a macroeconomic regime as a sort of “software” written on the the “hardware” of capitalism. After 1980, a new software was written. Now, not only would full employment NOT be a goal anymore, but labor would now be “disciplined”–forced to accept declining wages, longer working hours, less benefits, less stability (and it’s converse-more “flexibility” for employers), more international competition, and so on. It was, essentially, a revolt of the elites rather than the masses; from the top down rather than the bottom up.

Instead of elected governments, policy was handed over to the central bankers, who were unelected. What they did was to “cure” the inflation by raising real interest rates to extremely high levels. This caused a terrible recession between 1979 and 1982, and unemployment rates to spike to over ten percent. But it did bring down inflation and reset the system. This coincided with the transition from the Democratic Carter administration to the Republican Reagan one.

“And what’s the solution to stagflation? Hand policy to central bankers, because they’re not elected and they can’t be thrown out of office, and they can jack up interest rates to twenty percent when inflation’s sixteen percent, cause a massive hemorrhaging of the economy and a constriction of credit and you get the big recessions that happened in the 1980’s. But it really reset the system.”

“There was a new software written onto that hardware, and that was the ideas of Thatcher and Reagan and the people behind them. That open markets, price stability, going global, that was they way you do it. That flexibility was good, that labor was bad. That the returns to capital had to go up otherwise what was the point of capitalism? That was the Neoliberal compact.”

In addition, markets would open up to globalization. Regulations would be abolished. Capital would be free to seek its highest return. Markets would be liberalized. Trade unions would be crushed. Banks would be deregulated. Taxes would be lowered. Labor would become flexible and footloose. Economies would remove tariffs and open themselves up to foreign competition. Labor would no longer be protected.

Globalization would neutralize the power of workers, because now if unions demanded more money, production would just move somewhere else. First it moved to the “right to work” states of Dixie, and then abroad to places like China and Mexico. Workers could no longer demand raises from their employers. This ended the push for inflation, because rising wages are what drive inflation. I’d also note that high oil prices led to new reserves being tapped, especially Alaska and the North Sea, breaking the power of the OPEC cartel and bringing oil prices back down.

As banks lent more and more money, and as capital was “freed” to seek it’s highest return anywhere in the world, the total amount of money increased. This drove interest rates down and down and down. The problem with low interest rates, though, is that it decreases the returns to capital. It also penalizes savers and investors, since their accounts are not earning much of a return. So how could high finance ensure adequate profits in a world of low interest rates and low inflation? The answer: leverage.

“Now here’s the problem, those interest rates go down over time because you make more and more financial transactions, you integrate different markets, you open up globally, so the pool of money gets bigger. As the pool of money gets bigger, the price of money falls. What’s the price of money? The interest rate.”

“So how do you make money on a declining spread? You pump up the leverage. And the banking system of the West became multiples of the underlying size of the economy. And it was all working great so long as everyone was revolving credit, whether it was your credit card, your house, the mortgage, the corporate loan book, whatever it was, so long as it doesn’t go bust.”

Instead of full employment, financial policy in the new regime would focus on something else–controlling inflation, aka price stability, above all else. At the very hint of inflation, central bankers stood ready to raise interest rates. The central bankers became household names as the ability of politicians to influence the newly globalized economy waned. Inflation went away, never to return. This now made the world safe for investors, but it made debts harder to service. It was now a “creditor’s paradise.” Even though inflation stayed low for decades, the 1970’s fear of inflation lingered on.

However, as Blyth points out, this fear was irrational; the long-term trend in interest rates was for them to go down. He cites statistics showing that the interest rate for government debt has been declining since 1350! In fact, literally the only period of high inflation we see in the data was the 1970’s, and yet all of our macroeconomic thinking today is based on that one short time period (when oil prices spiked, interestingly enough):

“You all watch Game of Thrones Right? Right; Game of thrones; ‘Hi, I’m the king, I’d like to borrow some money.’ I’m the Iron Bank of Braavos. You know what happens–everybody dies.”

“In that world, you have very high real interest rates, because if you get a bond from a government, they might rip you off. There’s no secondary market where you buy and swap different bonds around to offset the risk. So you have very high real interest rates.”

“The Italians and the Dutch come along in the 15-16th century and invent a secondary market for government debt. That starts to grow rapidly. The risk dissipates.”

“And then by the time you get to the 1700’s, real interest rates are below four percent. The Brits can issue a perpetual bond, a ‘forever’ bond to fight the Napoleonic wars at three percent and it’s oversubscribed. By the time you get to 1941, the real rate of interest is 1.88.”

“So the long-run real rate of interest rate for the global economy is two percent. Then there’s the 1970’s. All the inflation is in the seventies because of the unique confluence of events which was the post-war regime and its breakdown. But all of the economics we’ve ever learned is based on that one bit of the trend series. Everything else is forgotten.”

At the same time, wages in the wealthy countries stagnated. The labor/capital split now shifted. Labor’s share of income, formerly going up, now went down. Wages became decoupled from productivity. Real wages, when adjusted for inflation, remained flat for decades.  As the economist Thomas Piketty later asserted, if the rates of return to capital are higher than the overall growth rate of the underlying economy, inequality will dramatically increase without bound. This is especially true when wages are stagnant.

“Capitalism is run by investors, investors and firms. Once those firms go global as they did in the eighties and increasingly in the nineties, then the ability of domestic labor to demand their share of the profit split with capital really declines. And that begins the wage stagnation that we see in 1979 which continues all the way through to the present day, such that 60 percent of Americans, when adjusted for inflation, haven’t had a wage rise for thirty years.”

A small, coastal elite, who held much of the paper wealth, became fabulously rich. But workers who depended on labor for their income, particularly in places that had deindustrialized like the American Heartland or the English Midlands, were hard hit,faced with declining wages, dead-end jobs, shrinking government services, budget cuts, jobs moving to other countries, and mass immigration into their communities. At the same time, the costs for “non-tradeble” goods like education and  healthcare soared into the stratosphere. Although globalization resulted in cheap consumer goods, the costs for things like college, health care, child care, and later, housing, became an increasingly onerous burden.

With declining wages, how would consumption keep up? How would Americans pay for the rising costs? By using credit to substitute for the lost wages. And this wasn’t just true of individuals, but governments as well. Governments, too would lower taxes and make up for the difference by borrowing from the private sector. Here’s filmmaker Adam Curtis explaining the role finance played in the 1980’s:

14:00: “The interesting thing about the 1980’s is that everyone thinks that Thatcher and Reagan really were successful. But increasingly historians are looking back and going, ‘No they weren’t.’ They came in saying they were going to regenerate industry. But by about 1986-7 most of the industries in Britain and America had collapsed because of the economic experiment. So what Thatcher and Reagan did was they turned to finance. And they said, ‘Can you help us?’ And what finance did was to say, ‘We’ll lend the money.’ Wages weren’t going up. Wages were actually collapsing at that point.”

“So what happens is you had a switch and they gave power to finance. And finance came in and introduced the idea of lending on a much grander scale. The politicians allowed that because they facilitated all sort of new acts of parliament that allowed all that lending to happen.”

“So what you’ve got is a shift away from the idea that you were on a constant travelator of increased wages, increased security in the industries you worked in. Your income stagnated and it was supplemented by lending money. So finance got a great deal of power.”

“Now underlying finance is a deep desire to keep the world stable, to avoid chaotic situations. So we began to move into that world where were always trying to avoid risk. What then happens is that idea begins to spread out, not just literally in terms of you lending money. The idea of avoiding risk becomes the central thing in our society. And I would argue that weve all become terririfed of change. Which is conservative.”

And thus, beginning in the 1980’s finance became the new basis for the economy. Manufacturing, meanwhile, practically disappeared–felled by a combination of offshoring and automation. Service jobs, at lower pay, became the most common job type. At the same time, a college degree became a basic requirement for any job that paid more than minimum wage. And, finally, the main alternative to this system–Communism–collapsed and went under. Now, there truly was, as Margaret Thatcher put it, no alternative.

The problem was that financing living standards with debt was unsustainable, especially with declining incomes for the majority of wage earners:

“So how do people survive when wages aren’t growing? They borrow…In 2004 I lived in Baltimore. I went away for two weeks. I couldn’t open the door when I came home cause I had so many credit card offers…That’s why banking’s so big. Because every single one of us is running a deficit.”

“For everyone who fifty years old or older, do you remember a time when you didn’t have credit cards? For everybody who’s under fifty, that happened. We used to have this thing called the state that ran deficits for us and paid for stuff. But now you do it yourself. Through student loans. Through revolving lines of credit. Through borrowing from your house as if it’s an ATM. Because that’s what you’re using to fill in the gap.”

The problem with such skewed rewards to globalization was that the people at the top can only buy so much. They save much of their income. Meanwhile the bottom sixty percent have seen their wages stagnate and have been taking out student loans, mortgages, credit card debt, payday loans, etc. to make up for lost wages and shrinking public services. This leads to a fall in consumption once all the debts start going bad and people are tapped out and can no longer borrow against their incomes and the asset bubbles burst:

“Now this created a big problem. I like Mitt Romney, but there’s only so many fridges he can buy. You do have a basic consumption problem if you’ve been running your economy as we have for the past thirty years on credit. And if people’s wages haven’t been rising and they’re strapped with too much debt–which banks call credit, [because assets and liabilities sum to zero]–then they can’t service their debts. At the same time they’re being told if you don’t go to college you’ll never amount to anything, there’s no jobs for anyone who doesn’t have a college degree these days, you end up with a world where your share of the national income is falling despite the fact that the country has never been richer. And it’s not just this country, it’s every country.”

Here’s Steve Keen explaining how it was done under the new macroeconomimc regime:

“The fundamental engine that drove the apparent success of Neoliberalism until the crisis struck was an increasing level of private debt–leverage.”

“The reason that private debt matters is because credit is the source of a large part of demand.”

“When you borrow money, what you’re actually doing from banks is: the banks are creating money, creating a debt for you at the same time, you then spend the money you’ve borrowed, so that additional change in debt becomes a component of demand today. But of course with that change in debt that gets added to the level of outstanding debt, and you can have a process where that level rises over time.”

“In the UK’s case…from 1880 right through to 1980, there was no trend in the level of private debt compared to GDP in the UK, it never exceeded 75 percent of GDP. When Maggie Thatcher came to power it was 55 percent of GDP. The debt level from Maggie Thatcher rose from 55 percent of GDP in 1982, to 190 percent in 2008…The reason the crisis occurred was the rate of growth of debts went from positve to negative, and bang, you had a crisis.”

“So the bubble was caused by a rise in leverage, the crisis after it was caused by an absence of the same substantial level of credit simply because both households and businesses are unwilling to borrow at the rate they were willing to borrow when the bubble was going on and the banks aren’t so willing to lend either. So we’ve got a sclerotic effect from the level of accumulated debt. That’s the real story.”

Blyth summarizes the problem as, “Debts are too high, wages are too low to pay the debt off, and inflation is too low to eat the debt.” Leverage works so long as the debts can be paid. But once they can’t, the whole thing falls apart like a house of cards. Leverage also tends to raise asset prices, causing bubbles. This is what happened during the global financial crash 2007-2008. Here’s Steve Keen again describing what happened:

“What Neoliberalism allowed the West to do was to use leverage to dramatically add to total demand, but of course adding to debt at the same time. And then when we reached the situation where so many interventions which were debt financed went bankrupt, where there were assets that were overvalued and then collapsed and then wiped out the banks in the process, and where people realized that rather than house prices rising forever they sometimes fall so you get the hell out of mortgage debt, all those things came along and they became what I called the walking dead of debt…”

“…The levels of debt are the highest they’ve been in human history, and well beyond what we can service reliably, and also investment and consumption are both diminished dramatically because people don’t want to invest beyond their income levels which they do during a boom. So the only way to solve it is to get the debt level down.”

The Second Macroeconomic Regime Collapses: 2007–Present

The second regime collapsed during the Global Financial Crisis of 2007-2008. Why did that happen?

Blyth doesn’t explain the specific timing of it, but it’s interesting to note that this, too, coincided with a dramatic spike in the price of oil.

The fundamental cause, however, is easier to understand. The financial sector was leveraged to the hilt. The total assets were multiples the size of the underlying “real” economy of goods and services.

If enough liabilities go bad that they exceed the assets of a typcial bank, the bank goes under. QE bought up the bad assets.

Blyth explains the concept of leverage, assets and liabilities using the example of a mortgage:

“People confuse debt and what a debt is. It’s not just this bad thing. Debt on the public side or the private side is an asset. The people who got bailed out got their assets bailed out.”

“Now, a very simple way of thinking about this: I have a mortgage, you have a mortgage. That to a bank is a liability. They don’t want your house. They want the income stream coming from it. [That’s their asset]. My asset is my house. My liability is paying the mortgage. It all sums to zero.”

40 Years Of Income Inequality In America, In Graphs (NPR)

However, unlike the previous regimes, this time there would be no “reset.” There would be no new software written for the hardware of capitalism. Instead, banks and the wealthy were bailed out by taking their assets onto the public balance sheet through “money printing” and buying up junk bonds:

“In the 1970s the system failed. It had a heart attack because of inflation. The Neoliberals came along and reset the system. They wrote new software for the hardware. We didn’t do that in 2008. We let the money doctors come in. What they did was they pumped 13 trillion of Euros, Dollars and Yen into the global banking system to keep the system going. They had a heart attack, and we basically put them into intensive care for ten years.”

The corresponding rise in public debt sparked calls for “austerity” on the part of elites to bring down the government’s debt. But, even as the investor class was bailed out, savage austerity cuts would be aimed squarely at the poorest and most vulnerable members of society who had been borrowing like crazy just to maintain their living standards in the face of decades of stagnant wages and rising costs:

“When you bail out the assets of a bank, you’re bailing out the assets and incomes of the top twenty percent of the income distribution, particularly those at the very top. So when you’ve just done that, they’re not going to turn around and say let’s pay extra taxes because we got bailed out. No, they want to put that on the other part of the income distribution–the ones who are now being told, we can’t have this, you need to pay more, your education can’t be free, et cetera.”

With interest rates at practically zero, there was nothing for the central bankers to do to stimulate the economy this time. Instead of increasing government spending as prescribed by Keynesianism, politicians preached the need for “belt tightening” due to the rising debt. This had the effect of shrinking the economy:

“There’s no inflation in the system. Why? because labor produces inflation. And once you’ve globalized your labor markets, there’s no inflation anymore. Why can’t Janet Yellen bring inflation rates up? Why can’t Draghi bring interest rates up? Because there’s no reason to. There’s no inflation. When you do it, you’d simply slow down the economy. But what does that mean for savers? What does that mean for pension funds? Whoops!”

“Now, in my opinion, add this all together and you get populism. Debts are too high. Wages are too low to pay off the debt. Inflation is too low to eat the debt. You can’t play the trick you did in the 1970’s when you got a mortgage. It’s the other way around–this is a creditor’s paradise, not a debtor’s paradise.”

“The Left response is ‘blame capital, blame globalization. And they’re not blameless. The Right response is blame immigration, blame globalization. We can disagree on the immigration one, but they’re basically hitting on the same things.”

Enter Populism

The Neoliberal economic regime hollowed out the middle classes of the industrialized world, even as they raised incomes in much of the developing world. The “elephant chart” compiled by economist Branko Milanovich shows the incomes of everyone in the world from the poorest person on earth to the richest, along with the percent change due to globalization:

The graph shows that incomes for the poorest countries went up, from a pittance to something less than a pittance. The biggest gains represent the emerging middle classes of Asia. The American middle class was represented by the 65-80 percentile in the global distribution. Their incomes have taken a beating. And notice that the last four squares represent the wealthiest 10 percent of people on the planet. They’ve captured the majority of the growth under Neoliberalism, such that 42 billionaires now own the same wealth as the bottom half—3.7 billion people—of the world’s population.

“Guess what? The top 20 percent have made off with all the cash. And if you’re actually in the bottom?…it’s hardly budged since 1980. And that’s true for the bottom three quintiles. So sixty percent of the country hasn’t had a pay rise when you adjust for inflation since 1980. Meanwhile, people like me on the coasts, we’ve been lapping it up. I’ve been having lobster thermidor in the bath!..”

These declining living standards for the majority in wealthy countries gives rise to populism. Even as the country as a whole has never been richer, and the stock market and GDP are hitting new heights, workers are having a harder and harder time making ends meet. Their wages are declining. They are heavily indebted. Their formerly good and stable manufacturing jobs are replaced with low-paying “flexible” service jobs with no benefits. Digital technology is forcing people to become “independent contractors.” And now workers hear even those jobs will soon be replaced by robots. Blyth illustrates this phenomenon with a hypothetical Rust Belt worker named Gary:

“There’s a guy called Gary. Gary lives in Gary, Indiana. Gary [has] ten years in the union in 1989. He gets seniority. He’s a line supervisor with seniority, he’s turning thirty, he gets married, everything’s going great. And he’s getting $30.00 an hour, real [wages].”

“Now, who knows why, but they’ve been moving the plant and the equipment down South for a long time. China didn’t take most of the industrialization; the South did. Texas did. North Carolina did. So they’ve been losing a lot of the industrial base. But then they signed this thing called NAFTA. And the plant disappeared, the supplier plant disappeared, and the town takes an enormous economic hit.”

“So a lot of people move out. the tax base goes down. The schools get worse. And he bootstraps himself and says, ‘I’ve never relied on anybody; I’ll get another job.’ They were meant to retrain him as a computer programmer; that’s what everybody said, but the governor at the time really just gave a shit about tax cuts, so they just cut the budget for that and handed it out to people.”

“So then he ended up getting a job in a call center. So he went to $15.00 an hour. And then five years later the call center went from Indiana to India. And now Gary works in his dotage, very hard, long hours, for $11.67 an hour for the largest employer in the United States–WalMart. And every day Gary reads in the papers how him and all of his mates are about to be replaced by robots. Because you do, every day. Whatever sector you’re in in the low end of the labor market–automation, robotiztion, it’s going to happen.”

“And the guys on Wall Street who got bailed out with everybody else’s money, they love this. They’re going to make a fortune off this. All these internet entrepreneurs, [the] Uber guys, they’re the ones who will own the patents on the robots. And he’ll be thrown on the scrap heap with his mates. And the only person who articulates anything he actually gives a shit about is this guy Trump.”

“Now he knows [Trump’s] a buffoon. He knows he’s a reality TV star. But [Gary] has had politician after politician after politician showing up and saying ‘vote for me better jobs, vote for me more security,’ and life’s gotten crappier and crappier and crappier. So he has no reason whatsoever to believe a word they say. So he has a liar on one side, and a bullshit artist on the other. Which one gives you more possibilities?”

Communities around the country have gotten worse and worse outside of the coasts and major cities for decades. Many can’t even afford to maintain their outdated infrastructure. The main losers from the situation were the center-left and center-right parties who unanimously supported Neoliberalism. Under their watch, things have gotten worse and worse for at least half the population, and they are fed up.

So the fringe parties come to the fore. Despite their differences, their core planks are:

1.) Left Populism: Blame globalization, blame capital.
2.) Right Populism: Blame globalization, blame immigration.

Both sides essentially converged on the same basic program–turn inward, against globalization. They exploit the anger caused by debt and falling living standards. And sometimes they use racism and xenophobia to do it. We’ve seen this before. It’s the world the architects of the post-WW2 war order were so desperate to avoid, because they knew where it inevitably led.

“So what you have is a sort of debtor’s revolt against the world we’ve built over the past 30 years which is a creditor’s paradise. So what you see is a left wing expression and a right wing expression, a racist expression and a non-racist expression of fundamental discontent with the way the rewards of the system have been skewed over the past thirty years.”

The key is that there are no real solutions being offered to the above problems by the mainstream political parties, and the people in charge don’t look like they know what they’re doing. Meanwhile, the workers have done everything capital asked of them. They went back to school. They retrained. They took out huge debts for college. They became flexible. But their living standards didn’t budge. Life kept getting harder. Because the mainstream parties offered no alternatives that actually translated into improving the status of anyone besides the top 20 percent, people turned to buffoons and demagogues, and some of them are very ugly indeed. Trump, the alt-right and Brexit are all examples of this trend.

What is the Solution?

Mark Blyth and Steve Keen both suggest possible solutions.

Blyth opposes undoing globalization and turning inward to economic nationalism and tariffs. He notes that the wealthy countries of the West have not had enough children, and without immigration their economies will shrink.

Instead, he recommends the state take over the things that have skyrocketed in price. He recommends universal health care, universal free education and free child care. Do those things, he claims, and you will nip populism in the bud.

Keen’s prescription is more ambitious. He advocates using the money-issuing power of central banks for a “people’s qualitative easing.” He explains the concept by describing the money-creation powers of central governments and how it was used to make the banking sector whole:

“If you owned your own bank, and people accepted checks you wrote on your bank as complete payment of any debt you had, would you feel worried about a large amount of debt? And the answer is fundamentally no, because if you could draw checks on a bank you owned…when as soon as you gave that check to somebody else, they basically wrote off what you owed them, and they then used that themselves to exchange money with other people, you’d be on easy street. The only danger you’d face is creating so much of the stuff that you caused a bubble, that the economy itself fell over because you were importing too much from overseas and the trade balance exploded and so on. That’s the real danger of somebody who owns their own bank spending without limit.”

“But that’s fundamentally the situation that the government is in. Any region where the government produces its own currency, and of course the UK government has the Bank of England which produces Pounds…it can pay its debts with its own bank. Now we’ve put all sorts of legal restrictions on them doing this…”

“When the treasury records that it’s going to, say, spend 50 billion Pounds and it’s going to get 45 billion in tax, therefore it has a 5 billion dollar gap, it then issues bonds to the equivalent of 5 billion pounds to pay for that. And let’s say its going to charge an interest of 10 percent, which is far higher than current levels. So 5.5 billion pounds of bonds it issues.”

“Currently those bonds have to be sold to the private sector, which means there’s a transfer from the financial sector to the government of money, and then the government spends that money into the economy.”

“And then, of course, they’ve got a debt to the financial sector. But that debt is effectively paid for by the central bank–the Bank of England– crediting the accounts of these institutions that own the shares; that own the bonds they’ve bought off the government. So it’s an accounting operation all the way through.”

“And if the central bank actually bought *all* the bonds outstanding, which is pretty close to what it’s done with QE…We know that in 2010 or 2011 when the Bank of England began Quantitative Easing, the amount of money that it created for bond purchases off private banks and off private financial institutions was 200 billion Pounds. Now did you get your part of the QE tax bill that year? The answer is, no you didn’t. There was no QE tax. The central bank simply said, ‘we’re going to deposit 200 billion Pounds worth of money in financial institutions’ bank accounts in return for them giving us the ownership of 200 billion pounds worth of bonds, whether they’re government bonds or private bonds. So its just an accounting operation.”

[…]

So, what you can do is, as I said, the government pays its own bills with what fundamentally amounts to an accounting operation. They pay QE which is fundamentally an accounting operation. And the level of QE which was running at 200 billion pounds per year was on the order of 1/6 or 1/7th the size of the economy per year. That’s the scale that they can do.”

“So government money creation could be used for what has been called ‘People’s Quantitative Easing,’ or what I call a Modern Debt Jubilee. Use that money creation capability to give a per capita injection to everybody in the country with a bank account. If they are in debt at all then the money reduces their debt level. If they’re not in debt they get a cash injection, but that cash injection could also be made conditional on them buying shares from companies that were required to pay their debt levels down. So you could actually use it as a way of using government money creation capability to effectively rebalance this from a far too much credit, far too little fiat-based money to a more sensible balance of the two.”

“You can do that level of spending politically during something like the Second World War because it’s an existential threat and nobody in their right mind is going to criticize using government money creation to mobilize as many physical resources as possible for a particular objective that virtually everybody in the society supports…in the UK’s case the government’s deficit in the first year of the Second World War was 40 percent of GDP. Nobody stood up in parliament and said ‘we cant afford this bill because our children will be indebted for the future,’ because somebody on the other side would say, ‘we cant afford *not* to spend this money because if we don’t your kids will be speaking German.’

Adam Curtis isn’t an economist, but an observer of society. He describes the period we’re in now as “Hypernormalization,” a concept taken from the last days of the Soviet Union. It’s described as a state where everyone knows the politicians are lying, the people know they’re lying, the politicians know they’re lying, and the politicians know that we know they’re lying. But everyone is just going along with it because nobody knows how to do anything else besides make-believe. He also points out that finance desires a predictable world which is fundamentally conservative, and the power of finance means that this is what politicians support. This means that measures that would shake up the system are less likely to be supported.

The reason there are no mass movements against this situation, Curtis argues, is that coming out of the Hippy Movement of the 1970’s was an attitude that prized individual self-expression and not being told what to do by others above collective self-sacrifice. This made mass movements impossible, since they require people to sublimate their individuality to the goals of the movement.

Instead, advertising, and later, social media platforms, learned how to exploit this desire for self-expression while still finding a way to manage and herd large groups of people for the benefit of elites. It accomplished this though cybernetics:

28:35: “The genius of modern power is that it managed to what politics failed to do. Politics can’t deal with individualism, because how can you have a political party where everyone wants to be an individual and not be a part of something? What modern managerial systems managed to do was square the circle. Look at modern social media. It manages to allow you to feel that you are totally yourself, expressing yourself online…Yet at the same time you are a component in a complex series of very complex circuits that is looking at you doing that and saying, ‘Hang on, if hes doing that, then hes very much like these people here which we’ve categorized like that.’ So we can say back to that person in they circuit, ‘Hey if you’re doing that, would you like this as well?’ and you go, ‘Hmm, all right, because its a bit like what you’ve just done. And it makes you sort of feel secure within your individuality.”

“So what they’ve managed to do, increasingly, the modern systems of management, is accept your individualism and your expressiveness; allow you to feel that you’re being more and more expressive, while at the same time managing you quietly and happily so that you become part of a very large group that you don’t see, because you’re just a little component in the circuit, but the computers look at you go, ‘Oh, well, there’s about 300 million of those sort of types, we’ll put them in that group. And its not a conspiracy; a group of people going ‘We’ll do this.’ It’s a system that can see from the information that it’s reading from you and lots of other people the patterns that you are part of and saying, ‘Well we’ll fit them all together into that pattern.’

And its benign in their terms. If you talk to the tech Utopians from Silicon Valley, they will go ‘This is incredibly efficient.’ And they’re right. It’s an incredibly efficient way of managing the problem that politicians can’t manage, which is our individuality and our desire to be self expressive. It’s problem is that its fundamentally conservative because its feeding back to you more of what it knows you like…

With online systems, the way to get people to participate is to make them outraged enough so they go online and click. This puts us all into little bubbles where we only see ideas that we already agree with. And so, we get politicians endlessly fanning the flames of the “culture wars” and getting people upset, but nothing substantial ever really changes. The economy just chugs along, making the rich richer and the poor poorer, with people becoming more and more frustrated because seem to have little impact on government. This, he claims, is because they are being micromanaged in a way that they don’t actually see by tech utopians in the name of efficiency.

Finance, as he points out, wants stability, not change, and in this goal they are assisted by the cybernetic control systems such as Google and Facebook. This can keep people forever atomized in their own groups and prevent fundamental change. Instead, people’s frustrations are channeled to cultural issues, egged on by politicians whose prime goal is not to unite people, but to keep them continually divided:

48:28: I have a very cynical theory about Trump. As politics became…less and less substantial and less and less confidently able to change things and power shifted away to all sort of other things that we were participating in…Really what people like Trump are is, they’re not politicians, they’re pantomime villains. They’ve turned politics into a Vaudeville. And what they do is they come onstage and we go THIS IS OUTRAGEOUS!!! This is absolutely terrible! We type away on social media saying, ‘This is is really really really bad.’ And…a marketeer for online told me once, angry people click more. And clicks are gold dust. And really what those clicks do is feed modern power. And everything stays the same…

This makes huge profits for media conglomerates and Silicon Valley, but everything stays the same. Until when? How will the system reset itself? When will it? Can it? Will it take another world war? Or will things just continue to get worse and worse forever for the majority? Is there any alternative? That, it seems, is a question no one can answer.

Professor Mark Blyth on Post-WW2 Economics and Neoliberalism (YouTube)

Mark Blyth – Why People Vote for Those Who Work Against Their Best Interests (YouTube)

Mark Blyth ─ Global Trumpism (YouTube)

Professor Mark Blyth on Policy Goals, Trump, and China (YouTube)

Adam Curtis – Do We Really Want Change? (Under the Skin)

Burying Neoliberalism Before It Buries Us (Sputnik News)

The Cucumber or the Grape?

…Calvinism is “perhaps the first systematic body of religious teaching which can be said to recognize and applaud the economic virtues.” No longer was “the world of economic motives alien to the life of the spirit.” Here is Zwingli, quoted by Wiskemann, quoted by Richard Tawney: “Labor is a thing so good and godlike…that makes the body hale and strong and cures the sickness produced by idleness…In the things of this life, the laborer is most like to God.”
Adam Smith: Supermoney, pp.137-138

Last time we took a historical survey of how large-scale civilizations were made possible by human slaves, and all of the major forms that it took. We also looked at some of the common misconceptions about how slavery worked in ancient societies.

Today, slavery is essentially illegal everywhere, despite all the underground forms of slavery that persist (human trafficking, migrant labor), and legal forms (incarceration and indentured servitude like H1-B Visas and student debt).

I’d like to consider this post by Nate Hagens et. al. which delves into the psychology of work: Why do we need jobs if we can have slaves working for us? (Cassandra’s Legacy)

Hagens makes a familiar point: much of the “work” performed by modern society today is no longer performed by flesh-and-blood human and animal slaves but by devices powered by fossil fuels, which he calls “energy slaves.” Some of this is performed via heat engines like internal combustion engines, electric dynamos, boilers, and so forth, while others tasks are performed through electricity: electric motors, transistors, heat pumps, cybernetic devices and so on. Recall that the ancient world had none of these:

…every American has over 500 invisible energy slaves working 24/7 for them. That is, the labor equivalent of 500 human workers, 24/7, every day of the year, mostly derived from burning fossil carbon and hydrocarbons…

We use the “slave” metaphor because it’s really a very good one, despite its pejorative label. Energy slaves do exactly the sort of things that human slaves and domestic animals previously did: things that fulfilled their masters’ needs and whims. And they do them faster. And cheaper. Indeed, it probably wasn’t a big coincidence that the world (and the USA) got around to freeing most of its human slaves only once industrialization started offering cheaper fossil-slave replacements.

The things we value are created with a combination of human and energy-slave work combined with natural capital (minerals and ores, soils and forests, etc.). There are huge amounts of embedded energy in the creation and operation of something like an iPad and the infrastructure which makes it work…To an ever-increasing degree over the last two centuries, wealth has been created more by fossil slaves than by human labor, significantly more – and it’s at its all-time peak about now…

In fact, we have so much energy, we actually make things expressly designed to be used once and thrown away! Or to fall apart quickly–so-called “planned obsolescence.” People who buy used goods often notice that older products tend to last longer than new ones, and often perform better and more reliably. Recently Apple admitted that they intentionally slow down older devices in order to get people to upgrade.

We increasingly buy disposable everything – used once and tossed away. Most everything is short-life these days; when your authors were young if you bought a fan, you expected it to last 20+ years. Now if it lasts 2-3 before you toss it, that’s about par for the course. Planned obsolescence exists because it’s “good for GDP.” A new dishwasher now lasts 6-8 years when it used to last 12-16, because they now have integrated cheaper electronics that fail.

Our GDP has become tethered to rapid product-replacement cycles keyed to our short attention spans and our enjoyment at buying new things. This creates “jobs” for car salesmen, advertising executives, etc., but has tilted the scales in favor of “useless GDP” rather than real societal utility. We know how to make things with high quality that last, but due to time bias and the financialization of the human experience, such an objective is relatively unimportant in our current culture. Many people get a new phone every 18 months with their cell plan, and perfectly functional ones wind up in the landfills.

After making a good case that our prosperity is actually the result of a massive surplus of energy channeled into heat engines of various types, Hagens and his co-authors consider the concept of “work.” Why, they ask, if so much of the work in our society is performed by energy slaves, do we place such a high value on “work”?

And place a high value on it, we do. In fact, we are well on our way (if not there already) to a society of “total work” where work encompasses every aspect of our lives and determines our entire value as a human being. Silicon Valley enthusiasts use polyphasic sleeping to reduce their “downtime” (a computer term) to only three hours a night. They scarf down meal replacement shakes and powders to avoid eating so they can spend more time at the office. Family time is seen as “unproductive,” and students labor away at several hours of homework a night. Entry to many professions has less to do with necessary training time as being a hazing ritual (e.g. law, medicine). Amazon employees openly weep at their desks and answer emails at three in the morning. We skip vacations for fear of being cast aside, or inundated upon our return. We cower at the tyranny of the punch clock and time sheet. The most admired person in our society is the business executive who sleeps only a few hours a night and arrives at the office by 4 AM, or the Wall Street trader who works past midnight.

We (especially Americans) then castigate anyone not willing or able to embrace this Stakhanovite work ethic as “lazy” and not deserving of any consideration; not even the bare social minimums of survival like healthy food, decent shelter and basic health care.

For upper-middle class men, notes sociologist Michèle Lamont, ambition and a strong work ethic are “doubly sacred…as signals of both moral and socioeconomic purity.” Elite men’s jobs revolve around the work devotion schema, which communicates that high-level professionals should “demonstrate commitment by making work the central focus of their lives” and “manifest singular ‘devotion to work,’ unencumbered with family responsibilities,” to quote sociologist Mary Blair-Loy. This ideal has roots in the 17th century Protestant work ethic, in which work was viewed as a “calling” to serve God and society. The religious connection has vanished…or has it?

Blair-Loy draws parallels between the words bankers used to describe their work — “complete euphoria” or “being totally consumed” — and Emile Durkheim’s classic account of a religion ceremony among Australian natives. “I worshipped my mentor,” said one woman. Work becomes a totalizing experience. “Holidays are a nuisance because you have to stop working,” said one banker interviewed by Blair-Loy. “I remember being really annoyed when it was Thanksgiving. Damn, why did I have to stop working to go eat turkey? I missed my favorite uncle’s funeral, because I had a deposition scheduled that was too important.”

Work devotion marries moral purity with elite status. Way back when I was a visiting professor at Harvard Law School, I used to call it the cult of busy smartness. How do the elite signal to each other how important they are? “I am slammed” is a socially acceptable way of saying “I am important.” Fifty years ago, Americans signaled class by displaying their leisure: think banker’s hours (9 to 3). Today, the elite — journalist Chrystia Freeland calls them “the working rich” — display their extreme schedules.

Why Men Work So Many Hours (Harvard Business Review)

Every moment of our waking lives becomes “work”, from the creation of art, to eating (“still working on that???”) to sex. Everything we do must contribute to the totalitarian productivist ethos of society. Even social maladies such as obesity and mental illness are never dismissed as intrinsically bad, but rather only undesirable to the extent that they “decrease productivity.” We have been reduced to productivist meat-machines, where anyone who does not continually contribute to the maximization of GDP must be ruthlessly cast aside as a mere speed-bump on the highway to the Singularity and Martian colonies.

Work-obsessed indeed.

…how, in this world of total work, would people think and sound and act? Everywhere they looked, they would see the pre-employed, employed, post-employed, underemployed and unemployed, and there would be no one uncounted in this census.

Everywhere they would laud and love work, wishing each other the very best for a productive day, opening their eyes to tasks and closing them only to sleep. Everywhere an ethos of hard work would be championed as the means by which success is to be achieved, laziness being deemed the gravest sin…In this world, eating, excreting, resting, having sex, exercising, meditating and commuting – closely monitored and ever-optimised – would all be conducive to good health, which would, in turn, be put in the service of being more and more productive…

Off in corners, rumours would occasionally circulate about death or suicide from overwork, but such faintly sweet susurrus would rightly be regarded as no more than local manifestations of the spirit of total work, for some even as a praiseworthy way of taking work to its logical limit in ultimate sacrifice. In all corners of the world, therefore, people would act in order to complete total work’s deepest longing: to see itself fully manifest.

This world, it turns out, is not a work of science fiction; it is unmistakably close to our own… We are on the verge of total work’s realisation. Each day I speak with people for whom work has come to control their lives, making their world into a task, their thoughts an unspoken burden…

If work dominated your every moment would life be worth living? (Aeon)

Thus, despite all our fossil energy slaves, despite all our labor-saving devices and cybernetic achievements and artificial intelligence and self-driving cars and robots and fully-automated lights-out factories churning out more widgets than can ever be consumed, it seems like we are more “work-obsessed” than ever before in human history! People in past societies worked far less than we do.

And yet it’s difficult to see what much of the extra added “work” has really contributed to society:

In 1930, the British economist John Maynard Keynes predicted that, by the end of the century, the average workweek would be about 15 hours. Automation had already begun to replace many jobs by the early 20th century, and Keynes predicted that the trend would accelerate to the point where all that people need for a satisfying life could be produced with a minimum of human labor, whether physical or mental.

Keynes turned out to be right about increased automation…But he was wrong about the decline of work.

As old jobs have been replaced by machines, new jobs have cropped up. Some of these new jobs are direct results of the new technologies and can fairly be said to benefit society in ways beyond just keeping people employed. Information technology jobs are obvious examples, as are jobs catering to newfound realms of amusement, such as computer game design and production.

But we also have an ever-growing number of jobs that seem completely useless or even harmful. As examples, we have administrators and assistant administrators in ever larger numbers shuffling papers that don’t need to be shuffled, corporate lawyers and their staffs helping big companies pay less than their fair share of taxes, countless people in the financial industries doing who knows what mischief, lobbyists using every means possible to further corrupt our politicians, and advertising executives and sales personnel pushing stuff that nobody needs or really wants.

Economists Are Obsessed with “Job Creation.” How About Less Work? (Evonomics)

Anthropologist and activist David Graeber contends that if we consider the economy-wide job profiles we had in the 1930’s when Keynes wrote his treatise, then we truly have eliminated most of the jobs! That is, we have indeed eliminated most of the human labor from large swaths of the economy thanks to our energy slaves, along with dramatic gains in efficiency.

Conventional economists argue that economic growth engendered by these changes to the economy has created enough new positions to absorb all the displaced labor from the automated and eliminated sectors of the economy such as manufacturing and agriculture. Furthermore, they argue the need for labor is essentially unlimited (the “Lump of Labour” fallacy). Graeber, however, argues that, even in theoretically “efficient” capitalist economies, a good portion of the displaced labor has been absorbed into unnecessary, or even socially harmful, make-work tasks, what he terms “Bullshit Jobs.”

Graeber lists five categories of Bullshit Jobs:

1. Flunkies – People who are there just to make someone else look good.

2. Goons – People whose jobs only exist because their competitors have them as well, such as corporate lawyers, lobbyists, telemarketers, etc. in a sort of zero-sum arms race.

3. Duct Tapers – people paid to continually apply patches to a broken system without fixing the underlying problems which are clearly identifiable. See, for example, the entire American health care system.

4. Box Tickers – people who are there to permit an organization to say they are complying with various rules and regulations that they are not actually complying with.

5. Taskmasters – people who are there to supervise people who don’t need supervision, and to make up new bullshit jobs.

An increasing number of people in capitalist societies are also employed in “guard labor,” that is, working to keep other people in line—police officers, FBI agents, prison guards, security guards, detectives, investigators, and countless other assorted “criminal justice” occupations. Keeping people in line and imprisoning them has been a major source of new jobs. And many new jobs have been created attempting to cope with the corrosive effects to the fabric of modern society such as counselors and social workers. We even have people whose full-time job it is to get other people into full-time jobs!

Graeber notes that, had we kept the same job profiles as we had in the 1930’s, we truly could have eliminated most of the jobs! Instead of doing that, though, we have instead created millions of low-productivity make-work tasks like those he cites above, most of which revolve around useless paper-pushing and professional lunch-eating:

“They could have done that if we’d kept up the same type of job profiles…you look at what jobs existed in the 1930’s. There were a lot of people working in industry, there were a lot of people working in agriculture, there were a lot of domestic servants—all that’s gone. A lot of the domestic servants have been replaced by service jobs.”

“There are a lot less people employed in industry, even if you count places like China where the factories have gone. People think it’s gone to the service sector. But actually, it’s not so much service. What it’s gone to is the administrative/clerical and supervisory sector. If you count service and that together, its gone from a quarter of all jobs to seventy-five percent today. So, you have all these people administering other people. And they’re not really doing anything—they’re just pretending.”

“It seems to come from the idea that work is a value in itself…”

BULLSHIT JOBS – David Graeber (YouTube)

Graeber also notes that there seems to be a notion that if you’re getting something meaningful out of what you do for a living (for example, making art or helping others) then you shouldn’t get paid at all, or at least you certainly shouldn’t get paid very much. That is, the knowledge that you’re actually doing something of value has come be seen as subtracting from the value of the job rather than adding to it! There’s resentment on a unconscious, or sometimes even conscious level against those who actually do real work, he contends. He cites the resentment against teachers and auto workers receiving high salaries and good benefits, despite bankers, corporate lawyers and middle-managers earning much, much more. The reason, he suspects, is because the main tasks of the latter cohort—filling out useless paperwork and attending boring meetings—are so soul-crushingly pointless and awful that we convince ourselves that they somehow “deserve” to be paid more money. Notice how business executives, Silicon Valley programmers, and Wall Street bankers constantly tout their endless work hours and personal sacrifices as the justification for their outsized paychecks, perks, and golden parachutes, without referring to what, if anything, all their excess work actually accomplishes for the benefit of anyone but themselves.

Gray notes that the problem is really not a technological one, but an economic one:

The real problem, of course, is an economic one. We’ve figured out how to reduce the amount of work required to produce everything we need and realistically want, but we haven’t figured out how to distribute those resources except through wages earned from the 40-hour (or more) workweek. In fact, technology has had the effect of concentrating more and more of the wealth in the hands of an ever-smaller percentage of the population, which compounds the distribution problem. Moreover, as a legacy of the industrial revolution, we have a cultural ethos that says people must work for what they get, and so we shun any serious plans for sharing wealth through means other than exchanges for work.

And that last point is the core of the most interesting part of Hagens’ argument. He has already established that “work” is primarily performed by energy slaves in one form or another in modern Industrial societies, whether mechanical work, or, increasingly, routine intellectual (i.e. non-creative) work. Most of our “jobs” have been purposely routinized and made fungible by design through “deskilling.” This was done long ago during the Industrial Revolution to ensure that labor was easily replaceable, and hence would be at the mercy of capitalist employers (i.e the “job creators”). These days “digital deskilling” is advancing rapidly thanks to complex algorithms.

Hagens’ et. al. contention that work isn’t all about accomplishing anything intrinsically useful at all. Rather, they contend, it is really all about the socially accepted amount of “suffering” that we must go through to “earn” our paycheck. There is nothing inherently good about jobs or work per se. They point out that most animals in nature do not seek out extra work and see it as something to be avoided if possible:

…if you kick open an anthill or a beehive, the insects will not be grateful for the sudden boost in job creation, and they will effectively utilize the cross-species language of biting and stinging to inform you of this opinion. From this we may infer that insects don’t understand economics…

Many hunter-gatherer societies don’t even have a concept of work:

Some anthropologists have reported that the people they studied didn’t even have a word for work; or, if they had one, it referred to what farmers, or miners, or other non-hunter-gatherers with whom they had contact did. The anthropologist Marshal Sahlins famously referred to hunter-gatherers as comprising the original affluent society—affluent not because they had so much, but because their needs were small and they could satisfy those needs with relatively little effort, so they had lots of time to play.

Hagens argues that the 40-hour work-week job is simply the rationing mechanism we’ve ended up with which allows people to get access to the collectively-produced wealth of society, including the output of our ubiquitous energy slaves. As they put it:

… there are a lot of jobs in the USA, which keep us very busy not making much of anything of long term value.

We do advertising, hairstyling, consulting, writing, and a lot of supervising of the things our fossil slaves do. We don’t care all that much what we’re doing as long as we feel we’re getting paid at least as well for the same task as the other…people around us…

These days in this culture, a “good job” is defined by how much it pays, not by what it accomplishes. Many people would consider it an optimum situation, a great job, to sit in a room for 40 hours per week and make $100,000 per year, just pulling a lever the way a capuchin does for a cucumber slice…

The reference to cucumber slices comes from a famous experiment where researchers had Capuchin monkeys complete a nonsense task in exchange for a food reward. Some monkeys got a cucumber slice, while others got a grape for doing the exact same task:

If you give capuchin monkeys the “job” of doing a nonsense task in exchange for a reward, they will happily do it all day long as long as they keep getting a reward – cucumber slices. But if a capuchin sees the monkey in the next cage get a (better tasting so higher value) grape while it still gets a cucumber slice, it’ll go ape, throwing the cucumber slice in the face of the experimenter in a rage. It gets the same cucumber slice it has been happy to work for before, but it no longer wants it, because it no longer feels fair in comparison to its cage mate’s effort and reward. Instead, it wants the experimenter and the other monkey to be punished for this inequity.

We’ll…refer to the term “capuchin fairness” because a similar mechanism turns out to be behind a great deal of human behavior. We’re outraged at the notion of somebody getting more reward than we do for doing the same thing. Indeed, many large-scale human institutions now stress perceived fairness of process over quality of end results.

A similar mechanism exists among ranked primates like chimpanzees:

…On the flip side, when two unrelated chimps put side by side were presented with a tasty grape and a less tasty carrot, the chimp with the grape sometimes threw it away. “I would say that the most likely cause was either fear of retribution or just general discomfort about being around an individual getting less than you,” says Brosnan. Differences in the social hierarchy also played a role, she says. Dominant chimps were angrier when they were on the receiving end of a lesser reward than those lower in the pecking order.

Why Do You Care About Fairness? Ask A Chimp (NPR)

And in human children too young to have been socialized in the concept of fairness:

A few years ago, a team of psychologists set out to study how kids…would respond to unfairness. They recruited a bunch of preschoolers and grouped them in pairs. The children were offered some blocks to play with and then, after a while, were asked to put them away. As a reward for tidying up, the kids were given stickers.

No matter how much each child had contributed to the cleanup effort, one received four stickers and the other two. According to the Centers for Disease Control and Prevention, children shouldn’t be expected to grasp the idea of counting before the age of four. But even three-year-olds seemed to understand when they’d been screwed. Most of the two-sticker recipients looked enviously at the holdings of their partners. Some said they wanted more. A number of the four-sticker recipients also seemed dismayed by the distribution, or perhaps by their partners’ protests, and handed over some of their winnings…The results, they concluded, show that “the emotional response to unfairness emerges very early.”

They Psychology of Inequality (The New Yorker)

This, Hagens contends, is the reason we are so all-consumed with “working hard.” It’s got nothing to do with your real social contribution. Instead, he argues that this is rooted in human social instincts, which are biologically-rooted and which we share with other large-brained social primates such as chimps, bonobos and monkeys.

In other words, it all has to do with our innate primate sense of fairness. Especially in the United States, we are obsessed with punishing “cheaters” and ‘scroungers.” We constantly berate the “lazy,” as if all the people living in cars and shelters just collectively decided to suddenly stop working one day. We are collectively as crabs in a bucket. Everyone must suffer equally.

This is backed up by a recent book by a University of Wisconsin sociology professor who found that right-wing “blue collar” voters in the Rust Belt are motivated almost entirely by grievance and resentment toward “elites:”

What I heard from my conversations is that, in these three elements of resentment — I’m not getting my fair share of power, stuff or respect — there’s race and economics intertwined in each of those ideas.

When people are talking about those people in the city getting an “unfair share,” there’s certainly a racial component to that. But they’re also talking about people like me [a white, female professor]. They’re asking questions like, how often do I teach, what am I doing driving around the state Wisconsin when I’m supposed to be working full time in Madison, like, what kind of a job is that, right?

It’s not just resentment toward people of color. It’s resentment toward elites, city people.

And maybe the best way to explain how these things are intertwined is through noticing how much conceptions of hard work and deservingness matter for the way these resentments matter to politics. We know that when people think about their support for policies, a lot of the time what they’re doing is thinking about whether the recipients of these policies are deserving. Those calculations are often intertwined with notions of hard work, because in the American political culture, we tend to equate hard work with deservingness.

“Part of it is that the Republican Party over the years has honed its arguments to tap into this resentment. They’re saying: “You’re right, you’re not getting your fair share, and the problem is that it’s all going to the government. So let’s roll government back.” So there’s a little bit of an elite-driven effect here, where people are told: ‘You are right to be upset. You are right to notice this injustice.'”

And a lot of racial stereotypes carry this notion of laziness, so when people are making these judgments about who’s working hard, oftentimes people of color don’t fare well in those judgments. But it’s not just people of color. People are like: Are you sitting behind a desk all day? Well that’s not hard work. Hard work is someone like me — I’m a logger, I get up at 4:30 and break my back. For my entire life that’s what I’m doing. I’m wearing my body out in the process of earning a living.

In my mind, through resentment and these notions of deservingness, that’s where you can see how economic anxiety and racial anxiety are intertwined. Part of where that comes from is just the overarching story that we tell ourselves in the U.S. One of the key stories in our political culture has been the American Dream — the sense that if you work hard, you will get ahead.

Well, holy cow, the people I encountered seem to me to be working extremely hard. I’m with them when they’re getting their coffee before they start their workday at 5:30 a.m. I can see the fatigue in their eyes. And I think the notion that they are not getting what they deserve, it comes from them feeling like they’re struggling. They feel like they’re doing what they were told they needed to do to get ahead. And somehow it’s not enough. Oftentimes in some of these smaller communities, people are in the occupations their parents were in, they’re farmers and loggers. They say, it used to be the case that my dad could do this job and retire at a relatively decent age, and make a decent wage. We had a pretty good quality of life, the community was thriving. Now I’m doing what he did, but my life is really much more difficult. I’m doing what I was told I should do in order to be a good American and get ahead, but I’m not getting what I was told I would get..

A New Theory for Why Trump Voters Are So Angry That Actually Makes Sense (Washington Post)

Trump voters were collectively throwing the cucumber slice back at the researcher.

Hagens contends that rather than 40 hours per week being some sort of necessary amount of work time to get done with the tasks-at-hand to keep society up and running, it is instead established as a sort of socially-acceptable threshold of discomfort that people are expected to endure in order to justify their right to the output of our energy slaves (our grapes and cucumber slices). The source of this is a historical contingency that has nothing to do with productivity or what we actually accomplish, but really operates as more of an adult babysitting operation:

And that’s where the perceived equality is: the equality of inconvenience. The 40-hour work week is a social threshold of inconvenience endured, which is now what we keep primary social track of rather than the productive output of a person’s activity…Because socially, everyone who isn’t a criminal is supposed to have a job and endure roughly equivalent inconvenience. Any segment of society which went to a 15-hour work week would be treated as mooching freeloaders, and be pelted by cucumber slices and worse.

In a society in which we’re all basically idle royalty being catered to by fossil slaves, why do we place such a value on “jobs”? Well, partly because it’s how the allocation mechanism evolved, but there also exists considerable resentment against those who don’t work. Think of the vitriol with which people talk about “freeloaders” on society who don’t work a 40-hour week and who take food stamps. The fact is, that most of us are freeloaders when it comes down to it, but if we endure 40 hours of inconvenience per week, we meet the social criteria of having earned our banana pellets even if what we’re doing is stupid and useless, and realized to be stupid and useless. Indeed, a job that’s stupid and useless but pays a lot is highly prized.

So “jobs” per se aren’t intrinsically useful at all… They’re mostly a co-opted, socially-evolved mechanism for wealth distribution and are very little about societal wealth creation. And they function to keep us busy and distract us from huge wealth disparity. We’re too busy making sure our co-workers don’t get grapes to do something as radical as call out and lynch the bankers. Keeping a population distracted may well be necessary to hold a modern nation together.

Why do we need jobs if we can have slaves working for us? (Cassandra’s Legacy)

Finally, in a strange way, it turns out that the old Labor Theory of Value might be correct after all.

The Labor Theory of Value is one of the most controversial ideas in economics. It was an attempt by economists to identify such a thing as “value” and then determine how to quantify it. What makes some things more valuable than others? Many early economists (including both Adam Smith and Karl Marx) thought that the amount of labor that went into producing something determined its value (and note, not it’s price).

While most economists have dismissed this notion, looked at another way it is correct. Instead, we can determine the value of something by how long we are willing work to get it. That is, we will work longer for a grape than a cucumber slice, and that is how we can determine its value, as Chris Dillow argues:

…I think of major expenses in terms of labour-time because they mean I have to work longer. A trip to the vet is an extra fortnight of work; a good guitar an extra month, a car an extra year, and so on.

When I consider my spending, I ask: what must I give up in order to get that? And the answer is my time and freedom. My labour-time is the measure of value.

This is a reasonable basis for the claim that workers are exploited. To buy a bundle of goods and services, we must work a number of hours a week. But taking all workers together, the hours we work are greater than the hours needed to produce those bundles because we must also work to provide a profit for the capitalist….For Marx, value was socially-necessary labour time…From this perspective, exploitation and alienation are linked. Workers are exploited because they must work longer than necessary to get their consumption bundle. And they are alienated because this work is unsatisfying and a source of unfreedom.

In Defence of the Labour Theory of Value (Stumbling and Mumbling)

Seen from this perspective, the value of something can be determined by the amount of often socially useless labor time we must sacrifice in order to get it. And we are very sensitive to others getting value that (we think) they did not deserve. Thus we establish the 40-hour work week as the threshold to ensure fairness of distribution. But is that a good idea? Is it even relevant anymore??? It turns out that there is no connection between the 40-hour work week and productivity. In fact, it might even be less productive:

The reason we have eight-hour work days at all was because companies found that cutting employees’ hours had the reverse effect they expected: it upped their productivity. During the Industrial Revolution, 10-to-16-hour days were normal. Ford was the first company to experiment with an eight-hour day – and found its workers were more productive not only per hour, but overall. Within two years, their profit margins doubled.

If eight-hour days are better than 10-hour ones, could even shorter working hours be even better? Perhaps. For people over 40, research found that a 25-hour work week may be optimal for cognition, while when Sweden recently experimented with six-hour work days, it found that employees had better health and productivity.

This seems borne out by how people behave during the working day. One survey of almost 2,000 full-time office workers in the UK found that people were only productive for 2 hours and 53 minutes out of an eight-hour day. The rest of the time was spent checking social media, reading the news, having non-work-related chats with colleagues, eating – and even searching for new jobs.

The compelling case for working a lot less (BBC)

In other words, the exact opposite of the way we’re going.

We know that most employees are disengaged at their jobs, and studies show that most of us only actually “work” for a small portion of the time we are on the clock, with the rest spent socializing, trying to look busy, or goofing off. Yet we must physically be physically present under some sort of supervision for 40 hours a week minimum to secure our right to our banana pellets. Does any of this make sense? Do any of us really want this? After all, books that promise a four-hour work week are best sellers.

In fact, all the evidence shows that many of us would be more productive if we worked a bit less. In addition, there would be many more jobs to go around:

Even on a global level, there is no clear correlation between a country’s productivity and average working hours. With a 38.6-hour work week, for example, the average US employee works 4.6 hours a week longer than a Norwegian. But by GDP, Norway’s workers contribute the equivalent of $78.70 per hour – compared to the US’s $69.60.

As for Italy, that home of il dolce far niente? With an average 35.5-hour work week, it produces almost 40% more per hour than Turkey, where people work an average of 47.9 hours per week. It even edges the United Kingdom, where people work 36.5 hours.

So why don’t we do that? That’s a story for another time.

The Many Faces of Slavery

“None are so hopelessly enslaved as those who falsely believe they are free.”
–Johann Wolfgang von Goethe

“Wages is a cunning device of the devil, for the benefit of tender consciences, who would retain all of the advantages of the slave system, without the expense, trouble and odium of being slave-holders.”
–Orestes Brownson

The ancient world ran on slave-power. Lacking heat engines and cybernetic devices, the only way to accomplish the many things civilization ran on–agriculture, construction, crafts, child-rearing, military operations, mining, transport, shipping, and so forth, was to use human and animal muscle power. Human labor has five core competencies, according to economist Brad DeLong:

(1) Moving things with large muscles.
(2) Finely manipulating things with small muscles.
(3) Using our hands, mouths, brains, eyes, and ears to ensure that ongoing processes and procedures happen the way that they are supposed to.
(4) Engaging in social reciprocity and negotiation to keep us all pulling in the same direction.
(5) Thinking up new things – activities that produce outcomes that are necessary, convenient, or luxurious – for us to do.

Surveying the ancient world, we see that slaves were the primary method for accomplishing the first two tasks, while the latter three were monopolized by the “educated” elite classes of the ancient world, who were always–and had to be–a minority (including in our world today, which is why “more education” cannot solve inequality). Simply put, no slavery–no civilization, and no state, as James C. Scott writes:

Slavery was not invented by the state…[but]…as with sedentism and the domestication of grain…the early state elaborated and scaled up the institution of slavery as an essential means to maximize its productive population and thus the surplus it could approporiate.

It would be almost impossible to exaggerate the centrality of bondage, in one form or another, in the development of the state until very recently…as late as 1800 roughly three-quarters of the world’s population could be said to be living in bondage…Provided that we keep in mind the various forms of bondage can take over time, one is attempted to assert: “No slavery, no state.” Against the Grain (ATG), pp. 155-156

Hence the ancient world had to come up with all sorts of philosophical justifications for slavery. Initially, however, race was not one of them. Anthropologist David Graeber points out that underlying the various justifications for slavery was the idea that the slave would otherwise be dead. Because their lives were spared, they were, in essence, the living dead, kind of like zombies! Because they were socially ‘dead’ as people, they had no rights and could be abused, bought and sold:

Slavery is the ultimate form of being ripped from one’s context, and thus from all the social relationships that make one a human being. Another way to put this is that the slave is, in a very real sense, dead. This was the conclusion of the first scholar to carry out a broad historical survey of the institution, an Egyptian sociologist named Ali ‘Abd al-Wahid Wafi, in Paris in 1931. Everywhere, he observes, from the ancient world to then-present-day South America, one finds the same list of possible ways whereby a free person might be reduced to slavery:

1) By the law of force
a. By surrender or capture in war
b. By being the victim of raiding or kidnapping
2) As legal punishment for crimes (including debt)
3) Through paternal authority (a father’s sale of his children)
4) Through the voluntary sale of one’s self

The book’s most enduring contribution, though, lay simply in asking: What do all these circumstances have in common? AI-Wahid’s answer is striking in its simplicity: one becomes a slave in situations where one would otherwise have died. This is obvious in the case of war: in the ancient world, the victor was assumed to have total power over the vanquished, including their women and children; all of them could be simply massacred. Similarly, he argued, criminals were condemned to slavery only for capital crimes, and those who sold themselves, or their children, normally faced starvation. Debt, the First 5000 Years, (DTF5kY) pp. 168-169

Many of the authors and scholars in Michael Hudson’s ISLET series about the ancient economy argue that slavery played only a subsidiary role in the establishment of early civilizations, and most of the labor was given voluntarily, such as a sort of social obligation, often involving work feasts. Author James C. Scott disagrees. He sees the existence of compulsory and unfree labor, in whatever form it took, as absolutely essential to the formation of the first states. He writes:

The general consensus has been that while slavery was undoubtedly present, it was a relatively minor component of the overall [Mesopotamian] economy…I would dispute this consensus.

Slavery, while hardly as massively central as in classical Athens, Sparta, or Rome, was crucial for three reasons: it provided the labor for the most important export trade good, textiles; it supplied a disposable proletariat for the most onerous work (for example, canal digging, wall building); and it was both a token of and a reward for elite status…When other forms of unfree labor, such as debt bondage, forced resettlement, and corvee labor, are taken into account, the importance of coerced labor for the maintenance and expansion of the grain-labor module at the core of the state is hard to deny.

Part of the controversy over the centrality of slavery in ancient Sumer is a matter of terminology. Opinions differ in part because there are so many terms that could mean “slave” but could also mean “servant,” “subordinate,” “underling,” or “bondsman.” Nevertheless, scattered instances of purchase and sale of people–chattle slavery–are well attested, though we do not know how common they were. ATG pp. 157-158

Three obvious reasons why Third Milennium Mesopotamia might seem less of a slave-holding society than Athens or Rome are the smaller populations of early polities, the comparably scarce documentation they left behind, and their relatively small geographical reach. Athens and Rome were formidable naval powers that imported slaves from throughout the known world, drawing virtually all their slave populations far and wide from non-Greek and non-Latin speaking societies. This social and cultural fact provided much of the foundation for the standard association of state peoples with civilization on the one hand and nonstate peoples with barabrism on the other…The greater the cultural and linguistic differences between slaves and their masters, the easier it is to draw and enforce the social and juridicial seperation that makes for the sharp demarcation typcial of slave societies…

Mesopotamian city-states by contrast, took their captives from much closer to home. For that reason, the captives were more likely to have been more culturally aligned with their captors. On this assumption, they might have, if allowed, more quickly assimilated to the culture and mores of their masters and mistresses In the case of young women and children, often the most prized captives, intermarriage or concubinage may well have served to obscure these social orgins within a couple of generations…ATG p. 174-175

In other words, Greece and Rome captured “barbarians” from outside society and incorporated them as a lower-tier slave strata to do all the stoop labor and scut work. The very word barbarian referred to someone who didn’t speak the Greek language.

In Mesopotamia, by contrast, the warfare was often between rival city-states—people who would have spoken similar languages and shared similar customs and beliefs. Thus, they would have appeared less like a foreign entity in the records and more like just a lower tier of society–their status obscured by cultural similarities and ambiguities in the terminology. Furthermore, this process would have been ongoing, with new layers of war captives being continually added to form the bottom strata of society, eventually “blending in” and “moving up” over time as new immigrants–er, slaves–took their place:

The continuous absorption of slaves at the bottom of the social order can also be seen to play a major role of social stratification–a hallmark of the early state. As earlier captives and their progeny were incorporated into the society, the lower ranks were constantly replenished by new captives, further solidifying the line between “free” subjects and those in bondage, despite its permeability over time. p. 169

One must surely wonder whether the Mesopotamian city-states met a substantial portion of their insatiable labor needs by absorbing captives or refugees from culturally similar populations. In this case such captives or refugees would probably appear not as slaves but as a special category of “subject” and perhaps would be, in time, wholly assimilated. ATG p. 175

Integrating war captives into society and isolating them from their original ethnic group rather than making them as a class permanently apart would have forestalled rebellion. Atomized people, without social ties, are much easier to control and cannot mount any sort of collective resistance to the prevailing social order (a point not lost on today’s ruling elites):

Insofar as the captives are seized from scattered locations and backgrounds and are separated from their families, as was usually the case, they are socially demobilized or atomized and therefore easier to control and absorb. If the war captives came from societies that were perceived in most respects as alien to the captors, they were not seen as entitled to the same social consideration. Having, unlike local subjects, few if any local social ties, they were scarcely able to muster any collective opposition…ATG, p. 167…

The principle of socially detached servants–Janissaries, eunuchs, court Jews–has long been seen as a technique for rulers to surround themselves with skilled but politically neutralized staff. At a certain point, however, if the slave population is large, is concentrated, and has ethnic ties, this desired atomization no longer holds. The many slave rebellions in Greece and Rome are symptomatic, although Mesopotamia and Egypt (at least until the New Kingdom) appeared not to have slavery on this scale. ATG pp. 167-168

James Scott considers slavery in ancient Sumeria, Babylonia, Assyria, Egypt and China as a form of manpower recruitment on the part of states–the original “human resources strategy.” Often military incursions were less about seizing territory as it was about seizing captives–what Max Weber called “booty capitalism.” Often, the people seized were those with rare and highly specialized skills that the attacking state did not possess:

Slave taking…represented a kind of raiding and looting of manpower and skills that the slaving state did not have to develop on its own…Women and children were particularly prized as slaves. Women were often taken into households as wives, concubines, or servants, and children were likely to be quickly assimilated, though at an inferior status…Women captives were at least as important for their reproductive services as for their labor. Given the problems of infant and maternal mortality in the early state and the need of both the patriarchal family and the state for agrarian labor, women captives were a demographic dividend. Their reproduction may have played a major role in alleviating the otherwise unhealthy effects of concentration and the domus. ATG, pp. 168-169.

One of the most common forms of slavery was domestic work. A major hallmark of elite status in the ancient world was how many lives you had control over. Large households were typically staffed with huge amounts of domestic servants, who were, in essence, slaves, even if they were not explicitly designated as such in historical records. These domestic servants cooked and cleaned, took care of the children, maintained gardens, bore their masters about in litters, and numerous other routine chores that elites prefer to use trafficked and immigrant labor for even today:

One imagines as well, that most of slaves not put to hard labor were monopolized by the political elites of early states. If the elite households of Greece or Rome are any indication, a large part of their distinction was the impressive array of servants, cooks, artisans, dancers, musicians, and courtesans on display. It would be difficult to imagine the first elaborate social stratification in the earliest states without war-captive slaves at the bottom and elite embellishment, dependent on those slaves, at the top. ATG, p. 169

David Graeber also points out this fact:

…this ability to strip others of their dignity becomes, for the master, the foundation of his honor…there have been places-the Islamic world affords numerous examples-where slaves are not even put to work for profit; instead, rich men make a point of surrounding themselves with battalions of slave retainers simply for reasons of status, as tokens of their magnificence and nothing else. DTF5kY, p. 170

And many forms of slavery were less obvious in the historical record. Scott includes things like forced resettlement, migrant workers, and serfdom as forms of compelled labor which also made civilization possible, but would be less likely to be noticed by archaeologists and economic historians:

In Athens in the fifth century BCE, for example, there was a substantial class, more than 10 percent of the population, of metics, usually translated as “resident aliens.” They were free to live and trade in Athens and had the obligations of citizenship (taxes and conscription, for example) without its privileges. Among them were a substantial number of ex-slaves. ATG p. 175

Finally, there are two forms of communal bondage that were widely practiced in many early states and that bear more than a family resemblance to slavery but are unlikely to appear in the textual record as what we think of as slavery. The first of these might be called mass deportation coupled with communal forced settlement. Our best descriptions of the practice come from the neo-Assyrian Empire (9II-609 BeE), where it was…systematically applied to conquered areas. The entire population and livestock of the conquered land were marched from the territory at the periphery of the kingdom to a location closer to the core, where they were forcibly resettled, the people usually as cultivators…In some cases, it seems that the captives were resettled on land abandoned earlier by other subjects, implying that forced mass resettlement may have been part of an effort to compensate for mass exodus or epidemics. ATG pp. 177-178

A final genre of bondage that is historically common and also might not appear in the historical record as slavery is the model of the Spartan helot. The helots were agricultural communities in Laconia and Messinia dominated by Sparta…They remained in situ as whole communities, were annually humiliated in Spartan rituals, and like the subjects of all archaic agrarian states were required to deliver grain, oil, and wine to their masters. Aside from the fact that they had not been forcibly resettled as war deportees, they were in all other respects the enserfed agricultural servants of a thoroughly militarized society.

Scott points out that the conquering and subjugation of an existing agricultural society by an incoming martial elite–as seems to have been the case in Sparta–may not technically look like slavery but be similar in most respects. The elites compel the producers to toil on behalf of their overlords. It is, in essence, serfdom. People are tied to a plot of land and obliged to provide food and goods to a militarized aristocracy, which is mass slavery in all but name.

And metics appear to be quite similar to today’s globalized peripatetic migrant worker, such as the thousands of “second-class citizens” that are the lifeblood of places like Dubai, or who pick the fruits and berries that end up in the supermarkets of Europe and North America. Interestingly, many immigrant workers in France today have embraced the term “metic” (métèque) tin reference to themselves.

Slavery was also an early way to punish criminals and enforce justice. The ancient world did not have the resources to feed and shelter large amounts of unproductive people in cages (jails, gaols) as we do today. Dungeons were mainly about holding people who were about to stand trail. To have basic shelter and three square meals a day without having to work would have been quite a luxury in the ancient world–people would have been purposely committing crimes to get it! Fines were not effective in pre-monetized and non-market economies. That’s one reason for the gruesome corporeal punishments we see doled out in the ancient world (eye-gouging, flogging, etc.). Making people into slaves took away many of their freedoms, but still compelled them to work on behalf of society–sort of a “work release” program in what was, in essence, an open-air prison. Even in today’s United States, slavery is legal if you are convicted of a crime. We still talk of criminals owing a “debt to society.”

Debt slavery was also often ignored in ancient accounts of slavery. We know that debt bondage became so common and so widespread that leaders had to periodically institute debt annulments in order to keep their societies functioning at all. This could take the form of regular mandated debt jubilees as in Mesopotamia, or emergency legislative actions like those of Solon the reformer in Athens. As David Graeber says, all ancient populism boils down to one single idea: “cancel the debts and redistribute the land” (i..e the means of production).

Slavery was also a major barrier to industrialization. In the new novel Kingdom of the Wicked, the author envisages an alternate Rome that has undergone an Industrial Revolution by the time of Christ. Slavery has been abolished, led by the Stoics whom she equates with Quaker abolitionists in Britain’s 19th century. This has led to the flowering of a “tinkering culture” exemplified by Archimedes and Heron to further their inventions into true labor-saving devices similar to those of early industrializing England. This is not so far-fetched: We know, for example, that the Romans employed water power on a massive scale for milling bread and manufacturing armaments, for example at Barbegal in modern-day France, and that the earliest factories of the Industrial Revolution (Arkwright’s mills) were water-powered, with fossil fuels coming only later due to wood shortages. The author writes of Roman Slavery:

…While Roman-era scientists later developed designs for things like steam engines (Heron of Alexandria) and built fabulous mechanical instruments (the Antikythera machine), they did so in a society that had been flooded with vast numbers of slaves (the late Republic and early Principate), and large-scale chattel slavery and industrialization are simply incompatible.

Chattel slavery undermines incentives to develop labour-saving devices because human labour power never loses its comparative advantage. People can just go out and buy another slave to do that labour-intensive job. Among other things, the industrial revolution in Britain depended on the presence (relative to other countries) of high wages, thus making the development of labour-saving devices worthwhile. The Antikythera mechanism is a thing of wonder, but it is also little more than a clockwork executive toy; no wonder the great lawyer Cicero had one. It’s just the sort of thing I can imagine an eminent [attorney] having on his desk in chambers.

Slavery—and its near relative, serfdom—have been pervasive in even sophisticated human societies, and campaigns for abolition few and far between. We forget that our view that slavery and slavers are obnoxious is of recent vintage. In days gone by, people who made fortunes in the slave trade were held in the highest esteem and sat on our great councils of state. This truism is reflected in history: The Society for Effecting the Abolition of the Slave Trade met for the first time in 1787. It had just twelve members—nine Quakers and three Anglicans…

…we know that the Romans didn’t think some people were ‘naturally servile’, which is at the heart of Aristotle’s argument in favour of slavery. The Roman view (consistent with their militarism) was always ‘you lost, we own you’. Roman law—even in its earliest form—always held that slavery was ‘contrary to nature’. Human beings were naturally free; slavery was a legally mandated status, however cruelly imposed.

It is also important to remember that ancient slavery was never race-based. No Roman argued that slaves were lesser forms of the human. Middle- and upper-class Roman children educated at home by slaves who were manifestly cleverer than them (not to mention mum and dad) knew this, intimately.

Author’s Note-Kingdom of the Wicked (Libertarianism.org)

The Roman explicitly defined the lack of freedom implied by slavery as “contrary to nature” in their legal codes!

In fact, even when labor-saving devices were invented in the ancient world, they often were often intentionally ignored or neglected in order to ensure that the large amounts of human labor available to elites would have some way to be utilized:

[W]hen Vespasian was offered a labor-saving machine for transporting heavy columns, he was said to have declined with the words: “I must always ensure that the working classes earn enough money to buy themselves food.”

Emperor Vespasian has a Solution for Unemployment (Resilience.org)

Not only was race or ethnic origin not a factor in Roman slavery, the ancient Romans did not regard slaves as inherently inferior in any way! In fact, they knew that slaves might even be more talented than their masters! There was no racial segregation or racial hierarchy; slavery was simply a social construct not based on any notion of superiority or racism, unlike in North America (as was it’s flip-side “citizenship”). This colors our view of ancient slavery. It also blinds us to the reality and essential role of slavery and bondage in human history. We are used to regarding slaves as “naturally” inferior due to the racist views utilized in America to justify it. A racial hierarchy was established in America after Bacon’s Rebellion in the South to make sure that poor whites and blacks would not unite against their rulers–another example of divide-and-rule atomization.

The problem for any culture that wants to spend time on literature, art, philosophy and science, is [that] somebody’s got to do the laundry. And so what we’ve done is, we have a washing machine. If we didn’t have a washing machine, my guess is, all over California there would be a lot more jobs at the lowest end–of people doing laundry. Just as the Chinese who entered California as basically indentured railway workers, they began to set up what we call Chinese laundries and Chinese restaurants. These are all low-skilled, high work.

Well, the Greeks; some of the cities–the ones that we admire like Athens–they had slaves because that was the way you got things done. They didn’t feel that slaves were inferior people. They just happened to be people often captured in war. We forget that the word slave comes from the word slav. The slaves come out of Russia into Europe through the Middle Ages. All the Middle ages were full of slaves.

The American slave experience was peculiar in that it was having people really who were not of their own culture; not of their own civilization. If you think about it, you’re in a Greek family and who’s the nursemaid for the children? Well, she has to be somebody who’s going to speak their language, and is going to be giving them the cultural values.

Anyone who lost in war…they were just people who lost; when you lost you got killed or be made a slave and most people given the choice thought, “well I’d rather try living and see how that works out.”

Tangentially Speaking – Jim Fadiman 57:10 – 59:35

David Graeber makes the same point regarding Roman slavery:

What made Roman slavery so unusual, in historical terms, was a conjuncture of two factors. One was its very arbitrariness. In dramatic contrast with, say plantation slavery in the Americas, there was no sense that certain people were naturally inferior and therefore destined to be slaves. Instead, slavery was seen as a misfortune that could happen to anyone. As a result, there was no reason that a slave might not be in every way superior to his or her master: smarter, with a finer sense of morality, better taste, and a greater understanding of philosophy. The master might even be willing to acknowledge this. There was no reason not to, since it had no effect on the nature of the relationship, which was simply one of power. The second was the absolute nature of this power… DTF5kY, p. 202

Indeed, H.G. Wells felt that the vast importation of slaves after the second Punic war was the final “nail in the coffin” for the Roman yeoman class. As Roman society was flooded with slaves from military expansion, the price of slaves went down dramatically. It then became cost effective to buy large amounts of slaves and work them to death on large plantations, meaning that ordinary family farms could not compete in what was effectively an early “free market” economy. Cheap slaves allowed unprecedented concentration of wealth in fewer and fewer hands.

In the Roman experience, this is the beginning of a 100-year-long process of Italy going from being a patchwork of smaller farms with some large estates to nothing but sprawling, commercially-oriented estates. And yes, the United States is continuing to go through a very similar process. At the founding of our republic, everybody’s a farmer, and now everything is owned by what, Monsanto?

Moving beyond just strictly agricultural companies, large American corporations are now employing more and more people. There seems to be this move away from people owning and operating their own establishments, and they’re instead being consumed by large entities. You’re talking about the Amazons of the world swallowing up so much of the market share, it just doesn’t pay to be a clerk in a bookstore or own a bookstore, you end up being a guy working in a warehouse, and it’s not as good of a job.

It doesn’t really feel like they could’ve arrested the process. Fifteen years after some land bill, you’d ask, “Who has the land? The poor?” No, they all just got bought up again. There never was a good political solution to it. The problem of these small citizen farmers was not solved until 100 years later when they simply ceased to exist.

Before the Fall of the Roman Republic, Income Inequality and Xenophobia Threatened Its Foundations (Smithsonian)

So slavery appears not to have been “race based” in most ancient societies, which is what makes the American experience so unique. Apart from places like plantations, mines and quarries, most slaves were probably indistinguishable from people around them. They went off to work every day just like everybody else. Again, slavery was a legal distinction more than anything else.

It’s also essential to keep in mind that our vision of slavery as constant beatings and starvation has also drastically colored our view. This is again a result of North American racially-based plantation slavery. In reality, slaves were an investment, and whipped and starving people hardly made the best workers.The cruelty of plantation slavery was highlighted and emphasized in written accounts, both by ex-slaves and abolitionists, to turn people against it. In reality, it was probably not as brutal as it is often depicted. To be crystal clear here, this is not a justification for slavery!!! But it also makes us overlook slavery in the ancient world, where it was more of a social/economic status than racial. In fact, most slavery looked indistinguishable from the routine of the average wage worker today!

John Moes, a historian of slavery…writes about how the slavery we are most familiar with – that of the antebellum South – is a historical aberration and probably economically inefficient. In most past forms of slavery – especially those of the ancient world – it was common for slaves to be paid wages, treated well, and often given their freedom.

He argues that this was the result of rational economic calculation. You can incentivize slaves through the carrot or the stick, and the stick isn’t very good. You can’t watch slaves all the time, and it’s really hard to tell whether a slave is slacking off or not (or even whether, given a little more whipping, he might be able to work even harder). If you want your slaves to do anything more complicated than pick cotton, you run into some serious monitoring problems – how do you profit from an enslaved philosopher? Whip him really hard until he elucidates a theory of The Good that you can sell books about?

The ancient solution to the problem…was to tell the slave to go do whatever he wanted and found most profitable, then split the profits with him. Sometimes the slave would work a job at your workshop and you would pay him wages based on how well he did. Other times the slave would go off and make his way in the world and send you some of what he earned. Still other times, you would set a price for the slave’s freedom, and the slave would go and work and eventually come up with the money and free himself.

Moes goes even further and says that these systems were so profitable that there were constant smouldering attempts to try this sort of thing in the American South. The reason they stuck with the whips-and-chains method owed less to economic considerations and more to racist government officials cracking down on lucrative but not-exactly-white-supremacy-promoting attempts to free slaves and have them go into business.

So in this case, a race to the bottom where competing plantations become crueler and crueler to their slaves in order to maximize competitiveness is halted by the physical limitation of cruelty not helping after a certain point…

Meditations on Moloch (Slate Star Codex)

Moes argues that the reason slavery declined in ancient Rome was not because slaves were treated so cruelly that they could not reproduce themselves (whips and chains), but as a result of widespread manumission. They were freed. Slaves often cut deals where they would buy their freedom by entering in with business deals with their owners. Often times, they would split the profits:

Profitable deals could be made with the slave or with the freedman, who could be and usually was obligated to render services to his former master. A freedman often continued in the same employment or else was set up in business with funds supplied by the master, or, on the land, was given part of the estate to work as a tenant. Hence the slave in fact bought his own freedom, either by being given the opportunity to accumulate savings of his own, the “peculium,” or afterward as a freedman, having received his freedom, so to speak, on credit.

This system was to the advantage of the owner because it gave the slave an incentive to work well and in general to make himself agreeable to his master. Thus, while the owner did not (immediately) appropriate the entire surplus that the slave earned over and above the cost of his maintenance, he still got great eeconomic benefits in the long run… the most highly valued slaves were the most likely to be freed, for the full benefit of their talents could not be obtained under the whiplash but only by giving them the positive incentive of immediate or ultimate freedom.

Seen from this perspective, the difference between a slave and the plight of the average modern American worker becomes awfully difficult to define. Of course, if you dare broach this topic, you are immediately confronted with opprobrium–how dare you! This is a legacy of the horrors of racebased plantation slavery to which we are constantly reminded. But, historically, slavery had nothing to do with racism or (direct) violence!

No, slaves were simply the people who had to labor above and beyond what they wished to in order to produce a surplus for someone else. They also had no control over their work circumstances. They had to do what their master told them to do, for the amount of time he told them to do it, in the place where he told them to do it, and the way he told them to do it. And the slave only kept a portion of what they produced, with the lion’s share going to his or her master.  That doesn’t sound all that different from the situation of the average worker today, now does it? The ancients were aware of this. Cicero wrote:

“…vulgar are the means of livelihood of all hired workmen whom we pay for mere manual labor, not for artistic skill; for in their case the very wage they receive is a pledge of their slavery.”

Thus wage slavery is simply another type of slavery, and not as distinct from its ancient counterpart as we have been led to believe. True, we aren’t regularly starved and beaten. Yes, we can find a different patron–er–employer. But we are just a human resource. We make profits for others. We don’t have control over our workplace. When you understand that, by and large, ancient slavery had nothing to do with racial inferiority–actual or perceived, or outright violence, and was just an economic category of individuals, you can understand why this is the case.

And consider this: how could our modern society function without the massive tier of low-paid workers? In fact, the people who get paid the least are the most essential to society’s everyday functioning, as David Graeber has pointed out. They do the non-automated agricultural work. They pick our fruits and vegetables. They cook and prepare our food. They look after our children and take care of our elderly. They teach our children. They drive our cars and trucks. They maintain our lawns and gardens. They build and maintain our infrastructure. They construct our buildings. They keep our shelves stocked with goods and deliver them to our doorstep. Not all of these are minimum wage workers, but an increasing number of them are! If they all vanished, society would grind to an immediate halt. Yet just three people “own” as much as half the American workforce!

The difference is that wage slaves are rented instead of owned. We are continually compelled by the invisible whip and the lash of utter poverty and destitution.

Today’s college system is virtually indistinguishable from indentured servitude. in fact, I would argue that it’s worse! With indentured servitude, it’s true that you could not leave your employer and “shop around” for another one. But, if you went into debt, you were guaranteed gainful employment for the duration of the loan–something today’s college students would kill for! Instead, they are expected to go deeply into the debt just for the mere chance of finding employment in their chosen field, which, more often than not, they don’t. Sometimes, they must even labor for free to get certain jobs (unpaid internships). And student debt, unlike other debt, cannot be discharged in bankruptcy. What, then, really is the difference between it and debt bondage??? H1-B visas are a similar scam, where imported workers often work for less than their native-born counterparts and cannot easily leave their employer (i.e. sponsor) to seek out other work.

And now, we are constantly informed that we must “love our jobs” to the extent that we will even work for free for the privilege! Employers depict themselves as a paternalistic  “family” (albeit one that you can be removed from at any time and for any reason). It’s a sort of Stockholm Syndrome on a societal scale. Today, we are totally defined by our work. It forms the core of our identity (“so, what do you do..?”). We are informed from birth that we must “love our jobs” and “like what we do” We no longer even think of our bondage as bondage! We are totally brainwashed to love our captivity and identify with our captors, the ultimate victory of tyranny over freedom. As Henry David Thoreau wrote:

“[i]t is hard to have a Southern overseer; it is worse to have a Northern one; but worst of all when you are the slave-driver of yourself.”

So, when you take all this into consideration, clearly civilization has always run on compelled labor of one form or another. It cannot be any other way. Corvee labor, forced resettlement, military drafts, tributary labor, convict labor, serfdom, migrant and trafficked labor, debt peonage and indentured servitude have all existed alongside chattel slavery since the beginnings of civilization. Freedom is just an illusion:

It is the secret scandal of capitalism that at no point has it been organized primarily around free labor. The conquest of the Americas began with mass enslavement, then gradually settled into various forms of debt peonage, African slavery, and “indentured service”-that is, the use of contract labor, workers who had received cash in advance and were thus bound for five-, seven-, or ten-year terms to pay it back. Needless to say, indentured servants were recruited largely from among people who were already debtors. In the 1600’s there were at times almost as many white debtors as African slaves working in southern plantations, and legally they were at first in almost the same situation, since in the beginning, plantation societies were working within a European legal tradition that assumed slavery did not exist, so even Africans in the Carolinas were classified, as contract laborers. Of course this later changed when the idea of “race” was introduced.

When African slaves were freed, they were replaced, on plantations from Barbados to Mauritius, with contract laborers again: though now ones recruited mainly in India or China. Chinese contract laborers built the North American railroad system, and Indian “coolies” built the South African mines. The peasants of Russia and Poland, who had been free landholders in the Middle Ages, were only made serfs at the dawn of capitalism, when their lords began to sell grain on the new world market to feed the new industrial cities to the west. Colonial regimes in Africa and Southeast Asia regularly demanded forced labor from their conquered subjects, or, alternately, created tax systems designed to force the population into the labor market through debt. British overlords in India, starting with the East India Company but continuing under Her Majesty’s government, institutionalized debt peonage as their primary means of creating products for sale abroad .

This is a scandal not just because the system occasionally goes haywire… but because it plays havoc with our most cherished assumptions about what capitalism really is particularly that, in its basic nature, capitalism has something to do with freedom. For the capitalists, this means the freedom of the marketplace. For most workers, it means free labor. DTF5kY, pp. 350-351

Today, living in a high-tech age of fossil fuels and automation, why have our “energy slaves” not liberated us from this burden? We’ll consider that next time.

BONUS: Ellen Brown (Web of Debt) has an interesting piece on student loan debt slavery over at Truthdig:

The advantages of slavery by debt over “chattel” slavery—ownership of humans as a property right—were set out in an infamous document called the Hazard Circular, reportedly circulated by British banking interests among their American banking counterparts during the American Civil War. It read in part:

“Slavery is likely to be abolished by the war power and chattel slavery destroyed. This, I and my European friends are glad of, for slavery is but the owning of labor and carries with it the care of the laborers, while the European plan, led by England, is that capital shall control labor by controlling wages.”

Slaves had to be housed, fed and cared for. “Free” men housed and fed themselves. For the more dangerous jobs, such as mining, Irish immigrants were used rather than black slaves, because the Irish were expendable. Free men could be kept enslaved by debt, by paying wages insufficient to meet their costs of living. The Hazard Circular explained how to control wages:

“This can be done by controlling the money. The great debt that capitalists will see to it is made out of the war, must be used as a means to control the volume of money. … It will not do to allow the greenback, as it is called, to circulate as money any length of time, as we cannot control that.”

The government, too, had to be enslaved by debt…

Student Debt Slavery: Bankrolling Financiers on the Backs of the Young (Truthdig)

The Scars of the Past

There’s been an explosion in scholarship pinning the collapse of societies on both outbreaks of disease and natural variations in climate. James Scott dedicates a good portion of Against the Grain to considering the fragility of early states. As he points out, when it comes to the formation of complex state societies, the question isn’t so much “what took so long” as “how could this even happen at all?”  People don’t inherently want to be controlled or dominated by a sociopathic oligarchy, so why did they “bend the knee,” and remain kneeling ever since?

And, in fact, what we see is, rather than a direct, steady progression to larger and more complex societies as depicted by old narratives of history (the “progress” narrative), we see states rising and falling. The idea that “bigger is better” is not in evidence from the standpoint of the average peasant living in these cultures. As Scott points out at length, states are fragile things prone to undermining their own existence through various factors. We see this trend even today with active secession movements in Catalonia, Scotland, the United States, and the criticisms of the European Union and “free trade.”

Ancient Egypt may have fallen in part because of riots caused by climate change and volcanoes, according to a new paper. The new study paints a picture of the ancient civilisation riven by droughts and disasters. It looked at the impact of the severe events of ancient Egypt, finding that they caused stress on its economy and ability to fight wars.

The Nile was incredibly important for the ancient Egyptians of Ptolemaic Egypt, between 350 and 30BC. Each year monsoon rainfall brought summer flooding that helped grow crops to support the society. When those crops failed, societal unrest would ensue, according to detailed reports at the time.

Until now, researchers haven’t known what caused those strange but important floods. They now propose they were the result of volcanic activity – which in turn would have altered the climate and brought about disruption to the most central parts of society.

“Ancient Egyptians depended almost exclusively on Nile summer flooding brought by the summer monsoon in east Africa to grow their crops,” said Joseph Manning, lead author on the paper and the William K & Marilyn Milton Simpson professor of history and classics at Yale, in a statement.

“In years influenced by volcanic eruptions, Nile flooding was generally diminished, leading to social stress that could trigger unrest and have other political and economic consequences

Ancient Egypt may have been brought down by volcanoes and climate change, researchers say (The Independent)

What we are learning, principally from pathogen genomics, is that the fall of the Roman Empire may have been a biological phenomenon.

The most devastating enemy the Romans ever faced was Yersinia pestis, the bacterium that causes bubonic plague and that has been the agent of three historic pandemics, including the medieval Black Death. The first pandemic interrupted a remarkable renaissance of Roman power under the energetic leadership of the emperor Justinian. In the course of three years, this disease snaked its way across the empire and carried off perhaps 30 million souls. The career of the disease in the capital is vividly described by contemporaries, who believed they were witnessing the apocalyptic “wine-press of God’s wrath,” in the form of the huge military towers filled with piles of purulent corpses. The Roman renaissance was stopped dead in its tracks; state failure and economic stagnation ensued, from which the Romans never recovered.

Recently the actual DNA of Yersinia pestis has been recovered from multiple victims of the Roman pandemic. And the lessons are profound…

Was the fall of Rome a biological phenomenon? (Los Angeles Times)

The winter seasonality of the Plague of Cyprian points to a germ that thrived on close interpersonal contact and direct transmission. The position of the Roman Empire astride some of the major flyways of migratory birds, and the intense cultivation of pigs and domestic fowl such as chickens and ducks, put the Romans at risk. Climate perturbations can subtly redirect the migratory routes of wild waterfowl, and the strong oscillations of the AD 240s could well have provided the environmental nudge for an unfamiliar zoonotic pathogen to find its way into new territory. The flu is a possible agent of the pestilence.

A second and more probable identification of the Plague of Cyprian is a viral hemorrhagic fever. The pestilence manifested itself as an acute-onset disease with burning fever and severe gastrointestinal disorder, and its symptoms included conjunctival bleeding, bloody stool, esophageal lesions, and tissue death in the extremities. These signs fit the course of an infection caused by a virus that induces a fulminant hemorrhagic fever.

Church Records Could Identify an Ancient Roman Plague (The Atlantic)

1. During the reign of Marcus Aurelius, a pandemic “interrupted the economic and demographic expansion” of the empire.

2. In the middle of the third century, a mix of drought, pestilence, and political challenge “led to the sudden disintegration of the empire.” The empire however was willfully rebuilt, with a new emperor, new system of government, and in due time a new religion.

3. The coherence of this new empire was broken in the late fourth and early fifth centuries. “The entire weight of the Eurasian steppe seemed to lean, in new and unsustainable ways, against the edifice of Roman power…and…the western half of the empire buckled.”

4. In the east there was a resurgent Roman Empire, but this was “violently halted by one of the worst environmental catastrophes in recorded history — the double blow of bubonic plague and a little ice age.”

The Fate of Rome (Marginal Revolution)

Explanations for a phenomenon of this magnitude [Rome’s collapse] abound: in 1984, the German classicist Alexander Demandt catalogued more than 200 hypotheses. Most scholars have looked to the internal political dynamics of the imperial system or the shifting geopolitical context of an empire whose neighbours gradually caught up in the sophistication of their military and political technologies. But new evidence has started to unveil the crucial role played by changes in the natural environment. The paradoxes of social development, and the inherent unpredictability of nature, worked in concert to bring about Rome’s demise…

It turns out that climate had a major role in the rise and fall of Roman civilisation. The empire-builders benefitted from impeccable timing: the characteristic warm, wet and stable weather was conducive to economic productivity in an agrarian society. The benefits of economic growth supported the political and social bargains by which the Roman empire controlled its vast territory. The favourable climate, in ways subtle and profound, was baked into the empire’s innermost structure.

The end of this lucky climate regime did not immediately, or in any simple deterministic sense, spell the doom of Rome. Rather, a less favourable climate undermined its power just when the empire was imperilled by more dangerous enemies – Germans, Persians – from without. Climate instability peaked in the sixth century, during the reign of Justinian. Work by dendro-chronologists and ice-core experts points to an enormous spasm of volcanic activity in the 530s and 540s CE, unlike anything else in the past few thousand years. This violent sequence of eruptions triggered what is now called the ‘Late Antique Little Ice Age’, when much colder temperatures endured for at least 150 years. This phase of climate deterioration had decisive effects in Rome’s unravelling. It was also intimately linked to a catastrophe of even greater moment: the outbreak of the first pandemic of bubonic plague.

How climate change and disease helped the fall of Rome (Aeon)

Wealth inequality has been increasing for millennia (The Economist)

Where hunter-gatherers saw themselves simply as part of an inherently productive environment, farmers regarded their environment as something to manipulate, tame and control. But as any farmer will tell you, bending an environment to your will requires a lot of work. The productivity of a patch of land is directly proportional to the amount of energy you put into it.

This principle that hard work is a virtue, and its corollary that individual wealth is a reflection of merit, is perhaps the most obvious of the agricultural revolution’s many social, economic and cultural legacies.

The acceptance of the link between hard work and prosperity played a profound role in reshaping human destiny. In particular, the ability to both generate and control the distribution of surpluses became a path to power and influence. This laid the foundations for all the key elements of our contemporary economies, and cemented our preoccupation with growth, productivity and trade.

Regular surpluses enabled a much greater degree of role differentiation within farming societies, creating space for less immediately productive roles. Initially these would have been agriculture-related (toolmakers, builders and butchers), but over time new roles emerged: priests to pray for good rains; fighters to protect farmers from wild animals and rivals; politicians to transform economic power into social capital.

How neolithic farming sewed the seeds of modern inequality (The Guardian)

Scientists have traced the rise of the super-rich deep into our historical past to uncover the ancient source of social inequality. Their conclusion? Thousands of years ago, it was the use of large farm animals – horses and oxen that could pull ploughs – which created the equivalent of our multi-billionaire entrepreneurs today.

It was only with the domestication of cattle and horses – sometimes thousands of years after land cultivation had begun – that serious divisions between societies’ haves and have-nots began to emerge, eventually creating the ancient equivalent of today’s island-owning, jet-setting billionaires...

Super-rich shown to have grown out of ancient farming (The Guardian)

Not only was prehistory more equal, but people were physically stronger too:

Prehistoric women had stronger arms than today’s elite rowing crews (Phys.org)

Unearthing a masterpiece (University of Cincinnati)

Q: What inspired you to look into this story?

A: When I was doing the History of Rome [podcast], so many people asked me, ‘Is the United States Rome? Are we following a similar trajectory?’ If you start to do some comparisons between the rise and development of the U.S. and rise and development of Rome, you do wind up in this same place. The United States emerging from the Cold War has some analogous parts to where Rome was after they defeated Carthage [in 146 B.C.]. This period was a wide-open field to fill a gap in our knowledge.

Q: One topic you describe at length is economic inequality between citizens of Rome. How did that come about?

A: After Rome conquers Carthage, and after they decide to annex Greece, and after they conquer Spain and acquire all the silver mines, you have wealth on an unprecedented scale coming into Rome. The flood of wealth was making the richest of the rich Romans wealthier than would’ve been imaginable even a couple generations earlier. You’re talking literally 300,000 gold pieces coming back with the Legions. All of this is being concentrated in the hands of the senatorial elite, they’re the consuls and the generals, so they think it’s natural that it all accumulates in their hands.

At the same time, these wars of conquest were making the poor quite a bit poorer. Roman citizens were being hauled off to Spain or Greece, leaving for tours that would go on for three to five years a stretch. While they were gone, their farms in Italy would fall into disrepair. The rich started buying up big plots of land. In the 130s and 140s you have this process of dispossession, where the poorer Romans are being bought out and are no longer small citizen owners. They’re going to be tenant owners or sharecroppers and it has a really corrosive effect on the traditional ways of economic life and political life. As a result, you see this skyrocketing economic inequality…

Q: Do you see parallels between land ownership in Rome and in the modern United States?

A: In the Roman experience, this is the beginning of a 100-year-long process of Italy going from being a patchwork of smaller farms with some large estates to nothing but sprawling, commercially-oriented estates. And yes, the United States is continuing to go through a very similar process. At the founding of our republic, everybody’s a farmer, and now everything is owned by what, Monsanto?

Moving beyond just strictly agricultural companies, large American corporations are now employing more and more people. There seems to be this move away from people owning and operating their own establishments, and they’re instead being consumed by large entities. You’re talking about the Amazons of the world swallowing up so much of the market share, it just doesn’t pay to be a clerk in a bookstore or own a bookstore, you end up being a guy working in a warehouse, and it’s not as good of a job.

Before the Fall of the Roman Republic, Income Inequality and Xenophobia Threatened Its Foundations (Smithsonian)

I’ve mentioned previously previously about the role that the transformation of land and labor into commodities which could be bought and sold was critical to the establishment of capitalist market economies (along with the extensive monetization of the economy by the state).

Prior to the market economy, most land was distributed by feudal relations and not simply something to be bought and sold like a waistcoat or a side of beef. Land ownership and tenure was something that was critical to the social fabric. In England (as in much of Western Europe), much of the country’s land was held by the Catholic Church. When Hnery VIII broke with the Catholic Church, he seized monastic lands, and eventually sold them off. This created a market for land that had not existed before, and which was unique to Britain. This may have been the key even in turning land into a marketable commodity, which was key in the development of market capitalism. As Polanyi put it:

Production is interaction of man and nature; if this process is to be organized through a self-regulating mechanism of barter and exchange, then man and nature must be brought into its orbit; they must be subject to supply and demand, that is, be dealt with as commodities, as goods produced for sale.

Such precisely was the arrangement under a market system. Man under the name of labor, nature under the name of land, were made available for sale; the use of labor power could be universally bought and sold at a price called wages, and the use of land could be negotiated for a price called rent. There was a market in labor as well as in land, and supply and demand in either was regulated by the height of wages and rents, respectively; the fiction that labor and land were produced for sale was consistently upheld. Capital invested in the various combinations of labor and land could thus flow from one branch of production to another, as was required for an automatic levelling of earnings in the various branches.

Previous scholarship has argued that the demographic disaster after the Black Death caused a shortage of labor and led to the demise of the feudal system. Flight into cities would also have contributed to wage labor taking the place of status relations as the main form of contract. This, combined with the establishment of a market for land, may have both been the causes of the transformation of labor and land into saleable commodities, which was a necessary step toward the market economy. This paper argues that places where land was heavily commoditized after the dissolution of the monasteries correlate with places where the Industrial Revolution first took off. To my knowledge, this historical connection was never explored by Polanyi himself, but it does provide an interesting addendum to his argument that universal markets are created by top-down state power and authority. Fascinating stuff:

In 1534, Henry VIII decided to break with the Catholic Church. In addition to severing ties with Rome, Henry appropriated all taxes that monasteries, churches and other religious institutions paid to the Pope. When his financing needs – due to wars in France – became too great, he expropriated all monasteries in England, which collectively held about one third of all land in the country (Youings 1967). When the management of these vast properties turned out to outstrip the bureaucratic capacity of his government, Henry sold all monastic assets in England. The main effect of this dumping of land was the creation of local land markets. Where lands were before held in long leases whose rates were set by medieval custom, lands now changed hands frequently and at market rates. In a few years between 1535 and 1542, the majority of monastic land was sold. Since monastic holdings were often ancient and were spread out unevenly throughout England, villages were differentially impacted by this shock. Some villages had no monastic assets in them (monasteries often owned land far away from their physical buildings) whereas in others, a local – or distant – monastery may have held large tracts of land. We hypothesise that the creation of a land market can be linked to local differences in subsequent development and, ultimately, industrialisation.

The origins of the Industrial Revolution (VoxEU)

It’s notable that this event did not take place in France, or anywhere else in Western Europe! Is this why France lagged in the race to industry? The lands of the Church were, in fact, eventually seized and sold off, as in England. But this took place only in the aftermath of the French Revolution centuries later.

And where it did take place, it seems it had a similar effect as in England centuries earlier: higher agricutureal productivity and more industrial output:

The law passed by the French Constituent Assembly on 2 November 1789 confiscated all Church property and redistributed it by auction. Over the next five years, more than 700,000 ecclesiastical properties – about 6.5% of French territory – were sold…We find that in regions where more church land was auctioned off, land inequality was higher in the 19th century. Further, we show that this wealth imbalance was associated with higher levels of agricultural productivity and agricultural investments by the mid-19th century. Specifically, a district with 10% more Church land redistributed had 25% higher productivity in wheat production, about 1.6 more pipe manufacturers (used for drainage and irrigation projects), and about 3.8% less land left fallow. Our study also shows that the beneficial effects of revolutionary land redistribution on agricultural productivity gradually declined over the course of the 19th century. This result is consistent with other districts gradually overcoming the transaction costs associated with reallocating the property rights that came with the feudal system.

Economic consequences of revolutions: Evidence from the 1789 French Revolution (VoxEU)

And this article wonders whether Rome could have had an industrial revolution:

Could Rome Have Had an Industrial Revolution? (Medium)

And finally, the scars of destroying people’s way of life continue to linger hundreds of years later, down to the present day!

People living in the former industrial heartlands of England and Wales are more disposed to negative emotions such as anxiety and depressive moods, more impulsive and more likely to struggle with planning and self-motivation, according to a new study of almost 400,000 personality tests.

The findings show that, generations after the white heat of Industrial Revolution and decades on from the decline of deep coal mining, the populations of areas where coal-based industries dominated in the 19th century retain a “psychological adversity”.

Researchers suggest this is the inherited product of selective migrations during mass industrialisation compounded by the social effects of severe work and living conditions.

Industrial Revolution left a damaging psychological ‘imprint’ on today’s populations (phys.org)

And entire populations continue to be destroyed under capitalism…

James C. Scott’s Against the Grain

During my long discursion on the history of money, the academic James C. Scott published an important book called Against the Grain: A Deep History of the First States.

Regular readers will know that this has been a longstanding area of research (or obsession) of mine. I’ve referred to Scott’s work before, particularly Seeing Like A State, which I think is indispensable in understanding many of the political divisions of today (and why left/right is no longer a useful distinction). We’re in an era where much of the “left” is supporting geoengineering and rockets to Mars, and the “right” (at least the alt-right) is criticizing housing projects and suburban sprawl.

It’s a shame that Scott’s book shared the same title as another one of my favorite books on that topic by journalist Richard Manning that came out a while ago: Against the Grain: How Agriculture Hijacked Civilization. Manning’s book is not only a historical account about how the rise of grain agriculture led to war, hierarchy, slavery and sickness, but a no-holds-barred examination of today’s grain-centric agribusiness model, where wheat, corn, soy and sugar are grown in mechanized monocultures and processed by the food industry into highly-addictive junk food implicated in everything from type two diabetes, to depression to Alzheimer’s disease (via inflammation):

Dealing with surplus is a difficult task. The problem begins with the fact that, just like the sex drive, the food drive got ramped up in evolution. If you have a deep, yearning need for food, you’re going to get along better than your neighbor, and over the years that gene is going to be passed on. So you get this creature that got fine-tuned to really need food, especially carbohydrates. Which brings us to the more fundamental question: can we ever deal with sugar? By making more concentrated forms of carbohydrates, we’re playing into something that’s quite addictive and powerful. It’s why we’re so blasted obese. We have access to all this sugar, and we simply cannot control our need for it—that’s genetic.

Now, can we gain the ability to overcome that? I’m not sure. You have to add to this the fact that there’s a lot of money to be made by people who know how to concentrate sugar. They have a real interest in seeing that we don’t overcome these kinds of addictions. In fact, that’s how you control societies—you exploit that basic drive for food. That’s how we train dogs—if you want to make a dog behave properly, you deprive him or give him food. Humans aren’t that much different. We just like to think we are. So as an element of political control, food and food imagery are enormously important.

The Scourge of Agriculture (The Atlantic)

Cancers linked to excess weight make up 40% of all US diagnoses, study finds (The Guardian)

Child and teen obesity spreading across the globe (BBC)

In that interview, Manning also makes this point which got so much attention in Yuval Noah Harari’s blockbuster, Sapiens (which came out years later):

…it’s not just human genes at work here. It’s wheat genes and corn genes—and how they have an influence on us. They took advantage of our ability to travel, our inventiveness, our ability to use tools, to live in a broad number of environments, and our huge need for carbohydrates. Because of our brains’ ability, we were able to spread not only our genes, but wheat’s genes as well. That’s why I make the argument that you have to look at this in terms of wheat domesticating us, too. That co-evolutionary process between humans and our primary food crops is what created the agriculture we see today.

As for the title, I guess Against the Grain is just too clever a title to pass up 🙂

I’m still waiting on the book from the library, but I have seen so many reviews by now that I’m not sure I’ll be able to add too much. What’s interesting to me is the degree to which the idea that civilization was a great leap backward from what we had before is starting to go mainstream.

The old, standard “Whig version” story of directional, inevitable progress is still pretty strong, though. Here’s one reviewer describing how it was articulated in the turn-of-the-century Encyclopedia Britannica:

The Encyclopaedia took its readers through a panorama of universal history, from “the lower status of savagery,” when hunter-gatherers first mastered fire; to the “middle status of barbarism,” when hunters learned to domesticate animals and became herders; to the invention of writing, when humanity “graduated out of barbarism” and entered history. Along the way, humans learned to cultivate grains, such as wheat and rice, which showed them “the value of a fixed abode,” since farmers had to stay near their crops to tend and harvest them. Once people settled down, “a natural consequence was the elaboration of political systems,” property, and a sense of national identity. From there it was a short hop—at least in Edwardian hindsight—to the industrial revolution and free trade.

Some unfortunate peoples, even entire continents such as aboriginal North America and Australia, might fall off the Progress train and have to be picked up by kindly colonists; but the train ran along only one track, and no one would willingly decline to board it…

What made prehistoric hunter-gatherers give up freedom for civilization? (The New Republic)

But,it turns out that the reality was quite different. In fact, hunter-gatherers resisted agriculture. Even where farmers and H-G’s lived side-by-side, the H-G’s (and herders) avoided farming as long as they could. When Europeans equipped “primitive” societies with seeds and hoes and taught them to farm, the natives threw away the implements and ran off into the woods. The dirt farmers of colonial America often ran away to go and live with the nomadic Indians, to the extent that strict laws had to be passed to prevent this (as documented in Sebastian Junger’s recent book Tribe).

At the ‘Man the Hunter’ symposium in Chicago in 1966, Marshall Sahlins drew on research from the likes of Richard B. Lee among the !Kung of the Kalahari to argue that hunter-gatherers enjoyed the ‘original affluent society’. Even in the most marginal environments, he said, hunter-gatherers weren’t engaged in a constant struggle for survival, but had a leisurely lifestyle. Sahlins and his sources may have pushed the argument a little too far, neglecting to consider, for instance, the time spent preparing food (lots of mongongo nuts to crack). But their case was strong enough to deal a severe blow to the idea that farming was salvation for hunter-gatherers: however you cut it, farming involves much higher workloads and incurs more physical ailments than relying on the wild. And the more we discover, as Scott points out, the better a hunter-gatherer diet, health and work-life balance look.

Why did we start farming? (London Review of Books)

So why did they do it? That is a question that nobody know the answer to, but it appears they stumbled into not because it was a better way of life, but due to some sort of pressures beyond their control. As Colin Tudge put it, “People did not invent agriculture and shout for joy; they drifted or were forced into it, protesting all the way.” Rather than taking up agriculture because it presented a better, more secure way of life as the Victorians thought (due to chauvinism and ignorance), it was actually much more unpleasant and much more work.

The shift to agriculture was in some respects…harmful. Osteological research suggests that domiciled Homo sapiens who depended on grains were smaller, less well-nourished and, in the case of women, more likely to be anaemic, than hunter-gatherers. They also found themselves vulnerable to disease and able to maintain their population only through unprecedentedly high birthrates. Scott also suggests that the move from hunting and foraging to agriculture resulted in ‘deskilling’, analogous to the move in the industrial revolution from the master tradesman’s workshop to the textile mill. State taxation compounded the drudgery of raising crops and livestock. Finally, the reliance on only a few crops and livestock made early states vulnerable to collapse, with the reversion to the ‘dark ages’ possibly resulting in an increase in human welfare.

Book Review: Against the Grain: A Deep History of the Earliest States by James C. Scott (London School of Economics)

Circumstances beyond their control must have played a role. Climate change is most commonly implicated. Overpopulation must have played a role, but this raises a chicken-and-egg problem: overpopulation is a problem created by agrarianism, so how could it have caused it?

One novel idea I explored earlier this year was Brian Hayden’s idea that the production of ever-increasing surpluses was part of a strategy by aggrandizing individuals in order to gain political power.

Periodic feasting events were ways to increase social cohesion and deal with uneven production in various climatic biomes–it was a survival strategy for peoples spread-out among a wide geographical area (mountains, plains, wetlands, riparian, etc.). If food was scarce in one area, resources could be pooled. Such feasting/resource pooling regimes were probably the earliest true “civilizations” (albeit before cities). It was also the major way to organize mass labor, which lasted well into the historical period (both Egyptian and Mesopotamian texts testify to celebratory work feasts).

At these events, certain individuals would loan out surplus food and other prestige items in order to lure people in debt to them. Cultural expectations meant that “gifts” would have to repaid and then some (i.e. with interest). These people would get their relatives and allies to work their fingers to the bone in order to produce big surpluses in societies where this was possible, such as horticultural and affluent forager ones. This would be used for feasting. They would then become “Big Men”–tribal leaders lacking “official” status.

Would-be Big-Men would then try and outdo one another by throwing larger, richer feasts than their rivals. Competitive feasting provided an opportunity for aggrandizers to try and outdo one another in a series of power games and status jockeying. But the net effect such power games had across the society was to ramp up food production to unsustainable levels. This, in turn, led to intensification.

At these feasts, highly prized foodstuffs would be used by aggrandizers to lure people into debt and other lopsided obligations, as well as get people to work for them. Manning notes above how food has been traditionally used to control people. And, Hayden speculates, the foods most commonly used were ones with pleasurable or mind-altering effects. One common one was almost certainly alcohol.

He speculates that grains were initially grown not for flavor or for carbohydrates, but for fermentation. It’s fairly certain that alcohol consumption played a major role in feasting events, and it’s notable that the earliest civilizations were all big beer drinkers (Egypt, Mesopotamia, China, Mesoamerica). Most agricultural village societies around the world have some sort of beer drinking/fermentation ritual, as Patrick E. McGovern has documented. The first “recipe” ever written down was for beer brewing. Hayden speculates that early monoliths like Göbekli Tepe and Stonehenge were built as places for such feasting events to take place, wedded to certain religious ideologies (all of them have astronomical orientations), and archaeology tends to confirm this. It’s notable that the earliest sites of domestication/agrarianism we know of are typically in the vicinity of these monoliths.

In other words, the root of this overproduction was human social instincts, and not just purely environmental or climatic factors. Is there some connection between plant/animal domestication and religious ideology? Is it any wonder that religious concepts in these societies transform to become very different from the animist ones of hunter-gatherers? Flannery and Marcus point out that the establishment of a hereditary priesthood that constructs temples and interprets the gods’ wishes (replacing the shaman) is always a marker of the transition from an egalitarian society to a hierarchical one with hereditary leadership. Even in the Bible, king and temple arise more or less simultaneously (e.g. Saul/David/Solomon).

Scott considers whether the Younger Dryas, a period of markedly colder and drier conditions between 12,900 and 11,700 years ago, forced hunter-gatherers into farming. But while the change in climate may have inspired more experimentation with cultivation and herding, the Younger Dryas is too early: communities committed to cereals and livestock didn’t arise until about ten thousand years ago. Scott overlooks another possible factor: religious belief. The discovery of the Neolithic hill-top sanctuary of Göbekli Tepe in southern Turkey in 1994 went against the grain of conventional archaeological understanding of the Neolithic. Here, around 11,000 years ago, hunter-gatherers had constructed a vast complex of massive decorated stone pillars in exactly the same place that domesticated strains of wheat had evolved.

The quantities of food needed to feed the workforce and those who gathered for rituals at Göbekli must have been huge: if the Neolithic gods could persuade people to invest so much effort in construction, and to suffer the physical injuries, ailments and deaths that came along with it, then perhaps expending those extra calories in the fields would have seemed quite trivial. Even then, Göbekli doesn’t help us explain why cereal farming and goat herding took such a hold elsewhere. Personally I find it difficult to resist the theory of unintended self-entrapment into the farming lifestyle, which was then legitimated by Neolithic ideology. We find evidence of burial rituals and skull cults throughout the Fertile Crescent.

Why did we start farming? (London Review of Books)

Scott’s book emphasizes the key role that grain cultivation played in the rise of the early states (even in the title). Cereals grown it river bottoms were easy to assess and tax, unlike other foodstuffs which would ripen at different times of the year, could be hidden, or grown in patches. They were storable and divisible. In some ways, grain may have been the earliest form of money:

Most early crops could not provide a source of taxation. Potatoes and tubers are easily hidden underground. Lentils produce annually and can be eaten as they’re picked. Grains, however, have determinate ripening times, making it easy for the tax collector to show up on time. They cannot be eaten raw. And because grains are so small, you can tax them down to the grain. Unlike squash or yams, grains are easy to transport. Spoilage time is nothing like that of vegetables. All these factors played into the first widespread form of currency.

Is the Collapse of Civilizations A Good Thing? (Big Think)

Grain is special, but for a different reason. It is easy to standardize—to plant in rows or paddies, and store and record in units such as bushels. This makes grain an ideal target for taxation. Unlike underground tubers or legumes, grain grows tall and needs harvesting all at once, so officials can easily estimate annual yields. And unlike fugitive wild foods, grain creates a relatively consistent surplus, allowing a ruling class to skim off peasant laborers’ production through a tax regime of manageable complexity. Grain, in Scott’s lexicon, is the kind of thing a state can see. On this account, the first cities were not so much a great leap forward for humanity as a new mode of exploitation that enabled the world’s first leisured ruling class to live on the sweat of the world’s first peasant-serfs.

What made prehistoric hunter-gatherers give up freedom for civilization? (The New Republic)

It’s worth noting that it wasn’t simply agriculture, but cereal production that relied on artificial irrigation that saw the rise of the first states. The need to coordinate all that labor, partition permanent plots of land, and resolve settlement disputes, must have led to the rise of an elite managerial class, as Ian Welsh points out:

Agriculture didn’t lead immediately to inequality, the original agricultural societies appear to have been quite equal, probably even more so than the late hunter-gatherer societies that preceded them. But increasing surpluses and the need for coordination which arose, especially in hydraulic civilizations (civilizations based around irrigation which is labor intensive and require specialists) led to the rise of inequality. The pharoahs created great monuments, but their subjects did not live nearly as well as hunter-gatherers.

The Right Stuff: What Prosperity Is and Isn’t (Ian Welsh)

Wealth inequality has been increasing for millennia (The Economist)

And sedentism, as I’ve noted, is not so much a product of agriculture as a cause. Likely sedentary societies needed to be around for some time in order to build up the kind of surpluses aggrandizing elites needed to gain power. These probably started as “redistributor chiefs” who justified their role through some combination of martial leadership and religious ideology:

Sedentism does not have its origins in plant and animal domestication. The first stratified states in the Tigris and Euphrates Valley appeared ‘only around 3,100 BCE, more than four millennia after the first crop domestications and sedentism’. Sedentism has its roots in ecologically rich, preagricultural settings, especially wetlands. Agriculture co-existed with mobile lifestyles in which people gathered to harvest crops. Domestication itself is part of a 400,000 year process beginning with the use of fire. Moreover, it is not a process (or simply a process) of humans gaining increasing control over the natural world. People find themselves caring for dogs, creating an ecological niche for mice, ticks, bedbugs and other uninvited guests, and spending their lives ‘strapped to the round of ploughing, planting, weeding, reaping, threshing, grinding, all on behalf of their favorite grains and tending to the daily needs of their livestock’.

This was also noted in the Richard Manning interview, above:

…we always think that agriculture allowed sedentism, which gave people time to create civilization and art. But the evidence that’s emerging from the archeological record suggests that sedentism came first, and then agriculture. This occurred near river mouths, where people depended on seafood, especially salmon. These were probably enormously abundant cultures that had an enormous amount of leisure time—they just had to wait for the salmon runs to occur. There are some good records of those communities, and from the skeleton remains we can see that they got up to 95 percent of their nutrients from salmon and ocean-derived sources. Along the way, they developed highly refined art—something we always associate with agriculture.

Of course, urban societies using irrigation and plow-based agriculture, with their palaces and temples, are very different from horticultural village societies practicing shifting cultivation (which Scott terms “late-Neolithic multispecies resettlement camps.”). This is likely why early agricultural societies were roughly about as egalitarian as their immediate predecessors, as Ian Welsh pointed out above. But once the plow allowed men to wrest control of food production away from the garden plots of women, the fortunes of females declined rapidly. Political control became exclusively centered in the households run by patriarchs, with women becoming little more than chattel. And because there was now property to be passed down, women’s sexual behavior became strictly regulated and monogomy enforced (for commoners but not for elites). Several thousand years of increasing surpluses and population led to the Neolithic “experiment” metastasizing into the first city-states and empires in various parts of the world. This was not a swift process, but instead took thousands of years to develop–longer than all of “recorded” history:

…why did it take so long – about four thousand years – for the city-states to appear? The reason is probably the disease, pestilence and economic fragility of those Neolithic villages. How did they survive and grow at all? Well, although farming would have significantly increased mortality rates in both infants and adults, sedentism would have increased fertility. Mobile hunter-gatherers were effectively limited by the demands of travel to having one child every four years. An increase in fertility that just about outpaced the increase in mortality would account for the slow, steady increase in population in the villages. By 3500 BCE the economic and demographic conditions were in place for a power-grab by would-be leaders.

Why did we start farming? (London Review of Books)

How agriculture grew on us (Leaving Babylon)

Once such societies were established, they were under an obligation to expand. This was due to the depletion of their agricultural resource base thanks to overgrazing, salinization, erosion, deforestation, and numerous other environmental problems caused by agriculture, along with rapid population growth. New farmers require new land, since their birthrates are higher. As such societies expanded, their neighbors had only three options: fight back by adopting similar measures, succumb and be assimilated, or run away. Many did run away, which is why so much of the the world’s inhabitants lived outside of state control until the 1600’s, as Scott points out (Scott calls them ‘Barbarians’; he uses it a term of respect rather than Victorian derision).

Scott also emphasizes the key role played by slavery in agrarian states. In Scott’s view, slavery was absolutely essential to the functioning of the state. Because sedentary, agricultural societies tended to have so much unpleasant “grunt” labor to be done, there was a strong incentive to acquire slaves to do the dirty work required to keep the society running. Three major ways labor was compelled in the ancient world were corvée labor, chattel slavery, and (we often forget) debt bondage. This only ended once we got “energy slaves” to do much of this grunt work for us. Yet even today, we use wage slavery compelled by poverty along with migrant labor to do the grunt work necessary for us. Non-mechanized agricultural labor is still completely dependent on migrant labor in the U.S. and Europe, as are many low-skill, non-automated professions (driver, nanny, gardener, etc.) Ancient slavery was less about skin color or point of origin, as it was in the Americas (where a racial hierarchy was instituted by Europeans). Instead it was simply more of a legal status, much like a temp or migrant worker in countries today (or the Chinese Hukou system):

In the world of states, hunter-gatherers and nomads, one commodity alone dominated all others: people, aka slaves. What agrarian states needed above all else was manpower to cultivate their fields, build their monuments, man their armies and bear and raise their children. With few exceptions, the epidemiological conditions in cities until very recently were so devastating that they could grow only by adding new populations from their hinterlands. They did this in two ways. They took captives in wars: most South-East Asian early state chronicles gauge the success of a war by the number of captives marched back to the capital and resettled there. The Athenians and Spartans might kill the men of a defeated city and burn its crops, but they virtually always brought back the women and children as slaves. And they bought slaves: a slave merchant caravan trailed every Roman war scooping up the slaves it inevitably produced.

The fact is that slaving was at the very centre of state-making. It is impossible to exaggerate the massive effects of this human commodity on stateless societies. Wars between states became a kind of booty capitalism, where the major prize was human traffic. The slave trade then completely transformed the non-state ‘tribal zone’. Some groups specialised in slave-raiding, mounting expeditions against weaker and more isolated groups and then selling them to intermediaries or directly at slave markets. The oldest members of highland groups in Laos, Thailand, Malaysia and Burma can recall their parents’ and grandparents’ memories of slave raids. The fortified, hilltop villages, with thorny, twisting and hidden approaches that early colonists found in parts of South-East Asia and Africa were largely a response to the slave trade.

Crops, Towns, Government (London Review of Books)

In describing the early city-states of Mesopotamia, Scott projects backwards from the historical records of the great slave societies of Greece and Rome. His account of the slaves and the way they were controlled seems strangely familiar. Much like migrant labourers and refugees in Europe today, they came from scattered locations and were separated from their families, demobilised and atomised and hence easier to control. Slaves, like today’s migrants, were used for tasks that were vital to the needs of the elites but were shunned by free men. And slaves, like refugee workers, were gradually integrated into the local population, which reduced the chance of insurrection and was necessary to keep a slave-taking society going. In some early states human domestication took a further step: written records from Uruk use the same age and sex categories to describe labourers and the state-controlled herds of animals. Female slaves were kept for breeding as much as for manual labour.

Why did we start farming? (London Review of Books)

How we Domesticated

I’ve often wondered if, when certain humans learned how to domesticate plants and animals, they used it as much on their fellow man as they did their flora and fauna. In this Aeon article, this passage really struck me:

When humans start treating animals as subordinates, it becomes easier to do the same thing to one another. The first city-states in Mesopotamia were built on this principle of transferring methods of control from creatures to human beings, according to the archaeologist Guillermo Algaze at the University of California in San Diego. Scribes used the same categories to describe captives and temple workers as they used for state-owned cattle.

How domestication changes species including the human (Aeon)

Indeed, the idea that humans domesticated themselves is another key concept in Harari’s Sapiens. But perhaps that domestication was much more “literal” than we have been led to believe. Perhaps human sacrifice was a way for early religious leaders to “cull” individuals who had undesirable traits from their standpoint: independence, aggression, a questioning attitude, etc. Indeed, hunter-gatherers still do not like obeying orders from a boss. I wonder to what extent this process is still going on, especially in modern-day America with its schools, prisons, corporate cubicles, police, military, etc.:

Anthropologists and historians have put forward the ‘social control hypothesis’ of human sacrifice. According to this theory, sacrificial rites served as a function for social elites. Human sacrifice is proposed to have been used by social elites to display their divinely sanctioned power, justify their status, and terrorise underclasses into obedience and subordination. Ultimately, human sacrifice could be used as a tool to help build and maintain systems of social inequality.

How human sacrifice helped to enforce social inequality (Aeon)

How humans (maybe) domesticated themselves (Science News)

And this is very relevent to our recent discussion of money: writing and mathematics were first used as methods of social control. As Janet Gleeson-White points out in this essay, accounting was our first writing technology. Money–and taxes–were an outgrowth of this new communications technology:

War, slavery, rule by élites—all were made easier by another new technology of control: writing. “It is virtually impossible to conceive of even the earliest states without a systematic technology of numerical record keeping,” Scott maintains. All the good things we associate with writing—its use for culture and entertainment and communication and collective memory—were some distance in the future. For half a thousand years after its invention, in Mesopotamia, writing was used exclusively for bookkeeping: “the massive effort through a system of notation to make a society, its manpower, and its production legible to its rulers and temple officials, and to extract grain and labor from it.”

Early tablets consist of “lists, lists and lists,” Scott says, and the subjects of that record-keeping are, in order of frequency, “barley (as rations and taxes), war captives, male and female slaves.” Walter Benjamin, the great German Jewish cultural critic, who committed suicide while trying to escape Nazi-controlled Europe, said that “there is no document of civilization which is not at the same time a document of barbarism.” He meant that every complicated and beautiful thing humanity ever made has, if you look at it long enough, a shadow, a history of oppression.

The Case Against Civilization (The New Yorker)

Collecting cereal grains directly as taxes would have been cumbersome for administrators, which no doubt led to the innovations we’ve been discussing recently: a unit of account and debt/credit records. The temples were the first institutions to create and store surpluses, making them arguably the ancestor to later corporations (and capitalism). They were the first to do economic planning and charge interest. Later, rulers would strongly desire to monetize the economy by issuing coins, because it was far easier to collect coins and record taxes using this method than collecting resources in kind. We’ve already seen how money, markets, and the state are intimately intertwined (and not separate as libertarians claim).

The connection between the earliest writing and domestication/subjugation is powerfully made by this article from the BBC documenting the world’s oldest writing:

In terms of written history, this is the very remote past. But there is also something very direct and almost intimate about it too. You can see fingernail marks in the clay. These neat little symbols and drawings are clearly the work of an intelligent mind.

These were among the first attempts by our human ancestors to try to make a permanent record of their surroundings. What we’re doing now – my writing and your reading – is a direct continuation. But there are glimpses of their lives to suggest that these were tough times. It wasn’t so much a land of milk and honey, but porridge and weak beer.

Even without knowing all the symbols, Dr Dahl says it’s possible to work out the context of many of the messages on these tablets. The numbering system is also understood, making it possible to see that much of this information is about accounts of the ownership and yields from land and people. They are about property and status, not poetry.

This was a simple agricultural society, with a ruling household. Below them was a tier of powerful middle-ranking figures and further below were the majority of workers, who were treated like “cattle with names”. Their rulers have titles or names which reflect this status – the equivalent of being called “Mr One Hundred”, he says – to show the number of people below him.

It’s possible to work out the rations given to these farm labourers. Dr Dahl says they had a diet of barley, which might have been crushed into a form of porridge, and they drank weak beer. The amount of food received by these farm workers hovered barely above the starvation level. However the higher status people might have enjoyed yoghurt, cheese and honey. They also kept goats, sheep and cattle.

For the “upper echelons, life expectancy for some might have been as long as now”, he says. For the poor, he says it might have been as low as in today’s poorest countries.

Breakthrough in world’s oldest undeciphered writing (BBC)

So the earliest writing tends to confirm Scott’s account. And not just Scott’s account, but that of anthropologist James Suzman, who has simultaneously come out with a book about the disappearing way of life of the the !Kung San Bushmen of the Kalahari. This is also reviewed in the New Yorker article, above. These hunter-gatherers are going through today exactly what those people in the Near East experienced roughly 6-8000 years ago, giving us a window into history:

The encounter with modernity has been disastrous for the Bushmen: Suzman’s portrait of the dispossessed, alienated, suffering Ju/’hoansi in their miserable resettlement camps makes that clear. The two books even confirm each other’s account of that sinister new technology called writing. Suzman’s Bushman mentor, !A/ae, “noted that whenever he started work at any new farm, his name would be entered into an employment ledger, documents that over the decades had assumed great mystical power among Ju/’hoansi on the farms. The secrets held by these ledgers evidently had the power to give or withhold pay, issue rations, and determine an individual’s right to stay on any particular farm.”

Writing turned the majority of people into serfs and enabled a sociopathic elite to live well and raise themselves and their offspring above everyone else.

And here we are at the cusp of a brand new “information revolution” where literally our every thought and move can be monitored and tracked by a tiny centralized elite and permanently stored. And yet we’re convinced that this will make all our lives infinitely better! Go back and reread the above. I’m not so sure. I already feel like “cattle with a name” in our brave new nudged, credit-scored, Neoliberal world.

We’re also experiencing another period of rapid climate change and resource depletion, just like that experienced at the outset of the original coming of the state. We’re now doing exactly what they did: intensification, and once again it’s empowering a small sociopathic elite at the cost of the rest of us. And yet Panglossians confidently tell us we’re headed for a peaceful techno-utopia where all new discoveries will be shared with all of us instead of hoarded, and we’ll all live like gods instead of being exterminated like rats because we’re no longer necessary to the powers that be. Doubtless the same con (“We’ll all be better off!!!”) was played on the inhabitants of early states, too. Given the human social instincts noted above, let’s just say I’m not optimistic. Please pass the protein blocks.

Welcome to 2030. I Own Nothing, Have No Privacy, and Life Has Never Been Better (Futurism)

Scott points out that the state is a very novel development, despite what we read in history books. We read about the history of states because states left written history, and we are their descendants. But that doesn’t mean most people lived under them. By Scott’s account, most humans (barbarians) lived outside of nation-states well into the 1500’s:

…Homo sapiens has been around for roughly 200,000 years and left Africa not much earlier than 50,000 years ago. The first fragmentary evidence for domesticated crops occurs roughly 11,000 years ago and the first grain statelets around 5000 years ago, though they were initially insignificant in a global population of perhaps eight million.

More than 97 per cent of human experience, in other words, lies outside the grain-based nation-states in which virtually all of us now live. ‘Until yesterday’, our diet had not been narrowed to the three major grains that today constitute 50 to 60 per cent of the world’s caloric intake: rice, wheat and maize. The circumstances we take for granted are, in fact, of even more recent vintage …Before, say, 1500, most populations had a sporting chance of remaining out of the clutches of states and empires, which were still relatively weak and, given low rates of urbanisation and forest clearance, still had access to foraged foods. On this account, our world of grains and states is a mere blink of the eye (0.25 per cent), in the historical adventure of our species.

Crops, Towns, Government (London Review of Books)

Why a leading political theorist thinks civilization is overrated (VOX)

Wither Collpase?

One of the more provocative ideas from Scott’s book is to question whether the withering away of state capacity–that is, a collapse–is really a bad thing at all!

We need to rethink, accordingly, what we mean when we talk about ancient “dark ages.” Scott’s question is trenchant: “ ‘dark’ for whom and in what respects”? The historical record shows that early cities and states were prone to sudden implosion.

“Over the roughly five millennia of sporadic sedentism before states (seven millennia if we include preagriculture sedentism in Japan and the Ukraine),” he writes, “archaeologists have recorded hundreds of locations that were settled, then abandoned, perhaps resettled, and then again abandoned.” These events are usually spoken of as “collapses,” but Scott invites us to scrutinize that term, too.

When states collapse, fancy buildings stop being built, the élites no longer run things, written records stop being kept, and the mass of the population goes to live somewhere else. Is that a collapse, in terms of living standards, for most people? Human beings mainly lived outside the purview of states until—by Scott’s reckoning—about the year 1600 A.D. Until that date, marking the last two-tenths of one per cent of humanity’s political life, “much of the world’s population might never have met that hallmark of the state: a tax collector.”

Book Review: Against the Grain: A Deep History of the Earliest States by James C. Scott (LSE)

Indeed, is collapse even a relevant concept when discussing history? What, really is collapsing? States can collapse, but cultures transform:

We also need to think about what we apply the term ‘collapse’ to – what exactly was it that collapsed? Very often, it’s suggested that civilisations collapse, but this isn’t quite right. It is more accurate to say that states collapse. States are tangible, identifiable ‘units’ whereas civilisation is a more slippery term referring broadly to sets of traditions. Many historians, including Arnold Toynbee, author of the 12-volume A Study of History (1934-61), have defined and tried to identify ‘civilisations’, but they often come up with different ideas and different numbers. But we have seen that while Mycenaean states collapsed, several strands of Mycenaean material and non-material culture survived – so it would seem wrong to say that their ‘civilisation’ collapsed. Likewise, if we think of Egyptian or Greek or Roman ‘civilisation’, none of these collapsed – they transformed as circumstances and values changed. We might think of each civilisation in a particular way, defined by a particular type of architecture or art or literature – pyramids, temples, amphitheatres, for example – but this reflects our own values and interests.

[…]

States collapsed, civilisations or cultures transformed; people lived through these times and employed their coping strategies – they selectively preserved aspects of their culture and rejected others. Archaeologists, historians and others have a duty to tell the stories of these people, even though the media might find them less satisfactory. And writers who appropriate history for moral purposes need to think carefully about what they are doing and what they are saying – they need to make an effort to get the history as right as possible, rather than dumbing it down to silver-bullet theories.

What the idea of civilisational collapse says about history (Aeon)

Scott looks at the fragility of states–and their propensity to revert to more simplified forms, as simply a necessary and inevitable part of the process of history. Rather than a catastrophe, a reduction in complexity often leads to an increase in personal freedom, social experimentation, autonomy, and even artistic development and cultural expression. The Middle Ages is often portrayed as a “dark age,” but that depiction was an invention of the Renaissance, and “dark” referred to the lack of written historical sources, not necessarily wail and woe. Note that the tools of the oppressor – written records, taxation, slavery, usury and money – all fade during this time period. This is not to dismiss the very real disappearance of technology, epidemic disease and warfare that accompanies a state collapse, but merely to suggest a more nuanced view. The Middle Ages was centered around the values of the Church, and society was reoriented along these lines.

Scott writes about the normalising effects of state collapse. Often it was the best thing possible for a people now emancipated from disease, taxes and labour. In the subsequent ‘dark ages’ – a propaganda term used by the elite – democracy and culture could flourish. Homer’s Iliad and Odyssey date from the dark age of Greece. This is in marked contrast to the consequences of state collapse today, now that there is no longer an external barbarian world to escape into. When Syria collapsed its refugees had no choice but cross the border to another state, whether Lebanon, Jordan or Turkey.

Why did we start farming? (London Review of Books)

While Scott’s topics are timely—tribalism, taxation, trade, warfare—one is particularly relevant: the collapse of civilizations. Shifting landscapes, battles, and resource depletion are all factors that forced newly sedentary societies to pack it up and move on once again. Scott does not see this as a necessary evil, but rather part of the natural order of things: “We should, I believe, aim to “normalize” collapse and see it rather as often inaugurating a periodic and possibly even salutary reformation of political order.”

Is the Collapse of Civilizations A Good Thing? (Big Think)

Scott argues that the loss of state capacity, rather than a tragedy, can often be seen as a liberating event. Yes, such periods mean more poverty, but without the yoke of the state, it can also paradoxically mean more freedom and happiness for the survivors of the collapse. And since relative poverty appears more harmful psychologically than absolute poverty, many societies tend to have greater well being after they’ve fallen apart. He writes:

When the apex disappears, one is particularly grateful for the increasingly large fraction of archaeologists whose attention was focused not on the apex but on the base and its constituent units. From their findings we are able not only to discern some of the probable causes of “collapse” but, more important, to interrogate just what collapse might mean in any particular case…much that passes as collapse as, rather, a disassembly of larger but more fragile political units into their smaller and often more stable components. While “collapse” represents a reduction in social complexity, it is these smaller nuclei of power—a compact small settlement on the alluvium, for example—that are likely to persist far longer than the brief miracles of statecraft that lash them together into a substantial kingdom or empire.

Over time an increasingly large proportion of nonstate peoples were not “pristine primitives” who stubbornly refused the domus, but ex–state subjects who had chosen, albeit often in desperate circumstances, to keep the state at arm’s length…The process of secondary primitivism, or what might be called “going over to the barbarians,” is far more common than any of the standard civilizational narratives allow for. It is particularly pronounced at times of state breakdown or interregna marked by war, epidemics, and environmental deterioration. In such circumstances, far from being seen as regrettable backsliding and privation, it may well have been experienced as a marked improvement in safety, nutrition, and social order. Becoming a barbarian was often a bid to improve one’s lot.

Thus, the leveling effects of “collapse” may be not as “disastrous” as we are led to believe.

Scott’s book gives us hope that the collapse of states, rather than being a universally bad thing, might lead to a flourishing of human freedom. In that, there is some hope. I’ll end with this thought from Scott’s review of Diamond:

Anthropology can show us radically different and satisfying forms of human affiliation and co-operation that do not depend on the nuclear family or inherited wealth. History can show that the social and political arrangements we take for granted are the contingent result of a unique historical conjuncture.

The Origin of Money – Key Takeaways

“We begin with the story of the greatest conqueror in history, a conqueror possessed of extreme tolerance and adaptability, thereby turning people into ardent disciples. This conqueror is money. People who do not believe in the same god or obey the same king are more than willing to use the same money. Osama Bin Laden, for all his hatred of American culture, American religion and American politics, was very fond of American dollars. How did money succeed where gods and kings failed?”
~ Yuval Noah Harari, “Sapiens: A Brief History of Humankind,” (2015)

So I’m done for now writing about the history of money, which is doubtless good news to any readers I still have left (if there are any). I went way too far down the rabbit hole on this one 😊.

But the way it started was actually very simple. When I started, I had two major questions. One was, where did the notion of a “national debt” come from??? I mean, you never hear about the national debt of ancient Greece and Rome do you? In fact, it’s hard to imagine any ancient empire, from Persia to China voicing concerns about the national debt. Yet now it seems to drive just about every decision any government makes. We’re constantly told that “we can’t afford” this or that because it would increase the national debt. But how can a nation-state go into debt merely by issuing its own money? And how can every country in the world be simultaneously in debt? Since every nation-state is the ultimate source of its own currency, how can they be in debt? To whom?

The other major question I had was how did we get this weird hybrid system where we have government money, but private banks and financiers seem to control it? After all, money is a public good. We all need it. It should theoretically be under democratic control. But actual control over it is exercised by a secretive cabal of bankers and financiers who are not accountable to any democratic institutions. As Michael Hudson says, “every economy is planned, it’s only a matter of who does the planning.” He argues that in our society it is the private financial interests who do the planning rather than government bureaucrats, and they do so primarily to benefit themselves, even to the detriment of society. As Frederick Soddy said:

“… every monetary system must at long last conform, if it is to fulfil its proper role as the distributive mechanism of society. To allow it to become a source of revenue to private issuers is to create, first, a secret and illicit arm of the government and, last, a rival power strong enough ultimately to overthrow all other forms of government.”(The Role Of Money[1932]).

Hopefully we learned some answers to those two questions. A thoroughgoing history of money, rather than just being of historical interest, does give us some crucial insights into current dilemmas and what we need to do going forward.

So, by way of conclusion, here are some of the major takeaways I got while writing this series of posts. If your eyes glazed over during the series or you just quit reading over the summer (and I don’t really blame you), I encourage you to come back and at least read this instead:

What is money and finance at its heart? It’s a way to get large numbers of people to cooperate on the same goal. As Yuval Noah Harari writes in Sapiens, because our “natural” group size is fairly small (only about 150 or so), we need to invent shared fictions to get people in order to cooperate at larger and larger scales.

For a long time, religion was the major one. Then came the nation-state. Now finance seems to be the major way of controlling people and getting them to cooperate. With enough money you can get people to do just about anything, including have sex with you or kill one another. Money is permission. But usually it’s used for more benign purposes, such as getting thousands of people from all over the world to cooperate in a shared goal such as building electric cars or making and selling fizzy drinks.

Homo sapiens have no natural instincts for cooperating with large numbers of strangers. Humans evolved for millions of years living in small bands. Consequently, there are no instincts for mass social cooperation. To make up for that, humans have to rely on all kinds of imagined realities that regulate cooperation on such a huge scale. The human empires are based on shared common beliefs, social and legal norms that sustain them. The stability of the complex societies is not based on natural instinct or on personal acquaintance, but on shared imagined realities. Coursera: A Brief History of Humankind by Dr. Yuval Noah Harari

Both the corporation and the nation-state are shared legal fictions invented to bind large numbers of people together with imaginary ties to some sort of common purpose. Since its invention in the 1600’s, the corporate form has gained more and more power relative to the nation-state which created it.

Money is transferable debt (or credit). This is the “credit theory of money.” In “primitive” societies, many items are used to signify debts and obligations between various individuals, groups, and families. But once these obligations can be transferred among unrelated people, it becomes a type of money, even if the “exchange” is just by oral agreement (as on the island of Yap). “Money” may not even have corporeal form, but if it does, then it is usually standardized in some way (stone disks, shells, beads, coins, paper, etc.).

Money, then, is credit and nothing but credit. A’s money is B’s debt to him, and when B pays his debt, A’s money disappears. This is the whole theory of money (Innes 1913, p.16)

…money is anything that denotes and extinguishes one’s debt/liability to another; it was not a product of market exchanges but rather a byproduct of social relations based on debt…the nature of money is a credit-debt relationship that can only be understood in institutional and social contexts…Therefore, money originated as a byproduct of social relations based on debt and realized its standard form through the need of the central authority, as opposed to private individuals, to establish a standard unit of account to measure debt obligations or production surplus.
Vincent Huang, On the Nature of Money, p. 6

In fact, there are numerous examples from history where the state has stopped issuing money or the banks have closed (such as Ireland in the 1960’s), and private credit circulated as money in the form of checks. Rather than barter, the establishment of new credit-clearing systems is the common response when currency systems seize up and go under at levels greater than a local village. Also, we see that throughout history the shrinking of the state has led to a curtailment of trade, not an expansion, suggesting that the markets, money and governments are symbiotic and not in opposition as we are led to believe.

Money emerges when one class is able to impose obligations on the rest of society. This is the contention of John F. Henry. This could be a redistributor chief, a warlord, a royal household or a divine priesthood. Henry contends that hydraulic engineers were the first such class to emerge in ancient Egypt. Carroll Quigley argues that ancient priests used their knowledge of reading the movements of the heavens to predict floods, and this allowed them to set themselves up as a ruling class. Others posit that the need to wage warfare led to warlords setting themselves up a ruling class. Religion probably played a key role; the root of the word hierarchy is “hiero-“ meaning sacred.

In every case, this class probably engaged in astronomy; managed collective labor in some way, whether in military or engineering endeavors; and collected goods for redistribution among the populace.

Money has no value in and of itself. It is not the thing that matters, but the ability of one section of the population to impose its standard on the majority, and the institutions through which that majority accepts the will of the minority. Money, then as a unit of account, represents the class relations that developed in Egypt (and elsewhere), and class relations are social relations.

To service the activities of this class, resources needed to be deployed to fund their efforts. To keep track of these resources, a unit of account was established by these authorities (priests and scribes).

As James C. Scott points out, writing was originally invented as a tool for social control of the masses by the ruling class, not as a form of cultural expression (which was oral). Written records first emerge to manage inputs and outputs. There is a fascinating argument that clay bullae envelopes were a form of double entry bookkeeping, with debits represented by tokens placed inside and credits represented by the markings on the outside.

Apart from its role in the invention of writing, accounting is significant for human civilization because it affects the way we see the world and shapes our beliefs. To take this early example, the invention of token accounting in Mesopotamia was important not only because it facilitated economic exchanges and generated writing, but, according to Mattessich, “because it encouraged people to see the world around them in terms of quantifiable outcomes. …
Jane Gleeson-White, Accounting: Our First Communications Technology

Along with writing, establishing common standards of measurement appears to have been a chief function of the ruling classes since their emergence. Ancient priests tracked cyclical movements of celestial bodies and divided the year into discrete units to determine the precise timing of the planting and harvest, as well as ritual gatherings and feasts. They encoded these heavenly movements and measurements into their monuments in order to depict a kind of cosmic order on earth–“as above so below.” The built calendrical monuments such as Gobeckli Tepe, Stonehenge and Nabta Playa. They began to measure distance in addition to time (to mark off plots of land) and weights and quantities (to measure offerings to the gods). This process happened independently in both the Old World and the New. Thus, the creation of a unit of measurement to establish equivalencies between disparate goods produced by households was a logical extension of the duties of the ruling class once people began to occupationally specialize.

…the rise of class society and inequality took place alongside the emergence of money, whereby money played a key role in establishing, maintaining and exacerbating inequality and class division in societies. To put it simply, as soon as one witnesses the emergence of money, one observes the rise of class society and economic inequalities. Money, class society, and inequality came into being simultaneously, so it seems, mutually reinforcing the development of one another. Semenova and Wray, The Rise of Money and Class Society: The Contributions of John F. Henry (PDF) p.2

Which leads to the following conclusion:

The “unit of account” role appears to have been the first function of money to emerge (and not the “means of exchange” or “store of value” functions). Thousands of years before the first coins were minted, tributes and donations to temples were denominated in a standard unit of account, such as the shekel in Babylonia and the deben in Egypt. Babylonian scribes established money-prices for internal administrative purposes to track the crops, wool, barley, and other raw materials distributed to their dependent workforce, as well as to calculate the rents, debts and interest owed to the temples and palaces. These prices were then fixed to a certain weight of silver, allowing it to be used as a standard measure of value and means of payment. Initially, grain (the principal product of the Mesopotamian economy) was used, but its value fluctuates too widely from year-to-year, so silver replaced it.

Thus, the authorities can determine, for example, that 1 horse = 2 cows = 5 pigs = 10 bushels of grain = 1 ounce of silver. This was used to calculate inputs and outputs for the redistribution economies of the Bronze Age. It was also used in assessing fines and punishments by legal and religious authorities. Such compensation payments for transgressions kept societies stable in the face of increasing numbers of strangers living shoulder-to-shoulder. There are clues in our language: the word for “debt” also means “sin” or “transgression” in many languages, and the verb “to pay” also means “to appease” or “to pacify.”

…money as a unit of account precedes its roles as a medium of exchange and store of value… It thus follows that the physical manifestation of money (the “money things”) is not necessary since money as a debt relation needs not be physically tangible. This has been demonstrated as early as in Mesopotamia (3100BC) where crops and silver were used as standard units of account but not as a general medium of exchange. Exchanges simply took the form of credit and debit entries in clay tablets, similar to our electronic payment system today. Vincent Huang, On the Nature of Money p.6

Money (a standard unit of account, used to denote debts or assess value) predates coins by [millennia], and coins only ever comprised a small fraction of the money in daily use. Most ancient money was in the form of marks on clay tablets or notes on pieces of papyrus, just as it is today (computers replacing clay or papyrus)…Coins were for spot transactions, untrusted persons and ceremonial gifts (donatives). The real cost of making money was and is in establishing and maintaining the trust needed to support it. https://rwer.wordpress.com/2014/02/03/the-real-costs-of-making-money-2-where-did-the-silver-used-to-buy-josef-come-from/

The unit of account was typically based on what was most appropriate for giving to the gods. This is a point David Graeber makes. We are all “in debt” from the moment we are born–to the gods, to our ancestors, to our parents, and to our society. This “primordial debt” is discharged by sacrificing to the gods or gifts to temples (mediated by the religious authorities). Hence, that “universal debt” becomes the cornerstone of taxation, and hence the first monetary systems. For example, the Bible demands a ten-percent gift of one’s income to the temple (a tithe).

Every tithe of the land, whether of the seed of the land or of the fruit of the trees, is the Lord’s; it is holy to the Lord. If a man wishes to redeem some of his tithe, he shall add a fifth to it. And every tithe of herds and flocks, every tenth animal of all that pass under the herdsman’s staff, shall be holy to the Lord. One shall not differentiate between good or bad, neither shall he make a substitute for it; and if he does substitute for it, then both it and the substitute shall be holy; it shall not be redeemed.” (LEV. 27:30–34)

So, for example, in ancient Mesopotamia, the fact that silver is “captured sunlight,” gives it a divine quality which makes it highly desirable for gifts to the temple. Thus, the unit of account becomes equivalent to a certain weight of silver.

Silver was sort of a “goldilocks commodity” – there was enough of it for coinage, but no so much that it would be too easy for anyone to procure. It only comes from a single place–a mine down deep in the earth, most of which were owned by the authorizes. Things like apples and hides could not be useful, for example, because they were widely distributed. You had to use something whose issuance could be controlled by the state. By stamping the ruler’s mark on the coins, it gained value in exchange over and above its precious metal value. That is, they were tokens:

Coinage arose at approximately the end of the seventh century BCE in Lydia (in what is now western Turkey), where there was an abundant supply of electrum, a natural alloy of gold and silver. But coinage was first used in everyday life in the Greek city-states on the coast of Lydia. One plausible theory is that it arose out of the best possible way for the Lydian monarchy to use its abundant electrum to pay Greek mercenaries. Each piece of electrum had a different and undeterminable proportion of gold and silver (and so a different metallic value), but numerous pieces each with exactly the same value could be created by stamping them with a mark meaning ‘this is worth x’. And so from the very beginning of coinage its conventional value was different from (generally greater than) its metallic value. Radical Anthropology, Richard Seaford interview (PDF)

In ancient Greece, cattle were ranked and sacrificed to the gods. Thus, the value of things such as ships and armor were measured against cattle, even though no one ever used cattle to buy or sell anything. In ancient Ireland, slave girls (kumals) were the most valuable commodity, so items were evaluated against them. Eventually, the kumal became just an abstract unit of account for trading purposes, divorced from its original context.

Religiously significant metals became important as temple offerings and temples began accumulate large reserves. Followers of the religion would look to acquire the metal, to enable them to make an offering to the gods, and so the metal became the commodity in the most demand. The Ancient Egyptians, who had easy access to gold, used Cypriot copper for their religious offerings while the Cypriots used Egyptian gold. In Mesopotamia, the metal of choice was silver…Later, we read in Homer that the Greeks priced goods in terms of oxen, the animal that was reserved for sacrifices to the gods, ..When ‘Currency Cranks’ or ‘Bullionists’ argue that the economy would be improved by reverting to a Gold Standard because gold has an ‘inherent value’ they need to explain where is the value in gold, apart from its inherent symbolic, representative, value. Lady Credit (Magic, Maths and Money)

The Unit of Account and the Means of Exchange need not be the same. In fact, for most of history they weren’t! In the Middle Ages, prices were denominated and taxes assessed in a common unit of account (e.g. livres tournois), but hundreds of different coins churned out by dozens of mints were used to pay them (such as the Piece of Eight or Louis d’Or).

In many places, there was often no coin equivalent to the unit of account. Coins were exclusively minted by authorities. The coins didn’t have a fixed value, rather their exchange value fluctuated and was dictated by government fiat. However, their bullion value fluctuated according to supply and demand in the marketplace. Imbalances between bullion and exchange values led to surfeits and shortages of precious metals, with corresponding price swings (e.g. the “Price Revolution”). This led to efforts by authorities to restrict the movements of precious metals (bullionism).

Similarly, bills of exchange were denominated in an abstract unit of account (écu de marc), which did not correspond to any particular sovereign currency in circulation. The arbitrage between this abstract unit of account and the currencies of the time is how bankers made their money when usury was still illegal.

An interesting example of this was seen in Brazil in the 1990’s. The government created a totally new unit of account, the “Unit of Real Value” (URV) which would hold its value relative to the currency (the cruzeiro), which was subject to hyperinflation. Prices, taxes and wages would be denominated in the URV, which would remain constant, while the amount of cruzeiros needed to equal 1 URV would vary. Eventually, once prices stabilized, the country would introduce a totally new currency equivalent to the URV (the real).: How Fake Money Saved Brazil (NPR)

Items that can be accepted in payment as fines or taxes to authorities acquire value in private transactions. As MMT economists point out, the dollar is given value by the ability to pay ones taxes with it, that is, one is able to discharge one’s personal obligations to the state with dollars (and only dollars). Thus, prices are typically denominated in dollars as well, and producers accept dollars in exchange for goods and services.

While private money can be created and issued, the ability to pay one’s taxes with dollars means there is always a demand for them. Also, since money is transferable credit, the government’s credit is typically much more reliable than that of private individuals. This is often referred to as the “state theory of money,” of “chartalism”:

[While] private individuals may have different units of accounts (cattle, watermelon, etc.)…It is unlikely that any individual could have sufficient power to induce others to hold its liabilities as a standard unit of account…By choosing a unit of account as the only means for individuals to extinguish his/her liabilities to themselves, the central authorities “write the dictionary”. Hence, the power of the central authority (state, temple, tribe, etc.) to impose a debt liability (fines, fees, taxes, etc.) on its population gives the former the unique right to choose a particular unit of account as the only means of payment to the central authority.

Money’s value comes from faith in the issuing government’s credit. The loss of faith in the currency had much more to do with the stability of the issuing government rather than the amount of precious metal contained in the coin. Numismatists can find no solid correlation between prices and the precious metal content of coins over millennia. Nor can they find a consistent standard of how much specie a coin “should” have.

In the case of paper money, the paper itself is not valuable; it is the enforceable claim written on it that’s valuable. Originally this promised to pay the bearer in coin. Then it evolved into banknotes–sort of a “paper coin”–a signifier of government debt which did not pay interest.

The above leads to the following conclusion:

Money led to markets, not vice-versa. Once the concepts of money and prices are firmly established by central authorities, only then can decentralized exchanges can take place in markets. That these standards were initially set by authorities makes far more sense (and is historically better supported), than the idea that money emerged spontaneously by private individuals to reduce search costs without recourse to any centralized authority through innumerable acts of barter.

Once the state has created the unit of account and named what can be delivered to fulfill obligations to the state, it has generated the necessary pre-conditions for development of markets. The evidence suggests that early authorities set prices for each of the most important products and services. Once prices in money were established, it was a short technical leap to creation of markets. This stands orthodoxy on its head by reversing the order: first money and prices, then markets and money-things (rather than barter-based markets and relative prices, and then numeraire money and nominal prices). The Credit Money and State Money Approaches by L. Randall Wray, p.9

Religion played a key role in the establishment of money and markets from the very beginning. This makes sense, since religion was the primary unifying and coordinating “story” for ancient societies. The source of the word religion literally means “to bind together.” We saw above the Biblical instructions on tithing.

Temples appear to have been the first banks and the first treasuries. Sumerian temples stored precious metals, made loans, rented land, coordinated labor, established prices for key goods, and determined fees and fines. The obolos, the lowest denomination Greek coin, derived its name from the iron spits (obelos) through which sacrificial roast meat was evenly distributed to the members of the tribe. The drachma derives its name from obeliskon drachmai, a ‘handful’ of spits. This communal ritual is thought to have influenced Greek ideas of decentralized exchange and universal value, in contrast to the centrally-administered economies of the Near East. The iron spits acquired value in interpersonal exchange. Later, Greek temples distributed stndardized lumps of metal, stamped with the city’s emblem, to all adult male members of the polis which allowed for the unique social order to be maintained.

In contrast to most ancient near-eastern societies, the Greek polis had retained sacrificial ritual that embodied the principle of communal egalitarian distribution. The fact that the Greek word for this distribution (moira) came to mean ‘fate’ indicates the importance of the distributional imperative. Citizenship was marked by participation in communal sacrifice, which also provided a model for the egalitarian distribution of metallic wealth in standardised pieces.Some of the vocabulary of early coinage comes from animal sacrifice. For instance, the word ‘obol’, used of a coin, comes from the word for a spit. In the communal egalitarian distribution meat was distributed on iron spits, which were of standard size as well as being portable and durable, i.e. they could be used as money (in a limited way). With the use of more precious metal in exchange, ‘obol’ was transferred to a piece that was of roughly equal value and so of much smaller size (and so even more convenient). Radical Anthropology, Richard Seaford interview (PDF)

Markets also appear to have emerged around religious buildings. Many ancient exchanges were near temples. The great fairs of Europe in places like Champagne and Lyon took place near churches and cathedrals under the watchful eye of the all-seeing God. Since so much of trade relies on trust and belief (credit comes from credere = “to believe”), it is logical that religion would play a central role:

We tend nowadays to think of religion as the non-material activity of mankind. Did not Jesus expel the moneychangers from the Temple? Does not Islam forbid the charging of interest on loans? Did not a similar Christian prohibition of usury hold back mediaeval Europe’s economic development for centuries? Yet when Jesus took his action against the money-changers he must have been reversing the tradition of several millennia. The temples were the source of commercial law and practice. They had developed writing for the keeping of their accounts. They imposed the moral code which made promises inviolable. In Mesopotamia temples employed the poor, the widows and the orphans in factories which produced textiles to be traded abroad for the commodities the region lacked, including silver, copper, tin and lead. They were, it seems, the major business centres. (Innes p. 136)

Some theorists posit that as exchanges became more common and societies became more affluent, they invented “big gods” that could see everything and demanded that we behave a certain way (honest, truthful, etc.). These “Big gods” could transcend the limits of the old tribal gods that were based on shared ancestry and culture. All you had to do was profess belief! Wealth may have driven the rise of today’s religions (Science)

Trade Credit (not gold or silver) was the primordial form of commercial money. Rather than barter or coins, credit lines were probably what was used for exchanges in the days before currency became commonplace. In “primitive” cultures, reciprocity performs this role, where one’s gifts to others circulate back to the giver in time without a formal accounting of who owes what to whom. This helps maintain social relationships in small, close-knit societies.

As societies scale up, reciprocity is replaced with more formal agreements, often denominated in the standard unit of account. Even in modern times, credit is what is used to purchase inputs upfront, rather than just repeated spot transactions (as any businessperson or farmer can tell you).

The word credit is derived, very appropriately, from the Latin word for ‘to trust’. …the division of labour, from the very first moment it was applied, required the creation of a credit system of some kind. It was absolutely necessary to be able to trust one’s fellow workers’ promises to reward one appropriately at some future moment for one’s own products or services. It would have helped to have an enforcing authority, and that makes it all the more likely that trade was conducted in a regulated way, not by free individual option…it is obvious that a completely free market economy has rarely, if indeed ever, existed. We all rely on the existence of an enforcement system. We rely on the rule of law.

Trade credit (bills of exchange) formed much of the “money” of the Middle Ages and facilitated the commercial expansion in the absence of adequate gold and silver supplies. Huge amounts were transferred using double-entry bookkeeping (the “Venetian Method”) without any cash changing hands. Bankers would settle accounts at the conto which concluded the fairs. In fact, this may have been the original purpose of the fairs, with retail trade being subsidiary. Eventually, as commerce became more and more important after 1600, these economic activities were located in permanent banks and bourses established by the major port cities in order to facilitate the activities of merchants and the expansion of long-distance trade.

Trade credit is the essential foundation of the whole economic system, and the essential financial problem of economic development is to monetise trade credit, to turn it into an instrument for transferring value, for measuring value and for storing value. Wray. 121

Tally sticks, which keep track of debts and credits, may be the earliest form of money to emerge, even before coins or clay tablets. They were made of organic materials such as wood and bone. Because metal coins are what survive, tally sticks are sadly omitted in standard accounts of money: What tally sticks tell us about how money works (BBC)

Debt servitude appears to be the earliest form of mass slavery. While slaves were often captured prisoners of war in primitive cultures, their numbers were necessarily limited because having too many hostile foreigners living among your society and doing its essential chores would be dangerous (if not outright suicidal). That’s why in the ancient Near East, they were mainly women and children employed in domestic labor (cooking, cleaning, weaving, child care, etc.). Rather than slave labor, their massive walls and monuments were built by voluntary, mainly corvée labor, which served as a sort of social glue and proxy form of taxation in the absence of money.

But once debt becomes commonplace, large numbers of one’s own people could be compelled to labor for others in order to service their debts. This, as David Graeber points out, would be seen as just and fair, and thus the debtors would be less inclined to rebel. In fact, this may have been how the very first classes formed in ancient societies–debtors and creditors–rather than through military conquest or political decisions. Debt and chattel slavery existed side-by-side in most ancient societies. Even in colonial America, there were more indentured servants than African slaves.

Early rulers realized they needed to occasionally release the people from their debt obligations to public institutions (temples and palaces), otherwise they would lose the support of the people. They also needed enough free men to staff the armed forces, as debt-serfs could not afford to train or equip themselves. Debt serfs could also run away. Some argue that the debt serfs of ancient Mesopotamia, the Habiru, are the ancestors of the Jews (Hebrews).

And ye shall hallow the fiftieth year, and proclaim liberty throughout the land unto all the inhabitants thereof; it shall be a jubilee unto you; and ye shall return every man unto his possession, and ye shall return every man unto his family.
— Leviticus 25:10

Later, when professional soldiers replaced citizen armies, debt forgiveness was abolished and debts were held sacrosanct. This gave rise to a large class of hereditary debt serfs.

Only later on do prisoners of war become the major source of slaves in the Classical world. Some of the first markets to emerge were slave markets where prisoners of war were bought and sold. The Roman war machine brought in tens of thousands of slaves, diving their costs down and displacing free labor in agriculture. This allowed wealth to concentrate to a degree that undermined social cohesion. This may have been an underlying cause of Rome’s decline and fall.

All the major innovations in money and finance seem to have been created either to manage long-distance trade or to fund wars. The need to raise funds for war has seemingly driven all financial innovations since Medieval times. The “national debt” began when Venice needed to fund a war with Byzantium, and so they borrowed from their wealthy merchant classes. This borrowing eventually became done on a permanent basis. All the creditors’ obligations were eventually consolidated in one lump sum, revenue streams were dedicated to them, and payments were managed by a state-run bank. Thus the “national debt” was born. Borrowing was done by various municipalities in Northern Europe, but none of these were national debts. The Dutch seem to be the first country to leverage those techniques effectively on a national scale to fight for their independence from Spain.

Going even farther back, it appears that coinage was first invented to pay troops. Coins were distributed to soldiers as payment. Then a tax was then imposed on the conquered populations. The way to pay the tax was to acquire signifiers of the state’s debt in the form of coins by selling goods and services to the occupiers, thus redeeming signifiers of the state’s debt. There is another clue in the language here: the word soldier comes from the soldius, a gold coin used to pay troops in the late Roman Empire.

There is firm evidence to support money being a state creation. Money appears in Europe at the time the Greek city states became reliant on mercenary armies. Cities paid soldiers in gold to conquer some community, the soldiers then spent the gold in the colonised lands and the state recovered the gold by taxing the colonised merchants and innkeepers using the tokens that the soldiers had paid for food and lodgings. Greek and Roman citizens never paid tax, only the conquered paid for the privilege and were bound to the conqueror by having to exchange their resources for the Imperial currency. The model would survive and drive colonialism in the modern age, in the 1920s the British taxed Kenya at a rate of about 75% of wages, forcing the colonised to grow cash-crops to be consumed by the colonisers. Lady Credit (Magic, Maths and Money)

Financial innovations spread through the need to wage wars. If a system gave one nation a competitive advantage, it had to be adopted by other nations in order to compete. It was a sort of a Darwinian arms race: if a financial innovation allows a country to be more effective in trade or warfare, it will dominate countries that are unable to deploy their resources as effectively. The others will either adapt the innovation on their own or be subsumed into the empire (and thus gain the innovation that way).This is probably why financial techniques spread so rapidly in Western Europe as compared with China and the Middle East, who relied upon conscription and command-and-control systems rather than mercenaries & borrowing to fight wars.

In “patrimonial states”, where the state was an extension of the ruling family’s household, loans were essentially personal loans to the monarch that could be refuted at any time. Only when parliamentary systems come into play does state borrowing become a reliable means for governments to raise money. Thus merchant republics led the way, first in Italy, and then in Holland.

John Law’s financial innovations were an attempt to consolidate and manage the massive debts Louis 14th had run up with his wars and extravagances. Similarly, the debts generated by King William’s Glorious Revolution and subsequent wars led to the creation of the Bank of England, a joint-stock company designed to loan to the government and manage the state’s debt. This is the ancestor of today’s central banks.

Borrowing marks a time when citizens become not only debtors of the state, but creditors as well, profoundly altering the social relations between the state and its citizens. There effects were distributional–from the public sector at-large to the wealthy citizens and institutions who held the bonds. Over time, this group became more and more influential. Borrowing allows nations to bring resources forward in time. It also allows borrowing from a wider range of people and institutions than just banks.

Warfare has also been the reason for abandoning precious metal standards. The need to issue adequate money to fight wars has led to the suspension of convertibility of money and the rise of fiat currencies. Every time any sort of fixed standard has been tried, warfare undermines it.

Trading Empires are the major source for financial innovations. It’s no coincidence that major financial innovations occur in thassalocracies reliant upon long-distance trade. First the Italian city-states such as Venice and Genoa, then the Spanish and Portuguese empires, then the United Provinces (Dutch Republics), and finally the English Whig merchants who invented the modern monetary system.

Why were trading empires such a fertile source for innovations such as insurance and limited-liability corporations? Because trading voyages require enormous sums of investment upfront and the outcome is highly uncertain.

Consider the enormous length of time it took these wind-powered trading voyages. It took 12-18 months to make it to the Indies, and then you had to procure the cargo. That meant it could be 3-4 years before a profit is was realized. That meant that trading voyages required a very high level of capitalization; investors did not get an immediate return on their funding. They also required large amounts of infrastructure: ships were expensive, trading forts were expensive, soldiers were expensive, and so large amounts of resources had to be brought together. This was beyond the capacity of any single entity, so resources needed to be pooled. In addition, unlike one-off trading voyages, trade with the Indies required a permanent infrastructure rather than resources to fund a single voyage. You needed a trade with long-term continuity to realize a profit; single-purpose funding would not do.

It is these requirements that led to financial innovations, from medieval Italian commenda (a temporary limited partnership) to the limited-liability joint-stock corporation, where ownership is negotiable and is wholly separated from direct management.

Money played a huge role in the evolution of Western philosophy and mathematics. European mathematics achieved a high degree of sophistication due to the need to deal with multiple currencies at the same time as the Church’s prohibition on usury. The earliest mathematical treatises were all concerned with practical matters in trade and jurisprudence, not abstract science. Rather than sophisticated mathematics being developed to explain physical phenomena, it was first developed to manage trade risks and calculate transaction costs. Later on, these math techniques began to be applied to the natural world. Often, early scientists began their mathematical training in commerce. Newton and Copernicus both wrote treatises on Money. After 1600 the commercial and scientific revolutions both gained steam.

European science did not start in the Renaissance, it existed in the High Middle Ages. The ‘renaissance’ of the ‘long twelfth century’ resulted in what the historian Joel Kaye describes as, “the transformation of the conceptual model of the natural world…[which] was strongly influenced by the rapid monetisation of European society taking place [between 1260-1380].” and played a pivotal role in the development of European science. Thirteenth century scholars, “[were] more intent on examining how the system of exchange actually functioned than how it ought to function…” It seems that Fibonacci did not just influence medieval merchants, those scholars keeping an eye on merchant’s dubious dealings, also, became obsessed with mathematics… Who was the first quant? (Magic, Maths and Money)

Going even further back in time, many of the distinctive features of ancient Greek society (and hence Western civilization) such as science, philosophy and democracy, may have their origins in the use of money and trade in the Greek world:

The first ever pervasive monetisation in history (of the Greek polis) made possible for the first time various features of Greek culture that have in asense persisted throughout monetised society up to the present. I confine myself here to two examples. One is the idea that the abstract is more real than the concrete, which was a basis of much ancient philosophy. Another is the absolute isolation of the individual: this is especially manifest in Greek tragedy, where, for instance, Oedipus is entirely alienated from the gods and from his closest kin. Both these features are familiar to us, but do not occur in pre-monetary society. Radical Anthropology, Richard Seaford interview (PDF)

Loans create deposits, not vice-versa. This is called the “endogenous theory of money.” It claims that the amount of money is constrained only by the number of willing borrowers in the economy, and not the amount of reserves held by the central bank.

In short, the endogenous money approach reverses two causalities proposed by orthodoxy: 1) reserve creates deposits; and 2) deposits create loan. On the contrary, the endogenous money holds that loans create deposits that then create the need for the central bank to accommodate with reserve. In other words, banks first make loans, and then seek reserves to meet central bank regulations…

…Suppose Henry decides to hire Joshua to build a condo. In theory, Henry could issue his own money/IOU to Joshua in exchange for Joshua’s labor time. The problem is, Joshua would probably not accept Henry’s own liability (Henry dollar) because Henry cannot sufficiently indebt the rest of the population to create a demand for his own IOU. Instead, Joshua agrees to exchange his labor only for the liability of the state (U.S. dollars). Therefore, Henry needs somehow to convert his own IOU to the state IOU in order to get Joshua’s labor. Now Henry walks into a bank and asks for a loan, the loan officer does not check the bank’s reserves at the central bank and comes back to tell Henry, “sorry, we are out of money!” If the bank thinks Henry’s project is good, it creates a Henry loan simply by crediting the Henry’s deposit account. To meet the reserve requirement, the bank then borrows reserves from other banks that have excess reserves or directly from the central bank. What distinguishes the bank’s IOUs and Henry’s IOUs is that the former is directly convertible to the central bank/state IOUs while the latter is not. Vincent Huang, On the Nature of Money

The government is not revenue constrained. The above leads to the conclusion that in order to collect taxes, the government must first issue the money-thing it wishes to collect. This leads to the conclusion that rather than taxes funding spending, spending funds taxes! If the government (public sector) collects more in taxes than it spends, it reduces the money supply and causes the private sector to go into deficit in the equivalent amount (the “sectoral balances” approach).

A sovereign issuer of currency can never “run out” of money, nor can it go “bankrupt.” It can, however, be short of key resources, productive capacity, willing borrowers, or faith in the governing institutions. In such cases, excess money in the economy could lead to inflation, which is the real constraint on issuing money. Taxation serves as a way of “un-printing” money to bring inflation under control.

Although the finer points of MMT can get quite involved, the most basic takeaway is very simple. For societies with currency-issuing governments:

If something can be done, it is “affordable”.

If we have access to the raw materials, the labor power, the skills, the equipment and the facilities needed to produce something, then we can afford to produce it. The cost of doing so is not financial. The cost is a real cost: the exertion of human effort and know-how, the wear and tear on facilities and equipment, and the depletion of natural resources.

On one level, it is bizarre that this basic takeaway of MMT is not already mainstream. If the idea is heterodox, it is only because we are currently living in a very topsy-turvy world, in which up is presented to us as down, black as white, with everything reversed. In reality, it should be much harder to believe the opposite: that what we are capable of is impossible. If it’s Doable, it’s Affordable (hetereconomist)

What constitutes “money” is not so simple. Many things can be used as money. Stocks can be thought of a kind of money (since they are an IOU). Equity can be used as money. Items with intrinsic value (or perceived intrinsic value) can be used as money. Gift cards are a type of money. So are airline points. No doubt John Law’s theories derived in part from his observations at the gambling tables of Europe. There he observed that all sorts of things could serve as money in a pinch: coins, stocks, bonds, jewelry, certificates of deposit, deeds to land, checks, even hastily scribbled IOU notes. Anything that is accepted in payment, whether gold, stocks, bonds, cash, or IOUs, can be used as money, he concluded.

As MMT theorists say, anyone can create money, the challenge is getting it accepted. As we saw, private monies circulated alongside state money and borrowing before the two were combined in England, where bills of exchange became enforceable by contract law outside of merchant courts. Because the state’s liabilities and credit are generally the most reliable (except in cases of state failure), its money generally becomes the ‘money thing’ at the top of the pyramid—the ultimate means of settlement for various debts.

Recall that money represents a promise/IOU and that these promises can be created by anyone. The ‘secret’ to turning these promises into money is getting other individuals or institutions to accept them. Therefore, the ‘hierarchy of money’ can be thought of as a multi-tiered pyramid where the tiers represent promises with differing degrees of acceptability. At the apex is the most acceptable or ‘ultimate’ promise…The ‘simplified hierarchy’ can be envisioned as a four-tiered pyramid, with the debts of households, firms, banks and the state each representing a single tier…All money in the hierarchy represents an existing relationship between a creditor and a debtor, but the more generally acceptable debts will be situated higher within the hierarchy…as the decisive money of the system, both the state’s promises and banks’ promises rank high among the monies of the hierarchy. Although bank money is part of the ‘decisive’ money of the system, its acceptance at state pay-offices really requires its conversion into state money (i.e., bank reserves). That is, bank money is converted to bank reserves so that (ultimately) the state actually accepts only its own liabilities in payment to itself. The debt of the state, which is required in payment of taxes and is backed by its power to make and enforce laws, is the most acceptable money in the pyramid and, therefore, occupies the first tier. Stephanie Bell, The role of the state and the hierarchy of money, p. 10-12

The test of ‘moneyness’ depends on the satisfaction of both of two conditions. First, the claim or credit is denominated in an abstract money of account. Monetary space is a sovereign space in which economic transactions (debts and prices) are denominated in a money of account. Second, the degree of moneyness is determined by the position of the claim or credit in the hierarchy of acceptability. Money is that which constitutes the means of final payment throughout the entire space defined by the money of account. Geoffrey Ingham, The Emergence of Capitalist Credit Money p. 214

In our society, money has multiple uses: means of exchange, store of value, unit of account, and settlement of debts. That these things are all embodied in a single item we call “money” is not a natural phenomenon but a feature of capitalist credit money which allows this system to function as it does. That invention took a long time and it’s probably not over yet.

The Origin of Money 10 – The Birth of Modern Finance

 Money Becomes Metal

It is one of the great ironies of history that at the same time the modern financial system and banking was being invented, Enlightenment thinkers discarded thousands of years of monetary history and declared money to be based on the intrinsic value of precious metals alone.

Events like the Kipper and Wipperzeit had convinced scholars in Europe that a stable value of coins depended on issuing coins with constant and fixed amounts of precious metal. This, they reasoned, would prevent the rapid hyperinflation and deflation that were wreaking havoc on monetary systems throughout Europe.

…This monetary terrorism had its roots in the economic problems of the late 16th century and lasted long enough to merge into the general crisis of the 1620s caused by the outbreak of the Thirty Years’ War, which killed roughly 20 percent of the population of Germany. While it lasted, the madness infected large swaths of German-speaking Europe, from the Swiss Alps to the Baltic coast, and it resulted in some surreal scenes: Bishops took over nunneries and turned them into makeshift mints, the better to pump out debased coinage; princes indulged in the tit-for-tat unleashing of hordes of crooked money-changers, who crossed into neighboring territories equipped with mobile bureaux de change, bags full of dodgy money, and a roving commission to seek out gullible peasants who would swap their good money for bad. By the time it stuttered to a halt, the kipper- und wipperzeit had undermined economies as far apart as Britain and Muscovy, and—just as in 1923—it was possible to tell how badly things were going from the sight of children playing in the streets with piles of worthless currency.

“Kipper und Wipper”: Rogue Traders, Rogue Princes, Rogue Bishops and the German Financial Meltdown of 1621-23 (Smithsonian Magazine)

In England, this concept was most forcibly argued by John Locke. A pound was a specific amount of silver, he declared, and should be held inviolable.

The reason he did this was because he wanted to argue that property rights were natural and absolute phenomena, and did not rest on any sort of monarchial authority or social contract. In line with this reasoning, he needed money to also be a “natural thing” not anchored in social relations and certainly not under the control of a sovereign.

At this time, England’s coinage was in rough shape. Much of the coinage had remained unchanged in a hundred years and clipped coins circulated alongside newer issues. People tended to save the good coins and spend the clipped ones, causing a loss of faith in the currency.

It was increasingly clear that the Mint had to offer recoinage …But at what rate? Mercantilists, who tended to be inflationist, clamoured for debasement, that is, recoinage at the lighter weight, devaluating silver coin and increasing the supply of money. In the meanwhile, the monetary problem was aggravated by a burst of bank credit inflation created by the new Bank of England, founded in 1694 to inflate the money supply and finance the government’s deficit. As the coinage problem came to a head in that same year, William Lowndes (1652–1724), secretary of the treasury and the government’s main monetary expert, issued a “Report on the Amendment of Silver Coin” in 1695, calling for accepting the extant debasement and for officially debasing the coinage by 25 percent, lightening the currency name by a 25 percent lower weight of silver.

[John] Locke had denounced debasement as deceitful and illusionist: what determined the real value of a coin, he declared, was the amount of silver in the coin, and not the name granted to it by the authorities. Debasement, Locke warned…is illusory and inflationist: if coins, for example, are devalued by one-twentieth, “when men go to market to buy any other commodities with their new, but lighter money, they will find 20s of their new money will buy no more than 19 would before.” Debasement merely dilutes the real value, the purchasing power, of each currency unit.

Threatened by the Lowndes report, Locke’s patron, John Somers, who had been made Lord Keeper of the Great Seal in a new Whig ministry in 1694, asked Locke to rebut Lowndes’s position before the Privy Council. Locke published his rebuttal later in the year 1695…Locke superbly put his finger on the supposed function of the Mint: to maintain the currency as purely a definition, or standard of weight of silver; any debasement, any change of standards, would be as arbitrary, fraudulent, and unjust as the government’s changing the definition of a foot or a yard. Locke put it dramatically: “one may as rationally hope to lengthen a foot by dividing it into fifteen parts instead of twelve, and calling them inches.”

…Locke’s view triumphed, and the recoinage was decided and carried out in 1696 on Lockean lines: the integrity of the weight of the silver denomination of currency was preserved. In the same year, Locke became the dominant commissioner of the newly constituted board of trade. Locke was appointed by his champion Sir John Somers, who had become chief minister from 1697 to 1700. When the Somers regime fell in 1700, Locke was ousted from the board of trade, to retire until his death four years later. The Lockean recoinage was assisted by Locke’s old friend, the great physicist Sir Isaac Newton (1642–1727) who, while still a professor of mathematics at Cambridge from 1669 on, also became warden of the Mint in 1696, and rose to master of the Mint three years later, continuing in that post until his death in 1727. Newton agreed with Locke’s hard-money views of recoinage.

John Locke vs. the Mercantilists and Inflationists (Mises Institute)

Because the price paid by the Royal Mint for gold and silver was fixed and no longer allowed to adjust freely based on supply and demand, the effect this had was for gold to be shipped to England, where the Mint paid a premium for it, and silver to leave the country for continental Europe, where it was worth more. This led to a shortage of silver coins in England, causing economic contraction.

…the Bank of England’s formation also coincided with the reconceptualization of money as simply precious metal in another form—a fable told most prominently by John Locke. In earlier centuries, everyone accepted that kings could reduce the metal content of coins and, indeed, there were good economic reasons to do so. Devaluing coins (raising the nominal price of silver) increased the money supply, a constant concern in the medieval and early modern periods, while revaluing coins (keeping the nominal price of silver but calling in all old coins to be reminted) imposed deflation on the economy. But Locke was the most prominent spokesperson for hard money—maintaining the metal content of coins inviolate. The theory was that money was simply metal by another name, since each could be converted into the other at a constant rate.

The practice, however, was that the vast majority of money—Bank of England notes, bills of exchange issued by London banks, and bank notes issued by country banks—could only function as fiat money. This had to be the case because the very policy of a constant mint price had the effect of driving silver out of coin form, vacuuming up the coin supply. If people actually wanted to convert their paper money into silver or gold, a financial crisis could be prevented only through a debt-financed expansion of the money supply by the Bank of England—or by simply suspending convertibility, as England did in the 1790s.

… at the same time that the English political system invented the modern monetary system, liberal theorists like Locke obscured it behind a simplistic fetishization of gold. The fable that money was simply transmutated gold went hand in hand with the fable that the economy was simply a neutral market populated by households and firms seeking material gain. This primacy of the economic over the political—the idea that government policy should simply set the conditions for the operation of private interests—is, of course, one of the central pillars of the capitalist ethos. Among other things, it justified the practice of allowing private banks to make profits by selling liquidity to individuals (that’s what happens when you deposit money at a low or zero interest rate)—a privilege that once belonged to sovereign governments.

Mysteries of Money (The Baseline Scenario)

The Great Monetary Settlement

By the late 1600’s two major forms of currency circulated: the government money issued in coin form, and the capitalist credit money issued by private bankers. Both were forms of transferable debt, but were used in very different spheres of exchange:

By the late seventeenth century, the two forms of money were available but unevenly spread across Europe – private credit and public metallic coinage. However, they remained structurally distinct and their respective producers – that is, states and capitalist traders – remained in conflict…England was best placed…to effect any integration of the different interests that were tied to the different moneys…[1]

Unlike its cousins on the continent, England’s finances were fairly stable, and its debt manageable. That is until 1672, when the Stop of the Exchequer was declared by King James II. This was essentially a default by England on its debts. The crown refuted the tallies owed to them, causing tally sticks to fall into disrepute and clearing the way for paper instruments to replace them as signifiers of state debt:

Charles II’s debt default in 1672 was critically important in hastening the adoption of public banking as a means of state finance and credit money creation. Since the fourteenth century, English kings had borrowed, on a small scale, against future tax revenues. The tally stick receipts for these loans achieved a limited degree of liquidity ‘which effectively increased the money supply beyond the limits of minting’.

However, compared with state borrowing in the Italian and Dutch republics, English kings, like all monarchs, were disadvantaged by the very despotic power of their sovereignty. Potential creditors were deterred by the monarch’s immunity from legal action for default and their successors’ insistence that they could not be held liable for any debts that a dynasty might have accumulated. [2]

The rising Whig merchant class wanted a monarch who would put the country’s finances on a more sound basis. Since they were overwhelmingly Protestant, they decided that putting a Protestant on the throne in place of the Catholic James Stuart would be the perfect excuse for overthrowing the monarchy. It was, in essence, a coup d’etat by the banking and merchant classes.

With an impending war with the Dutch, an annual Crown income of less than £2 million, and accumulated debts of over £1.3 million, Charles II defaulted on repayment to the tally holders in the Exchequer Stop of 1672. This event…culminated in the invitation to William of Orange to invade and claim the throne…[3]

An alliance of the Whigs and Tories got the husband of James’ sister Mary, the Dutch prince William of Orange, for the job. William and Mary took the throne in the last major invasion of England. It was a mostly bloodless revolution, but not entirely peaceful, and all sorts of rebellions would roil parts of the British Isles for decades (the Jacobite risings), mainly in the outer regions of the empire (e.g. Scotland, Ireland, etc.)

The bloodless coup would have profound effects for the history of the financial system. William brought “Dutch finance” across the channel to England, where it would be used to reorganize the state’s finances.

Because the revolution had been backed and funded by Whig parliamentarians, they called the shots. It was the end of England’s absolute monarchy and the beginning of the “king-in-parliament,” an unusual fusion of monarchial power and public accountability. They made William sign a “Bill of Rights” in 1689 and one of the things it specified was that the ability to raise funds would be strictly delegated to parliament. In other words, no more arbitrary taxes or defaults.

William subsequently dragged England into several wars on the continent:

Roey Sweet: “So the reason why the national debt is rising at this time, and by 1714 it’s about 48 million [pounds], is that Britain’s been involved in two long and expensive wars. Following the Glorious Revolution, William of Orange brings Britain into the Nine Years War against Louis the 14th, and then from 1701 Britain’s been involved in the War of the Spanish Succession which is a battle essentially to try and prevent the Bourbons from gaining ascendancy in Europe by uniting the Spanish and the French empires. So Britain has been fighting this, and it’s seen as a Whig war…and there’s a suspicion that it’s being prolonged purely for Whig interests. And so [Chancellor of the Exchequer Robert] Harley wants to try and end the war and also to get the debt into manageable proportions.” [4]

William needed to borrow to fight his wars, and his credit score was awful. His debt load from conducting the Glorious Revolution was already very high, meaning that no one wanted to loan to him. The interest rates he was looking at were in the neighborhood of modern-day credit cards—18-20 percent.

The prevention of any recurrence of default was a paramount consideration which parliament put to the new Dutch king in the constitutional settlement of 1689. In the first place, William was intentionally provided with insufficient revenues for normal expenditure and, consequently, was forced to accept dependence on parliament for additional funds. Second with William’s approval, and the expertise of his Dutch financial advisors, the government adopted long-term borrowing in the form of annuities (Tontines). These were funded by setting aside specific tax revenues for the interest payments. [5]

England managed its debt in a variety of ways, many of them similar to the methods used on the continent. But one new technique was coming to bear. By this time, in order to exploit the resources of the New Word and conduct trading operations where long-term investments were required, Europeans had invented the joint-stock company—a company where ownership was diversified among a group of unrelated individuals and ownership could be bought and sold at will.

The legal ingredients that comprise a corporation came together in a form we would recognise in England, on New Year’s Eve, in 1600. Back then, creating a corporation didn’t simply involve filing in some routine forms – you needed a royal charter. And you couldn’t incorporate with the general aim of doing business and making profits – a corporation’s charter specifically said what it was allowed to do, and often also stipulated that nobody else was allowed to do it. The legal body created that New Year’s Eve was the Honourable East India Company, charged with handling all of England’s shipping trade east of the Cape of Good Hope. [6]

Joint-stock companies had been originally formed to undertake long-distance trading expeditions and to exploit the resources of the New World. Now they would be pressed into service to reorganize and manage the nation’s debt. The idea was to use such  companies to manage the state’s finances. They would be chartered for this purpose:

Melvin Bragg: How was the government handling its debt before the South Sea Company was set up?

Anne Murphy: The government is handling its debt in three ways.

It’s created lottery schemes which are very popular, and they’re attractive to a broad spectrum of individuals. So it can raise money that way.

It sells annuities which again are very popular, but they’re very costly, and they’re quite inflexible.

And the government is also using the moneyed companies to support its debt raising activities. The first one of those is called the Bank of England which is set up in 1694. The Bank of England does two things: it lends to government, and also it’s one of the first companies that does the debt for equity swaps that the South Sea Company is to become so famous for, later.

[…]

Melvin Bragg: Was that seen at the time as something that was okay; that a private company taking over part of a national debt was fine?

Anne Murphy: It’s actually just a change of lender, really. What’s being switched here is the many, many lenders–the individuals who bought annuities or who bought lottery tickets from the government–for one lender: the Bank of England or the South Sea Company. So it’s not that a private company is in essence taking over the debt. What it’s doing is just consolidating the debt in one set of hands rather than many sets of hands.

And this helps because it makes administration easier and it brings costs down, and that’s what the government wants. So it’s a desirable thing to do. [7]

The modern money system began when governments started using joint-stock corporations to manage their finances in exchange for “special” privileges–specifically the privilege of extending credit denominated in the government’s official currency. The government, in essence, became a debtor to these private corporations, which are the ancestors of our modern banks. The debt was then monetized and circulates to this day as money. The Bank of England, funded by the subscribers from the merchant classes, bought the state’s debt and used it as backing for their banking operations. The merchant bankers, in essence, kidnapped the state’s money for their own uses.

From 1694 to 1697, the directors of the new Bank of England laid the true foundations for the financial revolution by lending the government £1.2 million, at the then attractive rate of 8 per cent, in order to secure their monopoly on joint-stock banking, raising the funds by selling Bank stock. Though redeemable on one year’s notice from 1706, the loan was in fact perpetual. In 1698, the New East India Company made a similar 8 per cent perpetual loan to secure its charter, as did the newly merged United East India Company in 1709.

From 1704 to 1710, the exchequer also issued irredeemable annuities…and…a series of highly popular lottery loans. Meanwhile, in 1711, the newly formed South Sea Company bought up…short-term floating debts and converted them into so-called perpetual stock with a 5 per cent return; and in 1720, it converted another £13.99 million in other loans and annuities into 5 per cent perpetual stock, a venture that led to its collapse in 1721 in the famous ‘Bubble’. Thereafter, while redeeming £6.5 million in South Sea stock and annuities, the Bank of England, on behalf of the government, issued several series of redeemable ‘stock’, many containing the popular lottery provisions, with generally lower rates of interest…

The Bank of England wasn’t the world’s oldest bank, nor even was it the first state bank. But what made it unique was the idea of the merchant classes loaning to the government, and in return gaining a measure of control over the nation’s finances. The multiple and conflicting systems of money and borrowing would be fused together for the first time in one supranational institution. Felix Martin calls this “The Great Monetary Settlement:”

The Bank’s primary role would … be to put the sovereign’s credit and finances on a surer footing. Indeed, its design, governance, and management were to be delegated to the mercantile classes precisely in order to ensure confidence in its operations and credit control. But in return the sovereign was to grant important privileges. Above all, the Bank was to enjoy the right to issue banknote-a licence to put into circulation paper currency representing its own liabilities, which could circulate as money. There was to be, quite literally, a quid pro quo. [8]
… the idea of the hybrid Bank of England found a powerful group of supporters in the circle of ambitious Whig grandees who were soon to dominate the first party-political administration of the country. They realised that [Projector William] Paterson’s Project could deliver a Great Monetary Settlement.

If they and the private money interest they represented would agree to fund the king on terms over which they, as the Directors of the new Bank, would have a statutory say, then the king would in tum allow them a statutory share in his most ancient and jealously guarded prerogative: the creation of money and the management of its standard. To be granted the privilege of note issue by the crown, which would anoint the liabilities of a private bank with the authority of the sovereign-this, they realised, was the Philosopher’s Stone of money. It was the endorsement that could liberate private bank money from its parochial bounds. They would lend their credit to the sovereign-he would lend his authority to their bank. What they would sow by agreeing to lend, they would reap a hundredfold in being allowed to create private money with the sovereign’ s endorsement. Henceforth, the seigniorage would be shared. [9]

With the foundation of the Bank of England, the money interest and the sovereign had found an historic accommodation…This compromise is the direct ancestor of the monetary systems that dominate the world today: systems in which the creation and man agement of money are almost entirely delegated to private banks, but in which sovereign money remains the “final settlement asset, the only credit balance with which the banks on the penultimate tier of the pyramid can be certain of settling payments to one another or to the state. Likewise, cash remains strictly a token of a credit held against the sovereign, but the overwhelming majority of the money in circulation consists of credit balances on accounts at private banks. The fusion of sovereign and private money born of the political compromise struck in 1694 remains the bedrock of the modern monetary world.  [10]

This effectively created modern finance. The state’s debt was monetized by private banks, who gained the ability to loan the state’s “official” money through the extension private credit. No longer would money creation and manipulation be exclusively a tool of the sovereign. The two different money systems—government coinage and bills of exchange, were fused into one here for the first time. Because it was backed by state debt (and ultimately tax revenue), the bank’s money became by far the most trustworthy legal means of settlement, and soon it became the predominant one—the final “money thing” at the apex of the pyramid.

In effect, the privately owned Bank of England transformed the sovereign’s personal debt into a public debt and, eventually in turn, into a public currency.

This fusion of the two moneys, which England’s political settlement and rejection of absolutist monetary sovereignty had made possible, resolved two significant problems that had been encountered in the earlier applications of the credit-money social technology.

First, the private money of the bill of exchange was ‘lifted out’ from the private mercantile network and given a wider and more abstract monetary space based on an impersonal trust and legitimacy…

Second, parliament sanctioned the collection of future revenue from taxation and excise duty to service the interest on loans…The new monetary techniques conferred a distinct competitive advantage, which, in turn, eventually ensured the acceptability of England’s high levels of taxation and duties for the service of the interest on the national debt.

The most important, but unintended, longer-term consequence of the establishment of the Bank of England was its monopoly to deal in bills of exchange. Ostensibly, the purchase of bills at a discount before maturity was a source of monopoly profits for the Bank. But it also proved to be the means by which the banking system as a whole became integrated and the supply of credit money (bills and notes), influenced by the Bank’s discount rate.

The two main sources of capitalist credit money that had originated in Italian banking practice -that is, the public debt in the form of state bonds and private debt in the form of bills of exchange – were now combined for the first time in the operation of a single institution. But of critical importance, these forms of money were introduced into an existing sovereign monetary space defined by an integrated money of account and means of payment based on the metallic standard. [11]

The bank of England would issue banknotes, which were liabilities of the bank that could circulate as money. Banknotes were originally records of deposits of coin (sterling) redeemable at the banks. Eventually banknotes simply became records of deposits unlinked to any other coins or commodities. They circulated as paper records of debits and credits, similar to the tally sticks which they replaced (the old tallies were burned en masse):

In 1694, the Bank of England stepped in. Originally a private company, it was founded to create money backed by its gold holdings that could be exchanged for Treasury pledges over future taxes. In contrast to the old tally stick system, these pledges, known as ‘gilt-edged’ stock, or gilts, came with redemption dates and paid a fixed rate of interest.

These changed characteristics of a fixed date and rate of return made the pledges resemble debts. However, the difference is that these pledges are ownership claims created by an individual over his own income, whereas a debt claim is created by one individual over another individual’s income. The correct analogy is to think of gilt-edged stock as akin to interest-bearing shares or equity bought by investors in UK Incorporated, with a redemption date.

The position today is quite similar, except that the Bank of England is now State owned and the pound sterling is not backed by gold but by faith alone.

The fiscal myth of tax and spend shared by virtually all schools of economics is that tax is first collected and then spent. This has never been the case: the reality… has always been that government spending has come first and taxation later. The reality is that taxation acts to remove money from circulation and to prevent inflation: it does not fund and never has funded public spending.

The Myth of Debt (UCL)

The second method to finance the debt mentioned above was a debt-for-equity swap. This was tried in both England and France. While it failed in a bursting stock bubble in both countries, the differences would have profound consequences for world history.

In England, the monetary system would remain fairly intact. In France, by contrast, it would take down the entire financial system and cripple the nation economically for a generation. The end result would be an Industrial Revolution in England, and revolt, revolution and dictatorship in France.

Remarkably, this would all take place in just 4 years–from 1716 to 1720. One result would be the issuance of the first true paper money in Europe. The other would be the first major stock bubble and collapse.

John Law’s System

John Law was the son of an Edenborough goldsmith. Goldsmiths, like pawnbrokers before them, functioned as low-level proto-bankers. The issued receipts against the gold deposited with them. Occasionally they would issue receipts in excess of the gold stored in their vaults, knowing that not everyone would wish to redeem their gold at the same time. These receipts circulated as proto-money, but are not the direct ancestor of the banknotes we use today as some have claimed.

In 1694, when the Bank of England was being founded, John Law killed a man in a duel over a woman. Law was living large in England at the time. Now an outlaw, he first went to hide out in Scotland, and when the Acts of Union were passed in 1707, he fled for the continent. He made his way to Genoa, and then to Amsterdam, where he was able to observe their financial systems and banking practices first hand.

Eventually he made his way to France where he became a professional gambler, dandy, and bon-vivant. Through an unlikely series of circumstances and networking, he wound up being a personal friend of Phillip II, the Duc de Orleans, a high-ranking aristocrat who was the regent of France for Louis 15th, the future heir to the throne of France, who was still a teenager.

Just as in England, the finances of the French state were a disaster due to funding  wars all over the continent and the profligacy of the king. Versailles didn’t come cheap.

Europe’s most powerful nation had several major financial problems: 1.) There was not enough money in circulation because of a shortage of coins (a ‘liquidity crisis’) 2.) The French government’s debt was effectively unpayable. The interest rates were staggering and the billets d’etat were what we might today call “junk bonds” and 3.) The privatized and localized tax system was horribly inefficient, preventing the state from collecting taxes effectively. It was riddled with graft and corruption—only a fraction of what was collected made its way into state coffers. The rest ended up in the hands of a corrupt money interest, who resisted any attempts at reform.

When Louis the 14th, the “Sun King,” died, France was in a state of bankruptcy. Continuous warfare had left France short of money and facing a sizeable state debt. Furthermore, the tax revenue collection system had been farmed out to the private sector, leaving the financiers (gens de finance)…the veritable controllers of the financial system. They exerted control by managing the tax farms and lending money to the state. In effect, the state was heavily mortgaged to the financiers. [12]

Law used his friendship with the regent to propose a radical reorganization of the French state finances. Law had seen the English system at work. When he fled Scotland, he spent time in both Genoa and Amsterdam and was able to observe up-close the functioning of their banking and financial systems. This gave him the foundation for his own ideas.

[Law’s] theory consisted in two propositions. One was that the world had insufficient supplies of metal money to do business with. The other was that, by means of a bank discount, a nation could create all the money it required, without depending on the inadequate metallic resources of the world…Law did not invent this idea. He found the germs of it in a bank then in existence— the Bank of Amsterdam. This Law got the opportunity to observe when he was a fugitive from England.

The Bank of Amsterdam, established in 1609, was owned by the city. Amsterdam was the great port of the world. In its marts circulated the coins of innumerable states and cities. Every nation, many princes and lords, many trading cities minted their own coins. The merchant who sold a shipment of wool might get in payment a bag full of guilders, drachmas, gulden, marks, ducats, livres, pistoles, ducatoons, piscatoons, and a miscellany of coins he had never heard of.

This is what made the business of the moneychanger so essential. Every moneychanger carried a manual kept up to date listing all these coins. The manual contained the names and valuations of 500 gold coins and 340 silver ones minted all over Europe. No man could know the value of these coins, for they were being devalued continually by princes and clipped by merchants. To remedy this situation the Bank of Amsterdam was established.

Here is how it worked. A merchant could bring his money to the bank. The bank would weigh and assay all the coins and give him a credit on its books for the honest value in guilders. Thereafter that deposit remained steadfast in value. It was in fact a deposit. Checks were not in use. But it was treated as a loan by the bank with the coins as security. The bank loaned the merchant what it called the bank credit. Thereafter if he wished to pay a bill he could transfer to his creditor a part of his bank credit. The creditor preferred this to money. He would rather have a payment in a medium the value of which was fixed and guaranteed than in a hatful of suspicious, fluctuating coins from a score of countries. So much was this true that a man who was willing to sell an article for a hundred guilders would take a hundred in bank credit but demand a hundred and five in cash.

One effect of this was that once coin or bullion went into this bank it tended to remain there. All merchants, even foreigners, kept their cash there. When one merchant paid another, the transaction was effected by transfer on the books of the bank and the metal remained in its vaults. Why should a merchant withdraw cash when the cash would buy for him only 95 per cent of what he could purchase with the bank credit? And so in time most of the metal in Europe tended to flow into this bank.

It was…a one hundred percent bank–[f]or every guilder of bank credit or deposits there was a guilder of metal money in the vaults. In 1672 when the armies of Louis XIV approached Amsterdam and the terrified merchants ran to the bank for their funds, the bank was able to honor every demand. This established its reputation upon a high plane. The bank was not supposed to make loans. It was supported by the fees it charged for receiving deposits, warehousing the cash, and making the transfers.

There was in Amsterdam another corporation—the East India Company. A great trading corporation, it was considered of vital importance to the city’s business. The city owned half its stock. The time came when the East India Company needed money to build ships. In the bank lay that great pool of cash. The trading company’s managers itched to get hold of some of it. The mayor, who named the bank commissioners, put pressure on them to make loans to the company—loans without any deposit of money or bullion. It was done in absolute secrecy. It was against the law of the bank. But the bank was powerless to resist.

The bank and the company did this surreptitiously. They did not realize the nature of the powerful instrument they had forged. They did not realize they were laying foundations of modern finance capitalism. It was Law who saw this…Here is what Law saw. It is an operation that takes place in our own banks daily. The First National Bank of Middletown has on deposit a million dollars. Mr. Smith walks into the bank and asks for a loan of $10,000. The bank makes the loan. But it does not give him ten thousand in cash. Instead the cashier writes in his deposit book a record of a deposit of $10,000. Mr. Smith has not deposited ten thousand. The bank has loaned him a deposit. The cashier also writes upon the bank’s books the record of this deposit of Mr. Smith. When Mr. Smith walks out of the bank he has a deposit of ten thousand that he did not have when he entered. The bank has deposits of a million dollars when Mr. Smith enters. When he leaves it has deposits of a million and ten thousand dollars. Its deposits have been increased ten thousand dollars by the mere act of making the loan to Mr. Smith. Mr. Smith uses this deposit as money. It is bank money.

That is why we have today in the United States about a billion dollars in actual currency in the banks but fifty billion in deposits or bank money. This bank money has been created not by depositing cash but by loans to the bank depositors. This is what the Bank of Amsterdam did by its secret loans to the East India Company, which it hoped would never be found out. This is what Law saw, but more important, he saw the social uses of it. It became the foundation of his system…[13]

He argued that money was not any particular object, in his words, it was not the value for which goods are exchanged but by which goods are exchanged. For that reason, it could be anything. Law reasoned that the demand for money was greater than the supply, and like any other commodity the supply needed to be increased. An increase in the money supply would cause economic expansion and drive down interest rates. To get around the supply and demand problems encountered with gold and silver, he would retire them and replace them with paper instead.Phase one of Law’s plan would increase the amount of money circulating by introducing paper money in place of metal. Law’s bank would take the coins and issue paper money based on the deposits. The paper money would then retain its value. This was the beginning of true paper money as we know it today:

…his new proposal laid out plans for a private bank, funded by himself and other willing investors, which would issue notes backed by deposits of gold and silver coins and redeemable at all times in coins equivalent to the value of the coin at the time of the notes’ issue, “which could not be subject to any variation.” Thus, Law pledged, his notes would be more secure than metal money, a hedge against currency vacillations, and therefore a help to commerce. Moreover, paper notes would increase the amount of circulating money and trade would be boosted. In short, he vowed, his bank would offer hope and the promise of a better future. [14]

The regent helped by making his well-publicized deposits and ensured that everyone knew he was using the bank for foreign transactions. Foreigners followed his lead, and at last found somewhere in Paris to discount their bills of exchange with ease and at reasonable prices. The influx of foreign cur rency alleviated the shortage of coins, and, with the slow trickle of banknotes Law printed and issued to depositors, boosted the money supply sufficiently for commerce to begin to pick up. Traders liked the banknotes because the guarantee of being paid in coin of fixed value meant that they knew exactly what something would cost or what price they would receive. The notes began to command a premium, like those issued by the Bank of Amsterdam. [15]

The bank was a success, expanding the money supply and goosing the French economy as planned. In 1717 the regent ordered that all public funds be deposited in the Banque Generale. The notes of the bank became authorized for payment of taxes. The center of French finance moved from Lyon to Paris. Law controlled the issuance of banknotes so that at least 25 percent of the circulating value of the notes could be redeemed in gold or silver. To further remove the link with metal, it was forbidden for most people to own specie or use it for transactions. The Bank was eventually bought out by the government and renamed the royal bank (Banque Royale).

In December 17 18, the Banque Generale became the Banque Royale, the equivalent of a nationalized industry today. Law continued to direct it, and under his leadership over the next months, the finances of France leaned more heavily on it. New branches opened in Lyon, La Rochelle, Tours, Orleans, and Amiens. To ensure that everyone made use of paper money, any transactions of more than 600 livres were ordered to be made in paper notes or gold. Since gold was in short supply, this obliged nearly everyone to use paper for all major transactions. Meanwhile. for the leap of confidence they had shown in purchasing shares in the bank in its early uncertain days, and perhaps to buy his way into their world, Law rewarded investors lavishly. Shares that they had partly bought with devalued government bonds were paid out in coin. Both he and the regent had been major shareholders and were among those who profited greatly from the bank’s takeover.

Few recognized the dangers signaled by the bank’s new royal status. Hitherto Law had kept careful control of the numbers of notes issued. There had always been coin reserves of around 25 percent against circulating paper notes. Now, with royal ownership and no shareholders to ask awkward questions, the bank became less controllable. The issuing and quantity of printed notes and the size of reserves would be decided by the regent and his advisers. The temptation to print too much paper money too quickly would thus be virtually unchecked.

Within five months of its royal takeover…eight printers, each of whom earned only 500 livres a year, were employed around the clock printing 100-, 50- and 10· livre notes. A further ominous change followed: notes were no longer redeemable by value at date of issue but according to the face value, which would change along with coins if the currency was devalued: the principle that underpinned public confidence in paper had been discarded and one of Law’s most basic tenets breached. But as the eminent eighteenth-century economist Sir James Steuart later incredulously remarked, “nobody seemed dissatisfied: the nation was rather pleased; so familiar were the variations of the coin in those days, that nobody ever considered anything with regard to coin or money, but its denomination… this appears wonderful; and yet it is a fact.” [16]

Once the bank was established, phase two was to create a joint-stock company to acquire, manage, and ultimately retire, the state’s debt. It would also take over tax collection and the management of state monopolies.

The growing success of the General Bank enabled Law to address the second crisis, management of the national debt. A new radical plan was necessary to restructure France’s financial situation. Law decided that the best way to accomplish this was to convert the government debt into the equity of a huge conglomerate trading company. To do so he needed to establish a trading company along the lines of British trading companies such as the East India Company and the South Sea Company. [17]

While the monarchy was cash poor, it did possess one major asset whose value was almost limitless—a huge chunk of the North American continent. Law’s solution for the debt problem was to use that to create a monopoly company with exclusive rights over the settlement and development of North America, and have investors trade their debt for equity in that company; what today is called a “debt for equity swap.” Instead of unpayable debt, investors could trade that in for shares in a company that offered seemingly unlimited potential. Once again Law and the royal court would be early investors, prompting everyone else to jump on the bandwagon. To make the shares more attractive, Law initiated a “buy now pay later’ scheme—only 10% down would get you a share. Even the general public could buy in, and an informal stock market in trading Mississippi Company shares sprang up on the Rue Quincampoix in Paris outside Law’s apartment.

The company went on a mergers-and-acquisitions spree, buying up all the rival trading companies. It also bought the rights to collect the taxes from the rival financiers. It gained permission to run several state monopolies.

Law was granted a charter to create the Compagnie de la Louisiane ou d’Occident (Company of Louisiana and the West). This company was given a twenty-five year exclusive lease to develop the vast French territories along the Mississippi in North America. This meant exploitation of the Mississippi region, which in the French point of view, represented all of North America watered by the Mississippi River and its tributaries. As part of the deal, Law was required to settle 6,000 French citizens and 3,000 slaves in the territory. To sweeten the transaction the company was awarded a monopoly for the growing and selling of tobacco. [18]

In May 1719 [the Company of the West] took over the East India and China companies, to form the Company of the Indies (Compagnie des Indes), butter known as the Mississippi company. In July Law secured the profits of the royal mint for a nine-year term. In August he wrested the lease of the indirect tax farms from a rival financier, who had been granted it a year before. In September the Company agreed to lend 1.2 billion livres to the crown to pay off the entire royal debt. A month later law took control of the collection (‘farm’) of direct taxes. [19]

Finally, the Banque Royale and the Mississippi Company merged. Essentially all of the French state’s finances were managed by this one huge conglomerate, owned by the government, with Law at the helm. It issued money, collected the taxes, managed the state’s debt, and owned much of North America.

In 1719, the French government allowed Law to issue 50,000 new shares in the Mississippi Company at 500 livres with just 75 livres down and the rest due in nineteen additional monthly payments of 25 livres each. The share price rose to 1,000 livres before the second installment was even due, and ordinary citizens flocked to Paris to participate. Based on this success, Law offered to pay off the national debt of 1.5 billion livres by issuing an additional 300,000 shares at 500 livres paid in ten monthly installments.

Law also purchased the right to collect taxes for 52 million livres and sought to replace various taxes with a single tax. The tax scheme was a boon to efficiency, and the price of some products fell by a third. The stock price increases and the tax efficiency gains spurred foreigners to Paris to buy stock in the Mississippi Company.

By mid-1719, the Mississippi Company had issued more than 600,000 shares and the par value of the company stood at 300 million livres. That summer, the share price skyrocketed from 1,000 to 5,000 livres and it continued to rise through year-end, ultimately reaching dizzying heights of 15,000 livres per share. [20]

Law’s System reached its apex, and the price of the Company’s share peaked, at the beginning of 1720. Two main elements crowned the system. The first was a virtual takeover of the French government, by which the Company substituted the liabilities (shares) for the national debt. The second was the substitution of the Company’s other liabilities (notes) for metallic currency. At the end of the operation, the Company, owned by the former creditors of the State, collected all the taxes, owned or managed most overseas colonies, monopolized all overseas trade, and freely issued fiat money which was sole legal tender. Its CEO also became minister of finance on January 5, 1720. [21]

The government debt was retired, the money supply was expanded, and interest rates fell. But there was a problem.

Pop Go The Bubbles

John Law’s newly nationalized state bank was extending credit in order to buy Mississippi Company shares far in excess of the amount of gold and silver it had stashed in its vaults, and his enemies knew it. The excess money from all the shares floating around began to leak into the wider financial system, causing inflation. The value of the banknotes was no longer fixed but allowed to float. Confidence in the system was always thin.

The old guard sensed their opportunity. They demanded the gold and silver back in return for their paper money, causing a run on the bank. When faith in the Bank disintegrated, so too did faith in Mississippi Company stock (since both were now one in the same institution).

Some early investors, realizing that their hopes of getting rich in Mississippi were greatly exaggerated, began to sell their shares and exchange their paper currency for gold, silver and land. As share prices soared throughout the summer of 1719 some of the more level-headed realized that the bull market was based on little more than “smoke and mirrors” and the ever increasing production of paper notes. Feeling that a crash would sooner or later be inevitable, they cashed in.

… When in early 1720 two royal princes decided to cash in their shares of the Mississippi Company, others followed their example. The downward spiral had begun. Law had to print 1,500,000 livres in paper money to try to stem the tide. By late 1720 a sudden decline in confidence occurred which sent share prices down as rapidly as they had risen. When panic set in, investors sought to redeem their bank and promissory notes en masse and convert them into specie. The “bubble” burst when the Banque Royale could no longer redeem their notes for lack of gold and silver coin. Bankruptcy followed. Political intrigue and the actions of rival bankers contributed to the downfall of the scheme. Those not quick enough to redeem their shares were ruined.

In an effort to slow the run on the Bank Royale, officials resorted to various nefarious schemes. These included counting the money out slowly and only in small denomination coins, inserting clerks in the line who would return the money they withdrew, and by shortening banking hours. At one point the bank refused to accept anything but 10 livre notes. None of these expedients were able to build confidence or to slow the panic-stricken investors for long. In a last-ditch effort to restore confidence in the bank, Law ordered the public burning of bank notes as they came in for redemption. This was meant to convince the public, that because of their growing scarcity, they would be worth more. A huge enclosure was set up outside the bank for this purpose. Several times a day, with great ceremony, the notes were consigned to the flames. This went on during the months of July and August 1720 while paper money continued to lose its value throughout the Summer.

The general public turned on Law and would have lynched him if they could. He was burned in effigy and the mere mentioning of his name could arouse a fury. In October, a coachman was slapped by a passenger during an argument over a disputed fare. The cabbie had the wit to denounce his fare as John Law, whereupon the crowd pounced upon the passenger. The poor man barely saved himself by hiding from his pursuers in a church. [22]

The fall of the company managing the government’s finances caused massive damage to the French economy. Money went back to being metal.

By June 1720 the note issue of the Banque Royale had reached a staggering 2,696,000,000 livres. This sum was approximately twice the money in circulation before Law’s bank opened its doors. The increase in the money in circulation created an inflationary spiral which could not be reversed once the population became leery of Law’s Mississippi Scheme. The entire complex development of the bank’s other schemes for colonial companies, monopolies and tax collection came into question. Law’s plan for his bank and the issue of paper money was sound in and of itself; however, the issue was carried to tremendous sums that Law had never anticipated.

At the end in 1721 the notes had ceased to circulate and specie gradually took their place. The country painfully returned to a specie footing as in years past. This severe lesson in paper money inflation had permanent and long lasting effects upon France. The popular distrust of paper money and big banks kept France financially backward for many years thereafter. France was not to see circulating paper money again until the French Revolution of 1789-1795 necessitated it. [24]

Because France’s finances were not on a firm foundation, it could no longer borrow to expand the money supply. The only remaining option was to raise taxes. But in order to do this, they needed to call a meeting of France’s “parliament,”–a body which did not meet on a regular basis. While English nobility remained primarily in their own estates in the countryside, most of the French nobility was in the French court, totally segregated from the commoners. They had no idea what they were unleashing:

“The immediate precipitating cause of the French Revolution is a lot of political grandstanding around the monarchy’s debt and deficit. This is very much like the debt ceiling crisis that we saw in 2011. The issue was less about whether the monarchy’s finances were actually viable, and more about people using the subject of money to push their political point.”

“So at the point at which the king basically has no money left in the coffers, and can’t persuade the establishment to verify and approve new taxes, he called the first meeting of the Estates General, a body that hasn’t met in 175 years, so that they can produce some new taxes. And that’s really generally considered to be the beginning of the French Revolution. So the French Revolution starts in a crisis about budgets and taxes.” [25]

Much of the money fleeing Paris found its way to England where it inflated the South Sea Bubble. Like the Mississippi Company, it was also the use of a joint-stock company to consolidate, and ultimately retire, the state’s debt. Instead of taxes, it was backed by exclusive contracts from the government to conduct trade in the South Seas. It met a similar fate:

…after the re-coinage, silver continued to flow out of Britain to Amsterdam, where bankers and merchants exchanged the silver coin in the commodity markets, issuing promissory notes in return. The promissory notes in effect served as a form of paper currency and paved the way for banknotes to circulate widely in Britain. So when panicked depositors flocked to exchange banknotes for gold coin from the Sword Blade Bank (the South Sea Company’s bank), the bank was unable to meet demand and closed its doors on September 24. The panic turned to contagion and spread to other banks, many of which also failed. [26]

It’s shares also cratered in value, yet the bubble had “only” seen a tenfold rise in share prices instead of the twentyfold rise in France. Because the South Sea company remained separate from the Bank of England and the Treasury (unlike in France, where they were all one in the same), the damage to the British Economy was limited:

When stock prices finally came back to earth in London, there was no lasting systemic damage to the financial system, aside from the constraint on future joint-stock company formation represented by the Bubble Act. The South Sea Company itself continued to exist; the government debt conversion was not reverse; foreign investors did not turn away from English securities. Whereas all France was affected by the inflationary crisis Law had unleashed, provincial England seems to have been little affected by the South Sea crash. [24]

The Bank of England acquired the South Sea Company’s stock. Unlike France, Britain’s currency held its value, and it was able to pay back its debts. Money flowed into England, including from overseas. British debt was widely marketed and held both domestically and internationally:

Finally, between 1749 and 1752, the chancellor of the exchequer…began to convert all outstanding debt and annuity issues – those not held by the Bank of England, the East India Company, and the reconstituted South Sea Company – into the Consolidated Stock of the Nation, popularly known as Consols…

Consols were fully transferable and negotiable, marketed on both the London Stock Exchange and the Amsterdam bourse; along with Bank of England and East India Company stock, they were the major securities traded on the London Stock Exchange in the late eighteenth and early nineteenth centuries. Though Consols were both perpetual but redeemable annuities, thus identical to Dutch losrenten, their instant and longenduring popular success was attributable to the firmly held belief, abroad as well as at home, that the government would not exercise its option to redeem them…Unchanged to this day, they continue to trade on the London Stock Exchange…

The result of the financial revolution was a remarkably stable and continuously effective form of public finance, which achieved an unprecedented reduction in the costs of government borrowing: from 14 per cent in 1693 to 3 per cent in 1757. [25]

The Aftermath

Two things ensured Britain’s predominance in financial affairs: the last major pitched battle fought on British soil was the Battle of Culloden in 1746. Britain was peaceful, unified, and politically stable far longer than just about anywhere else on earth at the time. The second was the invention of the heat engine and the exploitation of England’s vast coal reserves. The triangular trade and vast amount of cotton allowed Britain to set up factories and industrialize. In fact, government debt may have funded the Industrial Revolution:

Only twice in the period from 1717 to 1931, did the British suspend the convertibility of their currency. Each time they needed more money to fight a war than the tight hand of convertibility would permit. They suspended the convertibility to fight Napoleon and to fight the Kaiser. Each war produced paper money and inflation, as wars tend to do.

After Waterloo, sterling met every test of a key currency. The government was stable, the institutions honored and intact. The Royal Navy sailed the world: trade followed the flag. Britain was first into the industrial revolution, so its manufactured goods spread over the world. The battles were always at the fringes of the empire.

Every time there was a small crisis about the pound, the monetary authorities would raise the interest rates sharply. That might depress the domestic economy, but the high interest rates would draw in foreign exchange, and the pound would retain its value. Britain bought the raw materials, the commodities. and sent back the manufactured goods; and since the price of raw materials gradually declined. The pound increased in value.

The British government issued “consols,” perpetual bonds. Fathers gave them to their sons, and those sons gave them to their sons, and the bonds actually increased in value as time went on. “Never sell consols,’ said Soames Forsyte, Galsworthy’s man of property.

The world brought its money to London and changed it into sterling. London banked it and insured it. Cartographers colored Britain pink on world maps, and the world was half pink, from the Cape to Cairo, from Suez to Australia, In 1897, at Victoria’s Diamond Jubilee, the fleet formed five lines, each five miles long, and it took. four hours for it to pass in review at Spithead.

British capital went everywhere. It financed American railroads and great ranches in the western United States. British investors held not only American ranches. but Argentine ones, too. Their companies mined gold in South Africa and tin in Malaya, grew hemp in Tanganyika and apples in Tasmania, drilled for oil in Mexico, and ran the trolley lines in Shanghai and the’ Moscow Power and Light Company. [26]

Carroll Quigley saw the Napoleonic Wars as a battle of the old mercantile, bullion-based monetary system, based around agriculture and handicrafts, versus the British system of commercial bank credit money and industrial manufacturing. With the final defeat of Napoleon, the British money system became the basis for all the money in the world today:

This new technique of monetary manipulation became one of the basic factors in the Age of Expansion in the nineteenth century and made the fluctuations of economic activity less responsive to the rate of bullion production from mines, by making it more responsive to new factors reflecting the demand for money (such as the interest rate). This new technique spread relatively slowly in the century between the founding of the Bank of England and Napoleon’s creation of the Bank of France in 1803. The Napoleonic Wars, because of the backward, specie-based, financial ideas of Napoleon were, on their fiscal side, a struggle between the older, bullionist, obsolete system favored by Napoleon and the new fractional-reserve banknote system of England. [27]

In order to facilitate international trade, Britain instituted the gold standard. The gold standard was an agreement among nations to convert their currencies to gold at fixed rates, thus ensuring money earned overseas would hold its value relative to domestic currencies. The idea was that shipping gold bars from trade deficit countries to trade surplus countries would allow domestic money supplies to “self-adjust.” Trade deficits would be settled by shipping gold from deficit countries to surplus ones. Since the amount of money in your economy was based on how much gold you had, countries shipping gold out would have less money circulating, leading to deflation. This would make their exports more attractive. On the other hand, countries gaining gold reserves would issue more money causing inflation making their exports less attractive relative to the deficit counties on the world market. Over time, everything would just sort of balance out. That was the theory, anyway:

Britain adopted the gold standard in 1844 and it became the common system regulating domestic economies and trade between them up until World War I. In this period, the leading economies of the world ran a pure gold standard and expressed their exchange rates accordingly. As an example, say the Australian Pound was worth 30 grains of gold and the USD was worth 15 grains, then the 2 USDs would be required for every AUD in trading exchanges.

The monetary authority agreed to maintain the “mint price” of gold fixed by standing ready to buy or sell gold to meet any supply or demand imbalance. Further, the central bank (or equivalent in those days) had to maintain stores of gold sufficient to back the circulating currency (at the agreed convertibility rate).

Gold was also considered to be the principle method of making international payments. Accordingly, as trade unfolded, imbalances in trade (imports and exports) arose and this necessitated that gold be transferred between nations (in boats) to fund these imbalances. Trade deficit countries had to ship gold to trade surplus countries. For example, assume Australia was exporting more than it was importing from New Zealand. In net terms, the demand for AUD (to buy the our exports) would thus be higher relative to supply (to buy NZD to purchase imports from NZ) and this would necessitate New Zealand shipping gold to us to fund the trade imbalance (their deficit with Australia).

This inflow of gold would allow the Australian government to expand the money supply (issue more notes) because they had more gold to back the currency. This expansion was in strict proportion to the set value of the AUD in terms of grains of gold. The rising money supply would push against the inflation barrier (given no increase in the real capacity of the economy) which would ultimately render exports less attractive to foreigners and the external deficit would decline.

From the New Zealand perspective, the loss of gold reserves to Australia forced their Government to withdraw paper currency which was deflationary – rising unemployment and falling output and prices. The latter improved the competitiveness of their economy which also helped resolve the trade imbalance. But it remains that the deficit nations were forced to bear rising unemployment and vice versa as the trade imbalances resolved.

The proponents of the gold standard focus on the way it prevents the government from issuing paper currency as a means of stimulating their economies. Under the gold standard, the government could not expand base money if the economy was in trade deficit. It was considered that the gold standard acted as a means to control the money supply and generate price levels in different trading countries which were consistent with trade balance. The domestic economy however was forced to make the adjustments to the trade imbalances.

Gold standard and fixed exchange rates – myths that still prevail (billy blog)

The gold standard, the self-regulating market, and haute finance were the foundations of the Hundred Year’s Peace lasting up until the First World War.The Hundred Year’s Peace ushered in the final transition from civil society to a fully-fledged market society. Rather than being a sideshow, all of society’s relations now became coordinated by the market. The moral economy was crushed (by force if necessary), and the market and money based capitalist one replaced it. Millions died in this transition, whitewashed from history as “moral failures” even as the suffering under Communism is constantly referred to. Money and banking were at the center of the nexus. Controlling the money supply became absolutely necessary to the smooth functioning of this system. Unfortunately, it was not managed well.

The prevention of providing adequate currency caused panics and depressions throughout the nineteenth century. However, it did engender a hundred years’ of relative peace between the great powers. Nations broke into trading spheres, and the violence was the violence of empire, as well as the institutional violence imposed by the market system itself (hunger, homelessness, starvation, poverty, prisons, jails, alienation, conscription, etc.). In wartime and depressions, however, the gold standard tended to be abandoned. It always rested on peaceful international relations and government agreements; in no was was gold ever “natural” money.

Nineteenth-century civilization rested on four institutions. The first was the balance-of-power system which for a century prevented the occurrence of any long and devastating war between the Great Powers. The second was the international gold standard which symbolized a unique organization of world economy. The third was the self-regulating market which produced an unheard-of material welfare. The fourth was the liberal state. Classified in one way, two of these institutions were economic, two political. Classified in another way, two of them were national, two international. Between them they determined the characteristic outlines of the history of our civilization.

Of these institutions the gold standard proved crucial; its fall was the proximate cause of the catastrophe. By the time it failed, most of the other institutions had been sacrificed in a vain effort to save it. [28]

Mismanagement of the “new” market society was the proximate cause of two World Wars and the Cold War, killing millions. Once again, it threatens to tear the world apart. But that’s a story for another time.

Next: Concluding notes.

[1] Wray, et. al.; Credit and State Theory of Money, p. 209

[2] ibid.

[3] ibid.

[4] In Our Time – The South Sea Bubble (BBC)

[5] Wray, et. al.; Credit and State Theory of Money, p. 210

[6] How a creative legal leap helped create vast wealth (BBC)

[7] In Our Time – The South Sea Bubble (BBC)

[8] Felix Martin; Money, the Unauthorized Biography, pp. 116-117

[9] Felix Martin; Money, the Unauthorized Biography, pp. pp. 117-118

[10] Felix Martin; Money, the Unauthorized Biography, p. 120

[11] Wray, et. al.; Credit and State Theory of Money, p. 211

[12] William N. Goetzmann and K. Geert Rouwenhorst, eds. The Origins of Value: The Financial Innovations that Created Modern Capital Markets, pp. 230-231

[13] John Flynn’s Biography of John Law http://www.devvy.com/pdf/biography_john_law.pdf, pp. 5-7

[14] Janet Gleeson; Millionaire: The Philanderer, Gambler, and Duelist Who Invented Modern Finance, p. 113

[15] Janet Gleeson; Millionaire: The Philanderer, Gambler, and Duelist Who Invented Modern Finance, p. 116-117

[16] Janet Gleeson; Millionaire: The Philanderer, Gambler, and Duelist Who Invented Modern Finance,  pp. 132-133

[17] William N. Goetzmann and K. Geert Rouwenhorst, eds. The Origins of Value: The Financial Innovations that Created Modern Capital Markets, p. 231

[18] John Flynn’s Biography of John Law
http://www.devvy.com/pdf/biography_john_law.pdf, p. 5-6

[19] Niall Ferguson; The Ascent of Money, p. 141

[20] Crisis Chronicles: The Mississippi Bubble of 1720 and the European Debt Crisis (Liberty Street)

[21] Francois R. Velde; Government Equity and Money: John Law’s System in 1720 France, p. 21

[22] John E. Sandrock; John Law’s Banque Royale and the Mississippi Bubble, pp. 8-9

[23] Niall Ferguson; The Ascent of Money, p. 157

[24] John E. Sandrock; John Law’s Banque Royale and the Mississippi Bubble, p. 13

[25] Rebecca Spang: Stuff and Money in the Time of the French Revolution – MR Live – 2/21/17 (YouTube)

[26] Adam Smith; Paper Money, pp. 116-117

[27] Carroll Quigley; The Evolution of Civlizations, p. 377

[28] Karl Polanyi; The Great Transformation, chapter one.

The Origin of Money 9 – Bonds and the Invention of the ‘National Debt’

The Venetian government is the first we know of which became a debtor to its own citizens, or conversely, where citizens became creditors on the government. As with most innovations in finance, it was the need to raise funds for war that drove the need to raise revenue quickly.

Other city-states had to compete with Venice, and the system spread, first to Genoa, and then to other republics in Northern Italy like Florence, Milan and Sienna. These city-states were all expanding militarily, and they needed money to do it. Since they were republics, they had advantages that the absolute monarchies of Northern Europe did not have, including accountability to their citizens. The merchant classes essentially borrowed from themselves to fund the wars.

These methods of short and long term debt financing spread to Northern Europe but were done on the municipal, not state level, since states were largely still absolute monarchies who could, and did, repudiate their debts on a regular basis.

In Northern Europe tax collection was highly decentralized during the Middle Ages, and national governments relied on municipal and provincial tax receipts for revenue. Many localities in Western Europe turned to securities (annuities, lotteries, tontines, etc.) for short-term and long-term borrowing which were allowable under the Church’s ban on usury. Both France and Spain eventually incorporated these into the nation’s overall financial structure, however, these were still primarily local, not state liabilities. Both governments used debt instruments for borrowing, but these were intermediated by banks and unlike the Italian republics, borrowing costs were high because they were less reliable. The kings of France and Spain, unrestrained by effective parliaments, were serial defaulters.

The Seven United Provinces (today’s Belgium and the Netherlands), which, like the Italian City-states, were trading empires run by a wealthy merchant oligarchy, used these new methods of financing and banking to fund their rebellion against Spain as well as expand their burgeoning overseas trading empire. These securities eventually became negotiable, and markets emerged for buying, selling, and trading these debts. The United Provinces is likely the first place where these became national liabilities. The center of financial innovation shifted from Northern Italy to Holland.

From there “Dutch finance” spread across the Channel to England during the Glorious Revolution of 1688. To manage his mounting war debt, William of Orange took out a loan from the merchant bankers of England in exchange for certain prerogatives from the crown. England was the first major country to consolidate its debt, nationalize it, and monetize it, therefore setting the stage for the public/private hybrid system of money creation and banking that we use today.

Italy Invents the State Bank

It all started with the Crusades. Seaports like Venice and Genoa were launching points for the armies marching south to conquer the Holy Land. The vast amounts of money flowing into these cities during this time allowed them to remove themselves from the feudal order and become self-governing communes. The shipping expertise gained by ferrying soldiers back and forth to the Middle East allowed the Venetians and Genoese to develop the skills to become Europe’s primary merchants and traders, importing exotic goods from the Islamic world into western Europe, and becoming fabulously wealthy in the process.

It was through the Islamic trade centered around the Silk Road and the Indian ocean—the first modern “global economy”–that the Italians learned all sort of innovations that we saw last time, from paper to base-10 place notation, to algebra, to checks, to bills of exchange. These ideas would be used to usher in the “commercial revolution” of the late Middle Ages. They would also make Northern Italy the crucible for European banking and finance.

To fund their expansion, these thassalocracies needed money. Trading empires, as Paul Colinveax would remind us, require superior military technique. At this time, military empires relied mainly not on conscripts (most people in these republics were merchants and artisans), but on professional soldiers, i.e. mercenaries. As Carroll Quigley put it, “the existence of mercenary armies made money equivalent to soldiers and thus to power.” (p. 373)

For much of the fourteenth and fifteenth centuries, the medieval city-states of Tuscany – Florence, Pisa and Siena – were at war with each other or with other Italian towns. This was war wages as much by money as by men. Rather than require their own citizens to do the dirty work of fighting, each city hired military contractors (condottieri) who raised armies to annex land and loot treasure from its rivals. [2]

The main way states raised money during this period, as we saw last time, were taxes and seignorage. Taxes were levied almost exclusively on commercial activity for most of history (since most other activity took place outside of the commercial/money economy). This was unlikely to be as effective in an entrepot dependent upon shipping and trade. Feudal rents and dues were levied by kings, but were less available to city-states outside of the feudal system. Siegnorage was a major way of raising revenue as we saw previously, but for a merchant-based society, devaluing the currency was less likely to be helpful or popular.

The solution arrived at was to borrow money from the city’s wealthy merchant and banking classes.

During the thirteenth and fourteenth centuries major cities such as Florence, Genoa, Milan, and Venice were able to extend their territorial control; those of Venice and Genoa attained the importance of maritime empires.

The formation of a territorial state came at enormous costs. How did urban governments raise the money needed to cover such expenses? Since increasing or raising new taxes required time and, above all, public acceptance, the easiest way was to borrow from the wealthiest citizens.[3]

Despite the ban on usury, no medieval European government – municipal, territorial, or national – was able to function without borrowing, given that its powers to tax and exact rents were limited, while it was often engaged in costly wars. But such loans were usually for short terms, often at punitive rates of interest.

During the twelfth century, the Italian progenitors of the ongoing Commercial Revolution developed what became a system of municipally funded debts, debts that subsequently became permanent. Genoa took the lead, in 1149, when it agreed to give a consortium of the city’s lenders control over a compera, a consolidated fund of tax revenues to be used in paying the city’s creditors.

Venice followed suit in 1164, by securing a loan of 1,150 silver marci against the tax revenues from the Rialto market for twelve years. In 1187, in return for a loan of 16,000 Venetian lire, to finance the doge’s siege of Zara, creditors were given control over the salt tax and certain house rents for thirteen years; thereafter, the Salt Office was made responsible for all such loan payments…by 1207, the Venetians had adopted what had already become the hallmark of public finance in the Italian republics: a system of forced loans, known locally as prestiti, whose interest charges were financed by additional taxes on salt, the Rialto market, and the weigh-house.

Between 1262 and 1264, the Venetian Senate consolidated all of the state’s outstanding debts into one fund later called the Monte Vecchio – mountain of debt – and decreed that debt-holders should receive annual interest at 5 per cent, which the Ufficiale degli Prestiti was required to pay twice yearly from eight specified excise taxes. These prestiti debt claims (with interest payments) were assignable through the offices of the procurator of San Marco and, by 1320 at the latest, a secondary market for them had developed. [4]

A loophole in the medieval prohibition on usury allowed this to take place. Although we regard usury and interest as one in the same, in fact medieval law made a distinction between the two:

Usury is sometimes equated with the charging of interest, but by the thirteenth century it was recognised that the two ideas were different.

Usury derives from the Latin usura, meaning ‘use’, and referred to the charging of a fee for the use of money. Interest comes from the Latin intereo, meaning ‘to be lost’, and originated, in the Roman legal codes as the compensation someone was paid if they suffered a loss as a result of a contract being broken. So a lender could charge interest to compensate for a loss, but they could not make a gain by lending.

It is easier to understand this with a simple example. A farmer lends a cow to their cousin for a year. In the normal course of events, the cow would give birth to a calf and the cousin would gain the benefit of the cow’s milk. At the end of the loan, the farmer could expect the cow and the calf to be returned. The interest rate is 100%, but it is an interest since the farmer, if they had not lent the cow to their cousin, would have expected to end the year with a cow and a calf. Similarly, if the farmer lent out grain, they could expect to get the loan plus a premium on the basis that their cousin planted the grain, he would reap a harvest far greater than the sum lent. [5]

These concepts gave birth to the idea of the medieval census:

A census originated in the feudal societies as an “obligation to pay an annual return from fruitful property”. What this means is that the buyer of the census would pay a landowner, for example, for the future production from the land, such as wheat or wine, over a period of time.

As economic life in western Europe became based on money transactions rather than barter transactions, censii lost the link to specific produce, cartloads of wheat or barrels of wine. The buyer of the census would accept regular cash payment instead of the actual produce, and this was legitimate in the eyes of the canon lawyers as long as the lump-sum paid buy [sic] the buyer ‘equated’ with the value of the ‘fruitful property’ being produced by the seller.

Anyone who could became involved in censii. A labourer might sell a census based on the future revenue from their labour, states sold them based on the future revenue from taxes and monopolies, and the Church invested bequests by buying censii. Censii issued by governments, usually linked to specific tax revenues, became known as rentes. Censii could be ‘temporary’, lasting a few years, or ‘permanent’, until one of the parties died.

In today’s terms, temporary censii resemble modern mortgages, permanent censii resemble the ‘annuities’ pensioners live off today. They could be ‘redeemable’, by one or both parties, meaning that the contract could be cancelled. [6]

The Venetian government required a “forced loan” from their wealthiest citizens in line with their income (i.e. it was progressive) to fund the war effort. Since the loans were forced loans, interest was compensation for the lost money, which was allowable under the Church’s anti-usury doctrine. The government paid an “interest” of 5 percent per year in biannual installments of 2.5 percent to compensate for the lost money. To do this, the government allocated dedicated revenue streams from commercial taxes to pay the interest.

Prestiti were a development from the rentes created by states. Around the twelfth century the Italian city-states of Venice, Genoa and Florence began to forcefully sell temporary rentes to their rich citizens. By the mid-thirteenth century the different issues of rentes were consolidated into a mons (mountain) and everyone who had been made to buy a rente was given a share, proportionate to their contribution, in the mons. [7]

The loans were basically irredeemable—there was no pledge by the government to pay back the principal in a fixed amount of time. These were not bearer bonds; rather, the names of the creditors were recorded in government ledgers at the loan office (Camera degli imprestiti). They were assignable in that the revenue stream could be transferred to a third party with the consent of the owner, but they were not negotiable, however, at least at first. You could not simply sell your bonds on the open market without the knowledge of the original debtor (the government), i.e. they were not easily transferable. Nor were they legal tender which could be used in lieu of cash.

Venice created its mons, the monte vecchio, in 1262 and the shares, known as prestiti, entitled the holder to be paid 5%, a year, of the sum they lent, which was written on the prestiti and known as the ‘face value’. While there was no obligation for the states to pay the coupon, the annual payment, there was an expectation that they would if it could be afforded and the mountain itself was paid back as and when funds allowed. [8]

Eventually, as borrowing costs grew to encompass more and more of state revenue, dedicated agencies were established in order to manage the consolidated debt these states owed to their citizens and others:

During the last quarter of the thirteenth century the demand for loans on Venetian citizens grew: they had to deposit a part of their assessed wealth into state coffers, the sums were registered on public books, and tax revenues were devoted to paying interest. By 1274 Genoa adopted a similar measure, and some loans were consolidated and managed by a single state agency.

The republics of Venice and Genoa were thus the first to transform their floating debt into a consolidated debt; later, some Tuscan communities would follow suit.
The main features of such a system were extraordinary financing through irredeemable forced loans; moderate interest rates; credits that were heritable, negotiable and usable payment; an amount consolidated and managed by a specific authority; and specific tax revenues designated for paying interest. [9]

The Genoese set up a dedicated private bank to manage the public debt around 1400 called the Casa di San Giorgio. Today it is recognized by financial historians as the first modern state bank, and in time, it became more powerful than the state itself! Many European monarchs regularly used it for borrowing, and it even funded some of the first expeditions to the New World (Christopher Columbus’ childhood home was nearby):

On March 2 1408, eight men gathered in the great hall of the Casa di San Giorgio, a trading house on what was then the main street in Genoa, a few metres from where the waters of the Ligurian Sea lap the Italian shore. They were merchants, rich and powerful representatives of the city’s most influential families, and they were meeting to discuss a matter of the utmost gravity. The once-glorious republic of Genoa had fallen on hard times. After years of war with Venice and a crushing defeat at the battle of Chioggia in 1381, the state was effectively bankrupt. The task was to rescue it.

A few months earlier, towards the end of 1407, Genoa’s Council of Ancients had authorised the Casa di San Giorgio to carry out this job. It would be accomplished by creating a bank that would facilitate the repayment of Genoa’s debts in return for interest at 7 per cent and the right to collect taxes and customs owed to the city. The purpose of the meeting that spring day was to declare the Banco di San Giorgio open for business.

..The Banco di San Giorgio would, in time, become as powerful as the republic that created it – more powerful, according to Niccolò Machiavelli. It would survive for nearly 400 years. It would become the world’s first modern, public bank, not just a forerunner of the Bank of England but its prototype…in a short space of time, it became so entwined with the republic of Genoa that the bank and the state were indistinguishable.

Machiavelli described the relationship as “a state within a state”. The Banco di San Giorgio grew so influential that it replaced the Fuggers, the German banking dynasty, as the source of financing for Europe’s cash-starved, perpetually warring monarchs. A century and a half after it was created it had restored Genoese power and influence as a maritime and commercial state to such an extent that the period from 1557 to 1627 was termed the Age of Genoa by Fernand Braudel, the great French historian…Christopher Columbus, Genoa’s most illustrious son, would be a customer…[10]

The management of state finances became increasingly concentrated in the hands of a professional bureaucracy which was separate from direct control by the state. The republics made very sure that the money was paid back reliably. This made loaning to them much more reliable than loaning to monarchs, and they were able to raise more revenue for their operations:

One reason that this system worked so well was that they and a few other wealthy families also controlled the city’s government and hence its finances. This oligarchical power structure gave the bond market a firm political foundation. Unlike an unaccountable hereditary monarch, who might arbitrarily renege on his promises to pay his creditors, the people who issued the bonds in Florence were in large measure the same people who bought them. Not surprisingly, they therefore had a strong interest in seeing that their interest was paid. [11]

Because of their dependability, these government-backed IOUs soon became highly desirable places for rich merchants and nobles to store their wealth, much as they are today, secured by the government’s promises to pay. The guaranteed returns provided a reliable income stream for those able to purchase the bonds. The merchant classes and various institutions bought up the bonds and used them as collateral, endowments for charities, even gifts and dowries, and passed them down to their assignments and heirs.

Over time, as issuing bonds became more common, more and more people became dependent on bonds for their income. Much like today, many of the holders of bonds were not just individuals but institutions and endowments who relied on the bonds as a source of income. This parallels today, where holders of bonds are often institutional holders like retirement accounts and insurance companies:

Throughout the sixteenth and seventeenth centuries it seems that most of the bonds were in the hands of guilds and ecclesiastical and charitable institutions that looked to state debt to assure a sound, even if relatively low, return. The economic importance of the redistribution of money through the government debt can not be neglected…Both in Florence and Genoa, government creditors drew a significant share (about one-fifth) of their income from bonds. Accordingly, a flow of money spread through the city and revived the local economy. [12]

Initially, only citizens of the Republic could buy bonds, but over time, bonds were issued to outside sources. Nonetheless, it appears that the debt in Italian city-states was held mainly by its own citizens, and not by foreign creditors. Buying bonds was seen as a sort of civic duty for the city’s wealthy individuals:

To loan to the commune was regarded as a duty, part of belonging to the urban community. Loans were connected, to a certain extent, with the concept of charity and gifts to the res publica.

Some governments, such as Florence, at first forbade foreigners to held state bonds, while it seems that in Venice since the thirteenth century foreigners were allowed to buy government credits. Some devices, nevertheless, were adopted in order to bypass such prohibitions; the easiest solution was to grant citizenship to those who were willing to buy government bonds…At any rate, the foreign presence among bondholders seems to have been a limited phenomenon: by the early fifteenth century about one tenth of the Florentine debt was held by foreigners; in 1629, 92 percent of the principal of S. Giorgio belonged to Genoese citizens and institutions…Unlike some Italian princely states, such as Milan and the papal state, and German cities, the urban governments of Venice, Florence and Genoa succeeded in raising enormous amounts of money from their citizens and very seldom borrowed from foreigners…[13]

Today, governments sell bonds directly to the public in what is called a primary market. From there, they are traded by investors in secondary markets. At this time, there was no primary market for bonds—only a select few insiders could loan to governments. But soon a thriving secondary market emerged where such debts were bought and sold. The prices of bonds varied, depending on the reliability of the debtor (the state). Because interest was paid on the face value of the bond, if you could buy a bond on the cheap, you would be assured a nice payout. This was effectively an end-run around the Church’s ban on usury:

Quickly a market for Prestiti emerged, where holders who needed ready cash would trade them with people who had a surplus of cash and wanted to save. During times of peace and prosperity they had a high price, but during war and uncertainty, they traded at a low price.

For example, Venetian prestiti traded for their face value around 1340 when the Republic paid off a lot of the mons, but in 1465, during a disastrous war with the Ottoman Turks, they fell to 22% of face. The Florentine prestiti actually had a built in facility where a holder could go to the state and sell them for 28% of their face value, however their market price was never so low as to make this profitable.

The legitimacy of the prestati was debated by the canon lawyers. On the one hand the coupons, the regular cash payments can be seen as compensation for the forced nature of the original loan. The lender had no choice and so does suffer a loss. However, if a prestiti with a face of 100 ducats was sold for 22 ducats, the buyer would be receiving interest at a rate of 5∕22 = 23%; in what way had this buyer of the prestiti been forced to enter into the contract? An interest payment of 23% in these circumstances seemed to be “asking for more than what was given”.

Prestiti are important in that are one of the earliest representations of an actively traded financial instrument. The prestiti does not represent bushels of wheat or barrels of oil, it is a contract where by a state promises to pay a specified amount of money. Whether or not the state does pay out on the contract, is unknown and uncertain, hence the value of the contract is also unknown and uncertain. [14]

In the end, the ability to have people voluntarily lend to the government provided advantages that were simply too great to ignore. Such governments were able to raise large amounts of cash quickly; they were able to raise money from a much wider circle than just the immediate tax base; and they were able to overcome limitations in the amount of specie circulating. This made state borrowing very effective and the places that engaged in it very powerful. In addition, bonds provided reliable places for wealthy citizens to store wealth outside of banks, and the interest payments helped local economies flourish. Money was becoming an important source of military power, too. Luciano Pezzolo summarizes the advantages of bond issuance by Italian city-states:

First, the enormous concentration of capital in some Italian cities allowed governments to transform, through public credit, private wealth into military power, to build a territorial state, and to control a wider economic area…Italian governments collected money from taxpayers at 5 to 7 percent, whereas the major European monarchies of the Renaissance were compelled to borrow at a much higher price.

Second, the debts took on a political function. To be creditors in the government meant sharing the destiny of the regime, and consequently supporting it. In Florence, the Medicean regime tied itself to an oligarchy that profited from the management of government debt. Thus, debt helped create stability.

Third, the social structure was supported by state debt: the considerable bond income drawn by charitable and social institutions and redistributed it the poor maintained a paternalistic policy that was a pillar of the urban political and social system.

Fourth, both government bonds and interest provided an effective surrogate of cash money in the later Middle Ages during a period of bullion shortage. The trade of bonds and interest claims opened up sophisticated forms of speculation and implemented financial techniques that are quite familiar to modern brokers.

Finally, the means devised by governments to finance the deficit offered new forms of social security and investment (dowries, life annuities, lotteries) that are at the roots of [the] later financial system. [15]

In this, we can discern something like David Graeber’s military-coinage-slavery complex emerging around the bond markets:

1.) Governments would raise money for military operations by dedicating future expected revenue streams to loan repayments, effectively becoming debtors to their citizens. That is, they could borrow against future revenues.

2.) The proceeds from the territorial/commercial expansion would be used to pay interest on the loans.

3.) The interest money would then flow back into the domestic economy, causing economic expansion at home, as more people became dependent on the government debt as a store of value and a source of income.

4.) Economic expansion abroad and at home would allow governments to deliver better services to its citizens, ensuring broad popular support.

5.) The dependency on regular payouts by lenders would encourage them to support the political stability of the regime.

6.) City-states which avoided default were able to gain a fundraising advantage over their rivals. Hence, there was a strong incentive to make reliable payments and not to default.

Thus, the concept of the “national debt” was born. This gave rise to a brand new “money interest” whose wealth was held in government debt rather than coin.

Debt Financing Spreads to Northern Europe

Now contrast this with Northern Europe. Most nation-states were still under the feudal system. It would have made no sense for a ruler to borrow from himself, since they theoretically “owned” everything in the kingdom. Instead of borrowing from their citizens, therefore, these kingdoms continued to rely upon other sources of income.

Under the feudal system tax collection was highly decentralized and done mainly at the local level. Wealthy kingdoms, such as France, used tax farming (publican) methods very similar to those of ancient Rome:

Fiscal revenues consisted of a mixture of direct (income or wealth) taxes, indirect (consumption) taxes, and feudal dues arising from the royal demesne. The assessment and collection of these revenues was decentralized. For direct taxes, a global amount was set by the government, and then broken down into assessments for each province, where local authorities would proceed with the next level of assessment, and so on to the local level.

For indirect taxes, collection was carried out by tax farmers on behalf of the government. The procedure was much like the one in place since Medieval times for running the royal mints. The right to collect a given tax was auctioned to the highest bidder. The bidder offered a fixed annual payment to the king for the duration of the lease. Meanwhile, he took upon himself to collect the tax, hiring all the necessary employees. Any shortfall in revenues from the promised sum was made up by the entrepreneur; conversely, any revenue collected above and beyond the price of the lease was retained as profit by the entrepreneur…

Spending is decentralized as well to various treasurers. Each tax had an associated bureaucracy of collectors and treasurers, either government employees or officers (direct taxes) or employees of the tax farmer. The treasurers spent some of the monies they collected, upon presentation of payment orders emanating from the government, and turned over the remainder, if any, to the royal treasury in Paris. [16]

Although it’s anathema under modern economic dogma, government monopolies on various business activities were considered a legitimate way to raise revenue.

Government monopolies, such as salt and recently introduced tobacco, were also farmed out in the same fashion. Indeed, the ability to create monopolies was one of the king’s resources; one of the more outlandish examples being the exclusive right to sell snow and ice in the district of Paris, sold for 10,000L per year in 1701. [17]

Another method was through the sale of political offices. Governments would create offices and sell them at a profit, and the salary paid was essentially interest on the lump sum payment for the original position:

An officer was someone who held a government position not on commission or at the king’s leave, but as of right, and enjoyed various privileges attached to the position (in particular the collection of fees related to his activities). Offices were sold, and the king paid interest on the original sale price, which was called the wages of the office (gages). A wage increase was really a forced loan, requiring the officer to put up the additional capital. Officers could not be removed except for misconduct; however, the office itself could be abolished, as long as the king repaid the original sum. Thus, offices as a form of debt also carried the same repayment option as annuities. [18]

And, as in Italy, the census evolved into annuities which were sold by municipalities as a way of long-term borrowing.

Offices and annuities (which I will generically call bonds, and whose owners I will call bondholders) could be transferred or sold, but with fairly high transaction costs. Both were considered forms of real estate, and could be mortgaged. In the late 17th century the French government, like others in Europe, had begun experimenting with life annuities, tontines, and lottery loans, but on a limited basis, and had not yet issued bearer bonds. Even the short-term debt described above was registered in the sense that the payee’s name was on the instrument, and could be transferred only by endorsement.

A final form of borrowing combined tax creation and lending. The procedure consisted in creating a new tax for some limited time and immediately farming its collection in exchange for a single, lump-sum payment representing the tax’s net present value. [20]

Besides, absolute monarchs could always repudiate their debts, and there was not much recourse for creditors since monarchs had their own armies and made the laws. The kings who did take out loans for military campaigns ended up paying very high interest rates for this reason.

By the early sixteenth century, the Habsburg Emperor, French kings, and princes in the Low Countries had all affirmed their powers to regulate municipal public finances, especially rentes, and the municipal taxes that were used to pay annual rent charges. But this method of financing governments still remained municipal, because only municipalities sold rentes, so that the national institutions required for a funded, permanent public debt had yet to be created…the first national monarchy to establish a permanent, funded national debt based on rentes, by the early sixteenth century, was … the newly unified Habsburg kingdom of Spain.

Both the French and Spanish crowns sought to raise money … but they had to use towns as intermediaries. In the French case, funds were raised on behalf of the monarch by the Paris hôtel de ville-, in the Spanish case, royal juros had to be marketed through Genoa’s Casa di San Giorgio (a private syndicate that purchased the right to collect the city’s taxes) and Antwerp’s heurs, a forerunner of the modern stock market. Yet investors in royal debt had to be wary. Whereas towns, with their oligarchical forms of rule and locally held debts, had incentives not to default, the same was not true of absolute rulers. [21]

Despite this ability to borrow, by the 1500-1600’s France and Spain had become serial defaulters.

…the Spanish crown became a serial defaulter in the late sixteenth and seventeenth centuries, wholly or partially suspending payments to creditors in 1557 , 1560, 1575 , 1596, 1607, 1627 , 1647, 1652 and 1662. [22]

The Netherlands, by contrast, used these financial techniques to fund their war of independence from Spain and in the process became the financial center of northern Europe.

Part of the reason for Spain’s financial difficulties was the extreme costliness of trying and failing to bring to heel the rebellious provinces of the northern Netherlands, whose revolt against Spanish rule was a watershed in financial as well as political history. With their republican institutions, the United Provinces combined the advantages of the city state with the scale of a nation-state. They were able to finance their wars by developing Amsterdam as the market for a whole range of new securities: not only life and perpetual annuities, but also lottery loans (whereby investors bought a small probability of a large return). By 1650 there were more than 6 5,000 Dutch rentiers, men who had invested their capital in one or other of these debt instruments and thereby helped finance the long Dutch struggle to preserve their independence. [23]

The center of European trade moved from the Mediterranean to the North Atlantic starting in the mid-1400’s with the advent of pelagic shipping vessels and the discovery of new routes to Asia by circumnavigating Africa. Portugal and Spain took the lead here. Spain’s “discovery” of the American continent ensured that trade would now be centered on the Atlantic coast, and the Islamic trade in the Mediterranean withered and became less significant, especially after the fall of Constantinople to the Turks in 1453. Eventually, European maritime trade became centered in Antwerp. When the Spanish conquered the southern Netherlands, what we now call Belgium, in 1585, they took Antwerp, which was the main port for Northern Europe. Many of the more highly skilled merchants fled to Amsterdam, which would then become ground zero for the financial revolution.

The reason for the primacy of the Dutch Republic in trading and finance might simply boil down to geography. Holland and the Netherlands are below sea level, which is why they are called the Low Countries. The land had forcibly been reclaimed from the sea by dykes over the centuries. This made the Dutch dependent upon fishing, shipping and trading far more than just about anywhere else, since the water table was too high for farming and there was not much arable land. Yet at the same time the population density of these areas was quite high. So their entire economy had to be dependent almost exclusively on shipping and trade since there were no other options, unlike in France, Spain, Portugal and England.

The Dutch utilized much of the same methods of borrowing as the rest of Europe, but much more effectively:

The Netherlands successfully liberated itself from Spain between 1568 and 1648. The Dutch established the Dutch east India Company in 1602 and the Dutch West India Company in 1621. The Netherlands didn’t have to pay for an expensive court, fought their wars at home rather than abroad, profited from international trade, and saved money. The Amsterdam Exchange dealt not only in shares of the Dutch East India Company and Dutch West India Company, but in government bonds as well.

Most securities were in the form of Annuities issued by the individual provinces, the United Provinces and the towns. This is the essential way in which Dutch lending differed from Italian lending. The Italian credit system relied upon a system of private international banking. The Medicis and other commercial bankers would lend their funds to states, knowing the risks involved. The Italians also had officially chartered banks that intermediated deposits and loans.

Outside of the Italian city-states, loans to heads of state were basically personal loans that clearly ran the risk of default. Spanish, French and English kings borrowed when they had to, defaulted when they couldn’t pay, but had no system of drawing upon the savings of the public. The Dutch, on the other hand, developed state finance based upon the government’s ability to pledge its revenues against the annuities they had issued. Having no royal court, and relying upon local governments, the Dutch paid off loans on time with little risk of default. As risk declined, interest rates fell to 4%, the lowest they had ever been in history, and a rate consistent with the low level of default risk that governments enjoy today. [24]

The Dutch also set up a bourse where national debts could be traded as negotiable securities. They set up a state bank to manage trade. They also developed the modern corporation, where corporate shares were freely tradable, hence establishing the first stock market (the Amsterdam exchange).

The Dutch Republic became the main place where international debts could be bought and sold in secondary markets. While it was neither the first bank or exchange, what made it unique was the fact that this was consolidated in one specific location, with government backing, as well as the scale of operations. Securities from all over became speculative commodities. This was the beginning of trading debts and money that engendered speculative bubbles like Tulip mania. In fact, you could even gamble with assets that you didn’t actually own, setting up the stage for the modern Casino Capitalism.

The novelty at the beginning of the seventeenth century was the introduction of a stock market in Amsterdam. Government stocks and the prestigious shares in the Dutch East India Company had become the objects of speculation in a totally modern fashion. It is not quite accurate to call this the first stock market, as people often do. State loan stocks had been negotiable at a very early date in Venice, in Florence before 1328, and in Genoa, where there was an active markets in the luoghi and paghe of the Casa di San Giorgio, not to mention the Kuxen shares in the German mines which were quoted as early as the fifteenth century at the Leipzig fairs, the Spanish juros, the French rentes sur l’Hotel de Ville (municipal stocks) (I522) or the stock market in the Hanseatic towns from the fifteenth century. The statutes of Verona in 1318 confirm the existence of the settlement or forward market (mercato a termine). In 1428, the jurist, Bartolomeo de Bosco protested against the sale of forward loca in Genoa. All this evidence points to the Mediterranean as the cradle of the stock market.

But what was new in Amsterdam was the volume, the fluidity of the market and the publicity it received, and the speculative freedom of transactions. Frenetic gambling went on here – gaming for gaming’s sake: we should not forget that in about 1634, the tulip mania sweeping through Holland meant that a bulb ‘of no intrinsic value’ might be exchanged for ‘a new carriage, two grey horses and a complete harness’! Betting on shares however, in expert hands, could bring in a comfortable income… Exchanges and growing rich while the merchants said they Were becoming poorer. In every centre, Marseilles or London, paris or Lisbon, Nantes or Amsterdam, brokers, who were little hampered by the regulations, took many liberties with them.

But is is also true that speculation on the Amsterdam Stock Exchange had reached a degree of sophistication and abstraction which made it for many years a very special trading-centre of Europe, a place where people were not content simply to buy and sell shares, speculating on their possible rise or fall, but where one could by means of various ingenious combinations speculate without having any money or shares at all. This was where the brokers came into their own… All the same, such practices had not yet attained the scale they were to reach during the following century, from the time of the Seven Years War, with the increased speculation in shares in the British East India Company, the Bank of England and the South Sea, above all in English government loans…Share prices were not oficially published until 1747 however, whereas the Amsterdam Exchange had been billing commodity prices since 1585.

Several other changes took place as well. To resolve the multiple currencies circulating, state banks became established by governments, and monetary exchange ever more centered around bank credits rather than government-issued monies. You would deposit your coins in the bank and be given a credit for it, which would hold its value, protected from the arbitrary currency fluctuations decreed by sovereigns. Credit creation led to fractional reserve banking. Joint-stock companies were applied to banking, and even made loans to governments.

The seventeenth century saw the foundation of three distinctly novel institutions that, in their different ways, were intended to serve a public as well as a private financial function.

The Amsterdam Exchange Bank (Wisselbank) was set up in 1609 to resolve the practical problems created for merchants by the circulation of multiple currencies in the United Provinces, where there were no fewer than fourteen different mints and copious quantities of foreign coins. By allowing merchants to set up accounts denominated in a standardized currency, the Exchange Bank pioneered the system of cheques and direct debits or transfers that we take for granted today. This allowed more and more commercial transactions to take place without the need for the sums involved to materialize in actual coins. One merchant could make a payment to another simply by arranging for his account at the bank to be debited and the counterparty’s account to be credited.

The limitation on this system was simply that the Exchange Bank maintained something close to a 100 per cent ratio between its deposits and its reserves of precious metal and coin…A run on the bank was therefore a virtual impossibility, since it had enough cash on hand to satisfy nearly all of its depositors if, for some reason, they all wanted to liquidate their deposits at once. This made the bank secure, no doubt, but it prevented it performing what would now be seen as the defining characteristic of a bank, credit creation.

It was in Stockholm nearly half a century later, with the foundation of the Swedish Riksbank in 1656, that this barrier was broken through. Although it performed the same functions as the Dutch Wisselbank, the Riksbank was also designed to be a Lanebank, meaning that it engaged in lending as well as facilitating commercial payments. By lending amounts in excess of its metallic reserve, it may be said to have pioneered the practice of what would later be known as fractional reserve banking, exploiting the fact that money left on deposit could profitably be lent out to borrowers…

The third great innovation of the seventeenth century occurred in London with the creation of the Bank of England in 1694. Designed primarily to assist the government with war finance (by converting a portion of the government’s debt into shares in the bank), the Bank was endowed with distinctive privileges. From 1709 it was the only bank allowed to operate on a joint-stock basis; and from 1742 it established a partial monopoly on the issue of banknotes, a distinctive form of promissory note that did not bear interest, designed to facilitate payments without the need for both parties in a transaction to have current accounts. [25]

This last innovation – the use of private corporations such as banks to consolidate and manage the government’s debt, is at the heart of the modern financial system. The money we use is the government’s liability, backed by its ability to collect taxes. Yet now private banks would continue to be allowed to create credit by extending loans denominated in the same unit of account that the government required to pay the taxes, the ultimate form of financial settlement.

We’ll take a look at how that happened next time.

SOURCES:

[1] Not used.
[2] Niall Ferguson; The Ascent of Money, p. 69
[3] William N. Goetzmann and K. Geert Rouwenhorst, eds.: The Origins of Value: The Financial Innovations that Created Modern Capital Markets, p. 147
[4] John H. Munro: The medieval origins of the ’Financial Revolution’: usury, rentes, and negotiablity. http://mpra.ub.uni-muenchen.de/10925/ p. 514
[5] http://magic-maths-money.blogspot.com/2011/07/structured-finance-in-twelfth-century.html
[6] ibid.
[7] ibid.
[8] ibid.
[9] ibid.
[10] https://www.ft.com/content/6851f286-288d-11de-8dbf-00144feabdc0
[11] Niall Ferguson; The Ascent of Money, p. 72
[12] William N. Goetzmann and K. Geert Rouwenhorst, eds.: The Origins of Value: The Financial Innovations that Created Modern Capital Markets, p. 147
[13] ibid., p. 158
[14] http://magic-maths-money.blogspot.com/2011/07/structured-finance-in-twelfth-century.html
[15] William N. Goetzmann and K. Geert Rouwenhorst, eds.: The Origins of Value: The Financial Innovations that Created Modern Capital Markets, p. 163
[16] Francois R. Velde; Government Equity and Money: John Law’s System in 1720 France, p. 5-6
[17] Francois R. Velde; Government Equity and Money: John Law’s System in 1720 France, p. 5-6
[18] Francois R. Velde; Government Equity and Money: John Law’s System in 1720 France, p. 8
[19] Niall Ferguson; The Ascent of Money, pp. 73-74
[20] Francois R. Velde; Government Equity and Money: John Law’s System in 1720 France, p. 8
[21] John H. Munro: The medieval origins of the ’Financial Revolution’: usury, rentes, and negotiablity. http://mpra.ub.uni-muenchen.de/10925/ p. 73-74
[22] Niall Ferguson; The Ascent of Money, p. 74
[23] Niall Ferguson; The Ascent of Money, pp. 74-75
[24] http://www.businessinsider.com/700-years-of-government-bond-yields-2013-12
[24a] Fernand Braudel: Civilization and Capitalism Volume 2: The Wheels of Commerce, pp 100-102
[25] Niall Ferguson; The Ascent of Money, p.Pp. 48-49