(breaking this into multiple posts because of trouble posting) Part 1
How did private property begin?
Matt Breunig quotes the Libertarian economist Brian Caplan’s attempts to explain how private property first came to be:
“There are many clear-cut cases of righteous acquisition; once we understand them, we can use them to analyze fuzzier cases. What are some clear-cut cases? An individual living alone on an island grows some food, builds a house, carves a sculpture, or quarries some rock. If someone else shows up on the island, the new arrival seems morally obligated to respect that property [presumtively]. This isn’t just ‘seems to me’ or ‘seems to libertarians’; it’s “seems to almost everyone other than self-conscious socialist philosophers.” Other clear-cut cases: If two people mutually agree to pool their resources and effort, then split the rewards according to an explicit formula – whether 50/50, 90/10, or whatever. Or: I pay you ten pounds of food to build me a new hut.”
If you flatly insist that a person who builds a hut on a desert island isn’t morally entitled to exclude a new arrival from sharing it, there’s little left for me to say. Otherwise, we can build on these straightforward cases to credibly justify everything from real estate development to malls to multinational corporations. Doesn’t any big economic project in the modern world ultimately contain at least a small dose of theft? (i.e., doesn’t every skyscraper have at least one stolen brick in it?) Very likely, but in the real world, this rarely turns out to be a serious moral problem.
Reading it, I am immediately struck by the following: Caplan’s argument is ahistorical; it is simply a thought experiment. It does not make recourse to any actually documented historical facts or evidence that we know of. Given that property rights are at the very heart of libertarian thought, how can they be understood without recourse to actual history? How can it be justified? Perhaps this is the reason why “mainstream” economic thought has mostly abandoned historical inquiry and instead focused mainly on justifying currently existing conditions using sophisticated mathematical models and abstract formulas. Property rights, inequality and asymmetrical power relations are simply taken for granted as “natural” or “customary” and explained away or never seriously questioned (“…in the real world, this rarely turns out to be a serious moral problem.”).
There is a good analogy here with the libertarian explanation for the creation of money. That explanation is also a deductive thought experiment without recourse to any historical or anthropological data.
Libertarians posit an imaginary ahistorical society—again composed of isolated individuals pursuing specialized occupations—engaging in repeated barter transactions for everything they need which they cannot produce themselves. To reduce the transaction costs, they somehow—and without recourse to any centralized governing authority—come up with an item that has value not because it is intrinsically valuable, but because it can be used for exchanges, and they use this instead for trading in place of constantly bartering for things. Everyone agrees to accept this item in exchange. For various reasons such as scarcity, divisibility and durability, various metals became chosen as the standard, especially gold and silver. These metals became “real” money used by traders in markets. Later, as paper becomes more common, credit appears, representing gold and silver stored in a bank vault somewhere. Government then comes along and skims off (i.e. steals) a portion of this wealth from the producers peacefully engaging in mutually beneficial transactions in “free and open” markets which formed spontaneously without government aid or sanction. Or so the story goes.
The problem with this story is that it never actually happened! It, too, is ahistorical. This article in the Independent Australia summarizes the problems with this story, and then describes the evidence for how money was actually created based on the historical and anthropological evidence:
According to the myth, money evolved naturally, without the intervention of power institutions of any kind. In this tale, which I won’t repeat here, governments enter the story of money rather late on, eventually messing up a naturally stable system of commodity money. The abandonment of the gold standard by the USA in August 1971 supposedly marked the final betrayal by modern governments of a naturally stable system which had no need for governmental institutions at all. Governments confiscate taxpayers’ money. Governments create inflation and instability. Governments are generally bad news. So goes the story.
The thing is that there is apparently no evidence that this is true. None at all.
The prominent anthropologist, Caroline Humphrey, is very clear:
‘No example of a barter economy, pure and simple, has ever been described, let alone the emergence from it of money; all available ethnography suggests that there has never been such a thing.’
The monetary historian and one-time adviser to the Secretary of State for Wales, Glyn Davies is equally clear:
‘On one thing the experts on primitive money all agree, and this vital agreement transcends their minor differences. Their common belief backed up by the overwhelming tangible evidence of actual types of primitive moneys from all over the ancient world and from the archaeological, literary and linguistic evidence of the ancient world, is that barter was not the main factor in the origins and earliest development of money.’
In the case of ancient Mesopotamia, in particular, there are good reasons for believing that what became the social institution of money developed in a different way entirely.
The development of farming allowed for the emergence of religious and governmental institutions. Soldiers, administrators and priests needed to be provisioned, and accounts had to be kept of how this was being done. Tributes and taxes had to be raised in order to make this possible. The earliest sense in which money existed was as a way of recording the value of such tributes and taxes. The earliest unit of money as a scoring system may have been weights (or shekels) of barley, in places like Uruk, at least 5,000 years ago and probably well before that.
Not that taxes had to be paid in barley. They were just valued in these units. You could pay your taxes or tributes in a wide variety of items that the temple community could find a use for, as anthropologist David Graeber explains in his Debt: The First 5,000 Years.
The invention of government, accounting, taxation and money appear to have created early markets, specialisation and trade, and even credit and finance. Money as a thing was whatever you needed to obtain to pay your taxes. Governments were then able to spend by issuing token money which could be used in the future to pay taxes back to the government. Perhaps clay tokens were the earliest form of token money and, in some ways, not all that much unlike a modern $50 note.
I’m seeing a consistent pattern here. Libertarian thought tends to not be based on any serious inquiry into historical contingencies, or on empirical data, but on hypothetical thought experiments. And these thought experiments tend to be highly sympathetic to the status quo. Thought experiments are not necessarily bad–after all, scientists use them all the time to help formulate hypotheses. But when they contradict everything we know about history, anthropology, archaeology, sociology and psychology, well then, maybe we should stop taking them seriously. For example, there’s this from Caplan:
“An individual living alone on an island grows some food, builds a house, carves a sculpture, or quarries some rock. If someone else shows up on the island, the new arrival seems morally obligated to respect that property.”
The thing is, the above scenario never happened either! The lone individual has never existed! A hypothetical society of one has never existed outside of novels and films about imaginary castaways (and libertarian thought experiments). How can you rely on an imaginary state of nature which never existed in history to justify your position? It’s similar to the absurd Hobbesian concept of solitary individuals “choosing” to unite and form a government ex nihilo based on shared interests. As Francis Fukuyama writes:
We might label this the Hobbesian fallacy: the idea that human beings were primordially individualistic and that they entered into society at a later stage in their development only as a result of a rational calculation that social cooperation was the best way for them to achieve their individual ends…But in fact it is individualism and not sociability that developed over the course of human history. That individualism seems today like a solid core of our economic and political behavior is only because we have developed institutions that override our more naturally communal instincts…
Everything that modern biology and anthropology tell us about the state of nature suggests the opposite: there was never a period in human evolution when human beings existed as isolated individuals; the primate precursors of the human species had already developed extensive social, and indeed political, skills; and the human brain is hardwired with faculties that facilitate many forms of social cooperation…Human beings do not enter into society and political life as a result of conscious, rational decision. Communal organization comes to them naturally, though the specific ways they cooperate are shaped by environment, ideas, and culture.
[Fukuyama; The Origins of Political Order, pp. 29-30]
Just like governments, property relations must have arisen in a social milieu, and speculating about hypothetical Robinson Crusoes is not credible scholarship. Why not at least make a serious attempt to understand what actually happened? Perhaps it’s because it would not reflect so kindly upon the thesis you’re trying to justify. It seems like backward reasoning to me: starting with the conclusion and coming up with ways to justify it.
Matt Breunig makes a similar point:
The problem with the case is that, by clearing out all other people from the island, it eliminates the liberty destruction that makes property acquisition so obviously problematic. What if instead of one individual washing up on an island, ten of them do? Then one of them asserts that certain resources and land areas are his and that those who do not respect that claim will be violently attacked? This is more analogous to a real-life case of property acquisition where there exists more than one human being. It also clearly presents the problem of property acquisition rather than trying to get around it by creating a hypothetical society of one.
Similarly, Caplan’s hypothetical agreement between more-or-less equal individuals to divvy up resources along some sort of predetermined and mutually-agreed-upon formula (“…whether 50/50, 90/10, or whatever…”) is inconsistent with what we observe. It is more characteristic of strangers bartering in modern-day market economies than any ancient or traditional societies that we know of:
“Other clear-cut cases: If two people mutually agree to pool their resources and effort, then split the rewards according to an explicit formula – whether 50/50, 90/10, or whatever. Or: I pay you ten pounds of food to build me a new hut.”
This scenario assumes a hypothetical closed transaction among relative strangers with no ongoing social interactions between them where each party expects to walk away roughly equal. We know that this, like the hypothetical Robinsonade society of one, is not what actually happened in the past. Instead, exchanges were embedded in a wider social fabric without explicit reckoning of who owed what to whom. The unspoken laws of reciprocity determined that every individual could be secure in the knowledge that their labor and goods would be reciprocated at some point in the future by others, with everything balancing out in long run.
What actually happens in practice is that when individuals knew each other, exchange was based on reciprocity; a gift would be given in the anticipation of it being reciprocated in the future (when they don’t know each other there is barter, but in such situations money cannot emerge because cowrie shells might be important in one society, and gold in another).
One of the most famous stories illustrating the role of reciprocal exchange has concerns an anthropologist who after spending some time with bushmen, gave one of them his knife. When visiting the group some years later, anthropologists discovered that the knife had been owned, at some point in time, by every member of the community. The knife had not been communally owned, its ownership had passed from one person to the next and its passage was evidence of a social network in the community, just as the motion of planets is evidence of an, otherwise invisible, gravitational field.
Karl Polanyi makes the same point in The Great Transformation:
Ceremonial display serves to spur emulation to the utmost and the custom of communal labor tends to screw up both quantitative and qualitative standards to the highest pitch. The performance of acts of exchange byway of free gifts that are expected to be reciprocated though not necessarily by the same individuals a procedure minutely articulated and perfectly safeguarded by elaborate methods of publicity, by magic rites, and by the establishment of “dualities” in which groups are linked in mutual obligations—should in itself explain the absence of the notion of gain or even of wealth other than that consisting of objects traditionally enhancing social prestige.
…For it is on this one negative point that modern ethnographers agree: the absence of the motive of gain; the absence of the principle of laboring for remuneration; the absence of the principle of least effort; and, especially, the absence of any separate and distinct institution based on economic motives…[pp. 48-49]
Here’s anthropologist David Graeber describing the role that social relationships play in reciprocity:
[Sociologist Marcel] Mauss didn’t really think of everything in terms of exchange; this becomes clear if you read his other writings besides ‘The Gift’. Mauss insisted there were lots of different principles at play besides reciprocity in any society – including our own. For example, take hierarchy. Gifts given to inferiors or superiors don’t have to be repaid at all. If another professor takes our economist out to dinner, sure, he’ll feel that he should reciprocate; but if an eager grad student does, he’ll probably figure just accepting the invitation is favor enough; and if George Soros buys him dinner, then great, he did get something for nothing after all. In explicitly unequal relations, if you give somebody something, far from doing you a favor back, they’re more likely to expect you to do it again.
Or take communistic relations – and I define this, following Mauss actually, as any ones where people interact on the basis of ‘from each according to their abilities to each according to their needs’. In these relations people do not rely on reciprocity, for example, when trying to solve a problem, even inside a capitalist firm. (As I always say, if somebody working for Exxon says, “hand me the screwdriver,” the other guy doesn’t say, “yeah and what do I get for it?”) Communism is in a way the basis of all social relations – in that if the need is great enough (I’m drowning) or the cost small enough (can I have a light?) everyone will be expected to act that way.
I’m sensing a pattern here. What does it mean that libertarian arguments are dependent on hypothetical scenarios that fall apart under empirical scrutiny? Does it not mean that they should be discarded? It is possible that these views continue to be the economic and political orthodoxy only because there are a lot of special interests promoting them? Maybe this is behind the disturbing trend of claiming all of the humanities have been taken over by “Marxists.” It’s a good way to shut down the debate when the facts are not on your side.
Le secret des grandes fortunes sans cause apparente est un crime oublié, parce qu’il a été proprement fait.
“The secret of great fortunes without apparent cause is a crime forgotten, for it was properly done.”
—Honoré de Balzac
One of the simplest economic questions of all turns out to be one of the most complicated: Where did private property come from?
After all, private property is a shared fiction. There is no “property” apart from the legal rights enforcing it. Property is not, nor can it be, “natural.” We know that, once open a time, there was no such thing as private property. Indeed, this was the default condition for the hundreds of thousands of years while man subsisted as nomadic foragers, following herds of wild animals across the savanna and exploiting the abundant natural resources available to all.
There is no concept of “property” in the animal kingdom apart from that which an animal can defend or conceal itself. As the introduction to Primitive Property ponders:
No mere psychological explanation of the origin of property is, I venture…admissible, though writers…have attempted to discover its germs by that process in the lower animals. A dog, it has been said, shews an elementary proprietary sentiment when he hides a bone, or keeps watch over his master’s goods. But property has not its root in the love of possession. All living beings like and desire certain things, and if nature has armed them with any weapons are prone to use them in order to get and keep what they want. What requires explanation is not the want or desire of certain things on the part of individuals, but the fact that other individuals, with similar wants and desires, should leave them in undisturbed possession, or allot to them a share, of such things. It is the conduct of the community, not the inclination of individuals, that needs investigation.
The mere desire for particular articles, so far from accounting for settled and peaceful ownership, tends in the opposite direction, namely, to conflict and the right of the strongest. No small amount of error in several departments of social philosophy, and especially in political economy, has arisen from reasoning from the desires of the individual, instead of from the history of the community. [p. xi]
So that raises the question: how did we go from a system where resources were owned in common by all, to one where resources were owned by specific individuals who could deprive others of its use at will? And how and why did everyone else come to accept this situation?
After all, that’s what private property is. Private property is property where an individual or group of individuals can de-prive other persons of its use, hence private (privation also comes from the same root). Control and management over the item in question “belongs” only to certain people, and they and they alone are free to determine it’s use and claim the fruits of whatever it produces. They can also pass it down to their heirs in (theoretically) perpetuity.
Which is fine as well as it goes, but how then can one claim that such a system is based entirely on freedom and liberty? After all it’s all about taking liberty away from other people, particularly those who don’t own property. And those who do own property have considerable power over those who don’t—the very antithesis of freedom. Those without property have very little recourse.
Matt Breunig over at Jacobin points out that this is a serious problem with libertarian thought:
Perhaps the most interesting thing about libertarian thought is that it has no way of coherently justifying the initial acquisition of property. How does something that was once unowned become owned without nonconsensually destroying others’ liberty? It is impossible. This means that libertarian systems of thought literally cannot get off the ground. They are stuck at time zero of hypothetical history with no way forward.
If we accept that the only role of government is to guarantee private property rights, as libertarians claim, then shouldn’t we care about how those property rights came to be in the first place?
A problem with the concept of private property is that once private property becomes established in any society it sets in motion a Monopoly-style winner-take-all tournament where it accrues to fewer and fewer people over time. Things like the Law of Cumulative Advantage (a.k.a. the Matthew Effect) and just pure dumb luck ensure that more and more wealth accumulates to those who already have a lot of wealth to begin with, while those with already very little lose what little they have. This is so universal as to be considered almost a natural law.
This creates a polarization which has undermined every society that we know of since the dawn of agrarian civilizations thousands of years ago. It also appears to have very few good solutions.
An alarming projection produced by the House of Commons library suggests that if trends seen since the 2008 financial crash were to continue, then the top 1% will hold 64% of the world’s wealth by 2030. Even taking the financial crash into account, and measuring their assets over a longer period, they would still hold more than half of all wealth.
Since 2008, the wealth of the richest 1% has been growing at an average of 6% a year – much faster than the 3% growth in wealth of the remaining 99% of the world’s population. Should that continue, the top 1% would hold wealth equating to $305tn (£216.5tn) – up from $140tn today.
It also puts paid to the idea that libertarianism and market relations are–or can be–systems entirely free of violence or coercion, unlike “oppressive” central governments. After all, isn’t depriving other people of the resources they need to survive and making them pay for it a kind of violence? How could it not be? Here’s Breunig quoting Matt Zwolinsky making this same point:
If I put a fence around a piece of land that had previously been open to all to use, claim it as my own, and announce to all that I will use violence against any who walk upon it without my consent, it would certainly appear as though I am the one initiating force (or at least the threat of force) against others. I am restricting their liberty to move about as they were once free to do. I am doing so by threatening them with physical violence unless they comply with my demands. And I am doing so not in response to any provocation on their part but simply so that I might be better able to utilize the resource without their interference.
Again, what’s so funny about this insight is not just that it is a persuasive counterpoint to libertarianism, but rather that it seems to suggest that libertarian principles themselves forbid property ownership.
Similarly, any market-based system relies upon violence, direct or indirect, just as much as does any central government:
Can you imagine how capitalism could possibly function without coercion and the threat of violence? There would be nothing to stop theft and pillage of businesses, no claim to ownership over goods, no implementation of contracts, no enforcement of patents or protection against fraud. There would no incentive to innovate, to invest, to trade or to do any form of business. Without some degree of coercion we would truly be in a Hobbesian world where nothing was secure. There simply would be no such thing as capitalism without the security supplied by coercion.
But, an ancap would argue, we voluntarily enter into contracts, unlike any agreement we have with the state. It’s not coercion if we have agreed to it. But this is patently untrue of private property. No one (apart from the state and the previous owners) agreed to my parents ownership of their property. No contract was signed with their neighbours and no consent was given by anyone else in society. In fact no one ever agreed that property should be privately owned. Who ever said that land could belong to any one person? To many ancient civilisations (such as the Celts and Native Americans) this notion was as strange as any one person claiming they owned the air or the sky.
Indeed, how can we have “pure and natural liberty” based on something as unnatural as private property? How can a system based on depriving others of their rightful share in the commons claim that it is really all about “freedom?” And how can a system which deprives others of the basic things they need to survive be associated with “liberty?” It doesn’t make sense. Furthermore, how can we assert that such a system is entirely free of violence and coercion? Isn’t it just a matter of who does the coercion and why?
I’m going to sum up my final objections to Jordan Peterson and move on, because I’ve got other things to talk about.
Real ideological diversity
I’m going to begin with this quote from economist Richard Wolff, referring to his teaching of Marxist economics in university:
34:15: “I think, if the universities and colleges had a commetment to diversity of perspective the way they now at least say they have with diversity of gender and race and all of that, then we would have had people like me teaching–lots more of them than I am; many like me–teaching. And then we would have at least confronted a generation of students with the alternatives that they could have then thought about and made up their own minds. But this country has never, in my lifetime, had the confidence in its own people to give them real freedom of choice in learning. They’ve given them a very restricgted diet and we live with the consequences.”
This is an odd complaint considering Peterson’s contention that entire disciplines, including sociology, anthropology, biology, history, not to mention law and education, are intent on indoctrinating unsuspecting students with “postmodern Neo-Marxism.” Apparently the only place on a college campus where you won’t hear about Marx is the economics department!
I read this as saying that universities use diversity of gender/ethnic groups as a screen to cover their lack of diversity on actual intellectual ideas which are threatening to the ruling class. I agree. As I’ve said before, identity politics is a great way of neutering the Left.
Yet people are convinced “Marxists” have taken over some of the largest corporations in America even while entire areas of the country (including the national government) are effectively one party rule by Republicans, a party farther to the right than any other party existing in the developed world.
The Real Cultural Marxists:
This article about the programmer named Christopher Wylie whose ideas led to the founding of Cambridge Analytica and who contributed to Trump’s victory, has been widely read in the wake of the scandals surrounding that company. I found this part to be most interesting given the right wing demonization of cultural Marxism and handwringing over things like gay marriage:
A few months later, in autumn 2013, Wylie met Steve Bannon. At the time, he was editor-in-chief of Breitbart, which he had brought to Britain to support his friend Nigel Farage in his mission to take Britain out of the European Union.
What was he like?
“Smart,” says Wylie. “Interesting. Really interested in ideas. He’s the only straight man I’ve ever talked to about intersectional feminist theory. He saw its relevance straightaway to the oppressions that conservative, young white men feel.”
Wylie meeting Bannon was the moment petrol was poured on a flickering flame. Wylie lives for ideas. He speaks 19 to the dozen for hours at a time. He had a theory to prove. And at the time, this was a purely intellectual problem. Politics was like fashion, he told Bannon.
“[Bannon] got it immediately. He believes in the whole Andrew Breitbart doctrine that politics is downstream from culture, so to change politics you need to change culture. And fashion trends are a useful proxy for that. Trump is like a pair of Uggs, or Crocs, basically. So how do you get from people thinking ‘Ugh. Totally ugly’ to the moment when everyone is wearing them? That was the inflection point he was looking for.”
It was Bannon who took this idea to the Mercers: Robert Mercer – the co-CEO of the hedge fund Renaissance Technologies, who used his billions to pursue a rightwing agenda, donating to Republican causes and supporting Republican candidates – and his daughter Rebekah.
Nix and Wylie flew to New York to meet the Mercers in Rebekah’s Manhattan apartment.
“She loved me. She was like, ‘Oh we need more of your type on our side!’”
“The gays. She loved the gays. So did Steve [Bannon]. He saw us as early adopters. He figured, if you can get the gays on board, everyone else will follow. It’s why he was so into the whole Milo [Yiannopoulos] thing.”
It seems like a sort of projection–create a conspiracy theory about evil Marxists manipulating the culture for their political agenda and you cover the fact that you are actually doing what you are accusing your opponents of doing! Note that Breitbart is one of the major outlets pushing of the cultural Marxist conspiracy theory while at the same time believing that “politics is donwstream from culture;” supposedly the central idea of cultural Marxism. However, although the vast cultural Marxist conspiracy on campuses remains in the realm of speculation, Bannon’s and Mercer’s actions are actually documented.
So who’s really manipulating culture to their own ends here, college professors, or the people who, you know, actually wield political power in the real world? Personally, I’m more afraid of Cambridge Analytica and Breitbart than postmodernist professors or transgender student activists on campus.
Is the PC Threat Exaggerated?
I suspect that the campus free-speech threat is greatly exaggerated for political purposes. Consider this quote from Danieli Bolelli, a teacher and writer based in Los Angeles on the Joe Rogan podcast:
“I think a lot of this stuff is also a little bit media created in the sense that, ‘Let’s find the most batshit crazy person on that side; let’s put the spotlight on them,’ which makes everybody go, ‘What the fuck, who are those crazy people?’ It’s kind of like if you were to pick the Westboro Baptist Church and make it be representative of Christianity. It’s not, but if you keep putting the spotlight there, you’ll create this perception [which will] create a backlash and it becomes this thing where…”
“Like, that’s one of the funny things that I was noticing, because…I really don’t like political correctness. I really don’t like academia. There are ten thousand of these things where I’m completely on board with not liking some of these things.”
“But then there’s another side where…I have been teaching at university since 2001. I don’t think I’ve seen once a case of the kind of political correctness that I see in articles in media. Not once. I was doing the math. I had probably, maybe 11,000 students in my classes over the course of those years. I haven’t had one person ever defend hard-core Communism, or make an argument…even among my colleagues which I have issues with for other reasons, that’s never been one of the things.”
“I keep hearing about it, I keep reading about it in papers, but why is it when that’s how I make my living–I’m on college campuses all the time–I hardly ever see it?”
“I’m not saying that it’s not true; of course these stories true. There’s no argument. But what I’m wondering is how much do they get blown out of proportion because you get clicks, because it makes for an interesting narrative which then some people also live off that kind of narrative. How much of it is where you are putting a spotlight on and making a rare exception be the norm versus how much it’s a real thing?”
“I mean, I teach in Southern California. Santa Monica is one of the most liberal places around. If this thing is as dominant as advertised, I should be running into it all the time, right? And I don’t like that stuff so I would be sensitive…I would be paying attention. And I don’t see it. So I’m like, ‘Hmmm, what’s going on here?'”
“…I am not arguing that they [Jordan Peterson, Bret Weinstein] are wrong, they’re completely right. My issue is from there to arguing that this is this super prevalent thing. It’s like, from one story to say instead there’s a communist conspiracy to brainwash us all, we are starting from a completely understandable presence and taking it twenty-five steps too far.”
“I agree. But I think what’s happening is more of these unusual situations are occurring and so people are terrified of this spreading like wildfire across the country. Because kids are very easily influenced. And they’re also idealistic. They want to change the world.”
What’s the truth? Bolelli’s experience is backed up by data:
As Acadia University political science professor Jeffrey Sachs points out, according to a General Social Survey (GSS) dataset, “young people aged 18-34 are the most tolerant of potentially offensive speech and trending upward,” meaning not only that young people are already the most tolerant of offensive speech, but that they’re getting more tolerant…
A Heterodox Academy analysis of the FIRE disinvitation data shows that the most successful attempts to shut down speakers have come from right-leaning groups shutting down speech with which they don’t agree, but this hasn’t stopped pundits and politicians from seeing the student left as the gravest threat to free speech.
While “scalp hunting” is not anything I endorse, these have more the character of “witch hunts” than any kind of Leftist dogma. Witch hunts are a sad part of human nature, and appear to be prevalent in the United States for some reason. Remember, that Communists and anarchists were the primary victims of witch hunts during the twentieth century. While unfounded accusations hurled at innocent people are always a bad thing, is this really more prevalent now than at any time in the past? Or is it more playing to white male insecurity and fear of quotas in a time of disappearing job opportunities? After all, in the 1960’s, Leftist radicals set bombs off on campuses! Professors threatened by the “extreme left” end up as millionaires. Those menaced by the extreme right end up in the morgue (e.g. Heather Heyer; Anders Breivik’s and Dylan Roof’s victims, etc.).
Why is the Left so *violent*???
And yet, as Rogan opined earlier in the episode:
What’s fascinating to me about human beings of today is I’ve never seen a time where people are more interested in other people doing what they want them to do. Like, other people thinking the way they want them to think; other people behaving the way they want them to…People, it seems to me are more concerned with controlling people’s expression and thinking today than ever before. And even more so on the left.
I’m seeing this interesting trend today where people…it’s almost like we don’t like where things are headed. We don’t like what’s happening, we don’t like who the president is, so people are being real adamant about enforcing certain types of behavior. And that in turn, just like we were talking about people suppressing certain types of alcohol, that in turn …makes people rebel.
I feel like there’s more people leaning Right today than ever before. And I attribute it entirely to the people on the Left.
Yet, the data shows that this is factually untrue:
For nearly 50 years, the General Social Survey (GSS) has asked Americans about their tolerance for offensive speech. Some questions include: Should an anti-American Muslim cleric be permitted to teach in a public school? Should the local library stock books hostile to religion? On almost every question, young people aged 18 to 34 are the most likely to support free speech...Not only are young people the most likely to express tolerance for offensive speech, but with almost every question posed by the GSS, each generation of young people has been more tolerant than the last…
And it’s definitely not “spreading like wildfire,” despite what Rogan promotes on his show:
[T]hese incidents are rare. Take the phenomenon of blocking invited speakers from speaking on campus, also known as no-platforming. The Foundation for Individual Rights in Education (FIRE) reported 35 no-platforming attempts in 2017; out of those, 19 succeeded. In a country with over 4,700 schools, that hardly constitutes a crisis.
Finally, despite claims that college administrators are increasingly coddling students with speech codes, FIRE shows that the opposite is the case. The number of universities with restrictive speech codes has been dropping each year for the past decade and is currently at an all-time low. Most universities are not the ideological safe spaces their critics imagine.
In fact, our speech is often much more restricted off campus than on. Consider the workplace, where most non-students spend the bulk of their time when not at home. Once you’re on the job, most First Amendment rights disappear. The things you say, the clothing you wear, even the bumper stickers on the car you parked in the company lot — all can be restricted by private-sector employers. Perhaps the reason campus free speech controversies can sound so strange is because few of us are aware of how much we are already shielded from hateful or offensive speech.
In other words, the right-wing propaganda, pitched mainly at a demographic that has never set foot on a college campus or a corporate boardroom, is working as intended.
The propaganda tells us: Don’t worry about your job going away. Don’t worry about not being able to access health care. Don’t worry about all the people dying in your town from Fentanyl. Worry about the real threat: liberals who want to control your speech. Propaganda works.
I’ve also pointed out numerous instances of right-wing political correctness that stifles speech, yet the partisan desire — especially on the right — to manufacture fear of a particularly “illiberal left” is an important part of the conservative playbook in the Trump era. This despite the fact that President Donald Trump has openly attempted to use the power of the presidency and the resources of government to silence athletes and journalists he doesn’t like. Data is unlikely to change this attitude…being “anti-PC” is now effectively a form of tribalist identity politics. When I draw attention to right-wing threats to freedom of speech, these counterexamples — whether data-based or anecdotal — tend to threaten anti-PC identity and cause membership to close ranks.
If we consider the rise not only of anti-college views in popular media, but in organizations that seem to exist primarily to spread anti-college, anti-student and anti-faculty propaganda — like Turning Point USA or Campus Reform — it becomes clear that characterizing the campus left as “against free speech” appeals to large numbers of people who otherwise care little about quotidian campus affairs. Anti-PC and anti-college identity politics align with the faux-populism driving broader right-wing politics today.
Because of such propaganda, conservatives who see themselves, in some ways rightly, as victims of “the elite” are able to position themselves as fighting a scary, authoritarian, left-wing caricature. Indeed, the only way it’s possible to see left-wing college students as a group whose power rivals that of the presidency or the billionaire donor class is by embracing the cartoon image of lefty students as little authoritarians, and promoting it despite counterevidence. The political investment in the myth of the authoritarian college student is simply more powerful than even the most comprehensive data analyses on the subject.
Peterson is virulently anti-Communist and anti-Marxist, which to him are essentially the same thing. He insists that “Marxist” philosophy is based primarily on envy of the successful and inevitably leads to the gulags and reeducation camps.
This article cast aspersions on that rigid black-and-white thinking and is worth a read:
Since nuance in the story of 20th-century communism might ‘reduce the ease of our thoughts and the clarity of our feelings’, anti-communists will attack, dismiss or discredit any archival findings, interviews or survey results recalling Eastern Bloc achievements in science, culture, education, health care or women’s rights. They were bad people, and everything they did must be bad; we invert the ‘halo’ terminology and call this the ‘pitchfork effect’. Those offering a more nuanced narrative than one of unending totalitarian terror are dismissed as apologists or useful idiots. Contemporary intellectual opposition to the idea that ‘bad people are all bad’ elicits outrage and an immediate accusation that you are no better than those out to rob us of our ‘God-given rights’.
In 1984, the anthropologist Clifford Geertz wrote that you could be ‘anti anti-communism’ without being in favour of communism…In other words, you could stand up against bullies such as Joseph McCarthy without defending Joseph Stalin. If we carefully analyse the arguments of those attempting to control the historical narrative of 20th-century communism, this does not mean that we are apologising for, or excusing the atrocities or the lost lives of millions of men and women who suffered for their political beliefs.
What is the real reason for such Red-baiting and scare mongering, and why has it increased so markedly?
Conservative and nationalist political leaders in the US and across Europe already incite fear with tales of the twin monsters of Islamic fundamentalism and illegal immigration. But not everyone believes that immigration is a terrible threat, and most Right-wing conservatives don’t think that Western countries are at risk of becoming theocratic states under Sharia law. Communism, on the other hand, provides the perfect new (old) enemy. If your main policy agenda is shoring up free-market capitalism, protecting the wealth of the superrich and dismantling what little is left of social safety nets, then it is useful to paint those who envision more redistributive politics as wild-eyed Marxists bent on the destruction of Western civilisation.
What better time to resurrect the spectre of communism? As youth across the world become increasingly disenchanted with the savage inequalities of capitalism, defenders of the status quo will stop at nothing to convince younger voters about the evils of collectivist ideas. They will rewrite history textbooks, build memorials, and declare days of commemoration for the victims of communism – all to ensure that calls for social justice or redistribution are forever equated with forced labour camps and famine.
Peterson’s anti-communist zealotry and conflating political correctness with Marxism is a very useful message for those afraid that people may start questioning the increasing distribution of income upward. Peterson’s message is: don’t complain, don’t participate, just focus on climbing the existing hierarchy. This may be why so many deep pockets are willing to contribute to his Patreon account.
Are identity politics necessarily bad? Isn’t that what all politics is? After all, almost every policy will net winners and losers. Shouldn’t we care which group our representatives are in? That our neighbors are in? That we are in? Almost every politician will try and claim affiliation with their constituents. How could they not?
…all politics is identity politics. It is not just the Left that appeals to people based on their background and identity, all ideologies do so. All politicians campaign by highlighting their similarities with the voters, their common identity and by claiming to best represent the people. The Right is just as reliant on identity politics.
Think about a typical political campaign, regardless of political party. How does a candidate present themselves? They usually begin by emphasising their connection to the constituency and how long they’ve lived there. It’s certainly a benefit if they were born there and voters react negatively to “outsiders”. Then they’ll show their bond with the community, their participation in local events, traditions etc. Always the emphasis is on how similar they are to the constituents, how much they have in common with the voters. ‘Vote for me because I’m just like you, I can best represent you because I have gone through the same experiences you have’. They’ll show their bond with local industries and interest groups, their shared religion and patriotism (especially in America).
All of this identity politics. All politicians aim to get voters to identify with them, the only difference is method. It’s only a question of whether they highlight their common race, religion, class, geography, occupation etc. The goal of every campaign is to make voters identify with the candidate and believe that they are part of the same group.
Does anyone remember Sarah Palin and her “real Americans” who go hunting, attend church regularly, have kids and drive pickup trucks? Does anyone seriously think Palin was chosen because she was the most qualified candidate for vice president available to the McCain campaign? Or was it for reasons of “identity politics?” What about Mike Pence and his affiliation with Christian Evangelicals, a core part of the right-wing Republican coalition?
Peterson seems to believe that any sort of group affiliation is bad. But, without becoming a part of a larger group, how can one possibly effect change? We are a part of multiple, overlapping groups whether we like it or not: countries, families, workplaces, ethnicities, languages, occupations, etc., all tie us to other people and groups in various ways. It’s impossible for that not to be the case.
Now, I agree that neglecting people’s individualism is a bad thing. And certainly some groups aren’t allowed to speak for you just because they happen to be the same race, gender, and so forth. For example, if some white supremacist group claimed to speak for me because we’re the same ethnic “group” I would raise serious objections. This is not in dispute. People are more alike then they are different, as Peterson points out.
But dismissing the idea that there are no classes and that they are never in conflict is going a bit too far. The view that there are no classes, I would argue, is as against the grain of mainstream sociology as insisting that there are no genders.
Do Critics of Capitalism Hate Western Culture?
Peterson seems to imply that any criticism of capitalism is tantamount to Marxism. Again, maybe this isn’t accurate; it’s hard to tell. He also seems to imply that critics of capitalism (such as his alleged postmodernists) have a grudge against Western society and want to undermine it out of some notion of collective guilt.
Is Western society (whatever that is) entirely defined by capitalism? After all, it was around thousands of years before capitalism came along. In fact, many of the core institutions of the West are in opposition to capitalism! The Catholic Church, touted by many alt-Right types as the foundation of Western civilization, has been critical of capitalist materialism, it’s atomization of people, its lack of values and its callousness towards the poor and downtrodden. Many traditional social arrangements were destroyed, from the aristocracy to craft guilds to land tenure systems, in order to make way for capitalism and liberalized markets. The West existed under the Classical World, the Dark Ages, and Medieval feudalism.
Criticism of capitalism != Communism
Capitalism != Western civilization
There’s plenty to disagree with in Marxism if you’re so inclined, just as there is with any economic philosophy. But Peterson never engages with the actual philosophy itself. This is a good brief summary of what Marxism actually argues:
Marx started with the presumption that all markets operate much in the way the classical political economists then (and neoclassical economists today) presume. He then showed that even when all commodities exchange at their values and workers receive the value of their labor power (that is, no cheating), capitalists are able to appropriate a surplus-value (that is, there is exploitation). No special modifications of the presumption of perfect markets need to be made. As long as capitalists are able, after the exchange of money for the commodity labor power has taken place, to extract labor from labor power during the course of commodity production, there will be an extra value, a surplus-value, that capitalists are able to appropriate for doing nothing.
The point is, the Marxian theory of the distribution of income identifies an unequal distribution of income that is endemic to capitalism—and thus a fundamental violation of the idea of “just deserts”—even if all markets operate according to the unrealistic assumptions of mainstream economists. And that intrinsically unequal distribution of income within capitalism becomes even more unequal once we consider all the ways the mainstream assumptions about markets are violated on a daily basis within the kinds of capitalism we witness today.
Peterson frequently employs “snarl words” when discussing his opponents and critics (“Postmodernist,” “cultural (or Neo-) Marxist,” “feminist,” “social justice warrior”) or broad one-dimensional characterizations: (“PC culture,” the “radical Left,” and so on).
This is not what I would expect of a serious intellectual. His constant use of these phrases and terms should cause him to be a laughing stock, not taken seriously as a public intellectual. If he used such sloppy reasoning in his psychology career, he would not have gotten very far.
For example, Neo-marxism, to the extent that it exists, is a complex intellecual phenomenon. Conencting it to HR departments and blank slatism is intellectualy lazy.
What is Neo-Marxism?
Neo-Marxism is a huge area…both the Frankfurt School and Dependency Theory are important types of Neo-Marxism. Here are some others.
(1) The Hungarian Marxist, Georg Lukacs, and the “Budapest School” that came out of his work.
(2) The Italian Marxist, Antonio Gramsci, and the endless discourse on “hegemony” that has followed in his wake.
(3) Louis Althusser, Nicos Poulantzas, and the other structuralists.
(4) The analytical Marxist (or as they sometimes call themselves, the “no-bullshit Marxist) school: Jon Elster, John Roemer, Adam Przeworski, Erik Olin Wright, Robert Brenner, and others.
(5) Marxist feminism: Johanna Brenner, Nancy Hartsock, and others.
(6) Marxist state theory, most notably, of late, the French regulation school (see Bob Jessop’s work for a good summary of this work).
(7) Two schools of thought coming out of the U Mass – Amherst economics department: the social structures of accumulations school (Bowles and Gintis) and the “Rethinking Marxism” crowd (Resnick and Wolff).
(8) Marxist literary criticism – a huge enterprise, of which Terry Eagleton and Frederic Jameson are probably the leading lights.
(9) The “political Marxism” perspective growing out of Robert Brenner’s work, including Ellen Meiksins Wood, Benno Teschke, and others.
(10) Critical geography – the best-known thinker here would be David Harvey.
And on and on… So, you see, Neo-Marxism isn’t just a compact school of thought. It’s an entire range of ways of seeing the humanities and the social sciences. If you really want an introduction to the whole range, I’d suggest that you check out the online version of Erik Olin Wright’s graduate class: Sociology 621: Class, State, and Ideology, found at https://www.ssc.wisc.edu/~wright/sociology621-2011.htm
Similar things could be said about feminist or postmodernist thought.
So which is Peterson talking about? That’s the problem–he never engages with any of these ideas, instead just associating them with everything he doesn’t like (e.g. gender quotas and speech policing).
This article does a good job of explaining what’s wrong with Peterson’s constant invoking of the phrase “cultural (or Neo-) marxism”:”
Scholars…do not…suggest that the Frankfurt School or other “cultural Marxists” ever had a plan to destroy the moral fibre of Western civilization, or to use their critique of culture as a springboard to a totalitarian regime. That would be difficult to argue in all seriousness because Western “cultural Marxists” going back to the 1920s have typically been hostile to state power, social oppression of the individual, and Soviet Marxism itself. Moreover, they have shown considerable variation among themselves in their attitudes to specific social, moral, and cultural issues. There is no cultural Marxist master plan.
More generally, serious intellectual history cannot ignore the complex cross-currents of thought within the Left in Western liberal democracies. The Left has always been riven with factionalism, not least in recent decades, and it now houses diverse attitudes to almost any imaginable aspect of culture (as well as to traditional economic issues). Many components of the Western cultural Left can only be understood when seen as (in part) reactions to other such components, while being deeply influenced by Western Marxism’s widespread criticism and rejection of Soviet communism.
In the upshot, all the talk of cultural Marxism from figures on the (far) Right of politics is of little aid to understanding our current cultural and political situation. At best, this conception of cultural Marxism is too blunt an intellectual instrument to be useful for analysing current trends. At its worst, it mixes wild conspiracy theorizing with self-righteous moralism.
None of this is to deny the moderate thesis that much contemporary cultural criticism has roots that trace back to the 1960s New Left, the Frankfurt and Birmingham Schools, and various Marxist theories of culture. In that sense, contemporary cultural criticism extends a cultural Marxist tradition, but this tradition largely defined itself against Soviet Marxism. Theoretically, at least, it displays an antipathy to authoritarianism, and it aspires to liberate the autonomy of individuals.
Furthermore, contemporary cultural criticism (and much left-wing political thought and activism) has morphed into a form of Western post-Marxism. It has not only turned away from Marxist-Leninism, but evolved to a point where it has lost much contact with Marxism itself.
Current left-wing activism can, indeed, display hyperbolic, philistine, and authoritarian tendencies, but these have little to do with any influence from Marx, Soviet totalitarianism, or the work of the Frankfurt School. They have more, I suspect, to do with tendencies toward moral and political purity in almost any movement that seeks social change…
Neither does Peterson ever seriously engage with the ideas of Postmodernism:
“Postmodernism” …is often used to imply some divorcing of a political debate from objective truth or reality and isn’t actually rooted in an understanding of postmodern philosophy. Instead, it’s used to downplay evidence someone doesn’t like as being subjective while upholding evidence someone does like as objective.
I’ve never seen him engage with any of these specific ideas, just pull them out of context to pillory them. This is not what I expect of someone who is held up as a serious scholar and an important public intellectual whose ideas are worth paying attention to. To claim that mantle, he must take others’ ideas seriously as well.
This video does a very good job of debunking Peterson’s (and the alt-right more generally), favorite pet theory:
The last time we mixed together the anger and economic pain of large numbers of white males, radical anticommunism and esoteric mysticism, we didn’t end up with a very good result, especially when the society was full of disillusioned military veterans.
Mishra put his finger on something that bothered me a great deal but couldn’t quite articulate.
Now, I happen to know a bit about this stuff. Raiders of the Lost Ark is my favorite film, and I spent quite a long time coming up with my own “Americans versus Nazis and the Occult” idea for a novel (which I’ll keep to myself). But as part of that, I did extensive research into the role that occult ideas played in the rise of the Nazi Party, and the extent to which these societies played in the social organization of the radical extremist parties in Europe. See, for example, the Thule Society:
The Thule Society was a German occultist and völkisch group founded in Munich right after World War I, named after a mythical northern country in Greek legend. The society is notable chiefly as the organization that sponsored the Deutsche Arbeiterpartei (DAP; German Workers’ Party), which was later reorganized by Adolf Hitler into the National Socialist German Workers’ Party (NSDAP or Nazi Party). According to Hitler biographer Ian Kershaw, the organization’s “membership list … reads like a Who’s Who of early Nazi sympathizers and leading figures in Munich”, including Rudolf Hess, Alfred Rosenberg, Hans Frank, Julius Lehmann, Gottfried Feder, Dietrich Eckart, and Karl Harrer.
[Alfred] Rosenberg was inspired by the theories of Arthur de Gobineau, in his 1853–1855 book An Essay on the Inequality of the Human Races, and by Houston Stewart Chamberlain. Rosenberg’s The Myth of the Twentieth Century was conceived as a sequel to Chamberlain’s 1899 book The Foundations of the Nineteenth Century. Rosenberg believed that God created mankind as separate, differentiated races in a cascading hierarchy of nobility of virtue, not as separate individuals or as entities with “blank slate” natures. Rosenberg harshly rejected the idea of a “globular” mankind of homogeneity of nature as counter-factual, and asserted each biological race possesses a discrete, unique soul, claiming the Caucasoid Aryan race, with Germanic Nordics supposedly composing its vanguard elite, as qualitatively superior, in a vaguely “ontological” way, in comparison to all other ethnic and racial groupings: the Germanic Nordic Aryan as Platonic ideal of humankind. Other influences included the anti-modernist, “revolutionary” ideas of Friedrich Nietzsche, Richard Wagner’s Holy Grail romanticism inspired by the neo-Buddhist thesis of Arthur Schopenhauer, Haeckelian mystical vitalism, the medieval German philosopher Meister Eckhart and the heirs of his mysticism and Nordicist Aryanism in general.
Or a later example from after the War:
It isn’t hard these days to find discussions of Savitri Devi’s books on neo-Nazi web forums, especially The Lightning and the Sun, which expounds the theory that Hitler was an avatar – an incarnation – of the Hindu god Vishnu, and Gold in the Furnace, which urges true believers to trust that National Socialism will rise again. The American extreme-right website Counter-Currents hosts an extensive online archive of her life and work.
Her views are reaching a wider public, too, thanks to American alt-right leaders such as Richard Spencer and Steve Bannon, former Trump chief strategist and chair of Breitbart News, who have taken up the account of history as a cyclical battle between good and evil — a theory she shared with other 20th Century mystical fascists.
This isn’t the place to go into great detail about this. But I do know that similarly fruity and half-baked ideas were very popular with the small cabal of radicals who took over Germany’s government when Weimar fell. Ideas of an “Volkish” spirit outside of the real plane of existence were commonly held by many Nazis. So were “blood and soil” ideas and racist concepts that you see today in many “race realist” and HBD circles.
Nowhere in his published writings does Peterson reckon with the moral fiascos of his gurus and their political ramifications; he seems unbothered by the fact that thinking of human relations in such terms as dominance and hierarchy connects too easily with such nascent viciousness such as misogyny, anti-Semitism and Islamophobia. He might argue that his maps of meaning aim at helping lost individuals rather than racists, ultra-nationalists, or imperialists. But he can’t plausibly claim, given his oft-expressed hostility to the “murderous equity doctrine” of feminists, and other progressive ideas, that he is above the fray of our ideological and culture wars.
Indeed, the modern fascination with myth has never been free from an illiberal and anti-democratic agenda. Richard Wagner, along with many German nationalists, became notorious for using myth to regenerate the volk and stoke hatred of the aliens—largely Jews—who he thought polluted the pure community rooted in blood and soil. By the early twentieth century, ethnic-racial chauvinists everywhere—Hindu supremacists in India as well as Catholic ultra-nationalists in France—were offering visions to uprooted peoples of a rooted organic society in which hierarchies and values had been stable. As Karla Poewe points out in New Religions and the Nazis (2005), political cultists would typically mix “pieces of Yogic and Abrahamic traditions” with “popular notions of science—or rather pseudo-science—such as concepts of ‘race,’ ‘eugenics,’ or ‘evolution.’” It was this opportunistic amalgam of ideas that helped nourish “new mythologies of would-be totalitarian regimes.”
Wither Blank Slatism?
Peterson often accuses his opponents of “blank slatism,” that is, believing differences in gender and abilities are simply “cultural constructs” and the product of an unjust social order. He has never, as far as I can tell, positively identified or referred to this in the actual writings of his opponents. He is fond of quoting Orwell’s jibe about socialists “not loving the poor but hating the rich.” He quotes that one endlessly. But he takes it out of context from a book where Orwell advocated FOR socialism, as this article points out:
Orwell flat-out says that anybody who evaluates the merits of socialist policies by the personal qualities of socialists themselves is an idiot. Peterson concludes that Orwell thought socialist policies was flawed because socialists themselves were bad people. I don’t think there is a way of reading Peterson other than as extremely stupid or extremely dishonest, but one can be charitable and assume he simply didn’t read the book that supposedly gave him his grand revelation about socialism.
For example, I’ve never heard Peterson utter even one actual quote from Marx! I mean, it’s not like the man never wrote anything. If his ideas inevitably lead to mass murder and the gulag, then why not provide direct quotes which back that up? Even Postmodernists are never cited directly, only books about them, such as Explaining Postmodernism.
Neither Derrida nor Foucault is cited in 12 Rules for Life. Apparently, not only has Peterson never bothered to actually read them, he seems not to have even read their Wikipedia entries. The only relevant citation is of a book called Explaining Postmodernism: Skepticism and Socialism from Rousseau to Foucault, which he customarily recommends at speaking engagements. The author, Stephen Hicks, is Executive Director of the Center for Ethics and Entrepreneurship at Rockford University, and an acolyte of Ayn Rand. Armed with this dubious secondary source, Peterson is left making statements that are not only mired in factual error, but espouse a comically reductive conception of how social life and history work. He takes a common misunderstanding at face value, proceeding to build a whole outlook on it.
Thus he can continue to misrepresent some shadowy “other” without naming names. This means no one individual can stand up and say, as Peterson so often does, that he’s “misrepresenting my ideas.” Instead, Peterson claims to be in opposition to a broad, undifferentiated “radical Left”–a shadowy group with no real face and ideas defined primarily by him. He can then beat the stuffing out of this straw man endlessly.
Is his characterization of his enemies’ ideas accurate? Well, to take just one example, I decided to listen to the BBC’s In Our Time episode on Feminism. I liked their episode on the Frankfurt School, and I thought it might give me some historical perspective on feminism. Instead, it was more of a dialogue/conversation between the host and two scholars of feminism and authors of several books. Here is how the host begins the program:
Melvin Bragg (host): Helena Cronin, you have written that men are by nature more ambitious, status-conscious, dedicated, single-minded, and perservering than women. You say that this a two-million year old fact, and we should accept it. Can you develop that, please?
Helena Cronin: Yes, of course they are. There’s quite a large psychological difference between men and women. Natural selection didn’t just shape our bodies differently but it shaped our minds differently as well. Think of it this way: give a man 50 wives and he can have children galore. Give a woman 50 husbands, no use whatsoever. Over evolutionary time, natural section has favored those men who have competed like mad to get mates. Over evolutionary time, natural selection has favored the women who have been judicious about which men they’ve taken. we are all descendants of the competitive men and of the judicious women.
MB: If you take those adjectives one by one, though, you could say that…take competitive. Well, Very few men have been as competitive as Margaret Thatcher; single minded, hundreds of women I could think of, tens of women I could think of even personally are very single minded; persevering, think of doctors and teachers and so on. Do these things apply now in the way that you think they have applied for two million years?
HC: They certainly apply now in exactly the way they did in that genes are still building our minds and bodies in the way they have for two million years. And the difference in psychology between men and women. Whats changed now, of course, is that women have fought and struggled for more opportunities. And those women who, on average, would have performed more like men are now able to. But that’s a statistical difference. One can say statistically that men are taller than women. And it’s certainly true that there are some tall women around, but all the tallest people are men.
Similarly, although women are now being given opportunities, and we can find the Margaret Thatchers and so on that couldn’t have existed years ago, statistically, nevertheless, women are on average far less competitive than men.
The other guest, feminist author Germaine Greer, responds:
I actually think I probably agree that masculinity is very different from femininity. I certainly believe that. But I also believe that men work very hard at creating masculinism and they put themselves through extraordinary disciplines. There’s a lot of aspects to the way they behave which are highly cultural and extremely protean: could change pretty quickly…the point is, culture does different things with biology…
Feminists and the “radical Left” refuse to acknowledge gender differences? Really? That’s not what it sounds like to me. They don’t really disagree on the basics, just on the emphasis. If you go on listening, you find that they do have their disagreements, but it’s much more complex than Peterosn’s cardboard caricatures of feminists. Later on, there is this exchange:
Helena Cronin: “Men and women are different. you’re assuming that this is in some way inimical to feminism.”
Melvin Bragg: “To a certain extent it is.”
Helena Cronin: “No, that’s where I strongly disagree with you…”
Of course, I can always find some fringe scholar who believes anything if I look hard enough. Ironically, Peterson’s own Reddit site recommends “steel manning,” or arguing against the strongest version of your opponents case. But I think the above proves that Peterson’ popularity (and Patreon donations) are predicated on him doing the exact opposite.
Alternatives to the 12 Rules
One of Peterson’s basic points I read as this: Any political system which goes against basic human nature is doomed to fail. On this point, we agree. We just have different views on what is compatible with human nature.
He also argues that when we see differential outcomes, such as more men graduating with engineering degrees, or more male CEO’s, we shouldn’t automatically assume some sort of bias or discrimination is present. This is an important point, and I agree with it. There are other factors we should consider.
He also argues that we shouldn’t subsume our individuality in the service of a group identity, and opposes notions of “collective guilt.” These are also well-founded. However, his dismissal of any and all forms of oppression throughout history strikes me as an extreme position.
If you like Peterson’s political philosophies, then you may be less an anti-Marxist than a Burkean Conservative. This column from John Michael Greer is still the best articulation of Edmund Burke’s philosophy that I’ve read anywhere:
The foundation of Burkean conservatism is the recognition that human beings aren’t half as smart as they like to think they are. One implication of this recognition is that when human beings insist that the tangled realities of politics and history can be reduced to some set of abstract principles simple enough for the human mind to understand, they’re wrong. Another is that when human beings try to set up a system of government based on abstract principles, rather than allowing it to take shape organically out of historical experience, the results will pretty reliably be disastrous.
What these imply, in turn, is that social change is not necessarily a good thing. It’s always possible that a given change, however well-intentioned, will result in consequences that are worse than the problems that the change is supposed to fix. In fact, if social change is pursued in a sufficiently clueless fashion, the consequences can cascade out of control, plunging a nation into failed-state conditions, handing it over to a tyrant, or having some other equally unwanted result. What’s more, the more firmly the eyes of would-be reformers are fixed on appealing abstractions, and the less attention they pay to the lessons of history, the more catastrophic the outcome will generally be.
That, in Burke’s view, was what went wrong in the French Revolution. His thinking differed sharply from continental European conservatives, in that he saw no reason to object to the right of the French people to change a system of government that was as incompetent as it was despotic. It was, the way they went about it — tearing down the existing system of government root and branch, and replacing it with a shiny new system based on fashionable abstractions — that was problematic. What made that problematic, in turn, was that it simply didn’t work. Instead of establishing an ideal republic of liberty, equality, and fraternity, the wholesale reforms pushed through by the National Assembly plunged France into chaos, handed the nation over to a pack of homicidal fanatics, and then dropped it into the waiting hands of an egomaniacal warlord named Napoleon Bonaparte.
Two specific bad ideas founded in abstractions helped feed the collapse of revolutionary France into chaos, massacre, tyranny, and pan-European war. The first was the conviction, all but universal among the philosophes whose ideas guided the revolution, that human nature is entirely a product of the social order. According to this belief, the only reason people don’t act like angels is that they live in an unjust society, and once that is replaced by a just society, why, everybody would behave the way the moral notions of the philosophes insisted they should. Because they held this belief, in turn, the National Assembly did nothing to protect their shiny up-to-date system against such old-fashioned vices as lust for power and partisan hatred, with results that made the streets of Paris run with blood.
The second bad idea had the same effect as the first. This was the conviction, also all but universal among the philosophes, that history moved inevitably in the direction they wanted: from superstition to reason, from tyranny to liberty, from privilege to equality, and so on. According to this belief, all the revolution had to do to bring liberty, equality, and fraternity was to get rid of the old order, and voila — liberty, equality, and fraternity would pop up on cue. Once again, things didn’t work that way. Where the philosophes insisted that history moves ever upward toward a golden age in the future, and the European conservatives who opposed them argued that history slides ever downward from a golden age in the past, Burke’s thesis — and the evidence of history — implies that history has no direction at all.
The existing laws and institutions of a society, Burke proposed, grow organically out of that society’s history and experience, and embody a great deal of practical wisdom. They also have one feature that the abstraction-laden fantasies of world-reformers don’t have, which is that they have been proven to work. Any proposed change in laws and institutions thus needs to start by showing, first, that there’s a need for change; second, that the proposed change will solve the problem it claims to solve; and third, that the benefits of the change will outweigh its costs. Far more often than not, when these questions are asked, the best way to redress any problem with the existing order of things turns out to be the option that causes as little disruption as possible, so that what works can keep on working.
That is to say, Burkean conservatism can be summed up simply as the application of the precautionary principle to the political sphere.
I would assume that Peterson would agree with the obvious falseness of this sentiment: “…the only reason people don’t act like angels is that they live in an unjust society, and once that is replaced by a just society, why, everybody would behave the way the moral notions of the philosophes insisted they should.” This is what he claims “social justice warriors” believe. And if that’s true, then I agree with Peterson. It’s true that certain Utopian factions of the Left have made this mistake and gone too far down this road. to that extent, those ideas deserve criticism.
For what it’s worth, Peterson doesn’t see himself as a conservative, so much as a “terrified traditionalist” who generally believes in exercising caution over endorsing sweeping or radical cultural changes.
But it’s worth noting that Burke wasn’t criticizing Marxism, he was criticizing the French Revolution, a revolution which took place before Marx was even born! One wonders how exactly Marxism was responsible for this spasm of bloodshed over extreme inequality? Or perhaps it’s just that revolutions are inherently bloody business, regardless of what philosophy the revolutionaries ostensibly use to justify them. It just so happens most of the big ones in the twentieth century claimed to be channeling the spirit of Marx. In fact, Marx specifically warned against the tendency toward authoritarianism:
Neither of us cares a straw for popularity. A proof of this is for example, that, because of aversion to any personality cult, I have never permitted the numerous expressions of appreciation from various countries with which I was pestered during the existence of the International to reach the realm of publicity, and have never answered them, except occasionally by a rebuke. When Engels and I first joined the secret Communist Society we made it a condition that everything tending to encourage superstitious belief in authority was to be removed from the statutes.
Indeed, the idea that an unjust social order is responsible for society’s ills is an Enlightenment one, and not one specific to Marxism per se, as Peterson insists. For example, it was the “classical liberal” the Marquis de Condorcet, not Karl Marx, who penned the following:
The real advantages that should result from this progress, of which we can entertain a hope that is almost a certainty, can have no other term than that of the absolute perfection of the human race; since, as the various kinds of equality come to work in its favor by producing ampler sources of supply, more extensive education, more complete liberty, so equality will be more real and will embrace everything which is really of importance for the happiness of human beings …
John Gray has also pointed this out:
The repression of liberty that took place in the countries in which Communist regimes were established cannot be adequately explained as a product of backwardness, or of errors in the application of Marxian theory. It was the result of a resolute attempt to realize an Enlightenment utopia – a condition of society in which no serious form of conflict any longer exists.
The idea of evil as it appears in modern secular thought is an inheritance from Christianity. To be sure, rationalists have repudiated the idea; but it is not long before they find they cannot do without it. What has been understood as evil in the past, they insist, is error – a product of ignorance that human beings can overcome. Here they are repeating a Zoroastrian theme, which was absorbed into later versions of monotheism: the belief that ‘as the “lord of creation” man is at the forefront of the contest between the powers of Truth and Untruth.’ But how to account for the fact that humankind is deaf to the voice of reason? At this point rationalists invoke sinister interests – wicked priests, profiteers from superstition, malignant enemies of enlightenment, secular incarnations of the forces of evil. As so often is the case, secular thinking follows a pattern dictated by religion while suppressing religion’s most valuable insights. Modern rationalists reject the idea of evil while being obsessed by it. Seeing themselves as embattled warriors in a struggle against darkness, it has not occurred to them to ask why humankind is so fond of the dark. They are left with the same problem of evil that faces religion. The difference is that religious believers know they face an insoluble difficulty, while secular believers do not. Aware of the evil in themselves, traditional believers know it cannot be expelled from the world by human action. Lacking this saving insight, secular believers dream of creating a higher species. They have not noticed the fatal flaw in their schemes: any such species will be created by actually existing human beings.
Peterson’s views, by contrast, are more in line with Herbert Spencer’s Social Darwinsim as explained by Marx’s son in law Paul Lafarge:
“No political alchemy will get golden conduct out of leaden instincts; … no well-working institution will be framed by an ill-working humanity — hence mankind must abandon all hope of bettering our present system of society and of doing away with the wrongs and miseries of it.”
Another strain of thought similar to Peterson’s is Stoicism.
Like Peterson, Stoicism is interested in suffering and how to overcome it. It does not deny the harsh nature of existence. Like Peterson, stoicism differentiates between the things that are under our control and the things that aren’t. And it advocates mastering those aspects of your life you can control, while accepting those you cannot. Indeed, the very word stoic in English has come to mean “accepting one’s burdens without complaint.”
Stoicism has undergone something of a revival in these tumultuous times. There are many resources out there. I would recommend reading them.
As for the rest of Peterson’s rhetoric, you can get it from other wisdom sources who wrote long before Peterson without all the political baggage. For example, I ran across these quotes from the French writer Antoine de Saint Exupéry:
Each man must look to himself to teach him the meaning of life. It is not something discovered: it is something molded. These prison walls that this age of trade has built up round us, we can break down. We can still run free, call to our comrades, and marvel to hear once more, in response to our call, the impassioned chant of the human voice.
To be a man is, precisely, to be responsible. It is to feel shame at the sight of what seems to be unmerited misery. It is to take pride in a victory won by one’s comrades. It is to feel, when setting one’s stone, that one is contributing to the building of the world.
If it is true that wars are won by believers, it is also true that peace treaties are sometimes signed by businessmen.
If we could dredge up something forgotten not only by ourselves but by our whole generation or our entire civilization, we should become indeed the boonbringer, the culture hero of the day—a personage of not only local but world historical moment. In a word: the first work of the hero is to retreat from the world scene of secondary effects to those causal zones of the psyche where the difficulties really reside, and there to clarify the difficulties, eradicate them in his own case (i.e., give battle to the nursery demons of his local culture) and break through to the undistorted, direct experience and assimilation of what C. G. Jung called “the archetypal images.” This is the process known to Hindu and Buddhist philosophy as viveka, “discrimination.”
Never esteem anything as of advantage to you that will make you break your word or lose your self-respect.
Remember this— that there is a proper dignity and proportion to be observed in the performance of every act of life.
Finally, I would agree with the sentiment expressed by one of the above articles, “This much should be obvious from even a cursory reading of him: If Jordan Peterson is the most influential intellectual in the Western world, the Western world has lost its damn mind.”
While doing research for my last post, I ran across an interesting juxtaposition. I was looking at postmodern philosophers, and according to Wikipedia, one of the most prominent American postmodernists was a guy called Richard Rorty.
So I thought that I should take a look at this Rorty guy if he’s emblematic of American postmodernism, the same philosophy that Peterson claims is simply Marxism in disguise and has a “death grip” on North American universities.
Richard Rorty (1931–2007) developed a distinctive and controversial brand of pragmatism that expressed itself along two main axes. One is negative—a critical diagnosis of what Rorty takes to be defining projects of modern philosophy. The other is positive—an attempt to show what intellectual culture might look like, once we free ourselves from the governing metaphors of mind and knowledge in which the traditional problems of epistemology and metaphysics (and indeed, in Rorty’s view, the self-conception of modern philosophy) are rooted.
The centerpiece of Rorty’s critique is the provocative account offered in Philosophy and the Mirror of Nature. In this book, and in the closely related essays collected in Consequences of Pragmatism, Rorty’s principal target is the philosophical idea of knowledge as representation, as a mental mirroring of a mind-external world.
Providing a contrasting image of philosophy, Rorty has sought to integrate and apply the milestone achievements of Dewey, Hegel and Darwin in a pragmatist synthesis of historicism and naturalism. Characterizations and illustrations of a post-epistemological intellectual culture, present in both PMN and CP, are more richly developed in later works, … In these writings, ranging over an unusually wide intellectual territory, Rorty offers a highly integrated, multifaceted view of thought, culture, and politics, a view that has made him one of the most widely discussed philosophers in our time.
Okay, well that’s pretty complicated, and I’m not sure what to make of it. Is this the stuff that’s turning college students into Maoist Red Guards?
But the interesting thing is that I found that some of Rorty’s writings went viral in the aftermath of Trump’s election victory in 2016, particularly this passage:
Members of labor unions, and unorganized unskilled workers, will sooner or later realize that their government is not even trying to prevent wages from sinking or to prevent jobs from being exported. Around the same time, they will realize that suburban white-collar workers — themselves desperately afraid of being downsized — are not going to let themselves be taxed to provide social benefits for anyone else.
At that point, something will crack. The nonsuburban electorate will decide that the system has failed and start looking for a strongman to vote for — someone willing to assure them that, once he is elected, the smug bureaucrats, tricky lawyers, overpaid bond salesmen, and postmodernist professors will no longer be calling the shots.
Hmmm. Sounds pretty damn accurate, doesn’t it? It’s even more impressive that it was written back in 1998 during the Clinton administration, before even George W. Bush much less Donald Trump.
But the most salient part of the article is Rorty’s discussion of identity politics and change in emphasis on the Leftist tradition in America. Far from being a proponent of identity politics, this philosopher–who is considered to be one of the exemplars of postmodernist thought in America–issues a stark warning to the American Left about focusing on identity politics to the exclusion of all else. He also eerily predicts the politics of today, including the rise of Dr. Jordan Peterson and the alt-right more generally.
He begins be reviewing how the focus of the left in america changed due to the Vietnam war:
The focus of leftist politics changed in the 1960s. For Rorty, the left ceased to be political and instead became a cultural movement…The Vietnam War, more than anything else, set the left on its new trajectory. The war was seen as an indictment of the whole system, of America as such. Thus the broader anti-communist Cold War become a central fault line for left-wing activists. Led largely by students, the new left regarded anyone opposed to communism — including Democrats, union workers, and technocrats — as hostile…
From [Rorty’s] perspective, the problem was the total rejection of pragmatic reform. The belief that there was nothing in America that could be salvaged, no institutions that could be corrected, no laws worth passing, led to the complete abandonment of conventional politics. Persuasion was replaced by self-expression; policy reform by recrimination.
There was a shift away from economics towards a “politics of difference” or “identity” or “recognition.” If the intellectual locus of pre-’60s leftism was social science departments, it was now literature and philosophy departments. And the focus was no longer on advancing alternatives to a market economy or on the proper balance between political freedom and economic liberalism. Now the focus was on the cultural status of traditionally marginalized groups…
And it did this by “teaching Americans to recognize otherness,” as Rorty put it. Multiculturalism, as it’s now called, was about preserving otherness, preserving our differences; it doesn’t oblige us to cease to notice those differences. There’s nothing morally objectionable about that. As a political strategy, however, [multiculturalism is] problematic. It reinforces sectarian impulses and detracts from coalition-building.
The pivot away from politics toward culture spawned academic fields like women and gender studies, African-American studies, Hispanic-American studies, LGBTQ studies, and so on. These disciplines do serious academic work, but they don’t minister to concrete political ends. Their goal has been to make people aware of the humiliation and hate endured by these groups, and to alienate anyone invested in that hate.
Wow, that sounds pretty dead-on. Indeed, even Wikipedia notes of “Western Marxism”
The phrase “Western Marxism” wasn’t coined until 1953, by Maurice Merleau-Ponty. While often contrasted with the Marxism of the Soviet Union, Western Marxists were often divided in their opinion of it and other Marxist-Leninist states…Since the 1960s, the concept has been closely associated with the New Left and the focus on identity politics and the cultural domain, rather than economics and class struggle (this became especially prominent in the United States and the Western world).
Rorty explains that this focus on marginalized groups will enable a populist right to emerge in response to Americans (especially white Americans) believing their culture is under attack. This will distract them from economic issues such as the consequences of globalism and financialization. The left’s focus on cultural issues thus created an opening for the populist right, for people like Pat Buchanan, and later Donald Trump, who galvanize support among the white working class by exploiting racial grievance, cultural differences and economic anxiety. As Rorty explains:
While the Left’s back was turned, the bourgeoisification of the white proletariat which began in WWII and continued up through the Vietnam War has been halted, and the process has gone into reverse. America is now proletarianizing its bourgeoisie, and this process is likely to culminate in bottom-up revolt, of the sort [Pat] Buchanan hopes to foment.
Buchanan, you might recall, was touting the “cultural Marxism” meme back in the Nineties, long before anyone had heard of an obscure Canadian psychology professor named Jordan Peterson. This article from a right-wing news site (back in 2010!) gives an overview of Mr. Buchanan’s worldview:
“The United States has undergone a cultural, moral and religious revolution. A militant secularism has arisen in this country. It has always had a hold on the intellectual and academic elites, but in the 1960s it captured the young in the universities and the colleges. “This is the basis of the great cultural war we’re undergoing….We are two countries now. We are two countries morally, culturally, socially, and theologically. Cultural wars do not lend themselves to peaceful co-existence. One side prevails, or the other prevails.
“The truth is that while conservatives won the Cold War with political and economic Communism, we’ve lost the cultural war with cultural Marxism, which I think has prevailed pretty much in the United States. It is now the dominant culture. Whereas those of us who are traditionalists, we are, if you will, the counterculture.”
So states Patrick J. Buchanan in the opening scenes of James Jaeger’s new film, Cultural Marxism: The Corruption of America. As always, Buchanan is outspoken and splendidly patriotic in his testimony on the present degeneration of our country. Many of us born before the 1960s and its shocking nihilism agree vehemently with him. We were raised in a land far removed philosophically from the America we are cursed with today, and this disturbing fact weighs heavily upon our hearts and minds.
I suggest reading the article in its entirety. These paragraphs, especially, sound eerily similar to the rhetoric of Dr. Peterson:
“Critical Theory,” the brain-child of Max Horkeimer, was the first and most important of these strategies. Under its auspices, every tradition of Western life was to be redefined as “prejudice” and “perversion.” And these redefinitions were to be instilled into the social stream via devastating, scholarly criticisms of all values such as the family, marriage, property, individualism, faith in God, etc. These criticisms proved to be quite successful in the aftermath of the world’s collapse into the Great Depression, which brought about widespread disillusionment with the traditional capitalist society that had evolved in the West since the Renaissance and discovery of the New World.
The strategic criticisms were soon expanded by demarcating society’s members as either “victims” or “oppressors.” All who were economically successful were defined as oppressors, and all who were not successful were termed victims. Religious authorities became “witch-doctors.” Advocates of different social roles for men and women became “fascists.” Corporate heads became “exploiters.” Fathers became “patriarchal tyrants.” Families became “primitive clans.” The stream of criticism was relentless and extremely sophisticated in an intellectual sense. Thus it mesmerized the pundit class who then disseminated the criticisms’ fundamental content to the populace at large.
Compare to Peterson’s rhetoric cited in my previous post:
The postmodernists built on the Marxist ideology, Peterson said. “They started to play a sleight of hand, and instead of pitting the proletariat, the working class, against the bourgeois, they started to pit the oppressed against the oppressor. That opened up the avenue to identifying any number of groups as oppressed and oppressor and to continue the same narrative under a different name.”…“And so since the 1970s, under the guise of postmodernism, we’ve seen the rapid expansion of identity politics throughout the universities,” he said. “It’s come to dominate all of the humanities—which are dead as far as I can tell—and a huge proportion of the social sciences.”…“We’ve been publicly funding extremely radical, postmodern leftist thinkers who are hellbent on demolishing the fundamental substructure of Western civilization. And that’s no paranoid delusion. That’s their self-admitted goal,” …
All Peterson does is transfer the culpability for undermining Western civilization from the 1930’s Frankfurt School to the 1960’s French Postmodernists. Note that the idea that multiculturalism is an attack on “Western values” and that all of our major institutions have been taken over by socialist-minded elites imposing their views from above is a staple of alt-right thinking. It was an intrinsic part of Anders Breivik’s manifesto published right before his killing spree.
And Peterson wonders why they’re protesting.
Rorty’s prescient warning was that elites would emphasize identity politics on purpose in order to divide the working classes and keep them from coalescing around an economic agenda that would endanger elite power (unions, higher minimum wages, universal healthcare, higher taxes on unearned wealth, financial regulations, job creation, etc.):
By divorcing itself from class and labor issues, the left lost sight of its economic agenda and waged a culture war that empowers the right and has done little to improve the lives of the very people it seeks to defend. Rorty’s advice to the left was to pay attention to who benefits from such a strategy:
The super-rich will have to keep up the pretense that national politics might someday make a difference. Since economic decisions are their prerogative, they will encourage politicians of both the Left and the Right, to specialize in cultural issues. The aim will be to keep the minds of the proles elsewhere – to keep the bottom 75 percent of Americans and the bottom 95 percent of the world’s population busy with ethnic and religious hostilities, and with debates about sexual mores. If the proles can be distracted from their own despair by media-created pseudo-events…the super-rich will have little to fear.
Big business benefits most from the culture wars. If the left and the right are quarreling over religion or race or same-sex marriage, nothing much changes, or nothing that impacts wealth concentration changes. Rorty is particularly hard on Presidents Jimmy Carter and Bill Clinton, both of whom he accuses of retreating “from any mention of redistribution” and of “moving into a sterile vacuum called the center.” The Democratic Party, under this model, has grown terrified of redistributionist economics, believing such talk would drive away the suburbanite vote. The result, he concludes, is that “the choice between the major parties has come down to a choice between cynical lies and terrified silence.”
Rorty’s concern was not that the left cared too much about race relations or discrimination (it should care about these things); rather, he warned that it stopped doing the hard work of liberal democratic politics. He worried that it’s retreat into academia, into theory and away from the concrete, would prove politically disastrous.
Immediately after the now-famous passage about a future “strongman,” Rorty offered yet another disturbing prophecy:
One thing that is very likely to happen is that the gains made in the past forty years by black and brown Americans, and by homosexuals, will be wiped out. Jocular contempt for women will come back into fashion. The words ‘nigger’ and ‘kike’ will once again be heard in the workplace. All the sadism which the academic Left has tried to make unacceptable to its students will come flooding back. All the resentment which badly educated Americans feel about having their manners dictated to them by college graduates will find an outlet.
If this were to happen, Rorty added, it would be a calamity for the country and the world. People would wonder how it happened, and why the left was unable to stop it. They wouldn’t understand why the left couldn’t “channel the mounting rage of the newly dispossessed” and speak more directly to the “consequences of globalization.” They would conclude that the left had died, or that it existed but was “no longer able to engage in national politics.”
“Jocular contempt for women will come back into fashion…All the resentment which badly educated Americans feel about having their manners dictated to them by college graduates will find an outlet…” Er, holy shit, this is exactly what has happened! I mean, does this not explain the rise of the alt-right movement in a nutshell? And he wrote this back in 1998, before anyone had heard of 4chan, Reddit, Facebook or YouTube!!!
Who benefits from such a strategy? Maybe the same people promoting Dr. Peterson as “the world’s most important public intellectual.”
So, not only does this prominent postmodern philosopher NOT endorse identity politics, but he explicitly warns against it! Of course, this is just one individual. But it certainly argues against the fact that some shadowy, united cabal of radical leftist postmodernists is enthusiastically pushing identity politics and multiculturalism to undermine the West and turn us all into communists. Or that this strategy is successful.
Instead of identity politics and media shaming, what would be successful?. Rorty suggests:
…Rorty’s vision of an “inspirational liberalism” is worth revisiting…The first of his three lectures is devoted to John Dewey and Walt Whitman, both of whom, on his view, personified American liberalism at its best. These were pragmatists who understood the role of national pride in motivating political change. They understood that politics is a game of competing stories “about a nation’s self-identity, and between differing symbols of its greatness.”
The strength of Dewey and Whitman was that they could look at America’s past with clear eyes…and go beyond the disgust it invoked, beyond the cultural pessimism. They articulated a civic religion that challenged the country to do better, to forge a future that lived up to the promise of America. In Rorty’s words, they recognized that “stories about what a nation has been and should try to be are not attempts at accurate representation, but rather attempts to forge a moral identity.”
Both the Right and the left have a story to tell, and the difference is enormous:
For the Right never thinks that anything much needs to be changed: it thinks the country is basically in good shape, and may well have been in better shape in the past. It sees the Left’s struggle for social justice as mere troublemaking, as utopian foolishness. The Left, by definition, is the party of hope. It insists that our nation remains unachieved.
“[The Right] sees the Left’s struggle for social justice as mere troublemaking, as utopian foolishness.” Well now, that’s a pretty accurate description of the heart of Jordan Peterson’s worldview as far as I can tell. To reinforce this point, Peterson deploys ideas from Darwinism, such as his now infamous discussion of lobster battles for hierarchical supremacy.
The Perplexing Mr. Nietzsche
Speaking of philosophers, is anyone more confused and misunderstood that Mr. Nietzsche?
In the right-wing article on multiculturalism cited above, Nietzsche is cited as an inspiration for the evil cultural Marxist conspiracy:
The cultural Marxists adopted Nietzsche’s “transvaluation of all values,” in which the Mad Hatter’s world is instituted. Everything that previously was an evil now becomes a virtue while all the old virtues become evils. Individualism, self-reliance, property, profit, family, traditional marriage, fidelity to spouse, strength of will, personal honor, rising through merit — all these integral pillars of our civilization become distinctive evils that oppress us as humans. They must be rooted out of our existence.
Yet, at the same time, Nietzsche is also a favorite philosopher of the alt-right:
In her recent book about the rise of the alt-right, Irish academic Angela Nagle discusses their obsession with civilizational decay. “They’re disgusted by what they consider a degenerate culture,” she told me in a recent interview.
Nietzsche made these same arguments more than 100 years ago. The story he tells in The Genealogy of Morality is that Christianity overturned classical Roman values like strength, will, and nobility of spirit. These were replaced with egalitarianism, community, humility, charity, and pity. Nietzsche saw this shift as the beginning of a grand democratic movement in Western civilization, one that championed the weak over the strong, the mass over the individual.
The alt-right — or at least parts of the alt-right — are enamored of this strain of Nietzsche’s thought. The influential alt-right blog Alternative Right refers to Nietzsche as a great “visionary” and published an essay affirming his warnings about cultural decay.
“Future historians will likely look back on the contemporary West as a madhouse,” the essay’s author writes, “where the classic virtues of heroism, high culture, nobility, self-respect, and reason had almost completely disappeared, along with the characteristics of adulthood generally.”
Nietzsche is also frequently cited by many white nationalists:
“You could say I was red-pilled by Nietzsche.”
That’s how white nationalist leader Richard Spencer described his intellectual awakening to the Atlantic’s Graeme Wood last June. “Red-pilled” is a common alt-right term for that “eureka moment” one experiences upon confrontation with some dark and previously buried truth.
For Spencer and other alt-right enthusiasts of the 19th-century German philosopher Friedrich Nietzsche, that dark truth goes something like this: All the modern pieties about race, peace, equality, justice, civility, universal suffrage — that’s all bullshit. These are constructs cooked up by human beings and later enshrined as eternal truths.
Nietzsche says the world is in constant flux, that there is no capital-T truth. He hated moral and social conventions because he thought they stifled the individual. In one of his most famous essays, The Genealogy of Morality, which Spencer credits with inspiring his awakening, Nietzsche tears down the intellectual justifications for Christian morality. He calls it a “slave morality” developed by peasants to subdue the strong. The experience of reading this was “shattering,” Spencer told Wood. It upended his “moral universe.”
“There is no capital-T truth? All modern pieties are bullshit? Stifling the individual? This seems like exactly the sort of stuff Peterson regularly rails against in his attacks on postmodernism.
Peterson’s embracing of Nietzsche is also troubling. Nietzsche was, of course, associated with the Nazis, mainly through his sister, who was a fan of the movement and intentionally distorted his posthumous writings to reflect that. But pinning Nazism on Nietzsche would be as disingenuous as pinning the crimes of Communism on Marx. Yet his promotion of order as being a “masculine” phenomenon, (Logos) and chaos being a “feminine” phenomenon strike me as vaguely authoritarian. Peterson claims he is actually anti-authoritarian, and an avowed enemy of “extremism” of both the Left AND the Right. But it’s hard to get that from his metaphysics. An obsession with “order” and “masculine virtues” are both staples of right-wing thought. So is an obsession with “civilizational decline.” According to the right, civilizational Decline comes about when feminine ‘chaos” triumphs over masculine “order.”–the same affliction the alt-right claims is weakening society.
Much of Peterson’s philosophy is responding to Nietzsche, and it does so in two ways: He agrees with Nietzsche that life is hard and will inevitably involve enduring misery. To survive, one must be prepared for this. But for Peterson, preparation does not involve defining one’s own truth and reality, as Nietzsche said. Instead of assuming the world will conform to one’s own will, Peterson advocates the importance of taking responsibility for oneself and living in accordance with the objective reality of the world around us.
For Peterson, there is objective truth and reality, and we cannot simply transcend all moral frameworks and create truth for ourselves…To deny these constraints leads to chaos—internally, interpersonally, societally. This is the main point of Peterson’s recently released Twelve Rules for Life: An Antidote to Chaos, wherein he lays out a moral framework that he believes will help people live life to the fullest—however unavoidably tragic life may be. Rule Eight: “Tell the Truth—or, at least, don’t lie,” addresses the Nietzschean, post-modern axiom of the subjectivity of truth head on. Peterson contends that we intuitively know what truth is, and that “lies make you weak and you can feel it . . . you cannot get away with warping the structure of being.” …Similarly, Rule Seven — Pursue what is meaningful, not what is expedient — also defies Nietzschean nihilism and corresponds with Peterson’s understanding of an objective reality. “Meaning is what we do to buttress our self against the tragedy of life … our pursuit of meaning is an instinct. Perhaps our deepest instinct… meaning is the antidote to the malevolence of life.” To deny meaning exists, to pursue happiness instead of meaning, or to seek meaning in the wrong things will lead to chaos.
But Peterson borrows from, in addition to criticizing, Nietzsche. Both men rail against the “last man,” the human type that seeks to shirk risk and responsibility in favor of comfort and safety. Like Nietzsche, Peterson’s view offers an “ideal human type” that lives by a superior code. For Nietzsche it was Übermensch that lived by a code of his own creation— a “master morality” of “might makes right,” also popularized by Thrasymachus in Book I of Plato’s Republic. For Peterson, the ideal is a mode of existence wherein one lives within the preordained structure of the universe and nobly grits the challenges that life throws their way.
Is the “radical Left” really the biggest problem in the world today? If Postmodernism is a philosophy that rejects all truth and universal values and defines reality as whatever one chooses it to be, isn’t that more compatible with right-wing politics in America today? Consider the quote of a Bush administration official:
The phrase [Reality-based community] was attributed by journalist Ron Suskind to an unnamed official in the George W. Bush Administration who used it to denigrate a critic of the administration’s policies as someone who based their judgements on facts. In a 2004 article appearing in the New York Times Magazine, Suskind wrote:
The aide said that guys like me were ‘in what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality.’ […] ‘That’s not the way the world really works anymore,’ he continued. ‘We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality—judiciously, as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors…and you, all of you, will be left to just study what we do’.
The source of the quotation was later identified as Bush’s senior advisor Karl Rove, although Rove has denied saying it.
“Create your own reality?” Sounds pretty postmodern to me. And from the very next Republican administration:
“Alternative facts” is a phrase used by U.S. Counselor to the President Kellyanne Conway during a Meet the Press interview on January 22, 2017, in which she defended White House Press Secretary Sean Spicer’s false statement about the attendance numbers of Donald Trump’s inauguration as President of the United States. When pressed during the interview with Chuck Todd to explain why Spicer “utter[ed] a provable falsehood”, Conway stated that Spicer was giving “alternative facts”. Todd responded, “Look, alternative facts are not facts. They’re falsehoods.”
Conway’s use of the phrase “alternative facts” to describe demonstrable falsehoods was widely mocked on social media and sharply criticized by journalists and media organizations…The phrase was extensively described as Orwellian. Within four days of the interview, sales of the book 1984 had increased by 9,500%…
It doesn’t get more postmodern than that does it? Create your own reality? Alternative Facts? The world has no objective order or reality. It is up to us to define our own truth, purpose and reality for ourselves. Consider this quote from Peterson:
18:06: Among these post-modernist types, man, they don’t give a damn for facts. In fact, facts for them are currently whatever the current power hierarchy uses to justify their acquisition of power.
Sounds like the trump administration to me. And is it the Left who is really anti-science?
The Washington Post recently reported that officials at the Center for Disease Control were ordered not to use words like “science-based,” apparently now regarded as disablingly left-leaning. But further reporting in the New York Times appears to show that the order came not from White House flunkies but from officials worried that Congress would reject funding proposals marred by the offensive terms. One of our two national political parties — and its supporters — now regards “science” as a fighting word. Where is our Robert Musil, our pitiless satirist and moralist, when we need him (or her)?
In fact, this article makes the case that Trump is our first postmodern president:
[Postmodern] writers describe a world where the visual has triumphed over the literary, where fragmented sound bites have replaced linear thinking, where nostalgia (“Make America Great Again”) has replaced historical consciousness or felt experiences of the past, where simulacra is indistinguishable from reality, where an aesthetic of pastiche and kitsch (Trump Tower) replaces modernism’s striving for purity and elitism, and where a shared plebeian culture of vulgarity papers over intensifying class disparities. In virtually every detail, Trump seems like the perfect manifestation of postmodernism.
For Baudrillard, “the perfect crime” was the murder of reality, which has been covered up with decoys (“virtual reality” and “reality shows”) that are mistaken for what has been destroyed. “Our culture of meaning is collapsing beneath our excess of meaning, the culture of reality collapsing beneath the excess of reality, the information culture collapsing beneath the excess of information—the sign and reality sharing a single shroud,” Baudrillard wrote in The Perfect Crime (1995). The Trump era is rich in such unreality. The president is not only a former reality-show star, but one whose fame is based more on performance than reality—on the idea that he’s a successful businessman. Although his real estate and gambling empire suffered massive losses in the early 1990s, and Trump’s “finances went into a tailspin,” he survived thanks to the superficial value of his brand, which he propped up though media manipulation.
In Baudrillard’s terms, Trump is a simulacra businessman, a copy of a reality that has no real existence. All sorts of simulacrum and decoy realities now flourish. Consider the popularity of conspiracy theories, evidence of a culture where it’s easy for fictional and semi-fictional narratives to spread like wildfire through social media. Trump loves spreading conspiracy theories about his enemies, and his enemies love spreading conspiracy theories about him.
To me, the most tragic thing about Jordan Peterson is that not only does he recite right-wing talking points to his audience of impressionable and hurting you men, he advises them to get with the program and grin and bear it. Do not challenge or question a social order that is crushing you, just master it. And that narrative certainly benefits a certain group of people.
And we’re living in a time eerily similar to that which saw the rise of right-wing regimes around the world in the 1930’s. And once again we see illiberal regimes rising around the world due to the economic circumstances. We see extremist parties rising because the mainstream parties have lost their ability to effect change.
Peterson never tires of telling us about the millions of people who died under Communist repression. His house is apparently decorated wall-to-wall with Soviet propaganda art. He even named his daughter after Mikhail Gorbachev. But consider what is happening in Russia right now:
Now a museum, Perm-36 is the only part of Joseph Stalin’s Gulag that still survives. The network of brutal labour camps was where Soviet Russia sent its political opponents, as well as many criminals and kulaks – wealthier peasants. During Stalin’s Great Terror in the 1930s, millions passed through the system. Hard physical work on meagre rations in extreme weather killed vast numbers…The museum at this site was founded by historian Viktor Shmyrov in the 1990s as post-Soviet Russia opened up to the world.
“The Gulag was a huge phenomenon but there are practically no traces of it left,” he says. “That’s why Perm-36 needed preserving.” The country opened many archives then too, revealing the scale and details of decades of political repression. But the desire to dig deep into that past has been fading.
In 2014, Perm-36 was taken over by the local authorities and the museum’s founder was removed. The new administration then tried to soften the museum’s focus, says Shmyrov. “The dominant idea now is that the Gulag was necessary, both economically and to bring discipline and order.” One member of the new team admits there were changes. “There was a lean towards justifying the repressions, maybe three years when the museum wavered,” historian Sergei Sheverin says, standing by rows of barbed wire. At one point, the Gulag museum’s own website defended Stalin’s imprisonment of scientists – to force them to work for the state.
Sheverin suggests the museum was a stain on the “Great Power” narrative of Russia that’s now led by Putin. That approach has seen Stalin rehabilitated because of his role in the Soviet defeat of Nazi Germany. “The policy from above is that we shouldn’t remember the bad things, only the good,” says Sheverin.
The museum’s founder Viktor Shmyrov suspects there was an additional reason for his removal. Perm-36 used to host an annual forum and music festival that attracted thousands. In a place where free-thinkers were once incarcerated, Shmyrov says the festival had developed into a “freedom space”. “Not one person there could say a good thing about Vladimir Putin of course,” he says. “We used to have a powerful civil society. Now they’re bringing order and control.” The attempts to dilute the historical message at Perm-36 sparked opposition from human rights activists and the independent press…
But not, apparently, from Jordan Peterson who was busy fighting the real enemies of freedom: Candian politicians attempting to protect transgender people and the Ontario Education Association.
Meanwhile, in China, the president has removed limits to being president for life:
Last week China stepped from autonomy into dictatorship. That was when Xi Jinping … let it be known that he will change China’s constitution so that he can rule as president for as long as he chooses …. This is not just a big change for China but also strong evidence that the West’s 25 year long bet on China has failed.
After the collapse of the Soviet Union, the West welcomed [China] into the global economic order. Western leaders believed that by giving China a stake in institutions such as the World Trade Organization would bind it into the rules based system … They hoped that economic integration would encourage China to evolve into a market economy and that, as its people grew wealthier, its people would come to yearn for democratic reforms ….
As Rorty predicted, the elites are using cultural issues to keep us divided against one another as they consolidate power and engage in a new enclosure movement. Peterson is just the latest arrow in their quiver.
Without prompting, he raged, with operatic scattergun anger against postmodernism, Marxists and—his favourite bogeymen—“social justice warriors.” It was the day after the U.S. presidential election, and I was still reeling from Trump’s victory. Peterson was unperturbed. He said Trump was no worse than Reagan and that the Democrats got what they deserved for abandoning the working class and playing identity politics. I was initially surprised—someone who spent a lifetime studying tyranny wasn’t maybe a tad worried about a president with such undisguised autocratic ambitions? But then I remembered that Trump, too, has long blamed political correctness for America’s ills, and reflexively used the phrase to dismiss any criticism he faced—everything from his treatment of women to his proposed immigration ban on Muslims. And, among many Trump supporters, “social justice warrior” is a favourite epithet used to disqualify his critics.
I’ve spent a fair deal of time–way too much, actually–trying to get a handle on the Jordan Peterson phenomenon. And it is best to distinguish JP the phenomenon from JP the person, because from I can tell, they are indeed quite different and distinct.
I’m going to state at the outset what I had originally put in my conclusion. That:
1.) The Jordan Peterson phenomenon is mainly caused by our failure to take the pain of men, especially young men, seriously.
Men, especially white men, today are dealing with an impossible series of challenges. There are few satisfying roles for them in society anymore. They are ridiculed. They feel persecuted. They feel unloved. The rise of the Sheconomy has made the only jobs on offer for men ones that they don’t particularly enjoy doing or are not particularly suited for. Even in the few fields that are still (temporarily) male-dominated, such as computer programming, we are told that that this means we have a “diversity problem” that needs to be corrected, while no one frets about the paltry number of male home health care aides or registered nurses. Men are blamed for creating and sustaining a system that is shortening their own lifespans, and one that men feel is increasingly stacked against them (for example, child support and visitation rights).
Men quickly find that their natural interests do not overlap with what society wants or needs anymore, and their inclinations are seen as inherently boorish and cruel. They find that the traits that make them desirable as workers make them undesirable as romantic partners. They find video games and pot more satisfying than working in a dead-end job where you are treated like a virtual serf.
Peterson understands this phenomenon. He understands that men, in general, are less agreeable than women, and that they have different cognitive styles. He knows this from his psychological studies. He also knows that men, especially young men, have been abandoned by society that has no use for many of them and are feeling hopeless and adrift. This quote from James Howard Kunstler describes the situation pretty well:
“The general run of humanity really does need some sort of a coherent armature for daily life. And that incudes role models who offer examples of behavior that will allow them to thrive rather than to be defeated by life. They need a certain amount of discipline in order to fulfill the behavior that those role models show them, and they need some aspiration, some ability to aspire to the products or the results of leading what we might call a good life. And a lot of those things are missing, especially in these unfortunately sort-of disenfranchised, throw-away, forgotten, lower middle classes that we have in America. ”
“You can see it very clearly in my region, which was, as I said, a former thriving region of small manufacturing, small factories…around the confluence of the Hudson river and the Battenkill River where I am. Granted, a lot of these companies were paternalistic, but as part of that paternalism they sponsored a lot of institutional activities for people. You know, they had baseball teams, they had outings, [and] they paid these people enough to live decently, and these people produced children who aspired to do better. And they were able to do better. They got a better education by eighth grade in the 1920’s than people are getting now in grad schools. And all of this stuff has dissolved.”
“You actually need quite a bit of built-in structure in everyday life for a society to thrive and individuals to thrive within it. And that’s not there, and we don’t care about it. We just don’t care. We have eliminated most of the public gathering places in small town America. I live in a town that doesn’t have a coffee shop [or] a bar, anyplace that somebody might go outside their home. And there’s the expectation that all of the ‘community’ that you’re going to be a part of is found on your TV set. Well that’s just a lie. It’s based on a very basic and almost universal misunderstanding in America that the virtual is an adequate substitute for the authentic. That having relationships with made-up people on TV is the same as having relationships with people who are really in your life.”
“And so that structure for leading a good life is absent. We’re seeing the results of it in this ‘anything goes and nothing matters’ society that we’ve created for ourselves.”
Into this vacuum steps Jordan Peterson with his theories about how “anything goes and nothing matters” is the postmodernist creed, with its ultimate roots in Marxism, and that the universities are spreading this pessimistic message of “cultural Marxism.” To counteract this, he turns to philosophers like Nietzsche and looks to archetypes and mythology to restore a lost order (logos) to life.
“I think at a deep level the West has lost faith in masculinity. That’s no different than the death of God. It’s the same thing. And Nietzsche knew what the consequences of that would be, that’s most of what he wrote about. So you’d say the divine symbol of masculinity has been obliterated. Well, so then what do you expect? What’s going to happen? That means masculinity is going to become weak. Especially if the symbol is also denigrated, which it definitly is.”
“So what that means is that the ideal that men could aspirte to is denigrated? Well, then with your ideal in tatters, you’re weak. That’s definitional. So I think the reason that men have been responding positively to my thinking is that I don’t buy any of that. I like the masculine spirit. It’s necessary. And its not fundamentally carnage and pillaging. Its not fundamentally rape culture. It’s not fundamentally world destroying And all of those aspersions have been cast upon it. That’s partly the guilt of Western society for technological progress…”
2.) To me, the most tragic thing about the JP phenomenon is the fact that, in my not-so-humble-opinion, the destruction of white males is caused primarily by our economic system of globalized financial casino capitalism which seeks no other goal than to maximize profit for a small international investors class, consequences to the health of society be damned. It leads to a “devil take the hindmost” attitude, where society is a zero-sum game divided into winners and losers.
But instead of taking a critical look at that system, Peterson places the blame, and the responsibility for solving it, squarely on the shoulders of the individual. I think this is not only self-defeating, but it is actually harmful. Numerous studies have shown that in countries where individuals blame wider economic forces for their unemployment, rather than their own personal fortitude, there is less self-hatred and self-harm.
Peterson not only does not wish to look at these forces, but is a staunch defender of libertarian market values. Not only is there no class war, declares Peterson, but even thinking in class terms makes you a Marxist!
3.) One could hardly think of a better way to kneecap a genuine Leftist movement than unleashing the divisive identity politics seen on college campuses. But where are these ideas really coming from? Are they truly ‘Marxist’ as Peterson asserts?
We know that, by definition, the men suffering the most in America today are those without college degrees. This was the conclusion of the Case/Deaton study. Life spans are actually declining for men and women without degrees. This means that, by definition, the people suffering the most in our society have no idea what is really going on on college campuses! Yet they are continually warned of a “Red Peril” emanating from college campuses by the alt-right and vote accordingly. It’s the Red scare updated for the twenty-first century.
In my opinion, this is entirely a media-manufactured phenomenon. Why? As Adam Curtis opined, ‘Angry people click more.’ Keeping people angry and outraged seems to be the main purpose of media these days because it is profitable. Keeping people informed is less important than profits.
Are the semi-mythical “Social Justice Warriors” actually closet Maoists dedicated to spreading communism beyond the campus? Consider that it is at the core of the Marxist project for workers to set aside superficial differences such as race, gender and nationality, and recognize their class role as the main reason they are exploited. The social justice warriors clearly do not want that.
Liberals would be satisfied with a world in which exploitation and wealth were evenly distributed across demographic groups. The left doesn’t want that. We want no exploitation of anyone. That necessarily means that white men shouldn’t be exploited either…So, lonely and/or broke white men sometimes feel the left offers them no explanation for their suffering. You know who does? Jordan Peterson. He says to them, I know you feel bad, and let me tell you why. And then he feeds them a bunch of hateful bullshit. More and more people are going for it. He has the number one bestselling book on Amazon…
Slavoj Zizek makes this point as well:
If I were to engage in paranoiac speculations, I would be much more inclined to say that the Politically Correct obsessive regulations (like the obligatory naming of different sexual identities, with legal measures taken if one violates them) are rather a Left-liberal plot to destroy any actual radical Left movement. Suffice it to recall the animosity against Bernie Sanders among some LGBT+ and feminist circles, whose members have no problems with big corporate bosses supporting them. The “cultural” focus of PC and #MeToo is, to put it in a simplified way, a desperate attempt to avoid the confrontation with actual economic and political problems, i.e., to locate women’s oppression and racism in their socio-economic context…Liberals will have to take note that there is a growing radical Left critique of PC, identity politics and #MeToo…
This surprisingly intelligent YouTube comment makes a similar point:
For a long time it has been a tactic of US intelligence to support a moderate group, be it progressive or reactionary, as a way of blocking a more extremist group from gaining support. This happened domestically in the 60’s with progressive movements as well. Most famously Gloria Steinem was covertly supported by the CIA as a way of keeping attention away from more dangerous radicals. Culturally, things like universities in effect reproduce this dynamic. By having an Overton window big enough to include a lot of progressive politics, they can exclude actually dangerous stuff. This is the [role] political correctness basically plays. By maintaining vigorous debate within a specific window, and outrage for anything outside of that, it vanguards against real leftist politics of the sort actual Marxists argue for.
As this comment from an article in the Guardian about Peterson’s book states: “I thought Marxism was about “workers of the world unite” not ‘let’s fragment into a million separate indentities and fight each other.'”
So, who the hell is Jordan Peterson, anyway?
Jordan Peterson is a formerly obscure Canadian psychology professor who became an overnight sensation by posting a series of YouTube videos describing his opposition to Canadian Bill C-16. Opposition to this bill has become something of a a cause celebre among a certain group of self-described anti-Leftist activists who like to militate against against “identity politics.” He argued that the bill forced him to call people by their “preferred pronoun,” or else face sanction. He argued that this amounted to a form of “compelled speech,” and that language was a battleground that he would not cede to the “radical Left.”
In other words, if I were a transgender person and demanded Peterson call me, I don’t know, ‘apple,’ he would have to do so.
Now, I think we can all agree this is a little silly. But to Peterson, this was no less than a threat to freedom and very foundations of Western civlization.
On September 27, University of Toronto psychology professor Jordan Peterson posted a video titled Professor Against Political Correctness on his YouTube channel. The lecture, the first in a three-part series recorded in Peterson’s home office, was inspired by two recent events that he said made him nervous.
The first was the introduction of Bill C-16, a federal amendment to the Canadian Human Rights Act and Criminal Code that would add gender identity and gender expression to the list of prohibited grounds for discrimination. Peterson’s second concern was that U of T’s human resources department would soon make anti-bias and anti-discrimination training mandatory for its staff—training he believed to be ineffective, coercive and politically motivated. “I know something about the way that totalitarian, authoritarian political states develop,” Peterson said in the first video, “and I can’t help but think I’m seeing a fair bit of that right now.”
Other profs in his position might have written op-eds, circulated petitions or negotiated with university officials. But Peterson is a big believer in the power of YouTube—“a Gutenberg revolution for speech,” he calls it—and, as it turns out, he had a lot to get off his chest. He carpet-bombed Marxists (“no better than Nazis”), the Ontario Human Rights Commission (“perhaps the biggest enemy of freedom currently extant in Canada”), the Black Liberation Collective (“they have no legitimacy among the people they purport to represent”) and HR departments in general (“the most pathological elements in large organizations”).
Peterson also said he would absolutely not comply with the implied diktat of Bill C-16, which could make the refusal to refer to people by the pronouns of their choice an actionable form of harassment. He believes the idea of a non-binary gender spectrum is specious and he dismisses as nonsensical the raft of gender-neutral pronouns that transgender people have adopted—ze, vis, hir, and the singular use of they, them and their. “I don’t recognize another person’s right to determine what pronouns I use to address them,” he said grimly. “I think they’re connected to an underground apparatus of radical left political motivations. I think uttering those words makes me a tool of those motivations. And I’m going to try and be a tool of my own motivations as clearly as I can articulate them and not the mouthpiece of some murderous ideology.”...In his fervent opinion, the issue wasn’t pronouns, per se. It was much bigger than that. It was truth itself. Being told what to say—and by the government no less—was just one more step along the slippery slope to tyranny. The way Peterson tells it, the only thing standing between us and a full-blown fascist insurrection was him.
Underground apparatus? Murderous Ideology? What the f*ck is he talking about???
According to Peterson, the mandated use of such pronouns is a “slippery slope” down the road to totalitarianism, re-education camps and gulags, and identity politics is the “camel’s nose” for FULL COMMUNISM.
Peterson contends that “political correctness” is actually a mutated form of Communist ideology, the same ideology, he claims, that directly led to the murder of millions of innocent individuals in the twentieth century. Furthermore, he claims that entire fields of academia have been corrupted by “radical postmodernism” including nearly all the humanities such as anthropology and literature. He further alleges that these “Neo-Marxists” have seized control of universities, government departments and corporate HR departments.
Despite his fear of leftist goon squads patrolling college campuses, no one, not one single person, has been arrested or jailed, or even fined over this law. It is a totally artificial crisis, manufactured in order to smear the radical left on college campuses and foment outrage. It’s pure grandstanding. Here is what legal scholars think in a letter from the Canadian Bar Association:
For human rights legislation, the CHRA prohibits denying or differentiating adversely in the provision of goods, services, facilities or accommodation customarily available to the general public, commercial or residential accommodation, or, employment on the basis of a prohibited ground of discrimination. The Act applies to federal and federally regulated entities.
The amendment to the CHRA will not compel the speech of private citizens. Nor will it hamper the evolution of academic debates about sex and gender, race and ethnicity, nature and culture, and other genuine and continuing inquiries that mark our common quest for understanding of the human condition.
However, millions of people watched the videos and tens of thousands contributed to Peterson’s Patreon account, to the tune of over $50,000 a month. Being a martyr has its advantages. Chapo Trap House described him as “the Rosa Parks of Pronouns.”
If Peterson were really so concerned about the threats to free speech coming from employers such as his university, then why isn’t he arguing for more union representation, which has the added benefit if reducing inequality (which he claims to want to do):
I’m seeing a lot of comments from the political right and centre-right worrying about the possibility that workers may be fired for expressing conservative views…It strikes me that this would be a really good time for people…to campaign for an end to employment at will, and the introduction of the kind of unfair dismissal laws that protect workers in most democratic countries, but not, for the most part, in the US. Among other things, these laws prohibit firing employees on the basis of their political opinions. Better still, though, would be a resurgence of unionism. Union contracts generally require dismissal for cause, and unionised workers have some actual backup when it comes to a dispute with employers.
In an emailed rebuttal to a journalist who termed him a figure of the “far right”, he described his own politics as those of a “classic British liberal … temperamentally I am high on openness which tilts me to the left, although I am also conscientious which tilts me to the right. Philosophically I am an individualist, not a collectivist of the right or the left. Metaphysically I am an American pragmatist who has been strongly influenced by the psychoanalytic and clinical thinking of Freud and Jung.”
There seem to be three, mutually interlocking Jordan Petersons:
A. The tenured psychology professor, who has written books and papers, and whose lectures have been described as ‘life changing’ by students who took his courses.
B. The self-help guru, who talks about things like metaphysical truth, Jungian archetypes and seeking meaning whose ideas resemble Joseph Campbell’s work in a lot of ways.
C. The rabid anti-Communist crusader who engages in conspiracy theories and red-baiting, who sees secret Communism behind every campus action he doesn’t like.
Peterson’s fans commonly depict him as “misunderstood.” This is because, for almost everything he has said, he has said the opposite at some point, or used weasel words to meliorate his stance. He’s also been accused of doing a Gish gallop through the topics he describes, making describing what he really believes like nailing jello to a tree.
Why, then, is he considered to be far right?
Well, one major reason is that Peterson’s primary fan base is the alt-right, whether he likes it or not. It was not Peterson on his A or B incarnations that made him famous and put money in his coffers; it was version C. And he knows it.
A large part of this is because Peterson’s preferred enemies list is exactly the same as that of the alt-right: Social Justice Warriors, feminists, political correctness, activists (such as black lives matter and LQBTQ), the undifferentiated “radical left,” HR departments, entire academic disciplines (such as anything with ‘studies’ in the title), postmodernism, but above all, Marxists and Neo-Marxists.
Peterson throws around the terms “Marxism” and “Neo-Marxism” sloppily and interchangeably, and without precise definitions. For a man whose cardinal rules include “Be precise in your speech,” he is extremely sloppy using these phrases, making it difficult to know exactly what he is talking about. This video from the Epoch Times is the most comprehensive statement of Peterson’s ideology:
The accompanying article in the Epoch Times, an anti-comummunist newspaper founded by dissidents from the Falun Gong movement, transcribes the main points of the interview:
Peterson said it’s not possible to understand our current society without considering the role postmodernism plays within it, “because postmodernism, in many ways—especially as it’s played out politically—is the new skin that the old Marxism now inhabits.”
By the end of the 1960s, he said, even French intellectuals like Jean-Paul Sartre had to admit that the communist experiment—whether under Marxism, Stalinism, Maoism, or any other variant—was “an absolute, catastrophic failure.”
Rather than do away with the ideology, however, they merely gave it a new face and a new name. “They were all Marxists. But they couldn’t be Marxists anymore, because you couldn’t be a Marxist and claim you were a human being by the end of the 1960s,” said Peterson.
The postmodernists built on the Marxist ideology, Peterson said. “They started to play a sleight of hand, and instead of pitting the proletariat, the working class, against the bourgeois, they started to pit the oppressed against the oppressor. That opened up the avenue to identifying any number of groups as oppressed and oppressor and to continue the same narrative under a different name. And so since the 1970s, under the guise of postmodernism, we’ve seen the rapid expansion of identity politics throughout the universities,” he said. “It’s come to dominate all of the humanities—which are dead as far as I can tell—and a huge proportion of the social sciences.”
“We’ve been publicly funding extremely radical, postmodern leftist thinkers who are hellbent on demolishing the fundamental substructure of Western civilization. And that’s no paranoid delusion. That’s their self-admitted goal,” he said, noting that their philosophy is heavily based in the ideas of French philosopher Jacques Derrida, “who, I think, most trenchantly formulated the anti-Western philosophy that is being pursued so assiduously by the radical left.”
“The people who hold this doctrine—this radical, postmodern, communitarian doctrine that makes racial identity or sexual identity or gender identity or some kind of group identity paramount—they’ve got control over most low-to-mid level bureaucratic structures, and many governments as well,” he said. “But even in the United States, where you know a lot of the governmental institutions have swung back to the Republican side, the postmodernist types have infiltrated bureaucratic organizations at the mid-to-upper level.”
“I don’t think its dangers can be overstated,” Peterson said. “And I also don’t think the degree to which it’s already infiltrated our culture can be overstated.”
Now, technically, Peterson doesn’t use the term “Cultural Marxism” directly in the video, preferring to use the term “Neo-Marxism.” As far as I can tell, however, the terms are interchangeable; I could not find any information distinguishing between the two, so I will consider them the same unless I find out some new information. He certainly describes them in the same terms.
Given that he took grave exception to the use of term “far right” in reference to him, to the point of demanding a retraction, one can only assume he is okay with the phrase “cultural Marxism” in reference to this video, otherwise he would have demanded that the term be removed and relaced with a more accurate one.
That Peterson is also vehemently anti-Marxist would be relatively unremarkable were it not for the fact that, in many of his online disquisitions about what he sees as a left-wing takeover of campus culture, he uses the terms “Marxism” and “postmodernism” almost interchangeably. Not only are these two schools of thought very different from one another, they are also in certain respects mutually antagonistic. You don’t need an MA in critical theory to figure it out: the travails of the Democratic Party during the primaries for 2016’s presidential election highlighted, in a very public and destructive way, the ideological fault lines in US progressive politics. The bitter schism between the Hillary Clinton camp — which mobilized aggressively around identity politics — and the old-school leftists who rallied around Bernie Sanders ultimately helped clear Donald Trump’s path to the presidency. (Historically, the burgeoning of identity politics in US campus culture in the 1980s and ’90s went hand in hand with the ascendancy of postmodernist ideas that explicitly repudiated Marxism.) It’s not just that this sloppy use of language exposes Peterson as an intellectual lightweight; the tendency to causally conflate various disparate phenomena that one happens not to like — in this instance, postmodernism, Marxism, and political correctness — is the calling card of the paranoiac.
Cultural Marxism is a ‘snarl word’ and dog-whistle phrase that refers to the Frankfurt School, a loosely organized group of academic and writers based in Germany during the Weimar Republic who were influneced by Marx. They were part of what we would today call a “private think tank” based in Frankfurt. For a good overview, I suggest listening to this slightly less baised overview from the BBC Radio four’s excellent In Our Time show: BBC Radio 4 – In Our Time, The Frankfurt School
Weimar Germany was a time much like our own: economic dislocation, rampant unemployment, declining faith in liberal democracy; communists, anti-communists, fascists and anti-fascists battling it out in the streets, marches and protests, etc. Despite all the chaos, there was a feeling of ‘hope and change;’ one scholar in the show compares it to an ‘Obama moment.’
Yet, instead of revolution, the nation turned to the right-wing Nazi Party.
Marx himself believed that successful revolution could only take place where the forces of capitalist production were sufficiently advanced. In such a scenario, the inherent contradictions of capitalism would cause it to falter, leading to socialist structures taking over in a more-or-less organic manner.
Instead, all the major communist revolutions were agrarian revolts by peasants against the aristocracy, rather than the proletariat rising up and seizing the means of production from capitalists in industrialized countries. Because the mass production of capitalism was not yet fully developed in these countries, Marx himself could have predicted their failure, and would not be surprised at the chaos under their implementation. Most Communists consider the Soviet Union as a form of state capitalism.
The Frankfurt School think tank pondered this question: Why didn’t the revolution occur in Germany after the War, where it “should” have occurred? Why didn’t the proletariat rise up and overthrow the capitalist class in the advanced capitalist countries of Western Europe, as many thought was inevitable? To answer this question, the Frankfurt School looked at more than just the economic structure, they decided to look at the culture itself. Capitalism wasn’t just an economic system, they argued. It colonized the minds of the individual people living under it, such that they could see no alternatives. It was embedded in the very DNA of society. To this end, they developed a “critical theory,” which was, as you can imagine, critical of capitalist society, but addressed itself mainly to sociocultural issues rather than the economic workings of society as Marx had done.
They never called themselves “cultural Marxists,” however. Rather, that label first came from the National Socialist (Nazi) Party. The National Socialists didn’t use the phrase “cultural Marxism,” instead preferring the term “cultural Bolshevism.”
A History of Nazi Germany describes how the Weimar Republic brought about increased freedom of expression (modernism), then described by critics as decadent and irrational. Traditionalist Germans thought that this was causing German culture to decay and that society was heading towards a moral collapse.
The Nazis labelled this modernism as “Cultural Bolshevism” and, through “Jewish Bolshevism”, claimed that Jews were primarily behind Communism. In particular, they argued that Jews had orchestrated the Russian Revolution and were the main power behind Bolshevists.
This Jewish-led Bolshevist assault was described by Adolf Hitler as a disease that would weaken the Germans and leave them prey to the Jews, with Marxism being perceived as just another part of an “international Jewish conspiracy”. An ideological objective was thus the “purification” to eliminate alien influences and protect Germany’s culture.
This concept of Marxists undermining Western civilization, and equating being “critical” and “pessimistic” with an attempt to subvert Western values is a staple of far right which began in Nazi Germany as a reaction to dislocation and rapid change. It’s a thread that runs through the alt-right today.
As this article points out:
[Peterson’s] obsessive anti-communism sits uncomfortably with [his] supposed anti-fascism. The main opposition to Adolf Hitler’s rise, after all, came, not from high-minded conservatives like Peterson, but from German socialist and communist worker’s parties. And Hitler secured support domestically and internationally in part by promising to crush that leftist opposition.
In fact, a lot of “high minded conservatives” and prominent intellectuals threw their support behind Adolf Hitler and the Nazi Party. Many wealthy, conservative Americans did too, especially those strenuously opposed to the “socialist” policies of Franklin D. Roosevelt, policies that are quite similar to those advocated by, for example, Bernie Sanders today.
The “cultural Marxist” conspiracy theory didn’t die with the end of the Third Reich, however. Instead, it was revived and greatly expanded by the rising conservative movement of the 1990’s as the Republican Party merged with movement conservatism and the John Birch Society. They blamed everything they claimed was destroying American society on Marxists who were behind “politically correct” speech and quotas.
‘Cultural Marxism’ becomes a rallying cry for the modern-day alt-right
The conflagration of Marxism with political correctness and activism began long before anyone had ever heard of the good professor. It actually started in the Nineties, with roots going back to the Seventies.
This conspiracy theory hinges on the idea that the Frankfurt School wasn’t just an arcane strain of academic criticism. Instead, the Frankfurt School was behind an ongoing Marxist plot to destroy the capitalist West from within, spreading its tentacles throughout academia and indoctrinating students to hate patriotism & freedom. Thus, rock’n’roll, Sixties counterculture, the civil rights movement, the anti-war movement, homosexuality, modern feminism, and in general all the “decay” in the West since the 1950s are allegedly products of the Frankfurt school…[rationalWiki]
Its origins were surprisingly deliberate, emerging from a paleoconservative Washington think tank called the Free Congress Foundation. The FCF was founded by Paul Weyrich, a founder of the Heritage Foundation and namer of the so-called Moral Majority movement. Weyrich also created a TV network called National Empowerment Television, a short-lived predecessor to Fox News, which aired a documentary in 1999 called “Political Correctness: The Frankfurt School.” Hosted by…William Lind, it presents an account of the origin of what we now call “identity politics.”
Weyrich first presented his notion of Cultural Marxism in a 1998 speech to the Civitas Institute’s Conservative Leadership Conference, later repeating this usage in his widely syndicated “culture war letter”. At Weyrich’s request, William S. Lind wrote a short history of his conception of Cultural Marxism for the Free Congress Foundation; in it Lind identifies the presence of homosexuals on television as proof of Cultural Marxist control over the mass media and claims that Herbert Marcuse considered a coalition of “blacks, students, feminist women, and homosexuals” as a vanguard of cultural revolution…[wikipedia]
These came, Lind tells us, from the Institute for Social Research, or the Frankfurt School. There, Theodor Adorno, Herbert Marcuse, and their cronies created a school of thought called “critical theory,” which the FCF gave the name “cultural Marxism.” This frightening idea fused the impertinence of Marx with the indecency of Freud, producing a new threat to Western values far beyond those posed by Copernicus or Darwin… [https://www.viewpointmag.com/2018/01/23/postmodernism-not-take-place-jordan-petersons-12-rules-life/]
Sounds an awful lot like Peterson’s rhetoric, doesn’t it? In his essay, Lind declared, in rhetoric virtually identical to that of the stump speeches of Jordan Peterson:
“Political Correctness is cultural Marxism. It is Marxism translated from economic into cultural terms. It is an effort that goes back not to the 1960s and the hippies and the peace movement, but back to World War I. If we compare the basic tenets of Political Correctness with classical Marxism the parallels are very obvious.”
Lind wasn’t satisfied with just an online essay. He also produced a series of videos which can easily be accessed on YouTube, whose ideas are virtually identical to the political views of Dr. Peterson:
In 1999, Lind led the creation of an hour-long program entitled “Political Correctness: The Frankfurt School”. Some of Lind’s content went on to be reproduced by James Jaeger in his YouTube film “CULTURAL MARXISM: The Corruption of America.” The historian Martin Jay commented on this phenomenon saying that Lind’s original documentary:
‘… spawned a number of condensed textual versions, which were reproduced on a number of radical right-wing sites. These in turn led to a welter of new videos now available on YouTube, which feature an odd cast of pseudo-experts regurgitating exactly the same line. The message is numbingly simplistic: all the ills of modern American culture, from feminism, affirmative action, sexual liberation and gay rights to the decay of traditional education and even environmentalism are ultimately attributable to the insidious influence of the members of the Institute for Social Research who came to America in the 1930’s.‘
Heidi Beirich likewise holds that the conspiracy theory is used to demonize various conservative “bêtes noires” including “feminists, homosexuals, secular humanists, multiculturalists, sex educators, environmentalists, immigrants, and black nationalists”.
Wait a minute, that’s the exact same enemies list as Jordan Peterson!
Indeed, I’ve spent some time watching these documentaries. Now, when I say the rhetoric is the same, you don’t have to take my word for it. Watch the Jordan Peterson video above. Watch the William Lind documentaries. Make up your own mind.
Although the theory became more widespread in the late 1990s and through the 2000s, the modern iteration of the theory originated in Michael Minnicino’s 1992 essay “New Dark Age: Frankfurt School and ‘Political Correctness'”, published in Fidelio Magazine by the Schiller Institute. The Schiller Institute, a branch of the LaRouche movement, further promoted the idea in 1994. The Minnicino article charges that the Frankfurt School promoted Modernism in the arts as a form of cultural pessimism and shaped the counterculture of the 1960s (such as the British pop band The Beatles) after the Wandervogel of the Ascona commune.
The idea that the counterculture was a fifth column for communism is an old chestnut going back to the 1960’s, as is the idea that colleges were radicalizing middle American children. The Powell memorandum back in the 1970’s sounded a paranoid alarm about how students on college campuses were being indoctrinated by insidious left-wing professors to hate the “free enterprise” system.
According to Chip Berlet, who specializes in the study of far-right movements, the Cultural Marxism conspiracy theory found fertile ground within the Tea Party movement of 2009, with contributions published in the American Thinker and WorldNetDaily highlighted by some Tea Party websites.
More recently, the Norwegian terrorist Anders Behring Breivik included the term in his document “2083: A European Declaration of Independence”, which—along with The Free Congress Foundation’s Political Correctness: A Short History of an Ideology—was e-mailed to 1,003 addresses approximately 90 minutes before the 2011 bomb blast in Oslo for which Breivik was responsible. Segments of William S. Lind’s writings on Cultural Marxism have been found within Breivik’s manifesto.
Right-wing agitprop outlets such as Breitbart, whose head Steve Bannon served in the Trump administration, also commonly use cultural Marxism as a snarl word and all-purpose bogeyman for everything they believe is destroying America from within, in terms alarmingly similar to those of the Nazis:
Breitbart views so called ”Cultural Marxism”as the root of all evil. Cultural marxism destroys the language. Cultural Marxists wants to have equality between the sexes. they threaten the western civilization, and hate God and they love Muslims and Homosexuals too.
Yes, Cultural Marxists are behind Muslim” immigration to, they claim. It all started with talk about the rights of minorites in the 60s, as they write:
Under this “cultural Marxism,” progressives believed they would aid racial and sexual minorities — and now Islamic minorities — by transferring cultural power and status from ordinary Americans, especially from white working-class Americans and away from individualistic-minded Christian churches…
The present day cultural Marxists, including former President Obama
are also encouraging the migration of devout Muslims and their theocratic political leaders into the United States.
And this idea has even infiltrated the highest levels of the U.S. military:
In July 2017, Rich Higgins was removed by US National Security Advisor H. R. McMaster from the United States National Security Council following the discovery of a seven-page memorandum he had authored, describing a conspiracy theory concerning a plot to destroy the presidency of Donald Trump by Cultural Marxists, as well as Islamists, globalists, bankers, the media, and members of the Republican and Democratic parties.
As RationalWiki states, “Nobody denies that the Frankfurt School existed (and championed its fair share of nutty ideas). Critics of the pseudohistorical ‘Cultural Marxism’ conspiracy theory merely argue that the school was tediously unsuccessful (and, as such, somewhat unimportant) in the broad scheme of Western progressivism — and, more obvious still, that all liberals aren’t commies as well.”
Now, it’s obviously clear that Peterson’s understanding of “Cultural Marxism” is very different than Anders Breivik, the Norwegian mass murderer. But Peterson’s constant use of this term is worrying. After all, this is what our young men are listening to! Peterson’s claims are that things like bill C-16 lead to the gulag and reeducation camps. Yet ideas virtually identical to the ones he is peddling have already directly led to the deaths of 77 people in Norway. It’s even gained cachet among people with their fingers on the nuclear button. What’s the real threat here???
According to the Southern Poverty Law Center, the right-wing theory of cultural Marxism holds that the Jewish, Marxist philosophers of the 1930s Frankfurt School hatched a conspiracy to corrupt American values by promoting sexual liberation and anti-racism…Peterson has tweaked this argument a bit. In his lectures, he mostly traces cultural rot to postmodernists like Derrida (whose work Peterson comically garbles) rather than to the Frankfurt School.
In Peterson’s new book, though, he does explicitly link postmodernism to the Frankfurt school, and in other venues he regularly uses and approves the term “cultural Marxism.” One of his videos is titled “Postmodernism and Cultural Marxism.” On Facebook, he shared a Daily Caller article titled “Cultural Marxism Is Destroying America” that begins, with outright racism, “Yet again an American city is being torn apart by black rioters.” The article goes on to blame racial tension in the U.S. on … you guessed it: the Frankfurt School.
Of course, it is possible to criticize the left without falling into fascism. Joseph Stalin was a murderous monster; Communist regimes have done horrible things that led to the deaths of millions of people. But the left in the U.S. and Canada is not promoting armed revolution or mass murder. In his cultural Marxism video, Peterson argues that, whether you’re talking about Leninist insurrection or folks criticizing sexism or racism in cultural products, “the end result is much the same.” That’s dangerous nonsense, which can easily be used to justify any extreme of violence. If your gender studies professor is the equivalent of Lenin … well, we’d better destroy her, right?
His constant promotion of these paranoid conspiracy theories to his audience of impressionable, frustrated, and economically precarious young men makes him what I would characterize, somewhat ironically, a “useful idiot” for the far-right. This is why Peterson’s “I’m so misunderstood” schtick is disingenuous, as are the claims that he is “misinterpreted.” I think it’s pretty clear from the evidence above, in his own words, what he believes.
The tragic thing is, there was a guy who wrote in very similar terms about the rootlessness, despair and alienation that young men would inevitably experience under capitalism. He also gained a following as well. His name? Karl Marx:
Matthew Syed in the Times gives us a wonderful example of Marxist thinking. He asks why marathon running is so popular, and says it’s because it satisfies a desire for self-improvement which we cannot get from paid labour:
We live in a world where the connection between effort and reward is fragmenting. In our past, we hunted, gathered and built…We could observe, track and consume the fruits of our labour. We could see the connection between our sweat and toil, and the value we derived from them. In today’s globally dispersed capitalist machine, this sense is disappearing.
This is pure Marxism. Marx thought that people had a desire for self-actualization through work, but that capitalism thwarted this urge. In capitalism, he wrote:
Labor is external to the worker, i.e., it does not belong to his intrinsic nature; that in his work, therefore, he does not affirm himself but denies himself, does not feel content but unhappy, does not develop freely his physical and mental energy but mortifies his body and ruins his mind. The worker therefore only feels himself outside his work, and in his work feels outside himself.
Jon Elster claims that Marx “condemned capitalism mainly because it frustrated human development and self-actualization.”
Marx was right. The fact that we spend our leisure time doing things that others might call work – gardening, DIY, baking, blogging, playing musical instruments – demonstrates our urge for self-actualization. And yet capitalist work doesn’t fulfill this need. As the Smith Institute said (pdf):
Not only do we have widespread problems with productivity and pay, as well as growing insecurity at work, but also a significant minority of employees suffer from poor management, lack of meaningful voice and injustice at work. For too many workers, their talent, skills and potential go unrealised, leaving them less fulfilled and the economy failing to fire on all cylinders.
This poses the question: why isn’t there more demand at the political level for fulfilling work?
Perhaps because people like Jordan Peterson and his ilk would rather we focus on the threat from radical postmodernist feminist college professors, and the identitarian Neoliberals just want to make sure that there are enough minorities represented among the exploiters. Divide and rule has been a standard tactic to maintain power in America since Bacon’s Rebellion invented the very concept of “race” to keep working classes from teaming up against the aristocracy who were–dare we say it–oppressing them. It was only when Martin Luther King attempted to bring poor whites into his movement that he was assassinated.
The meaning and self-actualization Peterson is peddling in his book simply isn’t possible under the capitalist system. And that’s the problem. No amount of self-help or story-building is going to change that fact.
Combining white paranoia about being a minority with a deteriorating economy and constant fears of cultural Marxism, and peddling those ideas to angry young men has not shown itself to lead to a good result, historically. Is Peterson too ignorant of history to see this?
Admittedly it’s not always easy to distinguish between a harmless retro eccentric and a peddler of poisonous and potentially murderous ideas. So let’s take stock: Masculinist persecution myth? Check. Repeated appeals to Darwinism to justify social hierarchies? Check. A left-wing conspiracy to take over the culture? Check. Romanticization of suffering? Check. Neurotic angst about “chaos”? Check. Like many of his sort, Peterson sees himself as a defender of the best traditions of Western civilization and the Enlightenment. But there is an old adage: if it looks like a duck and quacks like a duck, chances are it’s a duck.
Finally, here are some more good comments from that YouTube video. I’ve combined several of them together which make the point that Marxism and Postmodernism have nothing to do with the identitarian politics on college campuses.
I’m so fucking tired of people using the term postmodernist as a catchall for leftists – postmodernism has literally nothing to do with Marxism, in fact by its very nature is at odds with the material nature of Marxism… Derrida wasn’t a Marxist, he wasn’t even a political radical unlike many of his colleges. Derrida didn’t even write about Marx at all until the 90s, after the time in which all of his intellectual cohort had given up on Marx. Derrida’s philosophical heritage is by way of the structuralism of Saussure and Levi-Strauss, and hermeneutic philosophy of Heidegger and Gadamer…
Something that most people who aren’t in the academic left don’t realize is that Foucault is seen as a clear break with Marxism, and distinctly not as an extension of it. Foucault was the first one to pose a distinctly different understanding of oppression, a sort of anarchist flavoured one, against the Marxists. There have been attempts at reconciliation, the most significant of which is Empire by Negri and Hardt, where they incorporated Foucault’s biopolitical framework to create framework for analyzing the world after the cold war. Postmodernism, insofar as that term refers to anything at all, is the wave of thinkers who broke with Marx after reading Nietzsche, which is both the case for Foucault and Deleuze. Lyotard and Baurillard also broke with Marx altogether, though for different reasons. Negri, Deleuze, and Althusser all also became anti-Hegelian, all adopting Spinoza as a model for bizarre anti-dialectical forms of “Marxism”.
Basically, this is a lot more complicated than Peterson, or you, understand. The people who are collected up into ‘postmodernism’ were serious intellectual with real insight, and while most of it I don’t think is correct, it’s important and interesting stuff.
Because Postmodernism doesn’t actually refer to anything, it is an empty label and basically exists only as a term of abuse by people who don’t want to actually engage with various philosophers and social thinkers. The really is no common factor philosophically that link Derrida, Foucault, Deleuze, Rorty, etc etc… What connects all these people is really just their attempt to explain society in the era they lived in. …Peterson just straight up doesn’t understand the topology of the left. Peterson has never lived in a place where ‘the left’ and ‘liberals’ where universally understood to be categorically different orientations in politics. For Peterson communists are just ‘very liberal’ people, while in European politics, for most of the most of the postwar period being ‘very liberal’ was the same as saying ‘very anti-communist’. In places like France and Italy the Communist party was often the second or third biggest party, and was distinctly separate from anything called a ‘liberal’ party. This fact means that Peterson totally conflates the Marxist left with the ‘left-liberalism’ or progressive liberalism…
Next time we’ll take a look at how Peterson defends and shores up those systems.
Mark Blyth is a popular speaker. You have probably seen his talks all over YouTube. His specialty is explaining the politics of austerity (and why it’s a bad idea), and the rise of populism around the world, what he calls “Global Trumpism.”
I’ve seen several of his talks, and the ideas behind them are very simple. So I’m going to try and explain them in a straightforward manner below, sprinkled with a few quotes.
I’m also going to incorporate two other thinkers whose views are very similar and who fill in some of the gaps: economist Steve Keen and filmmaker Adam Curtis. Each has their own unique take on our situation, but all of their views gel together into one coherent big-picture summation of what has happened to the post-war world, and how we got into our economic predicament. I’m also going to add a few points of my own along the way where applicable.
The First Macroeconomic Regime 1945-1973
After the Second World War, nearly the whole world lay in ruins. Over 50 million people were dead. The architects of the post-war order vowed that they would do whatever possible to ensure that it would never happen again, no matter what the cost.
They realized that it was the economic dislocations and upheavals of the Great Depression–the joblessness, the inflation, the lack of a safety net, the radicalization of the population, that had ultimately led to the rise of Fascism and war.
At the same time, Communism was ascendant. Stalin had taken over Eastern Europe and was flexing his muscles. Mao and the Communists came to power in China. These two countries alone represented a significant share of the world’s total land area and population. They were joined by numerous smaller states–Cuba, North Korea, Vietnam–and numerous revolutionaries in places like Africa and Central America.
No one would support capitalism if all the gains went solely to the very top and most people were becoming worse off. So the post-war order would focus on one thing above all: full employment and social stability. Blyth uses the term “macroeconomic regime” to describe the set of policies that run the economy. The post-war macroeconomic regime was based around stability, Keynesianism, national economies, unionization, and especially full employment as the goal.
During this period, labor’s share of income went up, while capital’s share went down. This had never happened before. A prosperous, consuming middle class was created in the industrialized world. The results of this were spectacular. Here’s Blyth:
“Back in the day, from the end of World War two, from 1945 to about 1975, this is the golden era. It was the period where something very weird happened that never happened before. The top of the income distribution came down, the bottom went up, and the whole distribution jumped. This is when you got the birth of the American middle classes. This is when British Prime Minister Harold Wilson said to the working people in Britain, ‘You’ve never had it so good,’ and he was right.”
“And there was a unique combination of circumstances that produced that world. Mainly the reaction to the great Depression, fascism, world war two; and the success of the Soviet Union appealing as an alternative economic model after the chaos of the Twenties and Thirties.”
“So at the end of that period, we built a world that looked like this: The Cold War Era. The policy target was full employment, regardless if you were Sweden or the United States. That’s what the government cared about, because we saw the disastrous consequences of unemployment on a decade-long period.”
But there was a problem. This problem was articulated in 1943 by a Polish economist named Michał Kalecki who was working in the basement of the London School of Economics. His seven-page paper was called The Political Consequences of Full Employment. His paper effectively predicted what was to come in the 1970’s.
We all know what happened next. The wheels came off in the 1970’s. Everything started falling apart the same year I was born–1973–which I’m sure was just a coincidence. But the question is, why did it fall apart?
Kalcecki’s explanation was that if you had a siloed economy of restricted capital and labor flows that targeted full employment, employees would take advantage of the situation to demand higher wages. If anyone can simply go out and get another job, businesses have to keep wages high to be competitive. But that eats into their profits. So they raise their prices to compensate. But rising prices eats into the workers’ pocketbooks. So the workers demand higher wages still. Businesses again raise prices to compensate for higher wages. Workers again demand more money to compensate for higher prices. And so on, and so on, in what economists call a wage-price spiral. More specifically, they call increasing wages pushing inflation up “wage push inflation“. Wage push inflation resulted from the full employment policies of the first macroeconomic regime:
Wage push inflation is a general increase in the cost of goods that is preceded by and results from an increase in wages. To maintain corporate profits after an increase in wages, employers must increase the prices they charge for the goods and services they provide. The overall increased cost of goods and services has a circular effect on the wage increase; eventually, as goods and services in the market overall increase, then higher wages will be needed to compensate for the increased prices of consumer goods. (Investopedia)
Blyth explains it this way:
“So what killed that first regime was inflation.”
“But it failed in the 1970’s. And the reason it failed was the following. Imagine you’ve decided that I’m going to target full employment, and that’s going to be my one policy goal. So you’re going to run a very tight, restrictive set of labor markets. And wages are going to get bid up, to the point that when you get to the late sixties when you’re running the Vietnam War off the books, your real unemployment rate is about 2-1/2 to three percent.”
“So the worst guy in your firm can leave work and then walk straight into another job and get a pay rise. The only way that firms can deal with this is by pushing up prices. So they push up prices, then what happens? Labor figures out they haven’t really had a pay rise. So they want more money. So they get a pay rise. So they want more money. And it all gets pushed up into inflation.”
“When inflation goes up and up and up like this, it becomes irrational to be an investor. So the investment rate collapses. Unemployment goes up despite the inflation. We get the great stagflation of the 1970’s.”
What high inflation and rising wages did was make it easy for people to service their debts. It was a “debtor’s paradise.” But creditors were not so happy. The value of their investments was eaten away by inflation. So they stopped investing, going on what was effectively an “investor strike.” The result of the dearth of investment due to high inflation was economic stagnation. Stagnation + inflation = stagflation.
“The Great Inflation of the 1970s destroyed faith in paper assets, because if you held a bond, suddenly the bond was worth much less money than it was before. But it was a brilliant time to be a debtor.”
“How many of you took out a mortgage in the 1970’s? You made out like bandits! Because if you have a 3 percent mortgage and there’s 10 percent inflation, it’s great; the bank was eating it and you were getting the capital gain. And then when you elected Reagan you locked in high real interest rates and your house increased in value. What a deal!”
So in the Blyth/Kalecki view, full employment policies, combined with an investor strike, caused the stagflation of the 1970’s. But there are a couple of alternate explanations I’d like to add.
One was the Vietnam War. That war caused a massive increase in government spending, as do all wars. At the same time, instead of raising taxes, they were being cut by the administration. Throughout history it has been the position of governments to “pay for” wars by raising taxes and/or borrowing.
I put “pay for” in quotes because MMT tells us that taxes do not fund government spending for war, or for anything else for that matter. So why the need for increased taxes? Because if the government is printing more and more money to pay for war costs, but it’s not “unprinting” money via taxation or soaking up the excess with war bonds, then you’re increasing the overall amount of money circulating in the economy. If you do this without a corresponding increase in productivity, then of course you will get inflation. It doesn’t help that the increased economic activity was mostly going to things that were being shipped halfway around the world to be blown up.
The other explanation is a sudden spike in the cost of oil. The formation of the OPEC cartel in the years prior cause the price of oil to triple overnight in the early 1970’s. Gas lines formed. The Arab Oil Embargo for the Six-Day war was another blow. This occurred as the U.S. hit domestic peak oil in 1972. Later in the decade, the Iranian Revolution would cause speculation in oil markets to raise the price once again (there was no actual supply shortage). It was the single largest transfer of national wealth in human history from the industrialized world to the oil producing nations of OPEC, particularly Saudi Arabia and the Gulf states. Interestingly, it was after this transfer of wealth that the threat of Islamic terrorism began to rise, funded by this money.
The price of oil tends to coincide with the change of macroeconomic regimes and financial dislocations. This is ignored by many economists.
Economists have a name for this phenomenon too. They call it cost-push inflation:
Cost-push inflation is a phenomenon in which the general price levels rise (inflation) due to increases in the cost of wages and raw materials. Cost-push inflation develops because the higher costs of production factors decreases in aggregate supply (the amount of total production) in the economy. Because there are fewer goods being produced (supply weakens) and demand for these goods remains consistent, the prices of finished goods increase (inflation). (Investopedia)
In this alternative view, full employment did not cause the problem. Rather it was an ill-advised war that the government refused to “pay for,” coupled with an unpredictable rise in the cost of the substance most dear to the economy–energy–as the result of cartel manipulation and geopolitical tensions.
Who’s right? What was the real cause? Hard to say, but I lean towards oil. If full employment was such a problem, then why did it take Kalecki’s prophecy thirty years to come true? Maybe because that’s when the energy costs spiked. The ugly conclusion from Blyth’s view, as I see it, is that is we cannot have full employment, otherwise inflation will inevitably be out of control. That is, we “need” a certain portion of the populace to be unemployed. I find this disturbing. It amounts to what is basically human sacrifice.
Inflation was running rampant. Stagflation. The Misery Index. Put on a sweater. Malaise forever. History’s greatest monster. You know the deal.
The system badly needed a reset. How was this accomplished?
The New Macroeconomic Regime 1980-2007: Neoliberalism
Blyth describes a macroeconomic regime as a sort of “software” written on the the “hardware” of capitalism. After 1980, a new software was written. Now, not only would full employment NOT be a goal anymore, but labor would now be “disciplined”–forced to accept declining wages, longer working hours, less benefits, less stability (and it’s converse-more “flexibility” for employers), more international competition, and so on. It was, essentially, a revolt of the elites rather than the masses; from the top down rather than the bottom up.
Instead of elected governments, policy was handed over to the central bankers, who were unelected. What they did was to “cure” the inflation by raising real interest rates to extremely high levels. This caused a terrible recession between 1979 and 1982, and unemployment rates to spike to over ten percent. But it did bring down inflation and reset the system. This coincided with the transition from the Democratic Carter administration to the Republican Reagan one.
“And what’s the solution to stagflation? Hand policy to central bankers, because they’re not elected and they can’t be thrown out of office, and they can jack up interest rates to twenty percent when inflation’s sixteen percent, cause a massive hemorrhaging of the economy and a constriction of credit and you get the big recessions that happened in the 1980’s. But it really reset the system.”
“There was a new software written onto that hardware, and that was the ideas of Thatcher and Reagan and the people behind them. That open markets, price stability, going global, that was they way you do it. That flexibility was good, that labor was bad. That the returns to capital had to go up otherwise what was the point of capitalism? That was the Neoliberal compact.”
In addition, markets would open up to globalization. Regulations would be abolished. Capital would be free to seek its highest return. Markets would be liberalized. Trade unions would be crushed. Banks would be deregulated. Taxes would be lowered. Labor would become flexible and footloose. Economies would remove tariffs and open themselves up to foreign competition. Labor would no longer be protected.
Globalization would neutralize the power of workers, because now if unions demanded more money, production would just move somewhere else. First it moved to the “right to work” states of Dixie, and then abroad to places like China and Mexico. Workers could no longer demand raises from their employers. This ended the push for inflation, because rising wages are what drive inflation. I’d also note that high oil prices led to new reserves being tapped, especially Alaska and the North Sea, breaking the power of the OPEC cartel and bringing oil prices back down.
As banks lent more and more money, and as capital was “freed” to seek it’s highest return anywhere in the world, the total amount of money increased. This drove interest rates down and down and down. The problem with low interest rates, though, is that it decreases the returns to capital. It also penalizes savers and investors, since their accounts are not earning much of a return. So how could high finance ensure adequate profits in a world of low interest rates and low inflation? The answer: leverage.
“Now here’s the problem, those interest rates go down over time because you make more and more financial transactions, you integrate different markets, you open up globally, so the pool of money gets bigger. As the pool of money gets bigger, the price of money falls. What’s the price of money? The interest rate.”
“So how do you make money on a declining spread? You pump up the leverage. And the banking system of the West became multiples of the underlying size of the economy. And it was all working great so long as everyone was revolving credit, whether it was your credit card, your house, the mortgage, the corporate loan book, whatever it was, so long as it doesn’t go bust.”
Instead of full employment, financial policy in the new regime would focus on something else–controlling inflation, aka price stability, above all else. At the very hint of inflation, central bankers stood ready to raise interest rates. The central bankers became household names as the ability of politicians to influence the newly globalized economy waned. Inflation went away, never to return. This now made the world safe for investors, but it made debts harder to service. It was now a “creditor’s paradise.” Even though inflation stayed low for decades, the 1970’s fear of inflation lingered on.
However, as Blyth points out, this fear was irrational; the long-term trend in interest rates was for them to go down. He cites statistics showing that the interest rate for government debt has been declining since 1350! In fact, literally the only period of high inflation we see in the data was the 1970’s, and yet all of our macroeconomic thinking today is based on that one short time period (when oil prices spiked, interestingly enough):
“You all watch Game of Thrones Right? Right; Game of thrones; ‘Hi, I’m the king, I’d like to borrow some money.’ I’m the Iron Bank of Braavos. You know what happens–everybody dies.”
“In that world, you have very high real interest rates, because if you get a bond from a government, they might rip you off. There’s no secondary market where you buy and swap different bonds around to offset the risk. So you have very high real interest rates.”
“The Italians and the Dutch come along in the 15-16th century and invent a secondary market for government debt. That starts to grow rapidly. The risk dissipates.”
“And then by the time you get to the 1700’s, real interest rates are below four percent. The Brits can issue a perpetual bond, a ‘forever’ bond to fight the Napoleonic wars at three percent and it’s oversubscribed. By the time you get to 1941, the real rate of interest is 1.88.”
“So the long-run real rate of interest rate for the global economy is two percent. Then there’s the 1970’s. All the inflation is in the seventies because of the unique confluence of events which was the post-war regime and its breakdown. But all of the economics we’ve ever learned is based on that one bit of the trend series. Everything else is forgotten.”
At the same time, wages in the wealthy countries stagnated. The labor/capital split now shifted. Labor’s share of income, formerly going up, now went down. Wages became decoupled from productivity. Real wages, when adjusted for inflation, remained flat for decades. As the economist Thomas Piketty later asserted, if the rates of return to capital are higher than the overall growth rate of the underlying economy, inequality will dramatically increase without bound. This is especially true when wages are stagnant.
“Capitalism is run by investors, investors and firms. Once those firms go global as they did in the eighties and increasingly in the nineties, then the ability of domestic labor to demand their share of the profit split with capital really declines. And that begins the wage stagnation that we see in 1979 which continues all the way through to the present day, such that 60 percent of Americans, when adjusted for inflation, haven’t had a wage rise for thirty years.”
A small, coastal elite, who held much of the paper wealth, became fabulously rich. But workers who depended on labor for their income, particularly in places that had deindustrialized like the American Heartland or the English Midlands, were hard hit,faced with declining wages, dead-end jobs, shrinking government services, budget cuts, jobs moving to other countries, and mass immigration into their communities. At the same time, the costs for “non-tradeble” goods like education and healthcare soared into the stratosphere. Although globalization resulted in cheap consumer goods, the costs for things like college, health care, child care, and later, housing, became an increasingly onerous burden.
With declining wages, how would consumption keep up? How would Americans pay for the rising costs? By using credit to substitute for the lost wages. And this wasn’t just true of individuals, but governments as well. Governments, too would lower taxes and make up for the difference by borrowing from the private sector. Here’s filmmaker Adam Curtis explaining the role finance played in the 1980’s:
14:00: “The interesting thing about the 1980’s is that everyone thinks that Thatcher and Reagan really were successful. But increasingly historians are looking back and going, ‘No they weren’t.’ They came in saying they were going to regenerate industry. But by about 1986-7 most of the industries in Britain and America had collapsed because of the economic experiment. So what Thatcher and Reagan did was they turned to finance. And they said, ‘Can you help us?’ And what finance did was to say, ‘We’ll lend the money.’ Wages weren’t going up. Wages were actually collapsing at that point.”
“So what happens is you had a switch and they gave power to finance. And finance came in and introduced the idea of lending on a much grander scale. The politicians allowed that because they facilitated all sort of new acts of parliament that allowed all that lending to happen.”
“So what you’ve got is a shift away from the idea that you were on a constant travelator of increased wages, increased security in the industries you worked in. Your income stagnated and it was supplemented by lending money. So finance got a great deal of power.”
“Now underlying finance is a deep desire to keep the world stable, to avoid chaotic situations. So we began to move into that world where were always trying to avoid risk. What then happens is that idea begins to spread out, not just literally in terms of you lending money. The idea of avoiding risk becomes the central thing in our society. And I would argue that weve all become terririfed of change. Which is conservative.”
And thus, beginning in the 1980’s finance became the new basis for the economy. Manufacturing, meanwhile, practically disappeared–felled by a combination of offshoring and automation. Service jobs, at lower pay, became the most common job type. At the same time, a college degree became a basic requirement for any job that paid more than minimum wage. And, finally, the main alternative to this system–Communism–collapsed and went under. Now, there truly was, as Margaret Thatcher put it, no alternative.
The problem was that financing living standards with debt was unsustainable, especially with declining incomes for the majority of wage earners:
“So how do people survive when wages aren’t growing? They borrow…In 2004 I lived in Baltimore. I went away for two weeks. I couldn’t open the door when I came home cause I had so many credit card offers…That’s why banking’s so big. Because every single one of us is running a deficit.”
“For everyone who fifty years old or older, do you remember a time when you didn’t have credit cards? For everybody who’s under fifty, that happened. We used to have this thing called the state that ran deficits for us and paid for stuff. But now you do it yourself. Through student loans. Through revolving lines of credit. Through borrowing from your house as if it’s an ATM. Because that’s what you’re using to fill in the gap.”
The problem with such skewed rewards to globalization was that the people at the top can only buy so much. They save much of their income. Meanwhile the bottom sixty percent have seen their wages stagnate and have been taking out student loans, mortgages, credit card debt, payday loans, etc. to make up for lost wages and shrinking public services. This leads to a fall in consumption once all the debts start going bad and people are tapped out and can no longer borrow against their incomes and the asset bubbles burst:
“Now this created a big problem. I like Mitt Romney, but there’s only so many fridges he can buy. You do have a basic consumption problem if you’ve been running your economy as we have for the past thirty years on credit. And if people’s wages haven’t been rising and they’re strapped with too much debt–which banks call credit, [because assets and liabilities sum to zero]–then they can’t service their debts. At the same time they’re being told if you don’t go to college you’ll never amount to anything, there’s no jobs for anyone who doesn’t have a college degree these days, you end up with a world where your share of the national income is falling despite the fact that the country has never been richer. And it’s not just this country, it’s every country.”
Here’s Steve Keen explaining how it was done under the new macroeconomimc regime:
“The fundamental engine that drove the apparent success of Neoliberalism until the crisis struck was an increasing level of private debt–leverage.”
“The reason that private debt matters is because credit is the source of a large part of demand.”
“When you borrow money, what you’re actually doing from banks is: the banks are creating money, creating a debt for you at the same time, you then spend the money you’ve borrowed, so that additional change in debt becomes a component of demand today. But of course with that change in debt that gets added to the level of outstanding debt, and you can have a process where that level rises over time.”
“In the UK’s case…from 1880 right through to 1980, there was no trend in the level of private debt compared to GDP in the UK, it never exceeded 75 percent of GDP. When Maggie Thatcher came to power it was 55 percent of GDP. The debt level from Maggie Thatcher rose from 55 percent of GDP in 1982, to 190 percent in 2008…The reason the crisis occurred was the rate of growth of debts went from positve to negative, and bang, you had a crisis.”
“So the bubble was caused by a rise in leverage, the crisis after it was caused by an absence of the same substantial level of credit simply because both households and businesses are unwilling to borrow at the rate they were willing to borrow when the bubble was going on and the banks aren’t so willing to lend either. So we’ve got a sclerotic effect from the level of accumulated debt. That’s the real story.”
Blyth summarizes the problem as, “Debts are too high, wages are too low to pay the debt off, and inflation is too low to eat the debt.” Leverage works so long as the debts can be paid. But once they can’t, the whole thing falls apart like a house of cards. Leverage also tends to raise asset prices, causing bubbles. This is what happened during the global financial crash 2007-2008. Here’s Steve Keen again describing what happened:
“What Neoliberalism allowed the West to do was to use leverage to dramatically add to total demand, but of course adding to debt at the same time. And then when we reached the situation where so many interventions which were debt financed went bankrupt, where there were assets that were overvalued and then collapsed and then wiped out the banks in the process, and where people realized that rather than house prices rising forever they sometimes fall so you get the hell out of mortgage debt, all those things came along and they became what I called the walking dead of debt…”
“…The levels of debt are the highest they’ve been in human history, and well beyond what we can service reliably, and also investment and consumption are both diminished dramatically because people don’t want to invest beyond their income levels which they do during a boom. So the only way to solve it is to get the debt level down.”
The Second Macroeconomic Regime Collapses: 2007–Present
The second regime collapsed during the Global Financial Crisis of 2007-2008. Why did that happen?
Blyth doesn’t explain the specific timing of it, but it’s interesting to note that this, too, coincided with a dramatic spike in the price of oil.
The fundamental cause, however, is easier to understand. The financial sector was leveraged to the hilt. The total assets were multiples the size of the underlying “real” economy of goods and services.
If enough liabilities go bad that they exceed the assets of a typcial bank, the bank goes under. QE bought up the bad assets.
Blyth explains the concept of leverage, assets and liabilities using the example of a mortgage:
“People confuse debt and what a debt is. It’s not just this bad thing. Debt on the public side or the private side is an asset. The people who got bailed out got their assets bailed out.”
“Now, a very simple way of thinking about this: I have a mortgage, you have a mortgage. That to a bank is a liability. They don’t want your house. They want the income stream coming from it. [That’s their asset]. My asset is my house. My liability is paying the mortgage. It all sums to zero.”
However, unlike the previous regimes, this time there would be no “reset.” There would be no new software written for the hardware of capitalism. Instead, banks and the wealthy were bailed out by taking their assets onto the public balance sheet through “money printing” and buying up junk bonds:
“In the 1970s the system failed. It had a heart attack because of inflation. The Neoliberals came along and reset the system. They wrote new software for the hardware. We didn’t do that in 2008. We let the money doctors come in. What they did was they pumped 13 trillion of Euros, Dollars and Yen into the global banking system to keep the system going. They had a heart attack, and we basically put them into intensive care for ten years.”
The corresponding rise in public debt sparked calls for “austerity” on the part of elites to bring down the government’s debt. But, even as the investor class was bailed out, savage austerity cuts would be aimed squarely at the poorest and most vulnerable members of society who had been borrowing like crazy just to maintain their living standards in the face of decades of stagnant wages and rising costs:
“When you bail out the assets of a bank, you’re bailing out the assets and incomes of the top twenty percent of the income distribution, particularly those at the very top. So when you’ve just done that, they’re not going to turn around and say let’s pay extra taxes because we got bailed out. No, they want to put that on the other part of the income distribution–the ones who are now being told, we can’t have this, you need to pay more, your education can’t be free, et cetera.”
With interest rates at practically zero, there was nothing for the central bankers to do to stimulate the economy this time. Instead of increasing government spending as prescribed by Keynesianism, politicians preached the need for “belt tightening” due to the rising debt. This had the effect of shrinking the economy:
“There’s no inflation in the system. Why? because labor produces inflation. And once you’ve globalized your labor markets, there’s no inflation anymore. Why can’t Janet Yellen bring inflation rates up? Why can’t Draghi bring interest rates up? Because there’s no reason to. There’s no inflation. When you do it, you’d simply slow down the economy. But what does that mean for savers? What does that mean for pension funds? Whoops!”
“Now, in my opinion, add this all together and you get populism. Debts are too high. Wages are too low to pay off the debt. Inflation is too low to eat the debt. You can’t play the trick you did in the 1970’s when you got a mortgage. It’s the other way around–this is a creditor’s paradise, not a debtor’s paradise.”
“The Left response is ‘blame capital, blame globalization. And they’re not blameless. The Right response is blame immigration, blame globalization. We can disagree on the immigration one, but they’re basically hitting on the same things.”
The Neoliberal economic regime hollowed out the middle classes of the industrialized world, even as they raised incomes in much of the developing world. The “elephant chart” compiled by economist Branko Milanovich shows the incomes of everyone in the world from the poorest person on earth to the richest, along with the percent change due to globalization:
The graph shows that incomes for the poorest countries went up, from a pittance to something less than a pittance. The biggest gains represent the emerging middle classes of Asia. The American middle class was represented by the 65-80 percentile in the global distribution. Their incomes have taken a beating. And notice that the last four squares represent the wealthiest 10 percent of people on the planet. They’ve captured the majority of the growth under Neoliberalism, such that 42 billionaires now own the same wealth as the bottom half—3.7 billion people—of the world’s population.
“Guess what? The top 20 percent have made off with all the cash. And if you’re actually in the bottom?…it’s hardly budged since 1980. And that’s true for the bottom three quintiles. So sixty percent of the country hasn’t had a pay rise when you adjust for inflation since 1980. Meanwhile, people like me on the coasts, we’ve been lapping it up. I’ve been having lobster thermidor in the bath!..”
These declining living standards for the majority in wealthy countries gives rise to populism. Even as the country as a whole has never been richer, and the stock market and GDP are hitting new heights, workers are having a harder and harder time making ends meet. Their wages are declining. They are heavily indebted. Their formerly good and stable manufacturing jobs are replaced with low-paying “flexible” service jobs with no benefits. Digital technology is forcing people to become “independent contractors.” And now workers hear even those jobs will soon be replaced by robots. Blyth illustrates this phenomenon with a hypothetical Rust Belt worker named Gary:
“There’s a guy called Gary. Gary lives in Gary, Indiana. Gary [has] ten years in the union in 1989. He gets seniority. He’s a line supervisor with seniority, he’s turning thirty, he gets married, everything’s going great. And he’s getting $30.00 an hour, real [wages].”
“Now, who knows why, but they’ve been moving the plant and the equipment down South for a long time. China didn’t take most of the industrialization; the South did. Texas did. North Carolina did. So they’ve been losing a lot of the industrial base. But then they signed this thing called NAFTA. And the plant disappeared, the supplier plant disappeared, and the town takes an enormous economic hit.”
“So a lot of people move out. the tax base goes down. The schools get worse. And he bootstraps himself and says, ‘I’ve never relied on anybody; I’ll get another job.’ They were meant to retrain him as a computer programmer; that’s what everybody said, but the governor at the time really just gave a shit about tax cuts, so they just cut the budget for that and handed it out to people.”
“So then he ended up getting a job in a call center. So he went to $15.00 an hour. And then five years later the call center went from Indiana to India. And now Gary works in his dotage, very hard, long hours, for $11.67 an hour for the largest employer in the United States–WalMart. And every day Gary reads in the papers how him and all of his mates are about to be replaced by robots. Because you do, every day. Whatever sector you’re in in the low end of the labor market–automation, robotiztion, it’s going to happen.”
“And the guys on Wall Street who got bailed out with everybody else’s money, they love this. They’re going to make a fortune off this. All these internet entrepreneurs, [the] Uber guys, they’re the ones who will own the patents on the robots. And he’ll be thrown on the scrap heap with his mates. And the only person who articulates anything he actually gives a shit about is this guy Trump.”
“Now he knows [Trump’s] a buffoon. He knows he’s a reality TV star. But [Gary] has had politician after politician after politician showing up and saying ‘vote for me better jobs, vote for me more security,’ and life’s gotten crappier and crappier and crappier. So he has no reason whatsoever to believe a word they say. So he has a liar on one side, and a bullshit artist on the other. Which one gives you more possibilities?”
Communities around the country have gotten worse and worse outside of the coasts and major cities for decades. Many can’t even afford to maintain their outdated infrastructure. The main losers from the situation were the center-left and center-right parties who unanimously supported Neoliberalism. Under their watch, things have gotten worse and worse for at least half the population, and they are fed up.
So the fringe parties come to the fore. Despite their differences, their core planks are:
1.) Left Populism: Blame globalization, blame capital.
2.) Right Populism: Blame globalization, blame immigration.
Both sides essentially converged on the same basic program–turn inward, against globalization. They exploit the anger caused by debt and falling living standards. And sometimes they use racism and xenophobia to do it. We’ve seen this before. It’s the world the architects of the post-WW2 war order were so desperate to avoid, because they knew where it inevitably led.
“So what you have is a sort of debtor’s revolt against the world we’ve built over the past 30 years which is a creditor’s paradise. So what you see is a left wing expression and a right wing expression, a racist expression and a non-racist expression of fundamental discontent with the way the rewards of the system have been skewed over the past thirty years.”
The key is that there are no real solutions being offered to the above problems by the mainstream political parties, and the people in charge don’t look like they know what they’re doing. Meanwhile, the workers have done everything capital asked of them. They went back to school. They retrained. They took out huge debts for college. They became flexible. But their living standards didn’t budge. Life kept getting harder. Because the mainstream parties offered no alternatives that actually translated into improving the status of anyone besides the top 20 percent, people turned to buffoons and demagogues, and some of them are very ugly indeed. Trump, the alt-right and Brexit are all examples of this trend.
What is the Solution?
Mark Blyth and Steve Keen both suggest possible solutions.
Blyth opposes undoing globalization and turning inward to economic nationalism and tariffs. He notes that the wealthy countries of the West have not had enough children, and without immigration their economies will shrink.
Instead, he recommends the state take over the things that have skyrocketed in price. He recommends universal health care, universal free education and free child care. Do those things, he claims, and you will nip populism in the bud.
Keen’s prescription is more ambitious. He advocates using the money-issuing power of central banks for a “people’s qualitative easing.” He explains the concept by describing the money-creation powers of central governments and how it was used to make the banking sector whole:
“If you owned your own bank, and people accepted checks you wrote on your bank as complete payment of any debt you had, would you feel worried about a large amount of debt? And the answer is fundamentally no, because if you could draw checks on a bank you owned…when as soon as you gave that check to somebody else, they basically wrote off what you owed them, and they then used that themselves to exchange money with other people, you’d be on easy street. The only danger you’d face is creating so much of the stuff that you caused a bubble, that the economy itself fell over because you were importing too much from overseas and the trade balance exploded and so on. That’s the real danger of somebody who owns their own bank spending without limit.”
“But that’s fundamentally the situation that the government is in. Any region where the government produces its own currency, and of course the UK government has the Bank of England which produces Pounds…it can pay its debts with its own bank. Now we’ve put all sorts of legal restrictions on them doing this…”
“When the treasury records that it’s going to, say, spend 50 billion Pounds and it’s going to get 45 billion in tax, therefore it has a 5 billion dollar gap, it then issues bonds to the equivalent of 5 billion pounds to pay for that. And let’s say its going to charge an interest of 10 percent, which is far higher than current levels. So 5.5 billion pounds of bonds it issues.”
“Currently those bonds have to be sold to the private sector, which means there’s a transfer from the financial sector to the government of money, and then the government spends that money into the economy.”
“And then, of course, they’ve got a debt to the financial sector. But that debt is effectively paid for by the central bank–the Bank of England– crediting the accounts of these institutions that own the shares; that own the bonds they’ve bought off the government. So it’s an accounting operation all the way through.”
“And if the central bank actually bought *all* the bonds outstanding, which is pretty close to what it’s done with QE…We know that in 2010 or 2011 when the Bank of England began Quantitative Easing, the amount of money that it created for bond purchases off private banks and off private financial institutions was 200 billion Pounds. Now did you get your part of the QE tax bill that year? The answer is, no you didn’t. There was no QE tax. The central bank simply said, ‘we’re going to deposit 200 billion Pounds worth of money in financial institutions’ bank accounts in return for them giving us the ownership of 200 billion pounds worth of bonds, whether they’re government bonds or private bonds. So its just an accounting operation.”
So, what you can do is, as I said, the government pays its own bills with what fundamentally amounts to an accounting operation. They pay QE which is fundamentally an accounting operation. And the level of QE which was running at 200 billion pounds per year was on the order of 1/6 or 1/7th the size of the economy per year. That’s the scale that they can do.”
“So government money creation could be used for what has been called ‘People’s Quantitative Easing,’ or what I call a Modern Debt Jubilee. Use that money creation capability to give a per capita injection to everybody in the country with a bank account. If they are in debt at all then the money reduces their debt level. If they’re not in debt they get a cash injection, but that cash injection could also be made conditional on them buying shares from companies that were required to pay their debt levels down. So you could actually use it as a way of using government money creation capability to effectively rebalance this from a far too much credit, far too little fiat-based money to a more sensible balance of the two.”
“You can do that level of spending politically during something like the Second World War because it’s an existential threat and nobody in their right mind is going to criticize using government money creation to mobilize as many physical resources as possible for a particular objective that virtually everybody in the society supports…in the UK’s case the government’s deficit in the first year of the Second World War was 40 percent of GDP. Nobody stood up in parliament and said ‘we cant afford this bill because our children will be indebted for the future,’ because somebody on the other side would say, ‘we cant afford *not* to spend this money because if we don’t your kids will be speaking German.’
Adam Curtis isn’t an economist, but an observer of society. He describes the period we’re in now as “Hypernormalization,” a concept taken from the last days of the Soviet Union. It’s described as a state where everyone knows the politicians are lying, the people know they’re lying, the politicians know they’re lying, and the politicians know that we know they’re lying. But everyone is just going along with it because nobody knows how to do anything else besides make-believe. He also points out that finance desires a predictable world which is fundamentally conservative, and the power of finance means that this is what politicians support. This means that measures that would shake up the system are less likely to be supported.
The reason there are no mass movements against this situation, Curtis argues, is that coming out of the Hippy Movement of the 1970’s was an attitude that prized individual self-expression and not being told what to do by others above collective self-sacrifice. This made mass movements impossible, since they require people to sublimate their individuality to the goals of the movement.
Instead, advertising, and later, social media platforms, learned how to exploit this desire for self-expression while still finding a way to manage and herd large groups of people for the benefit of elites. It accomplished this though cybernetics:
28:35: “The genius of modern power is that it managed to what politics failed to do. Politics can’t deal with individualism, because how can you have a political party where everyone wants to be an individual and not be a part of something? What modern managerial systems managed to do was square the circle. Look at modern social media. It manages to allow you to feel that you are totally yourself, expressing yourself online…Yet at the same time you are a component in a complex series of very complex circuits that is looking at you doing that and saying, ‘Hang on, if hes doing that, then hes very much like these people here which we’ve categorized like that.’ So we can say back to that person in they circuit, ‘Hey if you’re doing that, would you like this as well?’ and you go, ‘Hmm, all right, because its a bit like what you’ve just done. And it makes you sort of feel secure within your individuality.”
“So what they’ve managed to do, increasingly, the modern systems of management, is accept your individualism and your expressiveness; allow you to feel that you’re being more and more expressive, while at the same time managing you quietly and happily so that you become part of a very large group that you don’t see, because you’re just a little component in the circuit, but the computers look at you go, ‘Oh, well, there’s about 300 million of those sort of types, we’ll put them in that group. And its not a conspiracy; a group of people going ‘We’ll do this.’ It’s a system that can see from the information that it’s reading from you and lots of other people the patterns that you are part of and saying, ‘Well we’ll fit them all together into that pattern.’
And its benign in their terms. If you talk to the tech Utopians from Silicon Valley, they will go ‘This is incredibly efficient.’ And they’re right. It’s an incredibly efficient way of managing the problem that politicians can’t manage, which is our individuality and our desire to be self expressive. It’s problem is that its fundamentally conservative because its feeding back to you more of what it knows you like…
With online systems, the way to get people to participate is to make them outraged enough so they go online and click. This puts us all into little bubbles where we only see ideas that we already agree with. And so, we get politicians endlessly fanning the flames of the “culture wars” and getting people upset, but nothing substantial ever really changes. The economy just chugs along, making the rich richer and the poor poorer, with people becoming more and more frustrated because seem to have little impact on government. This, he claims, is because they are being micromanaged in a way that they don’t actually see by tech utopians in the name of efficiency.
Finance, as he points out, wants stability, not change, and in this goal they are assisted by the cybernetic control systems such as Google and Facebook. This can keep people forever atomized in their own groups and prevent fundamental change. Instead, people’s frustrations are channeled to cultural issues, egged on by politicians whose prime goal is not to unite people, but to keep them continually divided:
48:28: I have a very cynical theory about Trump. As politics became…less and less substantial and less and less confidently able to change things and power shifted away to all sort of other things that we were participating in…Really what people like Trump are is, they’re not politicians, they’re pantomime villains. They’ve turned politics into a Vaudeville. And what they do is they come onstage and we go THIS IS OUTRAGEOUS!!! This is absolutely terrible! We type away on social media saying, ‘This is is really really really bad.’ And…a marketeer for online told me once, angry people click more. And clicks are gold dust. And really what those clicks do is feed modern power. And everything stays the same…
This makes huge profits for media conglomerates and Silicon Valley, but everything stays the same. Until when? How will the system reset itself? When will it? Can it? Will it take another world war? Or will things just continue to get worse and worse forever for the majority? Is there any alternative? That, it seems, is a question no one can answer.
…Calvinism is “perhaps the first systematic body of religious teaching which can be said to recognize and applaud the economic virtues.” No longer was “the world of economic motives alien to the life of the spirit.” Here is Zwingli, quoted by Wiskemann, quoted by Richard Tawney: “Labor is a thing so good and godlike…that makes the body hale and strong and cures the sickness produced by idleness…In the things of this life, the laborer is most like to God.”
Adam Smith: Supermoney, pp.137-138
Last time we took a historical survey of how large-scale civilizations were made possible by human slaves, and all of the major forms that it took. We also looked at some of the common misconceptions about how slavery worked in ancient societies.
Hagens makes a familiar point: much of the “work” performed by modern society today is no longer performed by flesh-and-blood human and animal slaves but by devices powered by fossil fuels, which he calls “energy slaves.” Some of this is performed via heat engines like internal combustion engines, electric dynamos, boilers, and so forth, while others tasks are performed through electricity: electric motors, transistors, heat pumps, cybernetic devices and so on. Recall that the ancient world had none of these:
…every American has over 500 invisible energy slaves working 24/7 for them. That is, the labor equivalent of 500 human workers, 24/7, every day of the year, mostly derived from burning fossil carbon and hydrocarbons…
We use the “slave” metaphor because it’s really a very good one, despite its pejorative label. Energy slaves do exactly the sort of things that human slaves and domestic animals previously did: things that fulfilled their masters’ needs and whims. And they do them faster. And cheaper. Indeed, it probably wasn’t a big coincidence that the world (and the USA) got around to freeing most of its human slaves only once industrialization started offering cheaper fossil-slave replacements.
The things we value are created with a combination of human and energy-slave work combined with natural capital (minerals and ores, soils and forests, etc.). There are huge amounts of embedded energy in the creation and operation of something like an iPad and the infrastructure which makes it work…To an ever-increasing degree over the last two centuries, wealth has been created more by fossil slaves than by human labor, significantly more – and it’s at its all-time peak about now…
In fact, we have so much energy, we actually make things expressly designed to be used once and thrown away! Or to fall apart quickly–so-called “planned obsolescence.” People who buy used goods often notice that older products tend to last longer than new ones, and often perform better and more reliably. Recently Apple admitted that they intentionally slow down older devices in order to get people to upgrade.
We increasingly buy disposable everything – used once and tossed away. Most everything is short-life these days; when your authors were young if you bought a fan, you expected it to last 20+ years. Now if it lasts 2-3 before you toss it, that’s about par for the course. Planned obsolescence exists because it’s “good for GDP.” A new dishwasher now lasts 6-8 years when it used to last 12-16, because they now have integrated cheaper electronics that fail.
Our GDP has become tethered to rapid product-replacement cycles keyed to our short attention spans and our enjoyment at buying new things. This creates “jobs” for car salesmen, advertising executives, etc., but has tilted the scales in favor of “useless GDP” rather than real societal utility. We know how to make things with high quality that last, but due to time bias and the financialization of the human experience, such an objective is relatively unimportant in our current culture. Many people get a new phone every 18 months with their cell plan, and perfectly functional ones wind up in the landfills.
After making a good case that our prosperity is actually the result of a massive surplus of energy channeled into heat engines of various types, Hagens and his co-authors consider the concept of “work.” Why, they ask, if so much of the work in our society is performed by energy slaves, do we place such a high value on “work”?
And place a high value on it, we do. In fact, we are well on our way (if not there already) to a society of “total work” where work encompasses every aspect of our lives and determines our entire value as a human being. Silicon Valley enthusiasts use polyphasic sleeping to reduce their “downtime” (a computer term) to only three hours a night. They scarf down meal replacement shakes and powders to avoid eating so they can spend more time at the office. Family time is seen as “unproductive,” and students labor away at several hours of homework a night. Entry to many professions has less to do with necessary training time as being a hazing ritual (e.g. law, medicine). Amazon employees openly weep at their desks and answer emails at three in the morning. We skip vacations for fear of being cast aside, or inundated upon our return. We cower at the tyranny of the punch clock and time sheet. The most admired person in our society is the business executive who sleeps only a few hours a night and arrives at the office by 4 AM, or the Wall Street trader who works past midnight.
For upper-middle class men, notes sociologist Michèle Lamont, ambition and a strong work ethic are “doubly sacred…as signals of both moral and socioeconomic purity.” Elite men’s jobs revolve around the work devotion schema, which communicates that high-level professionals should “demonstrate commitment by making work the central focus of their lives” and “manifest singular ‘devotion to work,’ unencumbered with family responsibilities,” to quote sociologist Mary Blair-Loy. This ideal has roots in the 17th century Protestant work ethic, in which work was viewed as a “calling” to serve God and society. The religious connection has vanished…or has it?
Blair-Loy draws parallels between the words bankers used to describe their work — “complete euphoria” or “being totally consumed” — and Emile Durkheim’s classic account of a religion ceremony among Australian natives. “I worshipped my mentor,” said one woman. Work becomes a totalizing experience. “Holidays are a nuisance because you have to stop working,” said one banker interviewed by Blair-Loy. “I remember being really annoyed when it was Thanksgiving. Damn, why did I have to stop working to go eat turkey? I missed my favorite uncle’s funeral, because I had a deposition scheduled that was too important.”
Work devotion marries moral purity with elite status. Way back when I was a visiting professor at Harvard Law School, I used to call it the cult of busy smartness. How do the elite signal to each other how important they are? “I am slammed” is a socially acceptable way of saying “I am important.” Fifty years ago, Americans signaled class by displaying their leisure: think banker’s hours (9 to 3). Today, the elite — journalist Chrystia Freeland calls them “the working rich” — display their extreme schedules.
Every moment of our waking lives becomes “work”, from the creation of art, to eating (“still working on that???”) to sex. Everything we do must contribute to the totalitarian productivist ethos of society. Even social maladies such as obesity and mental illness are never dismissed as intrinsically bad, but rather only undesirable to the extent that they “decrease productivity.” We have been reduced to productivist meat-machines, where anyone who does not continually contribute to the maximization of GDP must be ruthlessly cast aside as a mere speed-bump on the highway to the Singularity and Martian colonies.
…how, in this world of total work, would people think and sound and act? Everywhere they looked, they would see the pre-employed, employed, post-employed, underemployed and unemployed, and there would be no one uncounted in this census.
Everywhere they would laud and love work, wishing each other the very best for a productive day, opening their eyes to tasks and closing them only to sleep. Everywhere an ethos of hard work would be championed as the means by which success is to be achieved, laziness being deemed the gravest sin…In this world, eating, excreting, resting, having sex, exercising, meditating and commuting – closely monitored and ever-optimised – would all be conducive to good health, which would, in turn, be put in the service of being more and more productive…
Off in corners, rumours would occasionally circulate about death or suicide from overwork, but such faintly sweet susurrus would rightly be regarded as no more than local manifestations of the spirit of total work, for some even as a praiseworthy way of taking work to its logical limit in ultimate sacrifice. In all corners of the world, therefore, people would act in order to complete total work’s deepest longing: to see itself fully manifest.
This world, it turns out, is not a work of science fiction; it is unmistakably close to our own… We are on the verge of total work’s realisation. Each day I speak with people for whom work has come to control their lives, making their world into a task, their thoughts an unspoken burden…
Thus, despite all our fossil energy slaves, despite all our labor-saving devices and cybernetic achievements and artificial intelligence and self-driving cars and robots and fully-automated lights-out factories churning out more widgets than can ever be consumed, it seems like we are more “work-obsessed” than ever before in human history!People in past societies worked far less than we do.
And yet it’s difficult to see what much of the extra added “work” has really contributed to society:
In 1930, the British economist John Maynard Keynes predicted that, by the end of the century, the average workweek would be about 15 hours. Automation had already begun to replace many jobs by the early 20th century, and Keynes predicted that the trend would accelerate to the point where all that people need for a satisfying life could be produced with a minimum of human labor, whether physical or mental.
Keynes turned out to be right about increased automation…But he was wrong about the decline of work.
As old jobs have been replaced by machines, new jobs have cropped up. Some of these new jobs are direct results of the new technologies and can fairly be said to benefit society in ways beyond just keeping people employed. Information technology jobs are obvious examples, as are jobs catering to newfound realms of amusement, such as computer game design and production.
But we also have an ever-growing number of jobs that seem completely useless or even harmful. As examples, we have administrators and assistant administrators in ever larger numbers shuffling papers that don’t need to be shuffled, corporate lawyers and their staffs helping big companies pay less than their fair share of taxes, countless people in the financial industries doing who knows what mischief, lobbyists using every means possible to further corrupt our politicians, and advertising executives and sales personnel pushing stuff that nobody needs or really wants.
Anthropologist and activist David Graeber contends that if we consider the economy-wide job profiles we had in the 1930’s when Keynes wrote his treatise, then we truly have eliminated most of the jobs! That is, we have indeed eliminated most of the human labor from large swaths of the economy thanks to our energy slaves, along with dramatic gains in efficiency.
Conventional economists argue that economic growth engendered by these changes to the economy has created enough new positions to absorb all the displaced labor from the automated and eliminated sectors of the economy such as manufacturing and agriculture. Furthermore, they argue the need for labor is essentially unlimited (the “Lump of Labour” fallacy). Graeber, however, argues that, even in theoretically “efficient” capitalist economies, a good portion of the displaced labor has been absorbed into unnecessary, or even socially harmful, make-work tasks, what he terms “Bullshit Jobs.”
Graeber lists five categories of Bullshit Jobs:
1. Flunkies – People who are there just to make someone else look good.
2. Goons – People whose jobs only exist because their competitors have them as well, such as corporate lawyers, lobbyists, telemarketers, etc. in a sort of zero-sum arms race.
3. Duct Tapers – people paid to continually apply patches to a broken system without fixing the underlying problems which are clearly identifiable. See, for example, the entire American health care system.
4. Box Tickers – people who are there to permit an organization to say they are complying with various rules and regulations that they are not actually complying with.
5. Taskmasters – people who are there to supervise people who don’t need supervision, and to make up new bullshit jobs.
An increasing number of people in capitalist societies are also employed in “guard labor,” that is, working to keep other people in line—police officers, FBI agents, prison guards, security guards, detectives, investigators, and countless other assorted “criminal justice” occupations. Keeping people in line and imprisoning them has been a major source of new jobs. And many new jobs have been created attempting to cope with the corrosive effects to the fabric of modern society such as counselors and social workers. We even have people whose full-time job it is to get other people into full-time jobs!
Graeber notes that, had we kept the same job profiles as we had in the 1930’s, we truly could have eliminated most of the jobs! Instead of doing that, though, we have instead created millions of low-productivity make-work tasks like those he cites above, most of which revolve around useless paper-pushing and professional lunch-eating:
“They could have done that if we’d kept up the same type of job profiles…you look at what jobs existed in the 1930’s. There were a lot of people working in industry, there were a lot of people working in agriculture, there were a lot of domestic servants—all that’s gone. A lot of the domestic servants have been replaced by service jobs.”
“There are a lot less people employed in industry, even if you count places like China where the factories have gone. People think it’s gone to the service sector. But actually, it’s not so much service. What it’s gone to is the administrative/clerical and supervisory sector. If you count service and that together, its gone from a quarter of all jobs to seventy-five percent today. So, you have all these people administering other people. And they’re not really doing anything—they’re just pretending.”
“It seems to come from the idea that work is a value in itself…”
Graeber also notes that there seems to be a notion that if you’re getting something meaningful out of what you do for a living (for example, making art or helping others) then you shouldn’t get paid at all, or at least you certainly shouldn’t get paid very much. That is, the knowledge that you’re actually doing something of value has come be seen as subtracting from the value of the job rather than adding to it! There’s resentment on a unconscious, or sometimes even conscious level against those who actually do real work, he contends. He cites the resentment against teachers and auto workers receiving high salaries and good benefits, despite bankers, corporate lawyers and middle-managers earning much, much more. The reason, he suspects, is because the main tasks of the latter cohort—filling out useless paperwork and attending boring meetings—are so soul-crushingly pointless and awful that we convince ourselves that they somehow “deserve” to be paid more money. Notice how business executives, Silicon Valley programmers, and Wall Street bankers constantly tout their endless work hours and personal sacrifices as the justification for their outsized paychecks, perks, and golden parachutes, without referring to what, if anything, all their excess work actually accomplishes for the benefit of anyone but themselves.
Gray notes that the problem is really not a technological one, but an economic one:
The real problem, of course, is an economic one. We’ve figured out how to reduce the amount of work required to produce everything we need and realistically want, but we haven’t figured out how to distribute those resources except through wages earned from the 40-hour (or more) workweek. In fact, technology has had the effect of concentrating more and more of the wealth in the hands of an ever-smaller percentage of the population, which compounds the distribution problem. Moreover, as a legacy of the industrial revolution, we have a cultural ethos that says people must work for what they get, and so we shun any serious plans for sharing wealth through means other than exchanges for work.
And that last point is the core of the most interesting part of Hagens’ argument. He has already established that “work” is primarily performed by energy slaves in one form or another in modern Industrial societies, whether mechanical work, or, increasingly, routine intellectual (i.e. non-creative) work. Most of our “jobs” have been purposely routinized and made fungible by design through “deskilling.” This was done long ago during the Industrial Revolution to ensure that labor was easily replaceable, and hence would be at the mercy of capitalist employers (i.e the “job creators”). These days “digital deskilling” is advancing rapidly thanks to complex algorithms.
Hagens’ et. al. contention that work isn’t all about accomplishing anything intrinsically useful at all. Rather, they contend, it is really all about the socially accepted amount of “suffering” that we must go through to “earn” our paycheck. There is nothing inherently good about jobs or work per se. They point out that most animals in nature do not seek out extra work and see it as something to be avoided if possible:
…if you kick open an anthill or a beehive, the insects will not be grateful for the sudden boost in job creation, and they will effectively utilize the cross-species language of biting and stinging to inform you of this opinion. From this we may infer that insects don’t understand economics…
Many hunter-gatherer societies don’t even have a concept of work:
Some anthropologists have reported that the people they studied didn’t even have a word for work; or, if they had one, it referred to what farmers, or miners, or other non-hunter-gatherers with whom they had contact did. The anthropologist Marshal Sahlins famously referred to hunter-gatherers as comprising the original affluent society—affluent not because they had so much, but because their needs were small and they could satisfy those needs with relatively little effort, so they had lots of time to play.
Hagens argues that the 40-hour work-week job is simply the rationing mechanism we’ve ended up with which allows people to get access to the collectively-produced wealth of society, including the output of our ubiquitous energy slaves. As they put it:
… there are a lot of jobs in the USA, which keep us very busy not making much of anything of long term value.
We do advertising, hairstyling, consulting, writing, and a lot of supervising of the things our fossil slaves do. We don’t care all that much what we’re doing as long as we feel we’re getting paid at least as well for the same task as the other…people around us…
These days in this culture, a “good job” is defined by how much it pays, not by what it accomplishes. Many people would consider it an optimum situation, a great job, to sit in a room for 40 hours per week and make $100,000 per year, just pulling a lever the way a capuchin does for a cucumber slice…
The reference to cucumber slices comes from a famous experiment where researchers had Capuchin monkeys complete a nonsense task in exchange for a food reward. Some monkeys got a cucumber slice, while others got a grape for doing the exact same task:
If you give capuchin monkeys the “job” of doing a nonsense task in exchange for a reward, they will happily do it all day long as long as they keep getting a reward – cucumber slices. But if a capuchin sees the monkey in the next cage get a (better tasting so higher value) grape while it still gets a cucumber slice, it’ll go ape, throwing the cucumber slice in the face of the experimenter in a rage. It gets the same cucumber slice it has been happy to work for before, but it no longer wants it, because it no longer feels fair in comparison to its cage mate’s effort and reward. Instead, it wants the experimenter and the other monkey to be punished for this inequity.
We’ll…refer to the term “capuchin fairness” because a similar mechanism turns out to be behind a great deal of human behavior. We’re outraged at the notion of somebody getting more reward than we do for doing the same thing. Indeed, many large-scale human institutions now stress perceived fairness of process over quality of end results.
A similar mechanism exists among ranked primates like chimpanzees:
…On the flip side, when two unrelated chimps put side by side were presented with a tasty grape and a less tasty carrot, the chimp with the grape sometimes threw it away. “I would say that the most likely cause was either fear of retribution or just general discomfort about being around an individual getting less than you,” says Brosnan. Differences in the social hierarchy also played a role, she says. Dominant chimps were angrier when they were on the receiving end of a lesser reward than those lower in the pecking order.
And in human children too young to have been socialized in the concept of fairness:
A few years ago, a team of psychologists set out to study how kids…would respond to unfairness. They recruited a bunch of preschoolers and grouped them in pairs. The children were offered some blocks to play with and then, after a while, were asked to put them away. As a reward for tidying up, the kids were given stickers.
No matter how much each child had contributed to the cleanup effort, one received four stickers and the other two. According to the Centers for Disease Control and Prevention, children shouldn’t be expected to grasp the idea of counting before the age of four. But even three-year-olds seemed to understand when they’d been screwed. Most of the two-sticker recipients looked enviously at the holdings of their partners. Some said they wanted more. A number of the four-sticker recipients also seemed dismayed by the distribution, or perhaps by their partners’ protests, and handed over some of their winnings…The results, they concluded, show that “the emotional response to unfairness emerges very early.”
This, Hagens contends, is the reason we are so all-consumed with “working hard.” It’s got nothing to do with your real social contribution. Instead, he argues that this is rooted in human social instincts, which are biologically-rooted and which we share with other large-brained social primates such as chimps, bonobos and monkeys.
In other words, it all has to do with our innate primate sense of fairness. Especially in the United States, we are obsessed with punishing “cheaters” and ‘scroungers.” We constantly berate the “lazy,” as if all the people living in cars and shelters just collectively decided to suddenly stop working one day. We are collectively as crabs in a bucket. Everyone must suffer equally.
This is backed up by a recent book by a University of Wisconsin sociology professor who found that right-wing “blue collar” voters in the Rust Belt are motivated almost entirely by grievance and resentment toward “elites:”
What I heard from my conversations is that, in these three elements of resentment — I’m not getting my fair share of power, stuff or respect — there’s race and economics intertwined in each of those ideas.
When people are talking about those people in the city getting an “unfair share,” there’s certainly a racial component to that. But they’re also talking about people like me [a white, female professor]. They’re asking questions like, how often do I teach, what am I doing driving around the state Wisconsin when I’m supposed to be working full time in Madison, like, what kind of a job is that, right?
It’s not just resentment toward people of color. It’s resentment toward elites, city people.
And maybe the best way to explain how these things are intertwined is through noticing how much conceptions of hard work and deservingness matter for the way these resentments matter to politics. We know that when people think about their support for policies, a lot of the time what they’re doing is thinking about whether the recipients of these policies are deserving. Those calculations are often intertwined with notions of hard work, because in the American political culture, we tend to equate hard work with deservingness.
“Part of it is that the Republican Party over the years has honed its arguments to tap into this resentment. They’re saying: “You’re right, you’re not getting your fair share, and the problem is that it’s all going to the government. So let’s roll government back.” So there’s a little bit of an elite-driven effect here, where people are told: ‘You are right to be upset. You are right to notice this injustice.'”
And a lot of racial stereotypes carry this notion of laziness, so when people are making these judgments about who’s working hard, oftentimes people of color don’t fare well in those judgments. But it’s not just people of color. People are like: Are you sitting behind a desk all day? Well that’s not hard work. Hard work is someone like me — I’m a logger, I get up at 4:30 and break my back. For my entire life that’s what I’m doing. I’m wearing my body out in the process of earning a living.
In my mind, through resentment and these notions of deservingness, that’s where you can see how economic anxiety and racial anxiety are intertwined. Part of where that comes from is just the overarching story that we tell ourselves in the U.S. One of the key stories in our political culture has been the American Dream — the sense that if you work hard, you will get ahead.
Well, holy cow, the people I encountered seem to me to be working extremely hard. I’m with them when they’re getting their coffee before they start their workday at 5:30 a.m. I can see the fatigue in their eyes. And I think the notion that they are not getting what they deserve, it comes from them feeling like they’re struggling. They feel like they’re doing what they were told they needed to do to get ahead. And somehow it’s not enough. Oftentimes in some of these smaller communities, people are in the occupations their parents were in, they’re farmers and loggers. They say, it used to be the case that my dad could do this job and retire at a relatively decent age, and make a decent wage. We had a pretty good quality of life, the community was thriving. Now I’m doing what he did, but my life is really much more difficult. I’m doing what I was told I should do in order to be a good American and get ahead, but I’m not getting what I was told I would get..
Trump voters were collectively throwing the cucumber slice back at the researcher.
Hagens contends that rather than 40 hours per week being some sort of necessary amount of work time to get done with the tasks-at-hand to keep society up and running, it is instead established as a sort of socially-acceptable threshold of discomfort that people are expected to endure in order to justify their right to the output of our energy slaves (our grapes and cucumber slices). The source of this is a historical contingency that has nothing to do with productivity or what we actually accomplish, but really operates as more of an adult babysitting operation:
And that’s where the perceived equality is: the equality of inconvenience. The 40-hour work week is a social threshold of inconvenience endured, which is now what we keep primary social track of rather than the productive output of a person’s activity…Because socially, everyone who isn’t a criminal is supposed to have a job and endure roughly equivalent inconvenience. Any segment of society which went to a 15-hour work week would be treated as mooching freeloaders, and be pelted by cucumber slices and worse.
In a society in which we’re all basically idle royalty being catered to by fossil slaves, why do we place such a value on “jobs”? Well, partly because it’s how the allocation mechanism evolved, but there also exists considerable resentment against those who don’t work. Think of the vitriol with which people talk about “freeloaders” on society who don’t work a 40-hour week and who take food stamps. The fact is, that most of us are freeloaders when it comes down to it, but if we endure 40 hours of inconvenience per week, we meet the social criteria of having earned our banana pellets even if what we’re doing is stupid and useless, and realized to be stupid and useless. Indeed, a job that’s stupid and useless but pays a lot is highly prized.
So “jobs” per se aren’t intrinsically useful at all… They’re mostly a co-opted, socially-evolved mechanism for wealth distribution and are very little about societal wealth creation. And they function to keep us busy and distract us from huge wealth disparity. We’re too busy making sure our co-workers don’t get grapes to do something as radical as call out and lynch the bankers. Keeping a population distracted may well be necessary to hold a modern nation together.
Finally, in a strange way, it turns out that the old Labor Theory of Value might be correct after all.
The Labor Theory of Value is one of the most controversial ideas in economics. It was an attempt by economists to identify such a thing as “value” and then determine how to quantify it. What makes some things more valuable than others? Many early economists (including both Adam Smith and Karl Marx) thought that the amount of labor that went into producing something determined its value (and note, not it’s price).
While most economists have dismissed this notion, looked at another way it is correct. Instead, we can determine the value of something by how long we are willing work to get it. That is, we will work longer for a grape than a cucumber slice, and that is how we can determine its value, as Chris Dillow argues:
…I think of major expenses in terms of labour-time because they mean I have to work longer. A trip to the vet is an extra fortnight of work; a good guitar an extra month, a car an extra year, and so on.
When I consider my spending, I ask: what must I give up in order to get that? And the answer is my time and freedom. My labour-time is the measure of value.
This is a reasonable basis for the claim that workers are exploited. To buy a bundle of goods and services, we must work a number of hours a week. But taking all workers together, the hours we work are greater than the hours needed to produce those bundles because we must also work to provide a profit for the capitalist….For Marx, value was socially-necessary labour time…From this perspective, exploitation and alienation are linked. Workers are exploited because they must work longer than necessary to get their consumption bundle. And they are alienated because this work is unsatisfying and a source of unfreedom.
Seen from this perspective, the value of something can be determined by the amount of often socially useless labor time we must sacrifice in order to get it. And we are very sensitive to others getting value that (we think) they did not deserve. Thus we establish the 40-hour work week as the threshold to ensure fairness of distribution. But is that a good idea? Is it even relevant anymore??? It turns out that there is no connection between the 40-hour work week and productivity. In fact, it might even be less productive:
The reason we have eight-hour work days at all was because companies found that cutting employees’ hours had the reverse effect they expected: it upped their productivity. During the Industrial Revolution, 10-to-16-hour days were normal. Ford was the first company to experiment with an eight-hour day – and found its workers were more productive not only per hour, but overall. Within two years, their profit margins doubled.
If eight-hour days are better than 10-hour ones, could even shorter working hours be even better? Perhaps. For people over 40, research found that a 25-hour work week may be optimal for cognition, while when Sweden recently experimented with six-hour work days, it found that employees had better health and productivity.
This seems borne out by how people behave during the working day. One survey of almost 2,000 full-time office workers in the UK found that people were only productive for 2 hours and 53 minutes out of an eight-hour day. The rest of the time was spent checking social media, reading the news, having non-work-related chats with colleagues, eating – and even searching for new jobs.
In other words, the exact opposite of the way we’re going.
We know that most employees are disengaged at their jobs, and studies show that most of us only actually “work” for a small portion of the time we are on the clock, with the rest spent socializing, trying to look busy, or goofing off. Yet we must physically be physically present under some sort of supervision for 40 hours a week minimum to secure our right to our banana pellets. Does any of this make sense? Do any of us really want this? After all, books that promise a four-hour work week are best sellers.
In fact, all the evidence shows that many of us would be more productive if we worked a bit less. In addition, there would be many more jobs to go around:
Even on a global level, there is no clear correlation between a country’s productivity and average working hours. With a 38.6-hour work week, for example, the average US employee works 4.6 hours a week longer than a Norwegian. But by GDP, Norway’s workers contribute the equivalent of $78.70 per hour – compared to the US’s $69.60.
As for Italy, that home of il dolce far niente? With an average 35.5-hour work week, it produces almost 40% more per hour than Turkey, where people work an average of 47.9 hours per week. It even edges the United Kingdom, where people work 36.5 hours.
So why don’t we do that? That’s a story for another time.
“None are so hopelessly enslaved as those who falsely believe they are free.”
–Johann Wolfgang von Goethe
“Wages is a cunning device of the devil, for the benefit of tender consciences, who would retain all of the advantages of the slave system, without the expense, trouble and odium of being slave-holders.”
The ancient world ran on slave-power. Lacking heat engines and cybernetic devices, the only way to accomplish the many things civilization ran on–agriculture, construction, crafts, child-rearing, military operations, mining, transport, shipping, and so forth, was to use human and animal muscle power. Human labor has five core competencies, according to economist Brad DeLong:
(1) Moving things with large muscles.
(2) Finely manipulating things with small muscles.
(3) Using our hands, mouths, brains, eyes, and ears to ensure that ongoing processes and procedures happen the way that they are supposed to.
(4) Engaging in social reciprocity and negotiation to keep us all pulling in the same direction.
(5) Thinking up new things – activities that produce outcomes that are necessary, convenient, or luxurious – for us to do.
Surveying the ancient world, we see that slaves were the primary method for accomplishing the first two tasks, while the latter three were monopolized by the “educated” elite classes of the ancient world, who were always–and had to be–a minority (including in our world today, which is why “more education” cannot solve inequality). Simply put, no slavery–no civilization, and no state, as James C. Scott writes:
Slavery was not invented by the state…[but]…as with sedentism and the domestication of grain…the early state elaborated and scaled up the institution of slavery as an essential means to maximize its productive population and thus the surplus it could approporiate.
It would be almost impossible to exaggerate the centrality of bondage, in one form or another, in the development of the state until very recently…as late as 1800 roughly three-quarters of the world’s population could be said to be living in bondage…Provided that we keep in mind the various forms of bondage can take over time, one is attempted to assert: “No slavery, no state.” Against the Grain (ATG), pp. 155-156
Hence the ancient world had to come up with all sorts of philosophical justifications for slavery. Initially, however, race was not one of them. Anthropologist David Graeber points out that underlying the various justifications for slavery was the idea that the slave would otherwise be dead. Because their lives were spared, they were, in essence, the living dead, kind of like zombies! Because they were socially ‘dead’ as people, they had no rights and could be abused, bought and sold:
Slavery is the ultimate form of being ripped from one’s context, and thus from all the social relationships that make one a human being. Another way to put this is that the slave is, in a very real sense, dead. This was the conclusion of the first scholar to carry out a broad historical survey of the institution, an Egyptian sociologist named Ali ‘Abd al-Wahid Wafi, in Paris in 1931. Everywhere, he observes, from the ancient world to then-present-day South America, one finds the same list of possible ways whereby a free person might be reduced to slavery:
1) By the law of force
a. By surrender or capture in war
b. By being the victim of raiding or kidnapping
2) As legal punishment for crimes (including debt)
3) Through paternal authority (a father’s sale of his children)
4) Through the voluntary sale of one’s self
The book’s most enduring contribution, though, lay simply in asking: What do all these circumstances have in common? AI-Wahid’s answer is striking in its simplicity: one becomes a slave in situations where one would otherwise have died. This is obvious in the case of war: in the ancient world, the victor was assumed to have total power over the vanquished, including their women and children; all of them could be simply massacred. Similarly, he argued, criminals were condemned to slavery only for capital crimes, and those who sold themselves, or their children, normally faced starvation. Debt, the First 5000 Years, (DTF5kY) pp. 168-169
Many of the authors and scholars in Michael Hudson’s ISLET series about the ancient economy argue that slavery played only a subsidiary role in the establishment of early civilizations, and most of the labor was given voluntarily, such as a sort of social obligation, often involving work feasts. Author James C. Scott disagrees. He sees the existence of compulsory and unfree labor, in whatever form it took, as absolutely essential to the formation of the first states. He writes:
The general consensus has been that while slavery was undoubtedly present, it was a relatively minor component of the overall [Mesopotamian] economy…I would dispute this consensus.
Slavery, while hardly as massively central as in classical Athens, Sparta, or Rome, was crucial for three reasons: it provided the labor for the most important export trade good, textiles; it supplied a disposable proletariat for the most onerous work (for example, canal digging, wall building); and it was both a token of and a reward for elite status…When other forms of unfree labor, such as debt bondage, forced resettlement, and corvee labor, are taken into account, the importance of coerced labor for the maintenance and expansion of the grain-labor module at the core of the state is hard to deny.
Part of the controversy over the centrality of slavery in ancient Sumer is a matter of terminology. Opinions differ in part because there are so many terms that could mean “slave” but could also mean “servant,” “subordinate,” “underling,” or “bondsman.” Nevertheless, scattered instances of purchase and sale of people–chattle slavery–are well attested, though we do not know how common they were. ATG pp. 157-158
Three obvious reasons why Third Milennium Mesopotamia might seem less of a slave-holding society than Athens or Rome are the smaller populations of early polities, the comparably scarce documentation they left behind, and their relatively small geographical reach. Athens and Rome were formidable naval powers that imported slaves from throughout the known world, drawing virtually all their slave populations far and wide from non-Greek and non-Latin speaking societies. This social and cultural fact provided much of the foundation for the standard association of state peoples with civilization on the one hand and nonstate peoples with barabrism on the other…The greater the cultural and linguistic differences between slaves and their masters, the easier it is to draw and enforce the social and juridicial seperation that makes for the sharp demarcation typcial of slave societies…
Mesopotamian city-states by contrast, took their captives from much closer to home. For that reason, the captives were more likely to have been more culturally aligned with their captors. On this assumption, they might have, if allowed, more quickly assimilated to the culture and mores of their masters and mistresses In the case of young women and children, often the most prized captives, intermarriage or concubinage may well have served to obscure these social orgins within a couple of generations…ATG p. 174-175
In other words, Greece and Rome captured “barbarians” from outside society and incorporated them as a lower-tier slave strata to do all the stoop labor and scut work. The very word barbarian referred to someone who didn’t speak the Greek language.
In Mesopotamia, by contrast, the warfare was often between rival city-states—people who would have spoken similar languages and shared similar customs and beliefs. Thus, they would have appeared less like a foreign entity in the records and more like just a lower tier of society–their status obscured by cultural similarities and ambiguities in the terminology. Furthermore, this process would have been ongoing, with new layers of war captives being continually added to form the bottom strata of society, eventually “blending in” and “moving up” over time as new immigrants–er, slaves–took their place:
The continuous absorption of slaves at the bottom of the social order can also be seen to play a major role of social stratification–a hallmark of the early state. As earlier captives and their progeny were incorporated into the society, the lower ranks were constantly replenished by new captives, further solidifying the line between “free” subjects and those in bondage, despite its permeability over time. p. 169
One must surely wonder whether the Mesopotamian city-states met a substantial portion of their insatiable labor needs by absorbing captives or refugees from culturally similar populations. In this case such captives or refugees would probably appear not as slaves but as a special category of “subject” and perhaps would be, in time, wholly assimilated. ATG p. 175
Integrating war captives into society and isolating them from their original ethnic group rather than making them as a class permanently apart would have forestalled rebellion. Atomized people, without social ties, are much easier to control and cannot mount any sort of collective resistance to the prevailing social order (a point not lost on today’s ruling elites):
Insofar as the captives are seized from scattered locations and backgrounds and are separated from their families, as was usually the case, they are socially demobilized or atomized and therefore easier to control and absorb. If the war captives came from societies that were perceived in most respects as alien to the captors, they were not seen as entitled to the same social consideration. Having, unlike local subjects, few if any local social ties, they were scarcely able to muster any collective opposition…ATG, p. 167…
The principle of socially detached servants–Janissaries, eunuchs, court Jews–has long been seen as a technique for rulers to surround themselves with skilled but politically neutralized staff. At a certain point, however, if the slave population is large, is concentrated, and has ethnic ties, this desired atomization no longer holds. The many slave rebellions in Greece and Rome are symptomatic, although Mesopotamia and Egypt (at least until the New Kingdom) appeared not to have slavery on this scale. ATG pp. 167-168
James Scott considers slavery in ancient Sumeria, Babylonia, Assyria, Egypt and China as a form of manpower recruitment on the part of states–the original “human resources strategy.” Often military incursions were less about seizing territory as it was about seizing captives–what Max Weber called “booty capitalism.” Often, the people seized were those with rare and highly specialized skills that the attacking state did not possess:
Slave taking…represented a kind of raiding and looting of manpower and skills that the slaving state did not have to develop on its own…Women and children were particularly prized as slaves. Women were often taken into households as wives, concubines, or servants, and children were likely to be quickly assimilated, though at an inferior status…Women captives were at least as important for their reproductive services as for their labor. Given the problems of infant and maternal mortality in the early state and the need of both the patriarchal family and the state for agrarian labor, women captives were a demographic dividend. Their reproduction may have played a major role in alleviating the otherwise unhealthy effects of concentration and the domus. ATG, pp. 168-169.
One of the most common forms of slavery was domestic work. A major hallmark of elite status in the ancient world was how many lives you had control over. Large households were typically staffed with huge amounts of domestic servants, who were, in essence, slaves, even if they were not explicitly designated as such in historical records. These domestic servants cooked and cleaned, took care of the children, maintained gardens, bore their masters about in litters, and numerous other routine chores that elites prefer to use trafficked and immigrant labor for even today:
One imagines as well, that most of slaves not put to hard labor were monopolized by the political elites of early states. If the elite households of Greece or Rome are any indication, a large part of their distinction was the impressive array of servants, cooks, artisans, dancers, musicians, and courtesans on display. It would be difficult to imagine the first elaborate social stratification in the earliest states without war-captive slaves at the bottom and elite embellishment, dependent on those slaves, at the top. ATG, p. 169
David Graeber also points out this fact:
…this ability to strip others of their dignity becomes, for the master, the foundation of his honor…there have been places-the Islamic world affords numerous examples-where slaves are not even put to work for profit; instead, rich men make a point of surrounding themselves with battalions of slave retainers simply for reasons of status, as tokens of their magnificence and nothing else. DTF5kY, p. 170
And many forms of slavery were less obvious in the historical record. Scott includes things like forced resettlement, migrant workers, and serfdom as forms of compelled labor which also made civilization possible, but would be less likely to be noticed by archaeologists and economic historians:
In Athens in the fifth century BCE, for example, there was a substantial class, more than 10 percent of the population, of metics, usually translated as “resident aliens.” They were free to live and trade in Athens and had the obligations of citizenship (taxes and conscription, for example) without its privileges. Among them were a substantial number of ex-slaves. ATG p. 175
Finally, there are two forms of communal bondage that were widely practiced in many early states and that bear more than a family resemblance to slavery but are unlikely to appear in the textual record as what we think of as slavery. The first of these might be called mass deportation coupled with communal forced settlement. Our best descriptions of the practice come from the neo-Assyrian Empire (9II-609 BeE), where it was…systematically applied to conquered areas. The entire population and livestock of the conquered land were marched from the territory at the periphery of the kingdom to a location closer to the core, where they were forcibly resettled, the people usually as cultivators…In some cases, it seems that the captives were resettled on land abandoned earlier by other subjects, implying that forced mass resettlement may have been part of an effort to compensate for mass exodus or epidemics. ATG pp. 177-178
A final genre of bondage that is historically common and also might not appear in the historical record as slavery is the model of the Spartan helot. The helots were agricultural communities in Laconia and Messinia dominated by Sparta…They remained in situ as whole communities, were annually humiliated in Spartan rituals, and like the subjects of all archaic agrarian states were required to deliver grain, oil, and wine to their masters. Aside from the fact that they had not been forcibly resettled as war deportees, they were in all other respects the enserfed agricultural servants of a thoroughly militarized society.
Scott points out that the conquering and subjugation of an existing agricultural society by an incoming martial elite–as seems to have been the case in Sparta–may not technically look like slavery but be similar in most respects. The elites compel the producers to toil on behalf of their overlords. It is, in essence, serfdom. People are tied to a plot of land and obliged to provide food and goods to a militarized aristocracy, which is mass slavery in all but name.
And metics appear to be quite similar to today’s globalized peripatetic migrant worker, such as the thousands of “second-class citizens” that are the lifeblood of places like Dubai, or who pick the fruits and berries that end up in the supermarkets of Europe and North America. Interestingly, many immigrant workers in France today have embraced the term “metic” (métèque) tin reference to themselves.
Slavery was also an early way to punish criminals and enforce justice. The ancient world did not have the resources to feed and shelter large amounts of unproductive people in cages (jails, gaols) as we do today. Dungeons were mainly about holding people who were about to stand trail. To have basic shelter and three square meals a day without having to work would have been quite a luxury in the ancient world–people would have been purposely committing crimes to get it! Fines were not effective in pre-monetized and non-market economies. That’s one reason for the gruesome corporeal punishments we see doled out in the ancient world (eye-gouging, flogging, etc.). Making people into slaves took away many of their freedoms, but still compelled them to work on behalf of society–sort of a “work release” program in what was, in essence, an open-air prison. Even in today’s United States, slavery is legal if you are convicted of a crime. We still talk of criminals owing a “debt to society.”
Debt slavery was also often ignored in ancient accounts of slavery. We know that debt bondage became so common and so widespread that leaders had to periodically institute debt annulments in order to keep their societies functioning at all. This could take the form of regular mandated debt jubilees as in Mesopotamia, or emergency legislative actions like those of Solon the reformer in Athens. As David Graeber says, all ancient populism boils down to one single idea: “cancel the debts and redistribute the land” (i..e the means of production).
Slavery was also a major barrier to industrialization. In the new novel Kingdom of the Wicked, the author envisages an alternate Rome that has undergone an Industrial Revolution by the time of Christ. Slavery has been abolished, led by the Stoics whom she equates with Quaker abolitionists in Britain’s 19th century. This has led to the flowering of a “tinkering culture” exemplified by Archimedes and Heron to further their inventions into true labor-saving devices similar to those of early industrializing England. This is not so far-fetched: We know, for example, that the Romans employed water power on a massive scale for milling bread and manufacturing armaments, for example at Barbegal in modern-day France, and that the earliest factories of the Industrial Revolution (Arkwright’s mills) were water-powered, with fossil fuels coming only later due to wood shortages. The author writes of Roman Slavery:
…While Roman-era scientists later developed designs for things like steam engines (Heron of Alexandria) and built fabulous mechanical instruments (the Antikythera machine), they did so in a society that had been flooded with vast numbers of slaves (the late Republic and early Principate), and large-scale chattel slavery and industrialization are simply incompatible.
Chattel slavery undermines incentives to develop labour-saving devices because human labour power never loses its comparative advantage. People can just go out and buy another slave to do that labour-intensive job. Among other things, the industrial revolution in Britain depended on the presence (relative to other countries) of high wages, thus making the development of labour-saving devices worthwhile. The Antikythera mechanism is a thing of wonder, but it is also little more than a clockwork executive toy; no wonder the great lawyer Cicero had one. It’s just the sort of thing I can imagine an eminent [attorney] having on his desk in chambers.
Slavery—and its near relative, serfdom—have been pervasive in even sophisticated human societies, and campaigns for abolition few and far between. We forget that our view that slavery and slavers are obnoxious is of recent vintage. In days gone by, people who made fortunes in the slave trade were held in the highest esteem and sat on our great councils of state. This truism is reflected in history: The Society for Effecting the Abolition of the Slave Trade met for the first time in 1787. It had just twelve members—nine Quakers and three Anglicans…
…we know that the Romans didn’t think some people were ‘naturally servile’, which is at the heart of Aristotle’s argument in favour of slavery. The Roman view (consistent with their militarism) was always ‘you lost, we own you’. Roman law—even in its earliest form—always held that slavery was ‘contrary to nature’. Human beings were naturally free; slavery was a legally mandated status, however cruelly imposed.
It is also important to remember that ancient slavery was never race-based. No Roman argued that slaves were lesser forms of the human. Middle- and upper-class Roman children educated at home by slaves who were manifestly cleverer than them (not to mention mum and dad) knew this, intimately.
The Roman explicitly defined the lack of freedom implied by slavery as “contrary to nature” in their legal codes!
In fact, even when labor-saving devices were invented in the ancient world, they often were often intentionally ignored or neglected in order to ensure that the large amounts of human labor available to elites would have some way to be utilized:
[W]hen Vespasian was offered a labor-saving machine for transporting heavy columns, he was said to have declined with the words: “I must always ensure that the working classes earn enough money to buy themselves food.”
Not only was race or ethnic origin not a factor in Roman slavery, the ancient Romans did not regard slaves as inherently inferior in any way! In fact, they knew that slaves might even be more talented than their masters! There was no racial segregation or racial hierarchy; slavery was simply a social construct not based on any notion of superiority or racism, unlike in North America (as was it’s flip-side “citizenship”). This colors our view of ancient slavery. It also blinds us to the reality and essential role of slavery and bondage in human history. We are used to regarding slaves as “naturally” inferior due to the racist views utilized in America to justify it. A racial hierarchy was established in America after Bacon’s Rebellion in the South to make sure that poor whites and blacks would not unite against their rulers–another example of divide-and-rule atomization.
The problem for any culture that wants to spend time on literature, art, philosophy and science, is [that] somebody’s got to do the laundry. And so what we’ve done is, we have a washing machine. If we didn’t have a washing machine, my guess is, all over California there would be a lot more jobs at the lowest end–of people doing laundry. Just as the Chinese who entered California as basically indentured railway workers, they began to set up what we call Chinese laundries and Chinese restaurants. These are all low-skilled, high work.
Well, the Greeks; some of the cities–the ones that we admire like Athens–they had slaves because that was the way you got things done. They didn’t feel that slaves were inferior people. They just happened to be people often captured in war. We forget that the word slave comes from the word slav. The slaves come out of Russia into Europe through the Middle Ages. All the Middle ages were full of slaves.
The American slave experience was peculiar in that it was having people really who were not of their own culture; not of their own civilization. If you think about it, you’re in a Greek family and who’s the nursemaid for the children? Well, she has to be somebody who’s going to speak their language, and is going to be giving them the cultural values.
Anyone who lost in war…they were just people who lost; when you lost you got killed or be made a slave and most people given the choice thought, “well I’d rather try living and see how that works out.”
David Graeber makes the same point regarding Roman slavery:
What made Roman slavery so unusual, in historical terms, was a conjuncture of two factors. One was its very arbitrariness. In dramatic contrast with, say plantation slavery in the Americas, there was no sense that certain people were naturally inferior and therefore destined to be slaves. Instead, slavery was seen as a misfortune that could happen to anyone. As a result, there was no reason that a slave might not be in every way superior to his or her master: smarter, with a finer sense of morality, better taste, and a greater understanding of philosophy. The master might even be willing to acknowledge this. There was no reason not to, since it had no effect on the nature of the relationship, which was simply one of power. The second was the absolute nature of this power… DTF5kY, p. 202
Indeed, H.G. Wells felt that the vast importation of slaves after the second Punic war was the final “nail in the coffin” for the Roman yeoman class. As Roman society was flooded with slaves from military expansion, the price of slaves went down dramatically. It then became cost effective to buy large amounts of slaves and work them to death on large plantations, meaning that ordinary family farms could not compete in what was effectively an early “free market” economy. Cheap slaves allowed unprecedented concentration of wealth in fewer and fewer hands.
In the Roman experience, this is the beginning of a 100-year-long process of Italy going from being a patchwork of smaller farms with some large estates to nothing but sprawling, commercially-oriented estates. And yes, the United States is continuing to go through a very similar process. At the founding of our republic, everybody’s a farmer, and now everything is owned by what, Monsanto?
Moving beyond just strictly agricultural companies, large American corporations are now employing more and more people. There seems to be this move away from people owning and operating their own establishments, and they’re instead being consumed by large entities. You’re talking about the Amazons of the world swallowing up so much of the market share, it just doesn’t pay to be a clerk in a bookstore or own a bookstore, you end up being a guy working in a warehouse, and it’s not as good of a job.
It doesn’t really feel like they could’ve arrested the process. Fifteen years after some land bill, you’d ask, “Who has the land? The poor?” No, they all just got bought up again. There never was a good political solution to it. The problem of these small citizen farmers was not solved until 100 years later when they simply ceased to exist.
So slavery appears not to have been “race based” in most ancient societies, which is what makes the American experience so unique. Apart from places like plantations, mines and quarries, most slaves were probably indistinguishable from people around them. They went off to work every day just like everybody else. Again, slavery was a legal distinction more than anything else.
It’s also essential to keep in mind that our vision of slavery as constant beatings and starvation has also drastically colored our view. This is again a result of North American racially-based plantation slavery. In reality, slaves were an investment, and whipped and starving people hardly made the best workers.The cruelty of plantation slavery was highlighted and emphasized in written accounts, both by ex-slaves and abolitionists, to turn people against it. In reality, it was probably not as brutal as it is often depicted. To be crystal clear here, this is not a justification for slavery!!! But it also makes us overlook slavery in the ancient world, where it was more of a social/economic status than racial. In fact, most slavery looked indistinguishable from the routine of the average wage worker today!
John Moes, a historian of slavery…writes about how the slavery we are most familiar with – that of the antebellum South – is a historical aberration and probably economically inefficient. In most past forms of slavery – especially those of the ancient world – it was common for slaves to be paid wages, treated well, and often given their freedom.
He argues that this was the result of rational economic calculation. You can incentivize slaves through the carrot or the stick, and the stick isn’t very good. You can’t watch slaves all the time, and it’s really hard to tell whether a slave is slacking off or not (or even whether, given a little more whipping, he might be able to work even harder). If you want your slaves to do anything more complicated than pick cotton, you run into some serious monitoring problems – how do you profit from an enslaved philosopher? Whip him really hard until he elucidates a theory of The Good that you can sell books about?
The ancient solution to the problem…was to tell the slave to go do whatever he wanted and found most profitable, then split the profits with him. Sometimes the slave would work a job at your workshop and you would pay him wages based on how well he did. Other times the slave would go off and make his way in the world and send you some of what he earned. Still other times, you would set a price for the slave’s freedom, and the slave would go and work and eventually come up with the money and free himself.
Moes goes even further and says that these systems were so profitable that there were constant smouldering attempts to try this sort of thing in the American South. The reason they stuck with the whips-and-chains method owed less to economic considerations and more to racist government officials cracking down on lucrative but not-exactly-white-supremacy-promoting attempts to free slaves and have them go into business.
So in this case, a race to the bottom where competing plantations become crueler and crueler to their slaves in order to maximize competitiveness is halted by the physical limitation of cruelty not helping after a certain point…
Moes argues that the reason slavery declined in ancient Rome was not because slaves were treated so cruelly that they could not reproduce themselves (whips and chains), but as a result of widespread manumission. They were freed. Slaves often cut deals where they would buy their freedom by entering in with business deals with their owners. Often times, they would split the profits:
Profitable deals could be made with the slave or with the freedman, who could be and usually was obligated to render services to his former master. A freedman often continued in the same employment or else was set up in business with funds supplied by the master, or, on the land, was given part of the estate to work as a tenant. Hence the slave in fact bought his own freedom, either by being given the opportunity to accumulate savings of his own, the “peculium,” or afterward as a freedman, having received his freedom, so to speak, on credit.
This system was to the advantage of the owner because it gave the slave an incentive to work well and in general to make himself agreeable to his master. Thus, while the owner did not (immediately) appropriate the entire surplus that the slave earned over and above the cost of his maintenance, he still got great eeconomic benefits in the long run… the most highly valued slaves were the most likely to be freed, for the full benefit of their talents could not be obtained under the whiplash but only by giving them the positive incentive of immediate or ultimate freedom.
Seen from this perspective, the difference between a slave and the plight of the average modern American worker becomes awfully difficult to define. Of course, if you dare broach this topic, you are immediately confronted with opprobrium–how dare you! This is a legacy of the horrors of racebased plantation slavery to which we are constantly reminded. But, historically, slavery had nothing to do with racism or (direct) violence!
No, slaves were simply the people who had to labor above and beyond what they wished to in order to produce a surplus for someone else. They also had no control over their work circumstances. They had to do what their master told them to do, for the amount of time he told them to do it, in the place where he told them to do it, and the way he told them to do it. And the slave only kept a portion of what they produced, with the lion’s share going to his or her master. That doesn’t sound all that different from the situation of the average worker today, now does it? The ancients were aware of this. Cicero wrote:
“…vulgar are the means of livelihood of all hired workmen whom we pay for mere manual labor, not for artistic skill; for in their case the very wage they receive is a pledge of their slavery.”
Thus wage slavery is simply another type of slavery, and not as distinct from its ancient counterpart as we have been led to believe. True, we aren’t regularly starved and beaten. Yes, we can find a different patron–er–employer. But we are just a human resource. We make profits for others. We don’t have control over our workplace. When you understand that, by and large, ancient slavery had nothing to do with racial inferiority–actual or perceived, or outright violence, and was just an economic category of individuals, you can understand why this is the case.
And consider this: how could our modern society function without the massive tier of low-paid workers? In fact, the people who get paid the least are the most essential to society’s everyday functioning, as David Graeber has pointed out. They do the non-automated agricultural work. They pick our fruits and vegetables. They cook and prepare our food. They look after our children and take care of our elderly. They teach our children. They drive our cars and trucks. They maintain our lawns and gardens. They build and maintain our infrastructure. They construct our buildings. They keep our shelves stocked with goods and deliver them to our doorstep. Not all of these are minimum wage workers, but an increasing number of them are! If they all vanished, society would grind to an immediate halt. Yet just three people “own” as much as half the American workforce!
The difference is that wage slaves are rented instead of owned. We are continually compelled by the invisible whip and the lash of utter poverty and destitution.
Today’s college system is virtually indistinguishable from indentured servitude. in fact, I would argue that it’s worse! With indentured servitude, it’s true that you could not leave your employer and “shop around” for another one. But, if you went into debt, you were guaranteed gainful employment for the duration of the loan–something today’s college students would kill for! Instead, they are expected to go deeply into the debt just for the mere chance of finding employment in their chosen field, which, more often than not, they don’t. Sometimes, they must even labor for free to get certain jobs (unpaid internships). And student debt, unlike other debt, cannot be discharged in bankruptcy. What, then, really is the difference between it and debt bondage??? H1-B visas are a similar scam, where imported workers often work for less than their native-born counterparts and cannot easily leave their employer (i.e. sponsor) to seek out other work.
And now, we are constantly informed that we must “love our jobs” to the extent that we will even work for free for the privilege! Employers depict themselves as a paternalistic “family” (albeit one that you can be removed from at any time and for any reason). It’s a sort of Stockholm Syndrome on a societal scale. Today, we are totally defined by our work. It forms the core of our identity (“so, what do you do..?”). We are informed from birth that we must “love our jobs” and “like what we do” We no longer even think of our bondage as bondage! We are totally brainwashed to love our captivity and identify with our captors, the ultimate victory of tyranny over freedom. As Henry David Thoreau wrote:
“[i]t is hard to have a Southern overseer; it is worse to have a Northern one; but worst of all when you are the slave-driver of yourself.”
So, when you take all this into consideration, clearly civilization has always run on compelled labor of one form or another. It cannot be any other way. Corvee labor, forced resettlement, military drafts, tributary labor, convict labor, serfdom, migrant and trafficked labor, debt peonage and indentured servitude have all existed alongside chattel slavery since the beginnings of civilization. Freedom is just an illusion:
It is the secret scandal of capitalism that at no point has it been organized primarily around free labor. The conquest of the Americas began with mass enslavement, then gradually settled into various forms of debt peonage, African slavery, and “indentured service”-that is, the use of contract labor, workers who had received cash in advance and were thus bound for five-, seven-, or ten-year terms to pay it back. Needless to say, indentured servants were recruited largely from among people who were already debtors. In the 1600’s there were at times almost as many white debtors as African slaves working in southern plantations, and legally they were at first in almost the same situation, since in the beginning, plantation societies were working within a European legal tradition that assumed slavery did not exist, so even Africans in the Carolinas were classified, as contract laborers. Of course this later changed when the idea of “race” was introduced.
When African slaves were freed, they were replaced, on plantations from Barbados to Mauritius, with contract laborers again: though now ones recruited mainly in India or China. Chinese contract laborers built the North American railroad system, and Indian “coolies” built the South African mines. The peasants of Russia and Poland, who had been free landholders in the Middle Ages, were only made serfs at the dawn of capitalism, when their lords began to sell grain on the new world market to feed the new industrial cities to the west. Colonial regimes in Africa and Southeast Asia regularly demanded forced labor from their conquered subjects, or, alternately, created tax systems designed to force the population into the labor market through debt. British overlords in India, starting with the East India Company but continuing under Her Majesty’s government, institutionalized debt peonage as their primary means of creating products for sale abroad .
This is a scandal not just because the system occasionally goes haywire… but because it plays havoc with our most cherished assumptions about what capitalism really is particularly that, in its basic nature, capitalism has something to do with freedom. For the capitalists, this means the freedom of the marketplace. For most workers, it means free labor. DTF5kY, pp. 350-351
Today, living in a high-tech age of fossil fuels and automation, why have our “energy slaves” not liberated us from this burden? We’ll consider that next time.
BONUS: Ellen Brown (Web of Debt) has an interesting piece on student loan debt slavery over at Truthdig:
The advantages of slavery by debt over “chattel” slavery—ownership of humans as a property right—were set out in an infamous document called the Hazard Circular, reportedly circulated by British banking interests among their American banking counterparts during the American Civil War. It read in part:
“Slavery is likely to be abolished by the war power and chattel slavery destroyed. This, I and my European friends are glad of, for slavery is but the owning of labor and carries with it the care of the laborers, while the European plan, led by England, is that capital shall control labor by controlling wages.”
Slaves had to be housed, fed and cared for. “Free” men housed and fed themselves. For the more dangerous jobs, such as mining, Irish immigrants were used rather than black slaves, because the Irish were expendable. Free men could be kept enslaved by debt, by paying wages insufficient to meet their costs of living. The Hazard Circular explained how to control wages:
“This can be done by controlling the money. The great debt that capitalists will see to it is made out of the war, must be used as a means to control the volume of money. … It will not do to allow the greenback, as it is called, to circulate as money any length of time, as we cannot control that.”
There’s been an explosion in scholarship pinning the collapse of societies on both outbreaks of disease and natural variations in climate. James Scott dedicates a good portion of Against the Grain to considering the fragility of early states. As he points out, when it comes to the formation of complex state societies, the question isn’t so much “what took so long” as “how could this even happen at all?” People don’t inherently want to be controlled or dominated by a sociopathic oligarchy, so why did they “bend the knee,” and remain kneeling ever since?
And, in fact, what we see is, rather than a direct, steady progression to larger and more complex societies as depicted by old narratives of history (the “progress” narrative), we see states rising and falling. The idea that “bigger is better” is not in evidence from the standpoint of the average peasant living in these cultures. As Scott points out at length, states are fragile things prone to undermining their own existence through various factors. We see this trend even today with active secession movements in Catalonia, Scotland, the United States, and the criticisms of the European Union and “free trade.”
Ancient Egypt may have fallen in part because of riots caused by climate change and volcanoes, according to a new paper. The new study paints a picture of the ancient civilisation riven by droughts and disasters. It looked at the impact of the severe events of ancient Egypt, finding that they caused stress on its economy and ability to fight wars.
The Nile was incredibly important for the ancient Egyptians of Ptolemaic Egypt, between 350 and 30BC. Each year monsoon rainfall brought summer flooding that helped grow crops to support the society. When those crops failed, societal unrest would ensue, according to detailed reports at the time.
Until now, researchers haven’t known what caused those strange but important floods. They now propose they were the result of volcanic activity – which in turn would have altered the climate and brought about disruption to the most central parts of society.
“Ancient Egyptians depended almost exclusively on Nile summer flooding brought by the summer monsoon in east Africa to grow their crops,” said Joseph Manning, lead author on the paper and the William K & Marilyn Milton Simpson professor of history and classics at Yale, in a statement.
“In years influenced by volcanic eruptions, Nile flooding was generally diminished, leading to social stress that could trigger unrest and have other political and economic consequences
What we are learning, principally from pathogen genomics, is that the fall of the Roman Empire may have been a biological phenomenon.
The most devastating enemy the Romans ever faced was Yersinia pestis, the bacterium that causes bubonic plague and that has been the agent of three historic pandemics, including the medieval Black Death. The first pandemic interrupted a remarkable renaissance of Roman power under the energetic leadership of the emperor Justinian. In the course of three years, this disease snaked its way across the empire and carried off perhaps 30 million souls. The career of the disease in the capital is vividly described by contemporaries, who believed they were witnessing the apocalyptic “wine-press of God’s wrath,” in the form of the huge military towers filled with piles of purulent corpses. The Roman renaissance was stopped dead in its tracks; state failure and economic stagnation ensued, from which the Romans never recovered.
Recently the actual DNA of Yersinia pestis has been recovered from multiple victims of the Roman pandemic. And the lessons are profound…
The winter seasonality of the Plague of Cyprian points to a germ that thrived on close interpersonal contact and direct transmission. The position of the Roman Empire astride some of the major flyways of migratory birds, and the intense cultivation of pigs and domestic fowl such as chickens and ducks, put the Romans at risk. Climate perturbations can subtly redirect the migratory routes of wild waterfowl, and the strong oscillations of the AD 240s could well have provided the environmental nudge for an unfamiliar zoonotic pathogen to find its way into new territory. The flu is a possible agent of the pestilence.
A second and more probable identification of the Plague of Cyprian is a viral hemorrhagic fever. The pestilence manifested itself as an acute-onset disease with burning fever and severe gastrointestinal disorder, and its symptoms included conjunctival bleeding, bloody stool, esophageal lesions, and tissue death in the extremities. These signs fit the course of an infection caused by a virus that induces a fulminant hemorrhagic fever.
1. During the reign of Marcus Aurelius, a pandemic “interrupted the economic and demographic expansion” of the empire.
2. In the middle of the third century, a mix of drought, pestilence, and political challenge “led to the sudden disintegration of the empire.” The empire however was willfully rebuilt, with a new emperor, new system of government, and in due time a new religion.
3. The coherence of this new empire was broken in the late fourth and early fifth centuries. “The entire weight of the Eurasian steppe seemed to lean, in new and unsustainable ways, against the edifice of Roman power…and…the western half of the empire buckled.”
4. In the east there was a resurgent Roman Empire, but this was “violently halted by one of the worst environmental catastrophes in recorded history — the double blow of bubonic plague and a little ice age.”
Explanations for a phenomenon of this magnitude [Rome’s collapse] abound: in 1984, the German classicist Alexander Demandt catalogued more than 200 hypotheses. Most scholars have looked to the internal political dynamics of the imperial system or the shifting geopolitical context of an empire whose neighbours gradually caught up in the sophistication of their military and political technologies. But new evidence has started to unveil the crucial role played by changes in the natural environment. The paradoxes of social development, and the inherent unpredictability of nature, worked in concert to bring about Rome’s demise…
It turns out that climate had a major role in the rise and fall of Roman civilisation. The empire-builders benefitted from impeccable timing: the characteristic warm, wet and stable weather was conducive to economic productivity in an agrarian society. The benefits of economic growth supported the political and social bargains by which the Roman empire controlled its vast territory. The favourable climate, in ways subtle and profound, was baked into the empire’s innermost structure.
The end of this lucky climate regime did not immediately, or in any simple deterministic sense, spell the doom of Rome. Rather, a less favourable climate undermined its power just when the empire was imperilled by more dangerous enemies – Germans, Persians – from without. Climate instability peaked in the sixth century, during the reign of Justinian. Work by dendro-chronologists and ice-core experts points to an enormous spasm of volcanic activity in the 530s and 540s CE, unlike anything else in the past few thousand years. This violent sequence of eruptions triggered what is now called the ‘Late Antique Little Ice Age’, when much colder temperatures endured for at least 150 years. This phase of climate deterioration had decisive effects in Rome’s unravelling. It was also intimately linked to a catastrophe of even greater moment: the outbreak of the first pandemic of bubonic plague.
Where hunter-gatherers saw themselves simply as part of an inherently productive environment, farmers regarded their environment as something to manipulate, tame and control. But as any farmer will tell you, bending an environment to your will requires a lot of work. The productivity of a patch of land is directly proportional to the amount of energy you put into it.
This principle that hard work is a virtue, and its corollary that individual wealth is a reflection of merit, is perhaps the most obvious of the agricultural revolution’s many social, economic and cultural legacies.
The acceptance of the link between hard work and prosperity played a profound role in reshaping human destiny. In particular, the ability to both generate and control the distribution of surpluses became a path to power and influence. This laid the foundations for all the key elements of our contemporary economies, and cemented our preoccupation with growth, productivity and trade.
Regular surpluses enabled a much greater degree of role differentiation within farming societies, creating space for less immediately productive roles. Initially these would have been agriculture-related (toolmakers, builders and butchers), but over time new roles emerged: priests to pray for good rains; fighters to protect farmers from wild animals and rivals; politicians to transform economic power into social capital.
Scientists have traced the rise of the super-rich deep into our historical past to uncover the ancient source of social inequality. Their conclusion? Thousands of years ago, it was the use of large farm animals – horses and oxen that could pull ploughs – which created the equivalent of our multi-billionaire entrepreneurs today.
It was only with the domestication of cattle and horses – sometimes thousands of years after land cultivation had begun – that serious divisions between societies’ haves and have-nots began to emerge, eventually creating the ancient equivalent of today’s island-owning, jet-setting billionaires...
A: When I was doing the History of Rome [podcast], so many people asked me, ‘Is the United States Rome? Are we following a similar trajectory?’ If you start to do some comparisons between the rise and development of the U.S. and rise and development of Rome, you do wind up in this same place. The United States emerging from the Cold War has some analogous parts to where Rome was after they defeated Carthage [in 146 B.C.]. This period was a wide-open field to fill a gap in our knowledge.
Q: One topic you describe at length is economic inequality between citizens of Rome. How did that come about?
A: After Rome conquers Carthage, and after they decide to annex Greece, and after they conquer Spain and acquire all the silver mines, you have wealth on an unprecedented scale coming into Rome. The flood of wealth was making the richest of the rich Romans wealthier than would’ve been imaginable even a couple generations earlier. You’re talking literally 300,000 gold pieces coming back with the Legions. All of this is being concentrated in the hands of the senatorial elite, they’re the consuls and the generals, so they think it’s natural that it all accumulates in their hands.
At the same time, these wars of conquest were making the poor quite a bit poorer. Roman citizens were being hauled off to Spain or Greece, leaving for tours that would go on for three to five years a stretch. While they were gone, their farms in Italy would fall into disrepair. The rich started buying up big plots of land. In the 130s and 140s you have this process of dispossession, where the poorer Romans are being bought out and are no longer small citizen owners. They’re going to be tenant owners or sharecroppers and it has a really corrosive effect on the traditional ways of economic life and political life. As a result, you see this skyrocketing economic inequality…
Q: Do you see parallels between land ownership in Rome and in the modern United States?
A: In the Roman experience, this is the beginning of a 100-year-long process of Italy going from being a patchwork of smaller farms with some large estates to nothing but sprawling, commercially-oriented estates. And yes, the United States is continuing to go through a very similar process. At the founding of our republic, everybody’s a farmer, and now everything is owned by what, Monsanto?
Moving beyond just strictly agricultural companies, large American corporations are now employing more and more people. There seems to be this move away from people owning and operating their own establishments, and they’re instead being consumed by large entities. You’re talking about the Amazons of the world swallowing up so much of the market share, it just doesn’t pay to be a clerk in a bookstore or own a bookstore, you end up being a guy working in a warehouse, and it’s not as good of a job.
I’ve mentioned previously previously about the role that the transformation of land and labor into commodities which could be bought and sold was critical to the establishment of capitalist market economies (along with the extensive monetization of the economy by the state).
Prior to the market economy, most land was distributed by feudal relations and not simply something to be bought and sold like a waistcoat or a side of beef. Land ownership and tenure was something that was critical to the social fabric. In England (as in much of Western Europe), much of the country’s land was held by the Catholic Church. When Hnery VIII broke with the Catholic Church, he seized monastic lands, and eventually sold them off. This created a market for land that had not existed before, and which was unique to Britain. This may have been the key even in turning land into a marketable commodity, which was key in the development of market capitalism. As Polanyi put it:
Production is interaction of man and nature; if this process is to be organized through a self-regulating mechanism of barter and exchange, then man and nature must be brought into its orbit; they must be subject to supply and demand, that is, be dealt with as commodities, as goods produced for sale.
Such precisely was the arrangement under a market system. Man under the name of labor, nature under the name of land, were made available for sale; the use of labor power could be universally bought and sold at a price called wages, and the use of land could be negotiated for a price called rent. There was a market in labor as well as in land, and supply and demand in either was regulated by the height of wages and rents, respectively; the fiction that labor and land were produced for sale was consistently upheld. Capital invested in the various combinations of labor and land could thus flow from one branch of production to another, as was required for an automatic levelling of earnings in the various branches.
Previous scholarship has argued that the demographic disaster after the Black Death caused a shortage of labor and led to the demise of the feudal system. Flight into cities would also have contributed to wage labor taking the place of status relations as the main form of contract. This, combined with the establishment of a market for land, may have both been the causes of the transformation of labor and land into saleable commodities, which was a necessary step toward the market economy. This paper argues that places where land was heavily commoditized after the dissolution of the monasteries correlate with places where the Industrial Revolution first took off. To my knowledge, this historical connection was never explored by Polanyi himself, but it does provide an interesting addendum to his argument that universal markets are created by top-down state power and authority. Fascinating stuff:
In 1534, Henry VIII decided to break with the Catholic Church. In addition to severing ties with Rome, Henry appropriated all taxes that monasteries, churches and other religious institutions paid to the Pope. When his financing needs – due to wars in France – became too great, he expropriated all monasteries in England, which collectively held about one third of all land in the country (Youings 1967). When the management of these vast properties turned out to outstrip the bureaucratic capacity of his government, Henry sold all monastic assets in England. The main effect of this dumping of land was the creation of local land markets. Where lands were before held in long leases whose rates were set by medieval custom, lands now changed hands frequently and at market rates. In a few years between 1535 and 1542, the majority of monastic land was sold. Since monastic holdings were often ancient and were spread out unevenly throughout England, villages were differentially impacted by this shock. Some villages had no monastic assets in them (monasteries often owned land far away from their physical buildings) whereas in others, a local – or distant – monastery may have held large tracts of land. We hypothesise that the creation of a land market can be linked to local differences in subsequent development and, ultimately, industrialisation.
It’s notable that this event did not take place in France, or anywhere else in Western Europe! Is this why France lagged in the race to industry? The lands of the Church were, in fact, eventually seized and sold off, as in England. But this took place only in the aftermath of the French Revolution centuries later.
And where it did take place, it seems it had a similar effect as in England centuries earlier: higher agricutureal productivity and more industrial output:
The law passed by the French Constituent Assembly on 2 November 1789 confiscated all Church property and redistributed it by auction. Over the next five years, more than 700,000 ecclesiastical properties – about 6.5% of French territory – were sold…We find that in regions where more church land was auctioned off, land inequality was higher in the 19th century. Further, we show that this wealth imbalance was associated with higher levels of agricultural productivity and agricultural investments by the mid-19th century. Specifically, a district with 10% more Church land redistributed had 25% higher productivity in wheat production, about 1.6 more pipe manufacturers (used for drainage and irrigation projects), and about 3.8% less land left fallow. Our study also shows that the beneficial effects of revolutionary land redistribution on agricultural productivity gradually declined over the course of the 19th century. This result is consistent with other districts gradually overcoming the transaction costs associated with reallocating the property rights that came with the feudal system.
And finally, the scars of destroying people’s way of life continue to linger hundreds of years later, down to the present day!
People living in the former industrial heartlands of England and Wales are more disposed to negative emotions such as anxiety and depressive moods, more impulsive and more likely to struggle with planning and self-motivation, according to a new study of almost 400,000 personality tests.
The findings show that, generations after the white heat of Industrial Revolution and decades on from the decline of deep coal mining, the populations of areas where coal-based industries dominated in the 19th century retain a “psychological adversity”.
Researchers suggest this is the inherited product of selective migrations during mass industrialisation compounded by the social effects of severe work and living conditions.
During my long discursion on the history of money, the academic James C. Scott published an important book called Against the Grain: A Deep History of the First States.
Regular readers will know that this has been a longstanding area of research (or obsession) of mine. I’ve referred to Scott’s work before, particularly Seeing Like A State, which I think is indispensable in understanding many of the political divisions of today (and why left/right is no longer a useful distinction). We’re in an era where much of the “left” is supporting geoengineering and rockets to Mars, and the “right” (at least the alt-right) is criticizing housing projects and suburban sprawl.
It’s a shame that Scott’s book shared the same title as another one of my favorite books on that topic by journalist Richard Manning that came out a while ago: Against the Grain: How Agriculture Hijacked Civilization. Manning’s book is not only a historical account about how the rise of grain agriculture led to war, hierarchy, slavery and sickness, but a no-holds-barred examination of today’s grain-centric agribusiness model, where wheat, corn, soy and sugar are grown in mechanized monocultures and processed by the food industry into highly-addictive junk food implicated in everything from type two diabetes, to depression to Alzheimer’s disease (via inflammation):
Dealing with surplus is a difficult task. The problem begins with the fact that, just like the sex drive, the food drive got ramped up in evolution. If you have a deep, yearning need for food, you’re going to get along better than your neighbor, and over the years that gene is going to be passed on. So you get this creature that got fine-tuned to really need food, especially carbohydrates. Which brings us to the more fundamental question: can we ever deal with sugar? By making more concentrated forms of carbohydrates, we’re playing into something that’s quite addictive and powerful. It’s why we’re so blasted obese. We have access to all this sugar, and we simply cannot control our need for it—that’s genetic.
Now, can we gain the ability to overcome that? I’m not sure. You have to add to this the fact that there’s a lot of money to be made by people who know how to concentrate sugar. They have a real interest in seeing that we don’t overcome these kinds of addictions. In fact, that’s how you control societies—you exploit that basic drive for food. That’s how we train dogs—if you want to make a dog behave properly, you deprive him or give him food. Humans aren’t that much different. We just like to think we are. So as an element of political control, food and food imagery are enormously important.
In that interview, Manning also makes this point which got so much attention in Yuval Noah Harari’s blockbuster, Sapiens (which came out years later):
…it’s not just human genes at work here. It’s wheat genes and corn genes—and how they have an influence on us. They took advantage of our ability to travel, our inventiveness, our ability to use tools, to live in a broad number of environments, and our huge need for carbohydrates. Because of our brains’ ability, we were able to spread not only our genes, but wheat’s genes as well. That’s why I make the argument that you have to look at this in terms of wheat domesticating us, too. That co-evolutionary process between humans and our primary food crops is what created the agriculture we see today.
As for the title, I guess Against the Grain is just too clever a title to pass up 🙂
I’m still waiting on the book from the library, but I have seen so many reviews by now that I’m not sure I’ll be able to add too much. What’s interesting to me is the degree to which the idea that civilization was a great leap backward from what we had before is starting to go mainstream.
The old, standard “Whig version” story of directional, inevitable progress is still pretty strong, though. Here’s one reviewer describing how it was articulated in the turn-of-the-century Encyclopedia Britannica:
The Encyclopaedia took its readers through a panorama of universal history, from “the lower status of savagery,” when hunter-gatherers first mastered fire; to the “middle status of barbarism,” when hunters learned to domesticate animals and became herders; to the invention of writing, when humanity “graduated out of barbarism” and entered history. Along the way, humans learned to cultivate grains, such as wheat and rice, which showed them “the value of a fixed abode,” since farmers had to stay near their crops to tend and harvest them. Once people settled down, “a natural consequence was the elaboration of political systems,” property, and a sense of national identity. From there it was a short hop—at least in Edwardian hindsight—to the industrial revolution and free trade.
Some unfortunate peoples, even entire continents such as aboriginal North America and Australia, might fall off the Progress train and have to be picked up by kindly colonists; but the train ran along only one track, and no one would willingly decline to board it…
But,it turns out that the reality was quite different. In fact, hunter-gatherers resisted agriculture. Even where farmers and H-G’s lived side-by-side, the H-G’s (and herders) avoided farming as long as they could. When Europeans equipped “primitive” societies with seeds and hoes and taught them to farm, the natives threw away the implements and ran off into the woods. The dirt farmers of colonial America often ran away to go and live with the nomadic Indians, to the extent that strict laws had to be passed to prevent this (as documented in Sebastian Junger’s recent book Tribe).
At the ‘Man the Hunter’ symposium in Chicago in 1966, Marshall Sahlins drew on research from the likes of Richard B. Lee among the !Kung of the Kalahari to argue that hunter-gatherers enjoyed the ‘original affluent society’. Even in the most marginal environments, he said, hunter-gatherers weren’t engaged in a constant struggle for survival, but had a leisurely lifestyle. Sahlins and his sources may have pushed the argument a little too far, neglecting to consider, for instance, the time spent preparing food (lots of mongongo nuts to crack). But their case was strong enough to deal a severe blow to the idea that farming was salvation for hunter-gatherers: however you cut it, farming involves much higher workloads and incurs more physical ailments than relying on the wild. And the more we discover, as Scott points out, the better a hunter-gatherer diet, health and work-life balance look.
So why did they do it? That is a question that nobody know the answer to, but it appears they stumbled into not because it was a better way of life, but due to some sort of pressures beyond their control. As Colin Tudge put it, “People did not invent agriculture and shout for joy; they drifted or were forced into it, protesting all the way.” Rather than taking up agriculture because it presented a better, more secure way of life as the Victorians thought (due to chauvinism and ignorance), it was actually much more unpleasant and much more work.
The shift to agriculture was in some respects…harmful. Osteological research suggests that domiciled Homo sapiens who depended on grains were smaller, less well-nourished and, in the case of women, more likely to be anaemic, than hunter-gatherers. They also found themselves vulnerable to disease and able to maintain their population only through unprecedentedly high birthrates. Scott also suggests that the move from hunting and foraging to agriculture resulted in ‘deskilling’, analogous to the move in the industrial revolution from the master tradesman’s workshop to the textile mill. State taxation compounded the drudgery of raising crops and livestock. Finally, the reliance on only a few crops and livestock made early states vulnerable to collapse, with the reversion to the ‘dark ages’ possibly resulting in an increase in human welfare.
Circumstances beyond their control must have played a role. Climate change is most commonly implicated. Overpopulation must have played a role, but this raises a chicken-and-egg problem: overpopulation is a problem created by agrarianism, so how could it have caused it?
One novel idea I explored earlier this year was Brian Hayden’s idea that the production of ever-increasing surpluses was part of a strategy by aggrandizing individuals in order to gain political power.
Periodic feasting events were ways to increase social cohesion and deal with uneven production in various climatic biomes–it was a survival strategy for peoples spread-out among a wide geographical area (mountains, plains, wetlands, riparian, etc.). If food was scarce in one area, resources could be pooled. Such feasting/resource pooling regimes were probably the earliest true “civilizations” (albeit before cities). It was also the major way to organize mass labor, which lasted well into the historical period (both Egyptian and Mesopotamian texts testify to celebratory work feasts).
At these events, certain individuals would loan out surplus food and other prestige items in order to lure people in debt to them. Cultural expectations meant that “gifts” would have to repaid and then some (i.e. with interest). These people would get their relatives and allies to work their fingers to the bone in order to produce big surpluses in societies where this was possible, such as horticultural and affluent forager ones. This would be used for feasting. They would then become “Big Men”–tribal leaders lacking “official” status.
Would-be Big-Men would then try and outdo one another by throwing larger, richer feasts than their rivals. Competitive feasting provided an opportunity for aggrandizers to try and outdo one another in a series of power games and status jockeying. But the net effect such power games had across the society was to ramp up food production to unsustainable levels. This, in turn, led to intensification.
At these feasts, highly prized foodstuffs would be used by aggrandizers to lure people into debt and other lopsided obligations, as well as get people to work for them. Manning notes above how food has been traditionally used to control people. And, Hayden speculates, the foods most commonly used were ones with pleasurable or mind-altering effects. One common one was almost certainly alcohol.
He speculates that grains were initially grown not for flavor or for carbohydrates, but for fermentation. It’s fairly certain that alcohol consumption played a major role in feasting events, and it’s notable that the earliest civilizations were all big beer drinkers (Egypt, Mesopotamia, China, Mesoamerica). Most agricultural village societies around the world have some sort of beer drinking/fermentation ritual, as Patrick E. McGovern has documented. The first “recipe” ever written down was for beer brewing. Hayden speculates that early monoliths like Göbekli Tepe and Stonehenge were built as places for such feasting events to take place, wedded to certain religious ideologies (all of them have astronomical orientations), and archaeology tends to confirm this. It’s notable that the earliest sites of domestication/agrarianism we know of are typically in the vicinity of these monoliths.
In other words, the root of this overproduction was human social instincts, and not just purely environmental or climatic factors. Is there some connection between plant/animal domestication and religious ideology? Is it any wonder that religious concepts in these societies transform to become very different from the animist ones of hunter-gatherers? Flannery and Marcus point out that the establishment of a hereditary priesthood that constructs temples and interprets the gods’ wishes (replacing the shaman) is always a marker of the transition from an egalitarian society to a hierarchical one with hereditary leadership. Even in the Bible, king and temple arise more or less simultaneously (e.g. Saul/David/Solomon).
Scott considers whether the Younger Dryas, a period of markedly colder and drier conditions between 12,900 and 11,700 years ago, forced hunter-gatherers into farming. But while the change in climate may have inspired more experimentation with cultivation and herding, the Younger Dryas is too early: communities committed to cereals and livestock didn’t arise until about ten thousand years ago. Scott overlooks another possible factor: religious belief. The discovery of the Neolithic hill-top sanctuary of Göbekli Tepe in southern Turkey in 1994 went against the grain of conventional archaeological understanding of the Neolithic. Here, around 11,000 years ago, hunter-gatherers had constructed a vast complex of massive decorated stone pillars in exactly the same place that domesticated strains of wheat had evolved.
The quantities of food needed to feed the workforce and those who gathered for rituals at Göbekli must have been huge: if the Neolithic gods could persuade people to invest so much effort in construction, and to suffer the physical injuries, ailments and deaths that came along with it, then perhaps expending those extra calories in the fields would have seemed quite trivial. Even then, Göbekli doesn’t help us explain why cereal farming and goat herding took such a hold elsewhere. Personally I find it difficult to resist the theory of unintended self-entrapment into the farming lifestyle, which was then legitimated by Neolithic ideology. We find evidence of burial rituals and skull cults throughout the Fertile Crescent.
Scott’s book emphasizes the key role that grain cultivation played in the rise of the early states (even in the title). Cereals grown it river bottoms were easy to assess and tax, unlike other foodstuffs which would ripen at different times of the year, could be hidden, or grown in patches. They were storable and divisible. In some ways, grain may have been the earliest form of money:
Most early crops could not provide a source of taxation. Potatoes and tubers are easily hidden underground. Lentils produce annually and can be eaten as they’re picked. Grains, however, have determinate ripening times, making it easy for the tax collector to show up on time. They cannot be eaten raw. And because grains are so small, you can tax them down to the grain. Unlike squash or yams, grains are easy to transport. Spoilage time is nothing like that of vegetables. All these factors played into the first widespread form of currency.
Grain is special, but for a different reason. It is easy to standardize—to plant in rows or paddies, and store and record in units such as bushels. This makes grain an ideal target for taxation. Unlike underground tubers or legumes, grain grows tall and needs harvesting all at once, so officials can easily estimate annual yields. And unlike fugitive wild foods, grain creates a relatively consistent surplus, allowing a ruling class to skim off peasant laborers’ production through a tax regime of manageable complexity. Grain, in Scott’s lexicon, is the kind of thing a state can see. On this account, the first cities were not so much a great leap forward for humanity as a new mode of exploitation that enabled the world’s first leisured ruling class to live on the sweat of the world’s first peasant-serfs.
It’s worth noting that it wasn’t simply agriculture, but cereal production that relied on artificial irrigation that saw the rise of the first states. The need to coordinate all that labor, partition permanent plots of land, and resolve settlement disputes, must have led to the rise of an elite managerial class, as Ian Welsh points out:
Agriculture didn’t lead immediately to inequality, the original agricultural societies appear to have been quite equal, probably even more so than the late hunter-gatherer societies that preceded them. But increasing surpluses and the need for coordination which arose, especially in hydraulic civilizations (civilizations based around irrigation which is labor intensive and require specialists) led to the rise of inequality. The pharoahs created great monuments, but their subjects did not live nearly as well as hunter-gatherers.
And sedentism, as I’ve noted, is not so much a product of agriculture as a cause. Likely sedentary societies needed to be around for some time in order to build up the kind of surpluses aggrandizing elites needed to gain power. These probably started as “redistributor chiefs” who justified their role through some combination of martial leadership and religious ideology:
Sedentism does not have its origins in plant and animal domestication. The first stratified states in the Tigris and Euphrates Valley appeared ‘only around 3,100 BCE, more than four millennia after the first crop domestications and sedentism’. Sedentism has its roots in ecologically rich, preagricultural settings, especially wetlands. Agriculture co-existed with mobile lifestyles in which people gathered to harvest crops. Domestication itself is part of a 400,000 year process beginning with the use of fire. Moreover, it is not a process (or simply a process) of humans gaining increasing control over the natural world. People find themselves caring for dogs, creating an ecological niche for mice, ticks, bedbugs and other uninvited guests, and spending their lives ‘strapped to the round of ploughing, planting, weeding, reaping, threshing, grinding, all on behalf of their favorite grains and tending to the daily needs of their livestock’.
This was also noted in the Richard Manning interview, above:
…we always think that agriculture allowed sedentism, which gave people time to create civilization and art. But the evidence that’s emerging from the archeological record suggests that sedentism came first, and then agriculture. This occurred near river mouths, where people depended on seafood, especially salmon. These were probably enormously abundant cultures that had an enormous amount of leisure time—they just had to wait for the salmon runs to occur. There are some good records of those communities, and from the skeleton remains we can see that they got up to 95 percent of their nutrients from salmon and ocean-derived sources. Along the way, they developed highly refined art—something we always associate with agriculture.
Of course, urban societies using irrigation and plow-based agriculture, with their palaces and temples, are very different from horticultural village societies practicing shifting cultivation (which Scott terms “late-Neolithic multispecies resettlement camps.”). This is likely why early agricultural societies were roughly about as egalitarian as their immediate predecessors, as Ian Welsh pointed out above. But once the plow allowed men to wrest control of food production away from the garden plots of women, the fortunes of females declined rapidly. Political control became exclusively centered in the households run by patriarchs, with women becoming little more than chattel. And because there was now property to be passed down, women’s sexual behavior became strictly regulated and monogomy enforced (for commoners but not for elites). Several thousand years of increasing surpluses and population led to the Neolithic “experiment” metastasizing into the first city-states and empires in various parts of the world. This was not a swift process, but instead took thousands of years to develop–longer than all of “recorded” history:
…why did it take so long – about four thousand years – for the city-states to appear? The reason is probably the disease, pestilence and economic fragility of those Neolithic villages. How did they survive and grow at all? Well, although farming would have significantly increased mortality rates in both infants and adults, sedentism would have increased fertility. Mobile hunter-gatherers were effectively limited by the demands of travel to having one child every four years. An increase in fertility that just about outpaced the increase in mortality would account for the slow, steady increase in population in the villages. By 3500 BCE the economic and demographic conditions were in place for a power-grab by would-be leaders.
Once such societies were established, they were under an obligation to expand. This was due to the depletion of their agricultural resource base thanks to overgrazing, salinization, erosion, deforestation, and numerous other environmental problems caused by agriculture, along with rapid population growth. New farmers require new land, since their birthrates are higher. As such societies expanded, their neighbors had only three options: fight back by adopting similar measures, succumb and be assimilated, or run away. Many did run away, which is why so much of the the world’s inhabitants lived outside of state control until the 1600’s, as Scott points out (Scott calls them ‘Barbarians’; he uses it a term of respect rather than Victorian derision).
Scott also emphasizes the key role played by slavery in agrarian states. In Scott’s view, slavery was absolutely essential to the functioning of the state. Because sedentary, agricultural societies tended to have so much unpleasant “grunt” labor to be done, there was a strong incentive to acquire slaves to do the dirty work required to keep the society running. Three major ways labor was compelled in the ancient world were corvée labor, chattel slavery, and (we often forget) debt bondage. This only ended once we got “energy slaves” to do much of this grunt work for us. Yet even today, we use wage slavery compelled by poverty along with migrant labor to do the grunt work necessary for us. Non-mechanized agricultural labor is still completely dependent on migrant labor in the U.S. and Europe, as are many low-skill, non-automated professions (driver, nanny, gardener, etc.) Ancient slavery was less about skin color or point of origin, as it was in the Americas (where a racial hierarchy was instituted by Europeans). Instead it was simply more of a legal status, much like a temp or migrant worker in countries today (or the Chinese Hukou system):
In the world of states, hunter-gatherers and nomads, one commodity alone dominated all others: people, aka slaves. What agrarian states needed above all else was manpower to cultivate their fields, build their monuments, man their armies and bear and raise their children. With few exceptions, the epidemiological conditions in cities until very recently were so devastating that they could grow only by adding new populations from their hinterlands. They did this in two ways. They took captives in wars: most South-East Asian early state chronicles gauge the success of a war by the number of captives marched back to the capital and resettled there. The Athenians and Spartans might kill the men of a defeated city and burn its crops, but they virtually always brought back the women and children as slaves. And they bought slaves: a slave merchant caravan trailed every Roman war scooping up the slaves it inevitably produced.
The fact is that slaving was at the very centre of state-making. It is impossible to exaggerate the massive effects of this human commodity on stateless societies. Wars between states became a kind of booty capitalism, where the major prize was human traffic. The slave trade then completely transformed the non-state ‘tribal zone’. Some groups specialised in slave-raiding, mounting expeditions against weaker and more isolated groups and then selling them to intermediaries or directly at slave markets. The oldest members of highland groups in Laos, Thailand, Malaysia and Burma can recall their parents’ and grandparents’ memories of slave raids. The fortified, hilltop villages, with thorny, twisting and hidden approaches that early colonists found in parts of South-East Asia and Africa were largely a response to the slave trade.
In describing the early city-states of Mesopotamia, Scott projects backwards from the historical records of the great slave societies of Greece and Rome. His account of the slaves and the way they were controlled seems strangely familiar. Much like migrant labourers and refugees in Europe today, they came from scattered locations and were separated from their families, demobilised and atomised and hence easier to control. Slaves, like today’s migrants, were used for tasks that were vital to the needs of the elites but were shunned by free men. And slaves, like refugee workers, were gradually integrated into the local population, which reduced the chance of insurrection and was necessary to keep a slave-taking society going. In some early states human domestication took a further step: written records from Uruk use the same age and sex categories to describe labourers and the state-controlled herds of animals. Female slaves were kept for breeding as much as for manual labour.
I’ve often wondered if, when certain humans learned how to domesticate plants and animals, they used it as much on their fellow man as they did their flora and fauna. In this Aeon article, this passage really struck me:
When humans start treating animals as subordinates, it becomes easier to do the same thing to one another. The first city-states in Mesopotamia were built on this principle of transferring methods of control from creatures to human beings, according to the archaeologist Guillermo Algaze at the University of California in San Diego. Scribes used the same categories to describe captives and temple workers as they used for state-owned cattle.
Indeed, the idea that humans domesticated themselves is another key concept in Harari’s Sapiens. But perhaps that domestication was much more “literal” than we have been led to believe. Perhaps human sacrifice was a way for early religious leaders to “cull” individuals who had undesirable traits from their standpoint: independence, aggression, a questioning attitude, etc. Indeed, hunter-gatherers still do not like obeying orders from a boss. I wonder to what extent this process is still going on, especially in modern-day America with its schools, prisons, corporate cubicles, police, military, etc.:
Anthropologists and historians have put forward the ‘social control hypothesis’ of human sacrifice. According to this theory, sacrificial rites served as a function for social elites. Human sacrifice is proposed to have been used by social elites to display their divinely sanctioned power, justify their status, and terrorise underclasses into obedience and subordination. Ultimately, human sacrifice could be used as a tool to help build and maintain systems of social inequality.
And this is very relevent to our recent discussion of money: writing and mathematics were first used as methods of social control. As Janet Gleeson-White points out in this essay, accounting was our first writing technology. Money–and taxes–were an outgrowth of this new communications technology:
War, slavery, rule by élites—all were made easier by another new technology of control: writing. “It is virtually impossible to conceive of even the earliest states without a systematic technology of numerical record keeping,” Scott maintains. All the good things we associate with writing—its use for culture and entertainment and communication and collective memory—were some distance in the future. For half a thousand years after its invention, in Mesopotamia, writing was used exclusively for bookkeeping: “the massive effort through a system of notation to make a society, its manpower, and its production legible to its rulers and temple officials, and to extract grain and labor from it.”
Early tablets consist of “lists, lists and lists,” Scott says, and the subjects of that record-keeping are, in order of frequency, “barley (as rations and taxes), war captives, male and female slaves.” Walter Benjamin, the great German Jewish cultural critic, who committed suicide while trying to escape Nazi-controlled Europe, said that “there is no document of civilization which is not at the same time a document of barbarism.” He meant that every complicated and beautiful thing humanity ever made has, if you look at it long enough, a shadow, a history of oppression.
Collecting cereal grains directly as taxes would have been cumbersome for administrators, which no doubt led to the innovations we’ve been discussing recently: a unit of account and debt/credit records. The temples were the first institutions to create and store surpluses, making them arguably the ancestor to later corporations (and capitalism). They were the first to do economic planning and charge interest. Later, rulers would strongly desire to monetize the economy by issuing coins, because it was far easier to collect coins and record taxes using this method than collecting resources in kind. We’ve already seen how money, markets, and the state are intimately intertwined (and not separate as libertarians claim).
The connection between the earliest writing and domestication/subjugation is powerfully made by this article from the BBC documenting the world’s oldest writing:
In terms of written history, this is the very remote past. But there is also something very direct and almost intimate about it too. You can see fingernail marks in the clay. These neat little symbols and drawings are clearly the work of an intelligent mind.
These were among the first attempts by our human ancestors to try to make a permanent record of their surroundings. What we’re doing now – my writing and your reading – is a direct continuation. But there are glimpses of their lives to suggest that these were tough times. It wasn’t so much a land of milk and honey, but porridge and weak beer.
Even without knowing all the symbols, Dr Dahl says it’s possible to work out the context of many of the messages on these tablets. The numbering system is also understood, making it possible to see that much of this information is about accounts of the ownership and yields from land and people. They are about property and status, not poetry.
This was a simple agricultural society, with a ruling household. Below them was a tier of powerful middle-ranking figures and further below were the majority of workers, who were treated like “cattle with names”. Their rulers have titles or names which reflect this status – the equivalent of being called “Mr One Hundred”, he says – to show the number of people below him.
It’s possible to work out the rations given to these farm labourers. Dr Dahl says they had a diet of barley, which might have been crushed into a form of porridge, and they drank weak beer. The amount of food received by these farm workers hovered barely above the starvation level. However the higher status people might have enjoyed yoghurt, cheese and honey. They also kept goats, sheep and cattle.
For the “upper echelons, life expectancy for some might have been as long as now”, he says. For the poor, he says it might have been as low as in today’s poorest countries.
So the earliest writing tends to confirm Scott’s account. And not just Scott’s account, but that of anthropologist James Suzman, who has simultaneously come out with a book about the disappearing way of life of the the !Kung San Bushmen of the Kalahari. This is also reviewed in the New Yorker article, above. These hunter-gatherers are going through today exactly what those people in the Near East experienced roughly 6-8000 years ago, giving us a window into history:
The encounter with modernity has been disastrous for the Bushmen: Suzman’s portrait of the dispossessed, alienated, suffering Ju/’hoansi in their miserable resettlement camps makes that clear. The two books even confirm each other’s account of that sinister new technology called writing. Suzman’s Bushman mentor, !A/ae, “noted that whenever he started work at any new farm, his name would be entered into an employment ledger, documents that over the decades had assumed great mystical power among Ju/’hoansi on the farms. The secrets held by these ledgers evidently had the power to give or withhold pay, issue rations, and determine an individual’s right to stay on any particular farm.”
Writing turned the majority of people into serfs and enabled a sociopathic elite to live well and raise themselves and their offspring above everyone else.
And here we are at the cusp of a brand new “information revolution” where literally our every thought and move can be monitored and tracked by a tiny centralized elite and permanently stored. And yet we’re convinced that this will make all our lives infinitely better! Go back and reread the above. I’m not so sure. I already feel like “cattle with a name” in our brave new nudged, credit-scored, Neoliberal world.
We’re also experiencing another period of rapid climate change and resource depletion, just like that experienced at the outset of the original coming of the state. We’re now doing exactly what they did: intensification, and once again it’s empowering a small sociopathic elite at the cost of the rest of us. And yet Panglossians confidently tell us we’re headed for a peaceful techno-utopia where all new discoveries will be shared with all of us instead of hoarded, and we’ll all live like gods instead of being exterminated like rats because we’re no longer necessary to the powers that be. Doubtless the same con (“We’ll all be better off!!!”) was played on the inhabitants of early states, too. Given the human social instincts noted above, let’s just say I’m not optimistic. Please pass the protein blocks.
Scott points out that the state is a very novel development, despite what we read in history books. We read about the history of states because states left written history, and we are their descendants. But that doesn’t mean most people lived under them. By Scott’s account, most humans (barbarians) lived outside of nation-states well into the 1500’s:
…Homo sapiens has been around for roughly 200,000 years and left Africa not much earlier than 50,000 years ago. The first fragmentary evidence for domesticated crops occurs roughly 11,000 years ago and the first grain statelets around 5000 years ago, though they were initially insignificant in a global population of perhaps eight million.
More than 97 per cent of human experience, in other words, lies outside the grain-based nation-states in which virtually all of us now live. ‘Until yesterday’, our diet had not been narrowed to the three major grains that today constitute 50 to 60 per cent of the world’s caloric intake: rice, wheat and maize. The circumstances we take for granted are, in fact, of even more recent vintage …Before, say, 1500, most populations had a sporting chance of remaining out of the clutches of states and empires, which were still relatively weak and, given low rates of urbanisation and forest clearance, still had access to foraged foods. On this account, our world of grains and states is a mere blink of the eye (0.25 per cent), in the historical adventure of our species.
One of the more provocative ideas from Scott’s book is to question whether the withering away of state capacity–that is, a collapse–is really a bad thing at all!
We need to rethink, accordingly, what we mean when we talk about ancient “dark ages.” Scott’s question is trenchant: “ ‘dark’ for whom and in what respects”? The historical record shows that early cities and states were prone to sudden implosion.
“Over the roughly five millennia of sporadic sedentism before states (seven millennia if we include preagriculture sedentism in Japan and the Ukraine),” he writes, “archaeologists have recorded hundreds of locations that were settled, then abandoned, perhaps resettled, and then again abandoned.” These events are usually spoken of as “collapses,” but Scott invites us to scrutinize that term, too.
When states collapse, fancy buildings stop being built, the élites no longer run things, written records stop being kept, and the mass of the population goes to live somewhere else. Is that a collapse, in terms of living standards, for most people? Human beings mainly lived outside the purview of states until—by Scott’s reckoning—about the year 1600 A.D. Until that date, marking the last two-tenths of one per cent of humanity’s political life, “much of the world’s population might never have met that hallmark of the state: a tax collector.”
Indeed, is collapse even a relevant concept when discussing history? What, really is collapsing? States can collapse, but cultures transform:
We also need to think about what we apply the term ‘collapse’ to – what exactly was it that collapsed? Very often, it’s suggested that civilisations collapse, but this isn’t quite right. It is more accurate to say that states collapse. States are tangible, identifiable ‘units’ whereas civilisation is a more slippery term referring broadly to sets of traditions. Many historians, including Arnold Toynbee, author of the 12-volume A Study of History (1934-61), have defined and tried to identify ‘civilisations’, but they often come up with different ideas and different numbers. But we have seen that while Mycenaean states collapsed, several strands of Mycenaean material and non-material culture survived – so it would seem wrong to say that their ‘civilisation’ collapsed. Likewise, if we think of Egyptian or Greek or Roman ‘civilisation’, none of these collapsed – they transformed as circumstances and values changed. We might think of each civilisation in a particular way, defined by a particular type of architecture or art or literature – pyramids, temples, amphitheatres, for example – but this reflects our own values and interests.
States collapsed, civilisations or cultures transformed; people lived through these times and employed their coping strategies – they selectively preserved aspects of their culture and rejected others. Archaeologists, historians and others have a duty to tell the stories of these people, even though the media might find them less satisfactory. And writers who appropriate history for moral purposes need to think carefully about what they are doing and what they are saying – they need to make an effort to get the history as right as possible, rather than dumbing it down to silver-bullet theories.
Scott looks at the fragility of states–and their propensity to revert to more simplified forms, as simply a necessary and inevitable part of the process of history. Rather than a catastrophe, a reduction in complexity often leads to an increase in personal freedom, social experimentation, autonomy, and even artistic development and cultural expression. The Middle Ages is often portrayed as a “dark age,” but that depiction was an invention of the Renaissance, and “dark” referred to the lack of written historical sources, not necessarily wail and woe. Note that the tools of the oppressor – written records, taxation, slavery, usury and money – all fade during this time period. This is not to dismiss the very real disappearance of technology, epidemic disease and warfare that accompanies a state collapse, but merely to suggest a more nuanced view. The Middle Ages was centered around the values of the Church, and society was reoriented along these lines.
Scott writes about the normalising effects of state collapse. Often it was the best thing possible for a people now emancipated from disease, taxes and labour. In the subsequent ‘dark ages’ – a propaganda term used by the elite – democracy and culture could flourish. Homer’s Iliad and Odyssey date from the dark age of Greece. This is in marked contrast to the consequences of state collapse today, now that there is no longer an external barbarian world to escape into. When Syria collapsed its refugees had no choice but cross the border to another state, whether Lebanon, Jordan or Turkey.
While Scott’s topics are timely—tribalism, taxation, trade, warfare—one is particularly relevant: the collapse of civilizations. Shifting landscapes, battles, and resource depletion are all factors that forced newly sedentary societies to pack it up and move on once again. Scott does not see this as a necessary evil, but rather part of the natural order of things: “We should, I believe, aim to “normalize” collapse and see it rather as often inaugurating a periodic and possibly even salutary reformation of political order.”
Scott argues that the loss of state capacity, rather than a tragedy, can often be seen as a liberating event. Yes, such periods mean more poverty, but without the yoke of the state, it can also paradoxically mean more freedom and happiness for the survivors of the collapse. And since relative poverty appears more harmful psychologically than absolute poverty, many societies tend to have greater well being after they’ve fallen apart. He writes:
When the apex disappears, one is particularly grateful for the increasingly large fraction of archaeologists whose attention was focused not on the apex but on the base and its constituent units. From their findings we are able not only to discern some of the probable causes of “collapse” but, more important, to interrogate just what collapse might mean in any particular case…much that passes as collapse as, rather, a disassembly of larger but more fragile political units into their smaller and often more stable components. While “collapse” represents a reduction in social complexity, it is these smaller nuclei of power—a compact small settlement on the alluvium, for example—that are likely to persist far longer than the brief miracles of statecraft that lash them together into a substantial kingdom or empire.
Over time an increasingly large proportion of nonstate peoples were not “pristine primitives” who stubbornly refused the domus, but ex–state subjects who had chosen, albeit often in desperate circumstances, to keep the state at arm’s length…The process of secondary primitivism, or what might be called “going over to the barbarians,” is far more common than any of the standard civilizational narratives allow for. It is particularly pronounced at times of state breakdown or interregna marked by war, epidemics, and environmental deterioration. In such circumstances, far from being seen as regrettable backsliding and privation, it may well have been experienced as a marked improvement in safety, nutrition, and social order. Becoming a barbarian was often a bid to improve one’s lot.
Thus, the leveling effects of “collapse” may be not as “disastrous” as we are led to believe.
Scott’s book gives us hope that the collapse of states, rather than being a universally bad thing, might lead to a flourishing of human freedom. In that, there is some hope. I’ll end with this thought from Scott’s review of Diamond:
Anthropology can show us radically different and satisfying forms of human affiliation and co-operation that do not depend on the nuclear family or inherited wealth. History can show that the social and political arrangements we take for granted are the contingent result of a unique historical conjuncture.