The Reason Americans Don’t Trust Experts – Economists

Who are you going to believe, me, or your own lying eyes?
—GROUCHO MARX

A lot of digital ink has been spilled recently on the rise and spread of agnotology in America. Why don’t Americans listen to experts anymore? Why don’t they trust scientists? Why do they instinctively assume their leaders are lying to them about everything? Why don’t they trust mainstream news outlets anymore? Why are they instead listening to “outsiders” who are obviously shills and charlatans? Why are they listening to “alternative” medical practitioners and quack doctors? Why are they giving credulity to seemingly outrageous conspiracy theories shared online? Why do they reject basic facts?

A lot has been written about that already, so I’m not going to review it here. I’m just going to interject one reason that I haven’t read about anywhere else that I know of.

That reason is economists.

Specifically, the fact that economists told middle America since at least the 1980s that free trade would be good for everyone in America, and that anyone who said otherwise was an ignorant rube who didn’t understand basic economic “science.”

The economists who incessantly proffered this view were “experts” from the most prestigious schools in America—Harvard, Yale, Stanford, Princeton, Georgetown, the University of Chicago, and the like. They claimed it was a settled argument, and that economics had “proven” it beyond the shadow of a doubt through equations as surely as we had proven the movements of the stars and planets. Even the way they framed the argument backed this up. They invoked the “Law” of comparative advantage, suggesting that this was a law of the universe on par with those of physics or chemistry. Anyone who disputed it might just was well believe that water runs uphill or the earth is flat, they claimed (although they weren’t above invoking a little magic on occasion):

[David] Ricardo attempted to prove theoretically that international trade is always beneficial. Paul Samuelson called the numbers used in Ricardo’s example dealing with trade between England and Portugal the “four magic numbers”. “In spite of the fact that the Portuguese could produce both cloth and wine with less amount of labour, Ricardo suggested that both countries would benefit from trade with each other”…Ricardo’s theory of international trade was reformulated by John Stuart Mill. The term “comparative advantage” was started by J. S. Mill and his contemporaries… (Wikipedia)

This became a nearly universal creed among economists and journalists. If there was one article of absolute faith, this was it. Surveys of economists indicated that nearly 100 percent of them agreed with statements about how free trade is always beneficial, and that it always benefits everybody. These economists claimed that free trade was an unstoppable force of nature as inevitable as the tides or the seasons, and that it would make all of us much better off in the long run. The most notable proponents of this creed wrote for the influential New York Times: Princeton economist Paul Krugman; and Thomas Friedman, whose boundless enthusiasm and turgid prose in defense of untrammeled trade and cosmopolitanism seemed at times to border on the absurd. Friedman, a multi-millionaire, published several books on the topic over the ensuing decades, celebrating the wonders of globalization and free trade.

So these were the so-called “experts” Americans were listening to throughout the eighties, nineties, and 2000s, right up until 2008.

Free trade—and its universal benefits—became the standard orthodoxy for both major American political parties beginning in the 1990s. The effect of this cannot be overstated. There truly was no alternative. And voters who pushed back against this orthodoxy were belittled and marginalized by both political parties in the ensuing decades.

Now picture the reality on the ground for ordinary middle Americans, particularly in areas that are considered to be Trump bastions in the heartland of the U.S. today.

Businesses that had been the cornerstones of communities for many generations began to disappear left and right. They either lost out in the newly globalized struggle for profits and went under; moved most of their operations overseas to take advantage of cheaper labor; or were bought out in the accompanying wave of financialization and were “restructured.” In each and every instance, these businesses—formerly the sources of prosperity for so many Americans—were gone, never to return. This happened throughout the eighties and nineties.

Just like an ecosystem, a local economy is a sort of trophic pyramid, and once the primary producers have died off, it will affect all the smaller levels of the pyramid above. The money circulating in the community began to dry up for other businesses in the “food chain” of the local economy like small businesses, bars and restaurants. The people who would have been their customers no longer had jobs, and hence the money to pay for local goods and services. What this meant was that small local businesses with tight margins now had less customers and subsequently went under. This led to the phenomenon of “boarded up main streets” seen in small towns all across America. All as the result of free trade agreements.

At the same time, a flood of cheap consumer goods inundated the market in the United States. These ultra-cheap goods were shoddy, Chinese made garbage, practically made to be thrown away, but Americans had no other choice but to buy them thanks to their shrinking incomes and the lack of alternative sellers who had long since gone under due to cutthroat price competition. Gigantic mega-businesses who could most effectively take advantage of far-flung global supply chains drove local businesses under, even while wresting generous subsidies and tax breaks from local governments. As local businesses fell one by one, this led to a domino effect throughout local economies where the businesses that were once cornerstones of the community went under, their market niches invaded by the transnational big-box chain stores. The small corner store getting replaced by Wal-Mart became a cliché repeated thousands of times over across the United States in the past several decades. Everybody knew it was happening, but no one could stop it.

In almost every small town in America, commerce today is dominated by a few behemoths like Wal-Mart, Home Depot, Lowe’s, Target, Costco, and chains like Applebee’s, Chuck E. Cheese and Taco Bell. All the money that would have circulated in the local economy was instead pulled out and sent to stockholders in New York, San Francisco, London, and other distant financial centers. Whatever small businesses that remained were dealt another successive death blow by the rise of online shopping and the dominance of Amazon’s monopoly over e-commerce. This “retail apocalypse” was ignored by politicians of both parties. Additionally, local newspapers shut down because they could no longer support themselves through ad revenue—everything was now increasingly online and  all the ad money now went to Google and Facebook. This led to information black holes in small towns all over America. People now increasingly went online to get their information, and this online world became a perfect vector for spreading disinformation by bad-faith actors like bots and trolls.

And what happened to the jobs? We were told that people who worked in factories were expendable, and that “making things” was only for dumb losers. Farm jobs had long since been eliminated thank to Big Agriculture. The party that ostensibly defended the working class just told everyone to go out and acquire “more education,” and that this would somehow solve the problem. Yet education did not become more accessible at this time, rather it became prohibitively more expensive and harder to access. Four-year college degrees were practically unobtainable without extensive parental support. The staggering amounts of debt one had to take on without this support practically ensured a lifetime of indentured servitude. This debt became impossible to discharge even in bankruptcy—a change supported by both political parties. Meanwhile, collages and universities became virtual empires overnight, building pharaonic architecture to attract rich students with deep pockets (often foreign-born) and raising tuition into the stratosphere to compensate.

Those who had the financial wherewithal and academic inclination were able to escape to the few remote college towns and distant big cities where such colleges and universities were located. Everyone else was left to drown. The small towns fell into ruin. The Republican columnist Keven Williamson sneered that they “deserved to die,” although it wasn’t entirely clear whether he was referring to the towns themselves or the people living in them. And the other party was no different either, abandoning Middle Americans and making their pitch exclusively to those areas that were, in Hillary Clinton’s words, “diverse, dynamic and moving forward.” High-level Democrats openly enthused that they would have a solid electoral majority once the people in these small towns finally kicked the bucket.

It was a sorting operation on a grand scale—winners from losers, sheep from goats. The “losers” remained behind in the small towns that were drained of their most entrepreneurial inhabitants; the “winners” moved away to a handful of high-tech hubs and exurbs that were growing exponentially, especially in the Sun Belt. Because all of the economic activity was now concentrated in a very small number of cities, the cost of real estate in these cities exploded, making even educated, affluent “winners” economically precarious due to sky-high housing costs. Yet no one in the political or professional economic classes offered any real solutions, or even acknowledged that it was happening.

Meanwhile, back in the small towns, the only jobs left were those in the service industry that paid paid minimum wage or close to it—a wage that had peaked in the 1960s and declined ever since. Here is a graphic of the largest employer in every state. Notice that the “red” states are dominated by Wal-Mart.

The only other alternative was the “Eds and Meds” economy of colleges and hospitals. Both of these metastasizing economic sectors were predatory and extractive, bleeding their customers dry even as they provided the only source of employment in rural areas that paid above minimum wage and offered decent benefits (aside form the prisons that were increasing located in rural areas and filled with the economic “losers”). The graphic above makes this dynamic painfully obvious.

America became staggeringly unequal. An entire infrastructure of poverty developed consisting of payday loan stores, car title loan stores, cash-4-gold stores, blood banks, urgent care clinics, Dollar Stores, pawn shops, and other predatory businesses. Cash-strapped small towns instituted aggressive policing tactics to compensate for lost tax revenue, including issuing very expensive tickets for every minor infraction (which often disproportionately targeted minorities). Tent cities sprang up from coast to coast like dandelions in the springtime. At the corner of seemingly every major intersection and at every freeway off-ramp were people holding up cardboard signs begging for spare change. People started GoFundMe sites to pay for ruinously expensive health care costs, since their low-wage, part-time jobs didn’t offer health insurance coverage. Then, to add insult to injury, beginning in the late nineteen-nineties they now also had to compete for low-wage jobs with immigrants from across the border who were arriving by the truckload in small towns across America in a race to the bottom, while politicians of both parties looked the other way. Any concern over this situation was castigated as “racist.”

All this occurred even while billionaire monopolists become incomprehensibly richer. People in these towns who couldn’t make ends meet no matter how hard they worked were treated to the spectacle of America’s billionaires going to bed at night and waking up billions of dollars richer the next morning, day after day, while their own lives fell apart due to things like unemployment, divorce, drug abuse, arrests, and just plain old bad luck.

What was the “expert” response to all of this? How did the economists from Yale, Harvard, Princeton, Stanford, and other elite institutions react to this economic earthquake?

Free trade is good. Full stop. Anyone who says otherwise is a dimwitted dolt who doesn’t understand the fundamental laws of economics. Besides, nothing can be done about it. They pointed to the affordability and ubiquitousness of ultra-cheap goods made by sweated labor in the global South as proof positive that free trade had benefited absolutely everyone. “Just look at your iPhone!,” they exclaimed.

Americans in small towns and suburbs were also told by these same elite experts that their suffering and that of their close friends and neighbors was justified because Chinese workers were bring “lifted from poverty” even as Americans were increasingly falling into it. Any concern over the increasingly dire poverty and deaths in Middle America was derided as backward parochialism by by professional economists and the neoliberal politicians who listened to them.

These experts, of course, were the same people in big cities who owned almost all the stocks and had benefited handsomely from globalized free trade. These members of what eventually became known as the “Professional Managerial Class” had managed to insulate themselves from foreign competition through legal means such as hard-to-obtain licensing requirements and hyperexpensive education, even while valorizing “competition” for everyone else. Both political parties were one hundred percent in the tank for globalized free trade: the Democrats toothlessly pushed for more education and means-teased social programs for the poorest of the poor, while the Republicans preached an old-fashioned grit-and-bootstraps ethos that castigated people who fell behind for their own lack of gumption and blamed poverty on poor character and moral failings (e.g. having children out of wedlock, excessive drinking). Republicans claimed the real threat to Americans was “dependence on big government” rather than unemployment or economic disintegration.

This message was broadcasted incessantly day after day, week after week, month after month, and year after year by professional economists burnished with impressive credentials from America’s finest institutions. They all sang from the same hymnal in absolute harmony. I live in the Rust Belt, and it’s impossible to overstate just how aggressively this message was pushed throughout the eighties, nineties and early 2000s. There was no dissent in the mainstream corporate media; none whatsoever. The “losers” in this system were told that they deserved what they got, we were told, and each and every one of us were now competitors in the high-stakes, winner-take all struggle of globalism, whether we wanted to be or not. There was simply no alternative.

Yet older people remembered a time that it didn’t used to be that way. They remembered when people could easily find a local job if they wanted one, even without a staggeringly expensive degree and massive debt. When you didn’t have to move far away from your family if you didn’t want to. When you could afford to raise family on a single breadwinner’s salary. When you could buy a house in your 20s. A time when there weren’t quite so many boarded up storefronts, panhandlers, food banks, or people living in their cars. When small local businesses thrived instead of just Wal-Mart and Amazon. They told these stories to their children as if they were describing some sort of long-vanished and forgotten culture, even though it had existed within their own lifetimes. As the satirical Onion headline put it, “Remains Of Ancient Race Of Job Creators Found In Rust Belt.” But the unfortunate circumstance of institutionalized racism during this time period allows any sort of nostalgia for this lost era to be dismissed as “racist” by members of the PMC.

What did the highly credentialed experts in economic “science” have to say to these folks? Sorry pops, that world is gone forever, and it’s not coming back. Suck it up, buttercup. Or else they refused to even acknowledge that anything had changed. Educated academics like Harvard’s Steven Pinker told Americans that’ “You literally never had it so good,” as did columnists in the New York Times like Nicholas Kristof. Anyone who said otherwise was derided as a backward parochialist who couldn’t’ understand cold, hard facts. Concern over America’s domestic disintegration—i.e. ordinary Americans who had been harmed by globalization—was derided as hopelessly ignorant and racist by members of the PMC who disproportionately staffed the corporate media and academic apparatus.

So, given the experience of the average American on the ground that I described above, is it really any wonder that experts began to lose their credibility? The average American looked around them and saw with their own two eyes what was happening right in front of them. They saw the increasing joblessness, homelessness, and poverty. They saw how their neighbors were struggling to make ends meet. They saw the boarded up storefronts, the tent cities, the crumbling infrastructure, the payday loan stores, the aggressive police, the people living in their cars, the people working for peanuts at Amazon and Wal-Mart, the foreclosures, the opioid overdoses, the suicides, and on and on and on.

And what did the professional economists continue to tell us? That none of it was happening! There was nothing to worry about, they insisted. After all, the statistics informed us that everything was fine. Throughout it all, economists assured us that free trade was good for everyone, full stop, and both political parties agreed with that assessment. This was the unassailable word of the so-called experts—the very smart economic “scientists” with high IQs and fancy degrees.

During this same time period, economists also told the public that there was little to no inflation. Now there really has been very little inflation, based on what inflation actually measures—a sustained increase in the average prices of goods you normally buy over time. As stated above, the prices of goods actually fell during this time, due to things like global wage arbitrage, automation, price competition by emerging oligopolies, and efficiency gains. Whether it’s towels, furniture or silverware, previous generations often paid much more for their manufactured goods than we do. The price of computers and electronic goods has fallen sharply, to the point where even poor households can afford large flat-screen televisions and smartphones.

The problem is, the average American doesn’t understand what “inflation” is as economists define it. All they know is that their paycheck doesn’t go as far as it used to. They saw the costs of housing skyrocket. They saw education and health care costs practically double each year. Inflation doesn’t measure those things, and there is a good reason for that. Their costs are not determined as much by the overall supply of money as by status competition and monopoly. Real estate is a local market, and the reason for its precipitous rise in growing urban areas is the one we already touched on above.

Nevertheless, such sophisticated arguments fell on deaf ears. Economists persistently told Americans inflation was low, yet the fixed costs of necessities like housing and health care were killing them, which are the very things inflation indexes specifically omit! Economists did a poor job of explaining this logic to the public, in large part due to elitism. By eliminating the very things American were going broke paying for from the inflation calculus, people began to assume that economists were somehow “cheating” or “covering up” these costs on purpose. The fact that these were “official government statistics” made people lose faith in the veracity of what the government was telling them more broadly: “How can ‘inflation’ be low when a hernia operation coast $100,000 and my school just doubled my tuition?”

The other thing economists told middle America was the unemployment rate was low. This, of course, was measured by the “official” unemployment rate. Due to this rate being low, the politicians were able to wave away concerns from their constituents about rising costs and inequality. After all, if the the official unemployment rate was low, they thought, then what are these people complaining about?

But this official unemployment statistics covered up a very different reality experienced by ordinary people on the ground. Sure, unemployment was officially low, but most of the jobs were awful! Competition for higher-paying jobs became ever more fierce over the years, and good paying jobs with benefits ever more out of reach for most people, especially if you didn’t happen to live in an urban area. Big corporate employers in the service sector routinely pared back working hours to avoid paying benefits, and even you worked just one single hour a week you were counted as officially employed. Underemployment was also not counted, meaning that people who had gone out and gotten expensive degrees but could only find low-paying wage work were invisible in the statistics. People dropping out of the workforce were also not counted, and neither were prisoners—both significant numbers of Americans. Finding a job in the era of automation and outsourcing became something like a game of musical chairs.

So this divide between lived experience and “official” government statistics further deepened the rift and sowed mistrust in political institutions and credentialed experts.

The average American also didn’t understand complex financial institutions like the Federal Reserve that increasingly seemed to control everything from behind the scenes. All the average American saw was that Wall Street and the wealthy investor class were repeatedly bailed out and made whole at every turn, while the average citizen was left to drown during the financial crisis. This led to the rise of all sort of kooky conspiracy theories such as those outlined in the notorious best-seller “The Creature from Jekyll Island”, which has been aggressively pushed by libertarian conspiracy theorists like Ron Paul, who insist that “fiat money” is the real reason behind the nation’s economic pain. Such theories obscured the actual reasons for this pain, such as a generation of stagnant wages, financial engineering, the demise of unions, global competition, corporate consolidation, and both political parties being run by and for a small group of wealthy oligarchs.

This was the economists’ gospel in a nutshell: Free trade is good; unemployment and inflation are low. That was the mantra from their eighties onward through today. And, even though some of the confusion is based on misunderstanding, this “reality” described by economists was 180 degrees opposite from what most Americans have experienced in their own lives from the 1980’s onward.

So, given all of the above, is it any wonder Americans stopped trusting the experts?

Think about that. Let me just say that again: the experts told them that what they saw happening all around them was not actually happening. So that’s what I mean when I say that economists are a major reason why people have lost trust in both credentialed experts and the mainstream corporate media.

And yet they somehow they trust Donald Trump. Why? Because back in 2016 Trump acknowledged that what they saw all around them was actually happening! In fact, he was the only politician to do so. It’s true that a few others like Bernie Sanders did as well, yet the Democratic party was successful in stifling his message and keeping him off the ballot. The Republican Party, ironically, had much less control over its rank-and-file members. These members of the party finally had a candidate who said out loud what they all knew to be true, and had been true for a long time. He phrased it crudely, and with an undercurrent of xenophobia and racism, but at least he acknowledged what the experts had arrogantly and confidently told them wasn’t happening.

So is it really a surprise they now trust Donald Trump more than these so-called experts? Given what I outlined above, is any wonder that the people who live in the small towns and rural villages across the country transferred their faith and loyalty from the credentialed experts to Trump? After all, the credentialed experts had been saying that free trade was good for everyone for nearly forty years. Trump said otherwise, and was the only one who did so (outside of the dissident parts of the Left that had been expelled from the mainstream Democratic party and had no political home, that is). Given the number of times I referred to columnists at The New York Times above, is it any wonder why people in small towns believe that the Times is “fake news?”

Of course, as I’ve said so many times before on this site, economics is a pseudoscience, and economists are really pushing political agendas rather than doing any sort of objective “science.” But I believe that the sneering dismissal of the ignorant rabble that emanated from the ivory towers of academia over the past forty years of neoliberal globlization set the stage for the rejection of any and all expertise that we are now experiencing on the part of the common people. The blowback means that real scientists—actual, legitimate physical scientists and medical doctors—are not being listened too either, thanks to the specious scientific pretensions that economists claimed while free trade was gutting the middle class. To the average American, these are yet more experts pissing on their leg and telling them its raining, just like the economists did for all those years. Why should we believe them?

For example, the conspiracy theories invoked to explain why inflation was low in the statistics but seemingly high in real life took on a life of their own. After all, if you can believe in a secret cabal of bankers and politicians running the Federal reserve, and government statisticians manipulating the unemployment rate, is it that big a leap to believe in a secret cabal of businessmen deliberately engineering a recession, or a secret cabal of virologists secretly engineering a global pandemic? We’ve practically been primed to believe it thanks to the economists’ pretensions of dressing up of political opinions as economic “science” over the past several decades.

The real reason for the economic pain of so many Americans was obscured because it had to be. If people really knew the truth, it would inevitably lead to a push for Leftist politics of the type promoted by Bernie Sanders, and this is the greatest fear among the oligarchs who run America. To avoid that (from their standpoint, terrifying) outcome, the oligarchs had no choice but to peddle paranoid conspiracy theories as the alternative. But now, like the sorcerer’s apprentice from the fairy tale, they have lost control of their own creation. The politics of conspiracy and paranoia have been let loose from Pandora’s Box and are beyond anyone’s ability to control and manipulate at this point. The duplicity of economists, the corporate media and politicians pushing globalism as good for everyone has destroyed the credibility of all experts, not just economists. It has killed faith and trust in media and the experts, no matter how reasonable or accurate those experts may be. This will not end well. We are truly lost, and cannot even find our way to the truth anymore, nor recognize it if we could.

But it all started with economists. Remember that.

ADDENDUM: The economics profession was also instrumental in getting us to ignore environmental limits and denying the consequences of climate change. In doing so, they attacked the credibility of actual scientists, and that has also contributed to the lack of faith in experts we are seeing today.

Hamstrung by Ideology

Americans generally believe that business and government are somehow in opposition; that government can only “interfere” in the workings of business and markets, and that “the economy” is something totally separate and distinct from the rest of society, including from political decisions and social cohesion.

The reason they think this is because of the pervasive libertarian ideology promoted by conventional neoclassical economics. And by libertarian, I’m referring to the systemic bias that pervades all conventional capitalist economics, not just the radical extremist ideology that goes under that name. Neoclassical and classical economics fosters the belief that “economics” must be kept wholly separate from every other aspect of society.

The Chinese, coming from a Marxist—and Confucianist (although in this case, Legalism is probably the better fit)—perspective, believe no such thing. They know that business and government are really the same thing, and always have been, and they make no bones about it. They are free from the Western delusion that there is some sort of “pure” capitalism, free from the taint of government intervention, or the delusion that such a thing is even possible. They do not have the ideological commitment to the”invisible hand,” or the blind faith that anarchic markets will automatically lead to beneficial social outcomes.

I had that thought reading the following paragraph by Adam Tooze:

As Trump’s trade warriors point out, the range of instruments that China deploys in industrial competition makes a nonsense of trade policy as defined by the WTO. Complexity and opacity are key to the success of China Inc. As Blustein shows in an illuminating cameo about tractor tyres, the network of state support for Chinese industry extends from central and local government grants and tax exemptions to subsidised land deals, cheap electric power and a raft of subsidised low interest loans, from the government as well as public and private banks. When rubber prices surged in the early 2000s Beijing devised a scheme to supply it at a reduced price and gave a set of inducements to rubber producers. The arrangements are all-encompassing yet almost entirely deniable, as the American lawyers retained by Chinese firms demonstrate when they face unpleasant questions from the US Department of Commerce.

Whose Century? (London Review of Books)

“Trump’s trade warriors,” as Adam Tooze calls, them, represent that standard American perspective that government should “butt out”, i.e. “not pick winners and losers.” That markets should be free to run themselves and that government should not “interfere.” This comes from a blind commitment to libertarian ideology.

The Chinese know that this is nonsense. They know that production and governance are inseparable. True, it’s no longer centrally planned as in the old days. But the myopic faith in an anarchic market to achieve ideal outcomes is a flaw that the Chinese do not posses. It’s an advantage of coming from a non-Western perspective free from the blinders imposed by neoclassical economic thinking as developed in the West. Of course the government manages the commanding heights of business and trade. What else would you expect?

The Chinese view is the more historically accurate one. In the West, the fairy tale is told of plucky businessmen succeeding despite being frustrated at every turn by petty government bureaucrats. This tale was further enhanced by fabulists like Ayn Rand, who peddled this nonsense for ideological reasons while having no knowledge of economic history, or even any experience in the actual business world.

Marxists, by contrast, have always been fully aware of how the state creates and sustains the capitalist economy, and has always done so. From the passing of laws, to issuing and regulating the supply of currency, to the establishment of limited liability corporations, to the building of infrastructure, to the selling off of formerly public lands to private interests, to the implicit assumption of risk, to the issuing of bonds, to intellectual property laws, to publicly-funded research, to numerous subsidies, to a basic social safety net, to K-12 mass schooling, to the provisioning of police and military to enforce contracts and property rights—the list of how government and business interests are intertwined—not opposed—goes on endlessly. There is no “great wall” dividing a self-contained intellectual abstraction called “the economy” from all the other aspects of human life in this world.

It seems the ideological blinders conferred to us by libertarian classical and neoclassical economists are—ironically—causing the West to fall behind at the game it supposedly invented, especially the U.S.

And, speaking of ideology, it was also ideology that has made globalism such a problem in the U.S., specifically the frontier ideology of  self-reliant “rugged individualism,” where honest, hard-working people never require outside help or “handouts.” This ideology insists that, rather than letting “the government” take of you, you should just bootstrap your way out of your circumstances through grit and pluck.

This, of course, is absolute nonsense, but it’s the dominant ideology of the Republican Party and conservative philosophy more generally. In the U.S., it manifests itself in the idea that “welfare” is inherently a bad thing, and that anything the government does to help its citizens is “communism”. This is the reason why the “China Shock” was so uniquely bad in the U.S. compared with other countries that were just as exposed to the neoliberal globalism. The reason you didn’t see the same backlash to “free trade” in other countries as compared to the U.S. is because those countries decided to care of their citizens instead of just throwing them under a bus:

Every advanced economy in the world – Japan, South Korea, European countries (Italy in particular) – felt the ‘China shock’. But only in the US has it led to the kind of political crisis we have witnessed since 2016. It is this that requires explanation. …Given the resources of American government, a shock on this scale could have been cushioned through spending on welfare, education, reinvestment and relocation. But that would have required creative politics, which is precisely what has been obstructed by the Republicans. Instead the problem wasn’t addressed, unleashing a pervasive status anxiety among lower-middle-class and working-class white Americans, especially men. It was in the counties where the highest number of jobs were lost because of the China shock that Trump scored best in the 2016 election.

Since the Clinton era, the Democratic establishment has held up its side of the bargain, deflecting opposition to globalisation from trade unions. What it did not reckon with was the ruthless cynicism of the Republican Party in opening its doors to xenophobic, know-nothing white nationalism, inciting talk of a nation betrayed and swinging over to protectionism. The Democrats also didn’t take into account the dogged refusal of the Republicans to co-operate in their efforts to patch together America’s welfare state, even, or especially, when it came to fundamentals such as unemployment insurance and health coverage…

In other words, if we hadn’t been so wedded to the “government bad” and “society owes you nothing,” attitudes, and if the elites had been even a little less rapacious, we would not have seen entire swaths of the country reduced to sub-third-world status, and hence the rise of authoritarian right-wing populism. In the U.S., for example, even health care is tied to a having a job, and instead of dealing with that problem, the politicians of both parties chose a politics of distraction and misinformation that has led us to where we are now.

Due to an ideological distaste for “big government solutions” and “government handouts,” inherited from libertarianism, the only other avenue for aspiring populist politicians was to promise to somehow “bring the jobs back,” so that workers could head back into the factories and “earn” the basics of life like health care and the money to pay for food and shelter. But, of course, this will not work. U.S. manufacturing continues to expand output, even while shedding workers. It was China’s low wages that made them predominant—low wages that would not work with the high fixed costs of food, education and housing in the U.S. High-wage manufacturing jobs were replaced with the “service economy”, and the ideological conception that what we earn is entirely down to our own personal “marginal productivity” (again promoted by neoclassical economists) led to opposition to any efforts to raise the bar for wages.

In both of these cases, we can see where hidebound ideological blindness prevented the U.S. from taking the steps that other countries have effectively taken, which has led to the creation of much more successful 21st century societies outside the U.S.—whether it it’s Europe’s social democracy or China’s state-managed capitalist/communist hybrid. Since both of these options are effectively off the table due to our ideological commitments, all Americans can do cry to the heavens that the imaginary libertarian world that “should” exist is nowhere to be found as we continue to circle the drain of history.

“China under the control of the CCP is, indeed, involved in a gigantic and novel social and political experiment enrolling one-sixth of humanity, a historic project that dwarfs that of democratic capitalism in the North Atlantic.”

It’s a long piece, worth reading in full: Whose century? (London Review of Books)

BONUS: To prove the point made above:

Tesla Motors Inc., SolarCity Corp. and Space Exploration Technologies Corp., known as SpaceX, together have benefited from an estimated $4.9 billion in government support, according to data compiled by The Times. The figure underscores a common theme running through his emerging empire: a public-private financing model underpinning long-shot start-ups.

The figure compiled by The Times comprises a variety of government incentives, including grants, tax breaks, factory construction, discounted loans and environmental credits that Tesla can sell. It also includes tax credits and rebates to buyers of solar panels and electric cars.

A looming question is whether the companies are moving toward self-sufficiency — as Dolev believes — and whether they can slash development costs before the public largesse ends.

Elon Musk’s growing empire is fueled by $4.9 billion in government subsidies (L.A. Times)

Things Covid-19 Has Proven Are True

I found this on one of the files on my jump drive. I don’t think these really need any more elucidation, because they should be self-evident, so I’m just going to put them out there. This list will probably expand over time.

The stock market is not the economy.

Health care should not be tied to employment.

Taxes do not fund government spending.

There is no shortage of money.

Globalized, just-in-time supply chains are fragile.

A lot of the work we do is pointless and nonessential. Or, put another way, jobs are more about earning the money to live rather than doing socially useful work.

The most important workers in society are often the least paid.

American politicians are corrupt and incompetent.

The notion of the “service economy” is bogus, and always has been.

“Small government” is not an inherent virtue.

A functional social safety net is actually good for business.

Feel free to add on to the list.

BONUS: This is a good perspective from a commenter on Naked Capitialism:

“I’ve been referring to Coronavirus for a while as the world’s most effective stress test of institutions, maybe the biggest such experiment in history. It has unerringly found the weak link in every country and society its hit – whether that weak link being weak institutions, stupid politicians, sclerotic bureaucracies, religious nutcases, institutional groupthink, authoritarian tendencies or whatever. In the US its found not just one, but a whole series of weak links it can exploit. The results are not pretty.”

BTW: I’m still not happy with the blog layout. Does anyone have any suggestions?

Random Observations from Late Capitalism

What’s up with the “trying to build my empire” line on Instagram profiles lately? Really, everyone is trying to build an empire? How about just trying to live your fucking life? Oh, right, anyone who does that is “lazy.” Everybody is trying to be Jeff Bezos, since being an ordinary person apparently doesn’t cut it anymore. Since when is the goal of the average person “building an empire?” (of course, I know the answer, since Neoliberalism). I don’t think there are enough resources for 200+ million people in the U.S. to each have their own “empire”—by definition an empire consists of the ruler and the ruled: an emperor and subjects.

How is it that people say these things uncritically? It’s like Neoliberalism in America has become so internalized, so ingrained, so much the water in which we swim, that it’s penetrated into our very soul.

And what’s up with all the MLM stuff everywhere? It seems like every woman on Instagram is engaged in some sort of multilevel marketing sales scheme. There are the old standbys like makeup (Mary Kay, Avon), with all sorts of new skin care/beauty products joining the mix. The latest schemes are things like essential oils and products made out of CBD. Then there are the always-popular health supplements. Everyone’s social media now is hawking product (when not posting conspiracy theories). It’s like everyone is a seller, but there are no buyers.

And are we ever going to reach Peak Supplements? It seems like every celebrity in the world is hawking some sort of magic pills. Tom Brady is the latest to jump on the bus in a field pioneered by luminaries like Alex Jones, Joe Rogan, Gwynneth Paltrow, Tim Ferriss, and countless televangelists. It’s like their core business model is deliberately attracting an audience of gullible paranoiacs so you can continually sell them useless shit. It was only a matter of time before a politician like Trump used this exact formula to win and maintain political power as well. I have a hunch he’s just the beginning.

This is what we call “business” today?

And related to that, whenever someone signs up to shill for one of these companies, they often say something like “proud to be a member of the [insert scam product] family.” Family? Really??? What’s with this idea that all of these money-making operations are any sort of family? Um, you’re not a member of a  family; you’re an employee. How sad is it that we have come to be gaslit into seeing employers as like our families—employers that owe us nothing and will terminate us without remorse based on numbers on a spreadsheet.

To say the language of late capitalism is Orwellian doesn’t to it justice.

And has anyone noticed the posters with exhortatory messages that have sprung up all over capitalist workplaces? It’s like something out of the late stage Soviet Union. Dig that coal, bale that hay, tote that barge!

And no wonder. Just like in the late Soviet Union, morale has long since eroded away, replaced by dark cynicism and gallows humor over things like unpayable debt and health care bills. Any connection between “hard work” and reward has been severed for the vast majority of people. Whatever class you’re born into, that is where you’ll stay. So we need to constantly encourage the proletariat to keep their noses to the grindstone, because previous motivators like prosperity, stability and social advancement are long gone.

And the similarities to the late Soviet Union don’t stop there. We have the spectacle of a fossilized gerontocracy exemplified by Trump, Joe Biden, Nancy Pelosi, et. al., unable to respond to the dire challenges facing the country, just like the uninspiring grey bureaucrats of the late stage Soviet Union. We have a cynical youth alienated from the political process and facing declining living standards with no realistic way to change course. We have a media that’s basically party propaganda. We have mass spying on the citizenry beyond anything the Soviets could muster. And lately, we even have secret police “disappearing” people off the streets of American cities to unnamed dark sites.

It’s capitalism run amok. We’re not just workers anymore—we’re all perennial hustlers; we’re all an “empire of one.” We’re popping magic pills for “total human optimization” while waiting for “our ship to come in.” We’re substituting instrumental money relationships for genuine ones. Every parent is working like a madman to give their kids any edge in the unremitting status tournament of American life. As Chris Rock observed, “anytime you’re talking to an American, you’re really talking to their agent.” And it’s only gotten worse since he first made that observation.

This culture is so toxic. It’s irredeemable. It’s one reason why I feel so alienated and alone in America. How can you possibly relate to anyone else in a culture like this? How can you have any kind of genuine relationships when everyone around you is a hustler; when everyone is spending every waking moment climbing the status latter and “building their empire?” It’s just so hopeless.

New Old Architecture

In Europe, you are surrounded everywhere you turn by majestic stone architecture. The Gothic cathedrals and castles that occupy the cities and countryside of Europe provide copious examples of the wonder and beauty of stone architecture.

Stone is one of the oldest materials used in construction. It is also the most durable. The oldest surviving buildings in the world are carved from stone, or made from assembled stones. Göbekli Tepe is made from stone. The Pyramids are made from stone. It’s also a very local material—when you are building from local stone, it gives a place a distinctive feel. That’s why Paris looks the way that it does: the cream-colored Lutetian limestone quarried from the banks of the Seine.

The reason, I think, that we find stone such a compelling material is because, before we built our own structures, we occupied “natural rooms” that the earth made for us—caves. The stone walls of caves, lit by tallow lamps and torches, and illuminated with spectacular artwork, were out earliest permanent homes, and our earliest cathedrals.

The stone walls in a Gothic cathedral or medieval castle are bearing walls, being both the source of shelter from the elements and supporting the overall structure. The walls of modern buildings, by contrast, do not carry any load besides their own weight. The structure is separate, usually a skeleton frame of steel or concrete. These are typically either curtain walls (which are clipped onto or tied back to the supporting structure), or infill walls (sitting on the structure and filling the gaps within it.)

Bearing Walls: Monolithic Masonry Construction (Columbia University)

Walls in commercial construction today are usually cavity walls, consisting of a facing material held to the structure by some sort of clip system, creating a cavity between the supporting structure of the wall (typically studs or masonry) and the veneer. This cavity is designed to resist the penetration of water (since liquid moisture cannot leap a cavity*). The cavity also gives us a place to put the insulation.

Facing materials are usually panelized systems of fiber-cement, metal, porcelain, treated wood, phenolic resin, or some other weather-resistant material. Even in walls that appear to be solid brick or stone, the brick or stone is merely a facing material held by clips to a wall usually comprised of wood or metal studs.

Which is what made this article so fascinating to me: The miracle new sustainable product that’s revolutionising architecture – stone! (The Guardian)

The article talks about having stone be an actual self-supported wall rather than a thin veneer, and also a potential bearing material.

The article is based on a London exhibition of architecture which uses stone as a true building material rather than just a facade veneer. It’s entitled The New Stone Age. Here is the BBC’s coverage:

Design’s new stone age is here (BBC)

Featured prominently is 15 Clerkenwell Close, a six-story building by architect Amin Taha, which uses cut stones as the facing material of the building. The stone is deliberately left in the condition it is quarried in rather than being dressed, leading to variegated facade that resembles an urban ruin. This approach has pleased some, and left others so distressed that they launched a campaign to tear the building down!

…The result looks like what might have happened if Mies van der Rohe had been weaned on The Flintstones. It features a load-bearing exoskeleton made of massive chunks of limestone brought straight from the quarry.

The blocks have been left with their raw quarrying marks exposed and stacked on top of each other to form columns and beams. Some of the slabs’ faces show the lines where they were drilled from the rock face, others are sawn smooth as if cut by a cheese wire, while some bear the rugged texture of the sedimentary seam, freshly pried from the Earth’s crust.

The building’s geological power was too much for one Islington councillor, who complained that the “awful” building was out of keeping with the historic neighbourhood, and ensured that a demolition notice was issued, based on a supposed breach of planning permission. Taha finally won the case last year, on the proviso that the smooth stone be roughened up to look like the rest (a process which, after testing, has thankfully proven structurally too risky to carry out).

The problem with a solid stone wall is that there is no insulation or waterproofing layer as there is in a typical cavity wall. From the details on the architect’s web site, it looks there is a secondary wall behind the stone facade that accomplishes these functions. The stone is stabilized by metal anchors which tie it back to the main structure.

Another building very similar to the one discussed in the Guardian article is 30 Finsbury Square, also in London, by Eric Parry Architects. Unlike Clerkenwell Close, the stone here is dressed and smooth, and facade is designed in a rationalist manner reminiscent of Italian rationalists like Aldo Rossi or Guiseppe Terrangni.

There area few instances where stone is used as both shelter and bearing material. For example the article prominently features a photo of this winery in France:

Delas Frères Winery in France (ArchDaily)

And another example: a radio station in the Himalayas that appears to be built out of solid stonework. It’s difficult to imagine this being built anywhere else, though; I don’t think you could build something like this in downtown London or an American suburb.

The BBC article mentions Jorn Utzon’s (Sydney Opera House) own Can Lis house in Mallorca

The article also prominently features a French firm, Perraudin Architecture, which builds using stone as a structural material, as opposed to just a veneer or facade material. This gives their projects an amazing texture and heft that you just don’t see often in modern architecture. I would imagine France has a tradition of stonemasonry that goes very far back, indeed.

House made of solid stone in Lyon by Perraudin Architecture (Dezeen)

The building is entirely built up in load-bearing limestone walls of 40 cm. Precise coursing elevations define each stone, to be extracted, dimensioned and numbered in the quarry and then transported to the site. There, they are assembled like toy blocks using nothing but a thin bed of lime mortar.


And another example with timeless beauty—social housing in France built out of solid exposed stone walls:

The building is entirely built up in load-bearing limestone walls of 40 cm. Precise coursing elevations define each stone, to be extracted, dimensioned and numbered in the quarry and then transported to the site. There, they are assembled like toy blocks using nothing but a thin bed of lime mortar.

No paint or plaster was added to the walls, so the stone surfaces are left bare to display traces of the quarrying process. Projecting courses of stone on the exterior mark the boundaries between floors and help to direct rainwater away from the windows.

Social housing with solid stone walls by Perraudin Architecture (Dezeen)

But what the article emphasizes is quarried stone as a more environmentally-friendly alternative to concrete. The concrete production process produces and enormous amount of carbon dioxide, whereas stone can by used as quarried directly from the ground. From the Guardian article:

When you step inside the Building Centre, you are immediately confronted with a large model of a speculative proposal for a 30-storey office tower – designed to be made entirely from stone. It looks like a series of Clerkenwell Closes stacked on top of each other, the chunky stone columns getting progressively thinner as they rise towards the clouds.

“We wanted to prove that a solid stone tower is eminently possible,” says Taha, handing me a substantial technical report that makes a hard-nosed case for such a building on grounds of both cost and carbon footprint. Using stone for the core, structure and floors, they argue, would be 75% cheaper than a steel and concrete structure, and have 95% less embodied carbon. The primary reason for the saving is that, while concrete and steel have to be fireproofed, weathered, insulated, then clad, a stone exoskeleton can be left exposed…

“Stone,” says architect Amin Taha, “is the great forgotten material of our time. In 99% of cases, it’s cheaper and greener to use stone in a structural way, as opposed to concrete or steel, but we mostly just think of using it for cladding.”…

The tactile qualities of stone are clear, but, for Taha, the environmental argument is what makes it such an important material to champion. “As a profession, we’re not thinking clearly about the embodied energy of building materials,” he says. “The perverse thing about concrete is that you take limestone, crush it, then burn it, by which time it loses 60% of its structural strength – so you then have to put steel reinforcement inside it. It’s total madness.”

By embracing stone as combined superstructure and external architectural finish, he says, we can save 60-90% of CO2 emissions for these key building elements. “And we’re standing on a gigantic ball of molten rock, so we’re not going to run out of stone any time soon.”

The miracle new sustainable product that’s revolutionising architecture – stone! (The Guardian)

Compare this to concrete:

After water, concrete is the most widely used substance on Earth. If the cement industry were a country, it would be the third largest carbon dioxide emitter in the world with up to 2.8bn tonnes, surpassed only by China and the US….Taking in all stages of production, concrete is said to be responsible for 4-8% of the world’s CO2. Among materials, only coal, oil and gas are a greater source of greenhouse gases. Half of concrete’s CO2 emissions are created during the manufacture of clinker, the most-energy intensive part of the cement-making process.

But other environmental impacts are far less well understood. Concrete is a thirsty behemoth, sucking up almost a 10th of the world’s industrial water use. This often strains supplies for drinking and irrigation, because 75% of this consumption is in drought and water-stressed regions. In cities, concrete also adds to the heat-island effect by absorbing the warmth of the sun and trapping gases from car exhausts and air-conditioner units – though it is, at least, better than darker asphalt.

It also worsens the problem of silicosis and other respiratory diseases…

Concrete: the most destructive material on Earth (The Guardian)

Plus, there’s just something about the “feel” of natural stone that can’t be captured by other materials. That’s why it has faced our buildings since ancient Egypt to ancient Rome and medieval Europe. Due to its “natural” qualities and heft, a solid stone wall simply “feels” better than modern veneer walls, in my opinion. In addition, stone and brick acquire a warm, pleasing patina over time, and are amenable to all sort of creative expression not possible with other panelized systems, and certainly not in aluminum curtain walls. For example, Brick Expressionism (Wikipedia) was common in the early twentieth century, and the beauty, variety, and expressionism of carved stone veneers is evident.

Alongside its sustainable qualities, it’s the material’s one-off nature that really appeals to the design world. “People increasingly want the authentic beauty and inconsistencies of natural stone,” says Solid Nature’s David Mahyari, “imitation ceramic tiles include realistic veins but have a repeat pattern like wallpapers, so you can tell quickly that they’re fake.”

Its age is also a factor. “Stone is a material that is millions of years old. Can you imagine this? I am completely convinced that this dimension also changes the way we relate to a stone object, establishing a different kind of connection with it and making it, somehow, more precious.”

London-based stone carver Simon Smith backs this up: “If the stone ‘takes a polish’, it’s like opening the door of the stone and seeing deep into it, and millions of years back in time.”

I’m not alone. I recently ran across this paragraph describing a project that used stone for vertical shading devices (The Jackman Law Building at the University of Toronto). It explained why the designers fought for natural stone instead of precast concrete for the shading devices:

The choice of stone for the shade fins stems from an aspiration to counter a look of mindless mediocrity that [Architect Siamak] Hariri sees being inflicted on cities by the widespread use of ersatz materials. Imitations lack the dignity, patina, and subtle variety of natural materials, he says, and he advocates for beauty as a value in its own right, as well as for its contribution to durability: “A really good building is one that people will not let be taken down.”

Continuing Education: Vertical Shading Devices (Architectural Record)

Traditionally, stone and brick cannot span spaces except using an arch, a vault (basically an extruded arch), or a dome (a revolved arch). Thus, the structural material in stone buildings was often wood or timber, or steel beams or trusses in newer buildings.

There is a way, however, to have stone span spaces: the flat stone vault, which was patented in the 1600s by French engineer Joseph Abielle. Abeille’s vault has recently been used on an innovative project in Jerusalem: a gift shop added on to an old Crusader church!:

The columns of the new shop are made out of massive stone, and the ceiling is a at stone vault composed of 169 interlocking voussoirs. The system is inspired by the invention of French engineer Joseph Abeille (1673-1756), who patented in 1699 a special system that allowed the building of at vaults.

The Flat Vault / AAU ANASTAS (Arch Daily)

The flat stone vault is completed! (AAU ANASTAS)

Years ago Low-Tech Magazine did a story on the Timbrel and Catalan vaults:

Tiles as a substitute for steel: the art of the timbrel vault (Low Tech Magazine)

The Nubian Vault is constructed of mud brick without requiring a temporary support:

They’ve even used such vaults to construct multi-story buildings without utilizing any concrete or steel:

The Sustainable Urban Dwelling Unit (SUDU) (No Tech Magazine)

These structures can be subsumed under the rubric of reciprocal supporting structures, in which each structural member supports every other member in turn, with a few members transferring the total load to the ground or supports (an interesting metaphor for society, no?). Reciprocal supporting structures are becoming increasingly popular.

Some examples of basic brick monolithic walls are Louis Khan’s Indian Institute of Management, and this house in Vietnam, which uses are perforated brick skin to wrap the house:

Incidentally, I’ve noticed a distinct trend in modern architecture to have not just a single skin, but to divide the exterior and from the interior using layers—for example a layer of sun screening, or a layer for privacy as in the house above, or wrapping balconies around the building to create a “semi-private” space, as in this building:

Colonnades line the terraces of Antonini Darmon’s Arches Boulogne apartments (Dezeen)

I wonder if cementitious foam insulation (Airkrete), sprayed inside a cavity, could give stone the necessary R-values to be used as an exterior wall without a second layer. Waterproofing could be accomplished by a hydrophobic coating. Such a wall would have decent thermal and waterproofing performance, not to mention be practically permanent (and beautiful, too!)

And there are some other promising new materials that can have both a structural use and give a beautiful texture.One that’s getting a lot of attention is cross-laminated timber (CLT). CLT consists of wood planks glued together to create a structurally stable panel (NLT—Nail Laminated Timber, uses nails to hold the planks together). The planks are set together at right angles to each other to provide structural stability, similar to how plywood is made, just at a larger scale. It’s part of growing suite of mass wood technologies:

Mass timber is a generic term that encompasses products of various sizes and functions, like glue-laminated (glulam) beams, laminated veneer lumber (LVL), nail-laminated timber (NLT), and dowel-laminated timber (DLT). But the most common and most familiar form of mass timber, the one that has opened up the most new architectural possibilities, is cross-laminated timber (CLT).

To create CLT, lumber boards that have been trimmed and kiln-dried are glued atop one another in layers, crosswise, with the grain of each layer facing against the grain of the layer adjacent.

Stacking boards together this way can create large slabs, up to a foot thick and as large as 18-feet-long by 98-feet-wide, though the average is something more like 10 by 40. (At this point, the size of slabs is restricted less by manufacturing limitations than by transportation limitations.)

Slabs of wood this large can match or exceed the performance of concrete and steel. CLT can be used to make floors, walls, ceilings — entire buildings.

The hottest new thing in sustainable building is, uh, wood (Vox)

What makes CLT so compelling are two things: the wood facing of the material provides a beautiful surface which can be left exposed on the inside (on the outside you will still require waterproofing, insulation and cladding). But perhaps the most attractive feature is that, since they are made from trees, they remove carbon from the air instead of increase it.

Unlike stone, wood is commonly used to span, and has been the most common material to do so since ancient times. The modern use of CLT leads to a wide range of structural expressions, with nearly endless variation:


Mass Timer Primer
(Canadian Architect)

CLT is the hot material of the moment, and there are many designers who are clamoring to build innovative large-scale structures in this material. There are all sort of proposals out there, from medium-sized buildings to skyscrapers (because we always have to build skyscrapers out of the hot new material for some reason). Although, for smaller-scale and residential structures, I wonder why structural insulated panels (SIPs) are not more popular. Those have been a round for a long time (an innovative use of SIPs is the Ashen Cabin by HANNAH Architecture and Design).

Once upon a time, wood was a primary building material across much of the world. But with industrialization, that changed in the West.

German architect Arnim Seidel explains that steel and concrete became the dominant building materials for to meet 20th-century demands: wide bridges, tall buildings, heavy loads.

“Wood came to be seen as backwards,” Seidel told DW.

Now, its environmental advantages are being recognized.

Materials like steel and concrete require massive amounts of energy to produce, and are usually transported over long distances. This emits CO2 that contributes to climate change.

By some estimates, producing a ton of concrete, or about a cubic meter, generates 410 kilograms of CO2 equivalent — the same amount of energy could power an average house for more than 10 days.

Locally harvested wood from sustainably managed forests not only has a much smaller carbon footprint in its production.

Using wood in buildings also sequesters carbon dioxide. When plants perform photosynthesis, this removes CO2 from the atmosphere and stores it in the wood.

“When we build with wood, we can conserve this stored CO2 for a longer period of time, and not emit it into the atmosphere,” Seidel told DW.

Wood: renewable construction material of the future? (DW)

Another material that is making a comeback is rammed earth:

Rammed earth is the descendant of ancient construction techniques like adobe or cob building. It can be used to build walls for many kinds of buildings, from houses to museums and even cemeteries.

The name says it all: it’s made of damp soil or earth that is placed in formwork, and then compressed or rammed into a solid, dense wall. As a construction technique, rammed earth almost disappeared with the development of reinforced concrete, but there has been a revival in interest because of its aesthetics and its perceived environmental benefits.

The carefully chosen mix of silt, sand, and gravel with a low clay content is moistened and then placed in about 4 inch deep layers between plywood forms; that’s why one sees the different colors and stripes, as often each layer is modified for aesthetic reasons. It used to be rammed by hand, but now powered rams are often used to reduce time and labor. Engineered structural reinforcing is often required.

Electric wiring and switch boxes can be built right into the wall as it goes up, so that a clean, interior earth finish can be maintained.

The structural potential of this material is more limited than the above materials. Cement-stabilized rammed earth has greater structural potential, but usually some sort of additional structure is used. Rammed earth walls tend to be mass walls, and this, along with other characteristics, limits them to fairly mild, drier climates such as the American Southwest, the Mediterranean, and Australia for building.

Like locally-quarried stone, using the earth from the site as a building material also anchors the building to the unique place, and allows us to surround ourselves with materials that look millions of years back in time.

The Dirt on Rammed Earth (Treehugger)

The world’s most beautiful homes are also down to earth (Curbed)

In summary, there are lot of innovative materials, and new ways to use old materials that add up to a lot of design possibilities for building design going forward. Let’s hope we can rise to the challenge and create a more inspiring built environment than has often been the case in the recent past.

* Of course it can, really, such as wind driven rain, but I’m trying to keep this simple!

Fun Facts

It’s time for a summer edition of fun facts!

The highest paid athlete of all time was a Roman charioteer; if he had lived today he would have been worth $15 billion.
https://www.thevintagenews.com/2017/01/18/the-highest-paid-athlete-of-all-time-was-a-roman-charioteer-if-he-had-lived-today-he-would-have-been-worth-15-billion/

Air pollution is responsible for shortening people’s lives worldwide on a scale far greater than wars and other forms of violence, parasitic and vector-born diseases such as malaria, HIV/AIDS and smoking.
https://www.escardio.org/The-ESC/Press-Office/Press-releases/The-world-faces-an-air-pollution-pandemic

California loses up to $1 billion in crops each year because of air pollution.
https://www.theverge.com/2020/3/16/21181725/air-pollution-california-crops-agriculture-1-billion

By 2010, 43,600 jobs had been lost or displaced in Michigan – and about 700,000 in the United States – due to the rise in the trade deficit with Mexico alone since NAFTA was enacted in 1994.
https://www.citizen.org/article/michigan-job-loss-during-the-nafta-wto-period/

Distribution of Household Wealth in the U.S. since 1989 (Federal Reserve)

As of December 2016, more than 129 million Americans have only one option for broadband internet service in their area – equating to about 40 percent of the country.
https://sites.psu.edu/netneutrality/2018/02/28/the-internet-monopoly/

The average consumer throws away 60 percent of clothing within a year of purchase.
https://www.treehugger.com/sustainable-fashion/we-throw-away-far-too-much-clothing.html

TIL there are more payday loan stores in the US than there are Starbucks or McDonald’s.
https://research.stlouisfed.org/publications/page1-econ/2019/04/10/fast-cash-and-payday-loans

‘Idiot’ once specifically referred to somebody with the mental age of a 2 year old. ‘Imbecile’ referred to somebody with the mental age of a 3 to 7 year old, and ‘Moron’ referred to somebody with the mental age of a 7 to10 year old.
https://eugenicsarchive.ca/discover/tree/53480acd132156674b0002c3

If cows were a country, they would be the third-largest greenhouse gas emitter in the world.

Three out of four new or emerging infectious diseases are zoonotic.

There are fewer American farmers today than there were during the Civil War, despite America’s population being nearly 11 times greater.
https://www.nytimes.com/2020/05/21/opinion/coronavirus-meat-vegetarianism.html

5 Companies own 80% of all stock in S&P 500 listed companies.
https://www.reddit.com/r/LateStageImperialism/comments/cbftd4/monopoly_the_deathknell_of_capitalism/

France’s longest border is with Brazil.
https://www.indexmundi.com/france/land_boundaries.html

Rudolph Hass, the man who grew and patented the original Hass avocado tree, didn’t make very much money despite its success as most people bought one single tree and then grew vast orchards from cuttings. He only made $5000 from his patent, and remained a postman his entire life.
https://en.wikipedia.org/wiki/Hass_avocado#History

The largest ancient pyramid in the world is buried inside a mountain in modern-day Mexico underneath a church.
https://www.bbc.com/future/article/20160812-the-giant-pyramid-hidden-inside-a-mountain

There is an inverse correlation between the amount of money spent on a wedding, and how long the marriage lasts. The more people spend on the ceremony, the more likely the couple will get divorced.
https://www.insider.com/study-couples-who-spend-more-on-weddings-more-likely-to-get-divorced-2018-7

As of 2018, there are 6 PR people for every journalist. Much of the change is attributed the 45% loss of newspaper employees from 2008 to 2017. Additionally the current median income of PR professions is $61,150 in comparison to journalists $46,270.
https://muckrack.com/blog/2018/09/06/there-are-now-more-than-6-pr-pros-for-every-journalist

Brazil has nearly 60,000 murders a year, more than the US, Canada, Australia, all of Europe, China, and many Asian countries combined.
metrocosm.com/homicides-brazil-vs-world/

SSC, Doxxing, and Julian Jaynes

I was planning to comment on the writeup that Slate Star Codex did on The Origin of Consciousness in the Breakdown of the Bicameral Mind by Julian Jaynes, which I was surprised had not been covered before. I suppose I’ll do it sooner rather than later, since Slate Star Codex has since been taken down. I guess that means I’ll be commenting on both topics.

As most of you are probably aware by now, the New York Times was planning on running an article about the blog that would have revealed the author’s real full name. The author, who blogs under the pen name Scott Alexander, claimed that the Times was going to “doxx” him, and that he needed to remain anonymous for professional reasons. As he describes it, removing the blog was the only way to stop the story from going out.

Now, I think the reasons he wished to remain anonymous were 100% legitimate: as a professional, there are certain ethical standards that you have to uphold, and if you have patients, having them able to read your opinions probably would color the doctor/patient relationship, which is particularly important with something like psychiatric counseling. And he also thought that being named in the New York Times would make him easier to locate, and that this would endanger the housemates he lives with, because he has received a number death threats in the past which he apparently believes are credible (as an aside: can anyone express an opinion today without receiving death threats? What does that say about our society?)

I don’t know about using the term “doxxing” though; that seems intentionally hyperbolic. From my understanding, “doxxing” implies malicious intent. It’s deliberately publishing details about a person’s offline identity in order to threaten, harass, intimidate, or bully that person. The Times was doing no such thing—for better or worse, their policy was to use people’s real names unless there was a compelling reason to maintain a person’s anonymity (such as informants, whistleblowers, etc). You can certainly argue whether or not that’s a good policy (and I’m sure a lot of people think that it isn’t), but I’m sure the Times had their reasons, and there was no deliberate intent to harm Alexander or anyone else as far as I can tell form the story. For what it’s worth, I suspect this will eventually prompt the Times to change their policy, and the blog will be up again at some point in the future, so if you’re a fan of it, I wouldn’t worry.

Now, I’m hardly unbiased in this case. I too blog under a pseudonym, but for different reasons. I don’t have professional reasons to not use my real name, as I don’t have patients or clients. I do often have knowledge of confidential projects in my area, but I stringently make sure never to discuss my job or any of my professional work on this blog. And I’ve never received death threats, but even if I did, well, I live alone so if someone did decide to take me out, all that would happen is that I’d end up as dead on the outside as I am on the inside. It might even be doing me a favor.

Rather, I do it because I need to earn money to survive, and I don’t want potential employers to Google my name and find this blog or any of my opinions, even though I think they’re hardly radical or extreme. It’s sad that I have to worry about this, but that’s the world we live in. It also calls into question just how much “freedom’ we really have in modern capitalist societies, but that’s a larger topic for another time. I’m scared shitless what would come up if I actually did google my real name, so I’ve never done it. When Jim put up my recent interview on The Attack Ads! Podcast, he initially published my real name, but he was kind enough to remove it and replace it with my pen name (kinder, it seems, than the New York Times!)

I have been doxxed in real life, however, and it was not a pleasant experience. I might as well go ahead and tell the story.

The last job I had before the one I have now was for a local architecture firm, which allowed me to practice again. I put the name of my employer on my Facebook profile (I know, I know, but we’ve all done stupid things in life that make us go ‘what were you thinking?’ in retrospect).

I had an acrimonious exchange on Facebook with some random asshole, but what I didn’t know was that this random asshole happened to know one of my co-workers at this firm (who was also an asshole). Thus, armed for revenge, he sent the exchange to this scumbag, who subsequently printed it out and literally took it from desk to desk around the entire firm, and directly to the firm’s managers/owners befoe I even knew what was happening. Clearly this person was an absolute sociopath, who—like so many Americans—enjoys destroying people for sport and twisting the knife simply because he can. I was sternly reprimanded by the firm’s leaders, and I’m sure it was a major factor in my eventual dismissal, effectively ending my professional career. Oh, and this incident exactly coincided with my mother’s final months dying of cancer.

So doxxing isn’t a good thing.

And it’s not like this was an isolated incident, either. I’ve had many, many experiences like this over my professional career and in my life experience—enough that’s it’s routine by now. Perhaps I just attract bullies. Incidents like this have convinced me that people are inherently cruel and evil, and will absolutely hurt you the minute they get the chance. It has led to my developing misanthropy and paranoia. I still have many PTSD symptoms including nightmares about that job.

Of course, I immediately deleted my Facebook profile. I do currently have one under a false name, but only because I still needed to sell some of my mother’s hoarded stuff online. I don’t post anything there or have any personal info, of course. In order to have access to the Marketplace, you need to have what Facebook considers to be a valid profile (presumably to deter scammers), so I signed up for a couple of groups to make the algorithm think I’m a real person and let me have access. One was about Cardinals. The other was a Julian Jaynes discussion group.

Which finally brings us around full circle to the real subject matter at hand. I’m writing this now because I have to go from memory, as the original post is obviously no longer online.

Alexander begins by “rewriting” the book along similar lines, keeping the parts of the premise he thinks are valuable, and omitting the parts that he thinks are incorrect or speculative. This allows him to summarize the book that he thinks Jaynes “should have written.”

I actually enjoyed this approach. Unlike most Julian Jaynes fans, I’m not a Jaynes absolutist. I’ve noticed that most Jaynes enthusiasts accept 100% of his thesis and tend to treat the book as holy writ. I like to pick and choose what I think is correct.

Alexander claims that what Jaynes was actually describing was the beginning of Theory of Mind, rather than consciousness in the Jaynesian sense.

Now, I do think that Jaynes’s choice of the term consciousness is problematic. Jaynes’s supporters will always point out that he goes to great lengths to define what he means by consciousness, and they’re right—he does! But the thing is, if you have to go to such lengths to define what you mean by a term, then the term is poorly chosen. For the average person, consciousness is just the state of being awake, and when they hear that Jaynes is claiming that ancient people lacked consciousnesses, even though he explains what he means by that (their awareness was different than ours), most people will still reject the thesis outright. In other words, merely by choosing this term, you start out in a hole, and you have to spend a lot of time digging out of it before you can even do the heavy lifting. And when you’ve got a thesis as “out there” as Jaynes does, that’s even more of a problem.

I wrote about Theory of Mind in my series of posts about the Origin of Religion. From my understanding, Theory of Mind is the ability to understand that others have thoughts, feelings and ideas different than your own. From this perspective, then, Jaynes would be arguing that an ancient Greek person would be unable to perceive that his fellow Greeks had different thoughts or possessed different knowledge than he did. Put another way, an ancient Greek person at the time of Homer would fail the Sally-Anne test.

But as far as I can tell, that’s not what Jaynes was saying at all! I find it hard to believe that the author got this concept wrong, considering he’s allegedly a psychiatrist. Maybe there’s some confusion of terminology here. Voice hearing has nothing to do with this ability. As far as I know, voice hearers and schizophrenics are still aware that other people have minds of their own.

Instead, the term I would use for what Jaynes is describing is meta-consciousness, or meta-awareness. This would mean that the book’s title would be The Origin of Meta-consciousness in the Breakdown of the Bicameral Mind, which I think is clearer. That concept is is different than Theory of Mind. I would define meta consciousness as being conscious of one’s own mental states. In this paradigm, consciousness is a thing that can be thought about and contemplated separately from one’s direct experience; whereas before thoughts are just thoughts–there is no conceptual entity that these thoughts are assigned to that allows you to stand back from one’s own thoughts and reflect on them. It would be like trying to see your own eyeball without a reflection.

When people did have thoughts expressed as language inside their own heads (as opposed to verbalizations), they assigned these thoughts to a conceptual entity that has come down to us as “gods.” With the slipperiness of language, it’s possible that word “god” simply referred to this inner voice, rather than a “real” person as often depicted. To aid this conception, this inner voice was assigned a persona–the persona of the god. Statues were made of these imaginary entities who were the source of such voices. They became cultural touchstones. Both temples and statues were expressly designed to “call forth” this inner voice and hear the god’s command (i.e. induce hallucinations).

What they did NOT have was a conception of “inner self” or “soul” that these inner vocalizations could be assigned to. At least, not yet. Over time, they developed this conceptual framework though the expansion of metaphor, and this entity became the source of these nonverbalized thoughts rather than a “god.” They heard this voice, then, not as a hallucination commanding them to do things (or, rather, what we would term a hallucination), but more of a voice that was under their conscious control as surely as the ones that gave rise to verbal communication between their fellow men. “Consciousness is (a mental process creating) an introspectable mind-space.” That “introspectable mind space” is different than theory of mind, which has to do with how we perceive others.

Previously, I suggested that this was somehow related to the mind’s ability to grasp recursion, based on Douglas Hofstadter’s ideas about the recursive nature of consciousness. Once the mind could grasp the principle of recursion, it could develop meta-awareness, which is turning thoughts back on oneself as if in hall of mirrors. This allowed for the development of a new kind of counsciousness which allowed people to perceive the voices in one’s head as as originating from the ‘self’ rather than a ‘god.’ I noted that the few populations who do not seem to have recursive structures in their language do indeed seem to have very fluid and undefined senses of self by our standards, and are prone to what from our vantage point would be hallucinations. This is speculation, however.

Alexander claims that Jaynes pins the breakdown on bicameral consciousnesses on increased trading during the Bronze Age, and the requirement to deal with other people in order to trade. To negotiate deals, you need to be able to put yourself in the mind of another person. Since he is operating on the assumption that Jaynes was talking about theory of mind, this makes sense. But Jaynes wasn’t really talking about this at all.

Although Jaynes does mention the increased trading during the Bronze Age, it is more the need for novel behaviors in general that he pinpoints, rather than just the need to trade per se. Jaynes argues that bicameralism was useful in world where routine behaviors were the norm, and that people would hear the voices of their leaders in their heads commanding them what to do. In contrast, when such top-down command structures did not work—such as dealing with outsiders—it called forth new types of behavior, and this is what caused the breakdown of bicameral consciousness, not simply trade.

What’s also odd is that an even bigger culprit in Jaynes’s view is the advent of the written word, which Alexander omits completely. Oral cultures would favor bicameralism, because orders are passed down vocally from the leaders, who then become gods in their heads commanding them. But with the written word, one takes command of one’s own inner voice. You use your brain in a completely different manner in the act of reading than you do in a world where 100% of interpersonal communication is via speech. This seems like a much more likely explanation of the shift in brain function than just trade alone. Why not mention it? He also omits many of Jaynes’s ideas about the value of metaphor in language. Language is what allows us to construct the metaphorical self and the “Analog I.”

Alexander briefly mentions that Jaynes’s conception of the split brain was based on Micheal Gazzinaga’s research (and through him Roger Sperry), and that a lot of this research has been debunked or superseded. He offers no sources to back up this claim, however. I was surprised by this, because one would have thought that if anyone, a psychiatrist–who is a doctor that specializes in the brain after all–would have more qualifications here than anywhere else. From my readings, it appears that a good portion of Jaynes’s claims about how the mind processes language across the hemispheres has comported with newer research, even if the entire concept of bicameralism has not been.

There is also no mention of the reassessment of Jaynes thesis by an cross-disciplinary team in 2007 that expressed qualified support for it: The bicameral mind 30 years on: a critical reappraisal of Julian Jaynes’ hypothesis. From what I recall from Charles Feryhough’s The Voices Within, there has been some empirical support for Jaynes’s model of how the brain hears voices in recent research.

Neuroscience Confirms Julian Jaynes’s Neurological Model (The Julian Jaynes Society)

Split-Brain Researchers Are Split (Psychology Today)

There is also no mention of Jaynes’s ideas on hypnotism, which is strange. Most people associate Jaynes’s ideas with schizophrenia, which is the hearing of voices, after all. But Jaynes also claimed that his ideas explained hypnotism—hypnotism was a throwback to bicameral consciousness where verbal commands would trigger a trance mode. Both schizophrenia and hypnotism are “throwbacks” to bicameral consciousness, he argued. He even claims that there is no other valid explanation for this hypnotic state in the psychological literature; rather, it’s just handwaved away. As he writes:

…hypnosis is the black sheep of the family of problems which constitute psychology. It wanders in and out of laboratories and carnivals and clinics and village halls like an unwanted anomaly. It never seems to straighten up and resolve itself into the firmer properties of scientific theory. Indeed, its very possibility seems like a denial of our immediate ideas about conscious self-control on the one hand, and our scientific idea about personality on the other. Yet it should be conspicuous that any theory of consciousness and its origin, if it is to be responsible, must face the difficulty of this deviant type of behavioral control.

I think my answer to the opening question in obvious: hypnosis can cause this extra enabling because it engages the general bicameral paradigm which allows a more absolute control over behavior than is possible with consciousness. (original emphasis)

Whether he’s right or not, conventional psychology really does offer no good explanation for hypnotism, reinforcing his point. Hypnotherapy is a legitimate method of therapy nowadays, yet we have no real idea how or why it works!

Finally, Alexander does raise an objection I’ve always had, namely that if Jaynes’s thesis is correct, then anthropologists should have discovered a true bicameral culture somewhere in the world by now, especially in very remote cultures that have been cut off from the wider world. He notes that there are a lot of strange things going on with consciousness detailed in the anthropological literature, but nothing that rises to Jaynes’s description. He also notes that anthropological descriptions that comport somewhat with Jaynes’s description may have been published in various later books.

I believe he’s referring to Gods, Voices, and the Bicameral Mind: The Theories of Julian Jaynes, which is published by the Julian Jaynes society. I’ve been wanting to get a hold of that book, but have been reluctant due to recent events. But I’ve heard Jaynes’s partisans claim that bicameral consciousness has in fact been documented in the anthropological literature, and that the book contains some papers documenting this. So maybe I’m off base here.

Yes, it does seem that something exceptional is going on with the consciousness of pre-contact peoples, but nonetheless, it’s still a bit different than the scenario Jaynes describes in the book. People will mention the Pirahã for example. And while it’s true that there are any number of anomalous events recorded in descriptions of them, they are still different than the bicameral civilization as Jaynes outlines it.

This is often explained by claiming that bicameral consciousness was not a trait of small tribal peoples, but only began with the shift to larger societies during the Mesolithic period. They will point to the construction of large structures like the recently discovered prehistoric circle of shafts near Stonehenge as a sign of the onset of bicameralism. In chapter 1 of book two, he writes:

With but few exceptions, the plan of human group habitation from the end of the Mesolithic up to the relatively recent eras is of a god-house surrounded by man-houses.

Adding on to the idea of god houses, he also pinpoints this as the reason for the elaborate burials of deceased god-kings with grave goods:

The burial of the important dead as if they still lived is common to almost all these ancient cultures whose architecture we have just looked at. This practice has no clear explanation except that their voices were still being heard by the living, and were perhaps demanding such accommodation…these dead kings, propped up on stones, whose voices were hallucinated by the living, were the first gods. (p. 379)

Just about all ancient cultures, from the Near East, to Mesoamerica, to China, look after the departed with goods, food and offerings, and Jaynes claims this is because bicameral man still hallucinated the voices of the dead god-kings in their heads. These elaborate burials and town layouts do not occur with scattered bands of hunter-gatherers such as the Pirahã, or Australian aborigines, or any of the isolated cultures were are likely to find, goes the argument. In Jaynes conception, “early cultures develop into bicameral kingdoms.” And so it’s no surprise that we wouldn’t find any such civilization that we can document anthropolgically, say Jaynes’s defenders.

But I still insist we would have found something similar to this by now. There’s a lot of anthropological literature across a wide range of cultures across the entire world. In this conception, bicameralism is a transient phenomenon which arrives with the onset of larger cultures, and then disappears when those cultures come into contract with outsiders, or become literate. This would mean that bicameralism is a phenomena lasting only a few thousand years at most. I don’t know if I’m willing to accept that.

Overall, aside from my quibbles above, I think the review did a good job of describing Jaynes’s ideas and taking them seriously on their own terms. I particularly liked how the author wrote how the standard depiction of the numerous depictions of gods and men speaking directly to each other as simply metaphorical is basically “kind of cheating”–in a way it is. If we take these phenomena seriously just as they were described, and didn’t use the cheats and dodges of “it’s all just metaphorical” then we come to very different conclusions.

For what it’s worth, I have an alternative concept of Jaynes that I’ve been meaning to write up for a while now. This obviously isn’t the time or the place. But my argument is essentially that, to borrow from Ran Prieur, “ancient people weren’t schizophrenic, they were tripping.” I think Ran’s basically correct. They weren’t literally tripping, of course–its just that their brains were working in way more similar to a modern person on psychedelics than a modern person’s everyday consciousness. Of course, tripping people often hear voices and “see” entities as a matter of course. Any state of consciousness that the brain can achieve with a drug it can achieve without that drug.

The descriptor of this comes from Robin Carhart-Harris’s work on psychedelics in the treatment of psychological disorders. He uses the term “entropy” to describe the differences in how the brain works on a psychedelic versus “normal” consciousness. Entropic brains have a much less defined sense of self, and process the world around them in a fundamentally different way than less entropic ones. I think the way ancient people processed the world was something closer to the entropic brain on a psychedelic, or to the way children perceive things (incidentally, meditation has been shown to increase brain entropy). Why this was the case I’m not sure, but it may have to do with the fact that our own brains probably produce DMT, and that the level may have dropped over time. This could be because instrumental rationality became more adaptive to environments where our major challenge was dealing with other people rather than with nature directly as societies grew larger and more complex. This changed our style of thinking from “primary consciousness” to “secondary consciousness”:

This article proposes that states such as the psychedelic state, REM sleep, the onset-phase of psychosis and the dreamy-state of temporal lobe epilepsy are examples of a regressive style of cognition that is qualitatively different to the normal waking consciousness of healthy adult humans. We will refer to this mode of cognition as “primary consciousness” and the states themselves as “primary states.” To enter a primary state from normal waking consciousness, it is proposed that the brain must undergo a “phase transition”, just as there must have been a phase-transition in the evolution of human consciousness with the relatively rapid development of the ego and its capacity for metacognition. This implies that the relationship between normal waking consciousness and “primary consciousness” is not perfectly continuous.

The entropic brain: a theory of conscious states informed by neuroimaging research with psychedelic drugs (Frontiers in Neuroscience)

The Free Market Is A Failure

Sorry for the deliberately click-bait-y headline, but I think this message is important to get out there.

In my discussions few months back on What is Neoliberalism, I noted that a core element of neoliberal philosophy is that markets are the only efficient, effective and rational way to distribute goods and services.

Neoliberals profess the idea that only competitive markets can allocate “scarce” resources efficiently, and that it is only such “free” markets that can lift people out of poverty and deliver broad prosperity. They pound it into our heads constantly.

Yet the Covid-19 crisis has illustrated spectacular and pervasive failures of such “free” markets all over the globe, and especially in the U.S. Instead of fairness or efficiency, we see systemic failure in every market we look: the food industry, the medical industry, the retail industry, the employment market. Resources are being destroyed and misallocated on a massive scale

Let’s start with the food industry, because food is the most important thing (nine means from anarchy, and all that). Thousands and thousands of pigs are being slaughtered, their meat left to rot, eaten by no-one, regardless of the forces of supply and demand:

The United States faces a major meat shortage due to virus infections at processing plants. It means millions of pigs could be put down without ever making it to table…

Boerboom, a third-generation hog farmer, is just one of the tens of thousands of US pork producers who are facing a stark reality: although demand for their products is high in the nation’s grocery stores, they may have to euthanise and dispose of millions of pigs due to a breakdown in the American food supply chain.

Meat shortage leaves US farmers with ‘mind-blowing’ choice (BBC)

Potatoes are sitting in Belgian warehouses and left to rot, only two short years after a drought threatened to produce a severe shortage:

Belgium: Lighthearted campaign to ‘eat more fries’ aims to lift heavy load (DW)

Meanwhile, dairy farmers in the U.S. heartland are dumping milk into the ground, to be drunk by no one.

Cows don’t shut off: Why this farmer had to dump 30,000 gallons of milk (USA Today)

In fact, the whole food situation is rather ugly, as this piece from The Guardian summarizes:

This March and April, even as an astounding 30 million Americans plunged into unemployment and food bank needs soared, farmers across the US destroyed heartbreaking amounts of food to stem mounting financial losses.

In scenes reminiscent of the Great Depression, dairy farmers dumped lakes of fresh cow’s milk (3.7m gallons a day in early April, now about 1.5 million per day), hog and chicken farmers aborted piglets and euthanized hens by the thousands, and crop growers plowed acres of vegetables into the ground as the nation’s brittle and anarchic food supply chain began to snap and crumble.

After delays and reports of concealing worker complaints, meatpacking plants that slaughter and process hundreds of thousands of animals a day ground to a halt as coronavirus cases spread like wildfire among workers packed tightly together on dizzyingly fast assembly lines.

Meanwhile, immigrant farmworkers toiled in the eye of the coronavirus storm, working and living in crowded dangerous conditions at poverty wages; at one Washington state orchard, half the workers tested positive for Covid-19. Yet many of these hardest working of Americans were deprived of economic relief, as they are undocumented. Advocates report more farmworkers showing up at food banks – and some unable to access food aid because they can’t afford the gas to get there.

None of this is acceptable or necessary and it’s not just about Covid-19, it’s also illustrative of a deeply deregulated corporate capitalism. America’s food system meltdown amid the pandemic has been long-developing, and a primary cause is decades of corporate centralization and a chaotic array of policies designed to prop up agribusiness profits at any cost.

Farmers are destroying mountains of food. Here’s what to do about it (Guardian)

That doesn’t sound very “efficient” to me, does it? How about you? Free market fundamentalists, care to weigh in?

Meanwhile, hospitals in the United States, which one would think are the most important thing to keep open during a pandemic, are actually closing across the country. These are the very things you want most to be open! Why is this happening? Because health care in the U.S. is a profit-driven enterprise that “competes” in the free market. Because elective procedures—their cash cow—have either been suspended or postponed. U.S. hospitals are closing because they are dependent upon these elective procedures to shore up their profits, and markets rely on profits.

As the deadly virus has spread beyond urban hotspots, many more small hospitals across the country are on the verge of financial ruin as they’ve been forced to cancel elective procedures, one of the few dependable sources of revenue. Williamson Memorial and similar facilities have been struggling since long before the pandemic — at least 170 rural hospitals have shut down since 2005, according to University of North Carolina research on rural hospital closures.

But even as hospitals in cities like New York City and Detroit have been deluged with coronavirus patients, many rural facilities now have the opposite problem: their beds are near-empty, their operating rooms are silent, and they’re bleeding cash.

More than 100 hospitals and hospital systems around the country have already furloughed tens of thousands of employees, according to a tally by industry news outlet Becker’s Hospital Review. They’ve sent home nurses and support staffers who would be deemed essential under state stay-home orders.

Rural hospitals are facing financial ruin and furloughing staff during the coronavirus pandemic (CNN)

And how about allocating labor via impersonal markets? How’s that going? Well, not so well. The workers with the skills most desperately needed on the front lines during the crisis are taking pay cuts and getting laid off left and right. Instead of contributing, they are sitting at home, unable to work even if they wanted to:

At a time when medical professionals are putting their lives at risk, tens of thousands of doctors in the United States are taking large pay cuts. And even as some parts of the US are talking of desperate shortages in nursing staff, elsewhere in the country many nurses are being told to stay at home without pay.

That is because American healthcare companies are looking to cut costs as they struggle to generate revenue during the coronavirus crisis.

“Nurses are being called heroes,” Mariya Buxton says, clearly upset. “But I just really don’t feel like a hero right now because I’m not doing my part.”

Ms Buxton is a paediatric nurse in St Paul, Minnesota, but has been asked to stay at home.

At the unit at which Ms Buxton worked, and at hospitals across most of the country, medical procedures that are not deemed to be urgent have been stopped. That has meant a massive loss of income.

Coronavirus: Why so many US nurses are out of work (BBC)

It’s an ironic twist as the coronavirus pandemic sweeps the nation: The very workers tasked with treating those afflicted with the virus are losing work in droves.

Emergency room visits are down. Non-urgent surgical procedures have largely been put on hold. Health care spending fell 18% in the first three months of the year. And 1.4 million health care workers lost their jobs in April, a sharp increase from the 42,000 reported in March, according to the Labor Department. Nearly 135,000 of the April losses were in hospitals.

As Hospitals Lose Revenue, More Than A Million Health Care Workers Lose Jobs (NPR)

So it doesn’t seem like “free and open” markets are doing so well with either health care or labor.

Meanwhile, U.S. states are competing against each other for desperately needed PPE equipment, bidding up the price and preventing scarce resources from going to where they are most badly needed, which would naturally be where Covid-19 has struck the hardest:

As coronavirus testing expands and more cases of infection are being identified, doctors, nurses and other healthcare workers are scrambling to find enough medical supplies to replenish their dwindling supply.

But state and local governments across the United States are vying to purchase the same equipment, creating a competitive market for those materials that drives up prices for everyone.

“A system that’s based on state and local governments looking out for themselves and competing with other state and local governments across the nation isn’t sustainable,” said John Cohen, an ABC News contributor and former acting Undersecretary of the Department of Homeland Security, “and if left to continue, we’ll certainly exacerbate the public health crisis we’re facing.”

“There’s a very real possibility,” he added, “that those state and local governments that have the most critical need won’t get the equipment they need.”

Competition among state, local governments creates bidding war for medical equipment (ABC News)

Yet neoliberals always tell us how important “competition” is in every arena of life.

Failure, failure, failure! Everywhere we look, we see failure. Pervasive, systematic failure. Resources going unused. Surpluses of food being dumped even while people go hungry and line up at food banks. Workers with necessary skills sitting at home, twiddling their thumbs. Other workers unable to even earn a living to support themselves and their families, no matter how badly they want to work. Masks and protective equipment NOT going to where they are most needed, their costs inflating, befitting no one except profiteers even as people die.

Tell me again about how the market is “efficient” at distributing resources. Tell me again about how central planning inevitably results in wasted resources, surfeits and shortages.

And here is the big, bold, underscored point:

The free-marketeers want to trumpet the market’s successes, but they don’t want to own its failures.

Free-market boosters always want to talk about the wonderful benefits of markets. How they allow multiple people to coordinate their activities across wide variations of space and time. How they allow knowledge to be distributed among many different actors. How they favor tacit knowledge that a single entity could not possess. Libraries of encomiums have been written celebrating the virtues of the “free” market. You know their names: The Provisioning of Paris, Economics in One Lesson, Free to Choose, I Pencil, and all of that. Much of what passes for economic “science” is simply cheerleading for markets– the bigger, freer and less-regulated the better.

Okay, fair enough.

But how about market failures? Why don’t they ever talk about that? Because if you read the economics books I cited above, you would come away with the idea that there are no market failures! That, in fact, there is no such thing. That markets, in effect, cannot fail!

If you want to own the successes, you need to own the failures.

Oh, they love, love, love to talk about central planning’s “failures”. They can’t get enough of that. They love to talk about empty shelves in the Soviet Union, long lines at supermarkets, the lack of toilet paper in Venezuela (amusingly, now a problem throughout the capitalist world), and the allegedly long waiting times in “socialized medicine” countries. We are constantly subjected to that drumbeat day after day after day. It’s part of every economics 101 course. Central planning doesn’t work. Central planning is inefficient. Central planning is “tyranny.”

But what about all that stuff I cited above?

Where are all the free-market fundamentalists now?

What is their excuse?

They’ll use special pleading. They’ll argue that it’s exceptional circumstances. That no one could have foreseen a “black swan” event like the global Covid-19 pandemic (despite numerous experts warning about it for years). They’ll tell us that markets work just fine under “normal” circumstances. They’ll say we cannot pass any kind of judgement on the failings of markets during such an unusual event.

Here’s why that argument is bullshit:

Pandemics are a real, and recurring phenomenon in human history. We’ve been incredibly fortunate that we’ve been in rare and atypical hundred-year period from 1918-1919 to today without a global pandemic or novel disease we couldn’t quickly contain and/or eradicate.

But pandemics are always—and always have always been—a societal threat, even if we’ve forgotten that fact. And the experts tell us that there will be a lot more of them in our future, with population overshoot, environmental destruction, encroachment on formerly unoccupied lands and climate change proceeding apace. What that means is this:

If your economic system can’t function properly during a pandemic, then your economic system is shit.

If your economic system only works when conditions are ideal, in fact depends upon conditions being ideal, then, your economic system doesn’t really work at all. If something like a pandemic causes it to seize up and fail, then your economic system is poorly designed and doesn’t work very well. Not only do the free markets graphed on economists’ chalkboards not exist in anywhere the real world, they apparently rely on a blissful Eden-like Arcadia to function as intended—a situation any causal glance at human history tells us is highly unusual. Any disruption and they fall like dominoes. They are about as resilient as tissue paper.

And the stresses are only going to get worse in the years ahead, with climate change making some areas uninhabitably hot, while other places are submerged under rising sea levels. And that’s before we get to the typical natural disasters like volcanic eruptions, tsunamis and earthquakes. And there will be new novel plant diseases as well, unfolding against the increasing resistance of germs to antibiotics.

Will the free market fundamentalists and libertarian market cheerleaders acknowledge this???

Don’t hold your breath.

No, they will continue to lionize “private initiative” at every opportunity, while completely ignoring the stuff I opened this post with. They’ll sweep it under the rug or, more likely, simply handwave it away. They’ll continue to say that we need to scale back government regulation and interference and let the invisible hand sort it all out.

Because discipline of modern economics as practiced today is not a science. It may not even rise to the level of a pseudoscience. It’s PR for laissez-faire capitalism.

Of course, we’ve had market failures before. They occurred all throughout the nineteenth century and during Great Depression, for example. These are well-documented. But many of the things that came out of those bygone market failures to prevent or mitigate them have been systematically and deliberately dismantled over the past generation due to rise of neoliberalism.

And now we’re paying the price.

Karl Polanyi made an important distinction between markets and Market Society. Markets are where people come together to buy, sell, and exchange surplus goods. These have existed throughout history. They are tangential to society; embedded in something larger than it. Such markets can be shut down without causing an existential threat to civilization.

But Market Society is dependent upon impersonal forces of supply and demand and functioning markets for absolutely everything in the society, from jobs to food to health care. Everything is oriented around maximizing private profits, and not human needs. Markets failing to function adequately lead to unemployment, sickness, starvation and death. Shutting them down is an existential threat to civilization.

As Dmitry Orlov wrote in his best-known work, the Russians survived the collapse of the Soviet Union precisely because they didn’t rely on the Market.

Naturalizing markets in this way is an abdication of both causal and moral responsibility for famines, a way to avoid reality and the ethical consequences for people in a position to change things. Markets are not given; they are predicated on a host of laws and social conventions that can, if the need arises, be changed. It makes no sense for American farmers to destroy produce they can’t sell while food banks are struggling to keep up with demand. This kind of thinking is a way for powerful people to outsource ethical choices to the market, but the market has no conscience.

Famine Is a Choice (Slate)

Now, to be clear I’m not necessarily making an argument for or against central planning as opposed to markets. That’s a different discussion.

But my core point is simply this: you cannot discuss market successes without discussing market failures. To do so is intellectually dishonest, disingenuous, and not to mention incredibly dangerous and irresponsible. If economics were a real science, instead of just PR for capitalism, it would take a look at the things I described above, and figure out ways they could have been avoided, regardless of any preconceived ideology or assumptions about the “right” way to arrange a society, or assumptions about how things “should” work. It would seek out ways for society to become, in Nassim Taleb’s terminology, “antifragile.”

But don’t hold your breath for that, either.

Attack Ads! Podcast

Jim and I chew the fat about the Nuisance Economy over that the Attack Ads! Podcast. It was fun to be a podcast guest once again, so I’m glad he had me on.

https://attackadspodcast.blogspot.com/2020/06/episode-152-nuisance-economy.html

Here’s a bit of our correspondence you might find interesting. I mentioned that Franklin Roosevelt did not have things like Fox News to contend with. He mentioned that there was a lot of co-opted media at the time that was very opposed to Roosevelt’s New Deal (mostly owned by rich newspaper barons). But my point was that television news did not exist, and television news is a completely different animal because it renders people more suggestible than when you actually have to parse words in written media. He replied:

Roosevelt dealt with privately-owned newspapers and (especially) radio, which has a power of its own. There is something about a well-modulated human voice to convey not just information but opinion.

You’re right about the light. There is something about flickering, low-light experiences which imprints on us easily. I’ve heard theories that tales told around the nightly campfire were the main method of imparting helpful wisdom, so our brains glommed on to those conditions for paying attention. Hence, the Latin word “focus,” which literally meant “domestic hearth.” Combine such a mental preference for optics with a human voice, both backed by vast fortunes and the need for their continuance, and… Oh, yeah, here we are!

We also talked a bit about the economics of Henry George via email. I’m somewhat familiar with George, but haven’t dived in too deep. Jim mentioned an economist working in the Georgist tradition called Mason Gaffney: https://masongaffney.org/

Gaffney is yet another economist banished from the “respectable” discipline for heresy (but not inaccuracy). As I’ve said so often, economics is really a type of theology.

He also said quite a few interesting things about rents and rent seeking. He turned me on tho this author: Gerrit De Geest. Chapter one of his book is available as a paper online: Rents: How Marketing Causes Inequality (Chapter 1)

De Geest’s argument is that wide wealth and income differentials are not primarily the results of differences in individual ability, intelligence, inventiveness, or “hard work.” Instead, he argues, they are the results of being able to capture outsize economics rents. This is done by distorting markets, and the primary means of distorting markets is (ironically) called marketing. Marketing today is the science of distorting markets for the benefit of businesses in order to extract outsize profits far in excess of the costs of production and distribution. This is everything from exploiting cognitive biases to vendor-lock-in, to extending copyright protection and many other techniques.

Furthermore, he claims, these techniques have reached such a high level of sophistication and ubiquity that nearly all markets everywhere are heavily distorted in some way towards rent-seeking, and consumers are often powerless to resist. He sees this as a under-represented reason for the rise of extreme inequality that we see everywhere today. And this is all perfectly legal. As he puts it, “business schools have outsmarted law schools.”

We’ll take a closer look at that another time.