Americans generally believe that business and government are somehow in opposition; that government can only “interfere” in the workings of business and markets, and that “the economy” is something totally separate and distinct from the rest of society, including from political decisions and social cohesion.
The reason they think this is because of the pervasive libertarian ideology promoted by conventional neoclassical economics. And by libertarian, I’m referring to the systemic bias that pervades all conventional capitalist economics, not just the radical extremist ideology that goes under that name. Neoclassical and classical economics fosters the belief that “economics” must be kept wholly separate from every other aspect of society.
The Chinese, coming from a Marxist—and Confucianist (although in this case, Legalism is probably the better fit)—perspective, believe no such thing. They know that business and government are really the same thing, and always have been, and they make no bones about it. They are free from the Western delusion that there is some sort of “pure” capitalism, free from the taint of government intervention, or the delusion that such a thing is even possible. They do not have the ideological commitment to the”invisible hand,” or the blind faith that anarchic markets will automatically lead to beneficial social outcomes.
I had that thought reading the following paragraph by Adam Tooze:
As Trump’s trade warriors point out, the range of instruments that China deploys in industrial competition makes a nonsense of trade policy as defined by the WTO. Complexity and opacity are key to the success of China Inc. As Blustein shows in an illuminating cameo about tractor tyres, the network of state support for Chinese industry extends from central and local government grants and tax exemptions to subsidised land deals, cheap electric power and a raft of subsidised low interest loans, from the government as well as public and private banks. When rubber prices surged in the early 2000s Beijing devised a scheme to supply it at a reduced price and gave a set of inducements to rubber producers. The arrangements are all-encompassing yet almost entirely deniable, as the American lawyers retained by Chinese firms demonstrate when they face unpleasant questions from the US Department of Commerce.
Whose Century? (London Review of Books)
“Trump’s trade warriors,” as Adam Tooze calls, them, represent that standard American perspective that government should “butt out”, i.e. “not pick winners and losers.” That markets should be free to run themselves and that government should not “interfere.” This comes from a blind commitment to libertarian ideology.
The Chinese know that this is nonsense. They know that production and governance are inseparable. True, it’s no longer centrally planned as in the old days. But the myopic faith in an anarchic market to achieve ideal outcomes is a flaw that the Chinese do not posses. It’s an advantage of coming from a non-Western perspective free from the blinders imposed by neoclassical economic thinking as developed in the West. Of course the government manages the commanding heights of business and trade. What else would you expect?
The Chinese view is the more historically accurate one. In the West, the fairy tale is told of plucky businessmen succeeding despite being frustrated at every turn by petty government bureaucrats. This tale was further enhanced by fabulists like Ayn Rand, who peddled this nonsense for ideological reasons while having no knowledge of economic history, or even any experience in the actual business world.
Marxists, by contrast, have always been fully aware of how the state creates and sustains the capitalist economy, and has always done so. From the passing of laws, to issuing and regulating the supply of currency, to the establishment of limited liability corporations, to the building of infrastructure, to the selling off of formerly public lands to private interests, to the implicit assumption of risk, to the issuing of bonds, to intellectual property laws, to publicly-funded research, to numerous subsidies, to a basic social safety net, to K-12 mass schooling, to the provisioning of police and military to enforce contracts and property rights—the list of how government and business interests are intertwined—not opposed—goes on endlessly. There is no “great wall” dividing a self-contained intellectual abstraction called “the economy” from all the other aspects of human life in this world.
It seems the ideological blinders conferred to us by libertarian classical and neoclassical economists are—ironically—causing the West to fall behind at the game it supposedly invented, especially the U.S.
And, speaking of ideology, it was also ideology that has made globalism such a problem in the U.S., specifically the frontier ideology of self-reliant “rugged individualism,” where honest, hard-working people never require outside help or “handouts.” This ideology insists that, rather than letting “the government” take of you, you should just bootstrap your way out of your circumstances through grit and pluck.
This, of course, is absolute nonsense, but it’s the dominant ideology of the Republican Party and conservative philosophy more generally. In the U.S., it manifests itself in the idea that “welfare” is inherently a bad thing, and that anything the government does to help its citizens is “communism”. This is the reason why the “China Shock” was so uniquely bad in the U.S. compared with other countries that were just as exposed to the neoliberal globalism. The reason you didn’t see the same backlash to “free trade” in other countries as compared to the U.S. is because those countries decided to care of their citizens instead of just throwing them under a bus:
Every advanced economy in the world – Japan, South Korea, European countries (Italy in particular) – felt the ‘China shock’. But only in the US has it led to the kind of political crisis we have witnessed since 2016. It is this that requires explanation. …Given the resources of American government, a shock on this scale could have been cushioned through spending on welfare, education, reinvestment and relocation. But that would have required creative politics, which is precisely what has been obstructed by the Republicans. Instead the problem wasn’t addressed, unleashing a pervasive status anxiety among lower-middle-class and working-class white Americans, especially men. It was in the counties where the highest number of jobs were lost because of the China shock that Trump scored best in the 2016 election.
Since the Clinton era, the Democratic establishment has held up its side of the bargain, deflecting opposition to globalisation from trade unions. What it did not reckon with was the ruthless cynicism of the Republican Party in opening its doors to xenophobic, know-nothing white nationalism, inciting talk of a nation betrayed and swinging over to protectionism. The Democrats also didn’t take into account the dogged refusal of the Republicans to co-operate in their efforts to patch together America’s welfare state, even, or especially, when it came to fundamentals such as unemployment insurance and health coverage…
In other words, if we hadn’t been so wedded to the “government bad” and “society owes you nothing,” attitudes, and if the elites had been even a little less rapacious, we would not have seen entire swaths of the country reduced to sub-third-world status, and hence the rise of authoritarian right-wing populism. In the U.S., for example, even health care is tied to a having a job, and instead of dealing with that problem, the politicians of both parties chose a politics of distraction and misinformation that has led us to where we are now.
Due to an ideological distaste for “big government solutions” and “government handouts,” inherited from libertarianism, the only other avenue for aspiring populist politicians was to promise to somehow “bring the jobs back,” so that workers could head back into the factories and “earn” the basics of life like health care and the money to pay for food and shelter. But, of course, this will not work. U.S. manufacturing continues to expand output, even while shedding workers. It was China’s low wages that made them predominant—low wages that would not work with the high fixed costs of food, education and housing in the U.S. High-wage manufacturing jobs were replaced with the “service economy”, and the ideological conception that what we earn is entirely down to our own personal “marginal productivity” (again promoted by neoclassical economists) led to opposition to any efforts to raise the bar for wages.
In both of these cases, we can see where hidebound ideological blindness prevented the U.S. from taking the steps that other countries have effectively taken, which has led to the creation of much more successful 21st century societies outside the U.S.—whether it it’s Europe’s social democracy or China’s state-managed capitalist/communist hybrid. Since both of these options are effectively off the table due to our ideological commitments, all Americans can do cry to the heavens that the imaginary libertarian world that “should” exist is nowhere to be found as we continue to circle the drain of history.
“China under the control of the CCP is, indeed, involved in a gigantic and novel social and political experiment enrolling one-sixth of humanity, a historic project that dwarfs that of democratic capitalism in the North Atlantic.”
It’s a long piece, worth reading in full: Whose century? (London Review of Books)
BONUS: To prove the point made above:
Tesla Motors Inc., SolarCity Corp. and Space Exploration Technologies Corp., known as SpaceX, together have benefited from an estimated $4.9 billion in government support, according to data compiled by The Times. The figure underscores a common theme running through his emerging empire: a public-private financing model underpinning long-shot start-ups.
The figure compiled by The Times comprises a variety of government incentives, including grants, tax breaks, factory construction, discounted loans and environmental credits that Tesla can sell. It also includes tax credits and rebates to buyers of solar panels and electric cars.
A looming question is whether the companies are moving toward self-sufficiency — as Dolev believes — and whether they can slash development costs before the public largesse ends.
I found this on one of the files on my jump drive. I don’t think these really need any more elucidation, because they should be self-evident, so I’m just going to put them out there. This list will probably expand over time.
The stock market is not the economy.
Health care should not be tied to employment.
Taxes do not fund government spending.
There is no shortage of money.
Globalized, just-in-time supply chains are fragile.
A lot of the work we do is pointless and nonessential. Or, put another way, jobs are more about earning the money to live rather than doing socially useful work.
The most important workers in society are often the least paid.
American politicians are corrupt and incompetent.
The notion of the “service economy” is bogus, and always has been.
“Small government” is not an inherent virtue.
A functional social safety net is actually good for business.
Feel free to add on to the list.
BONUS: This is a good perspective from a commenter on Naked Capitialism:
“I’ve been referring to Coronavirus for a while as the world’s most effective stress test of institutions, maybe the biggest such experiment in history. It has unerringly found the weak link in every country and society its hit – whether that weak link being weak institutions, stupid politicians, sclerotic bureaucracies, religious nutcases, institutional groupthink, authoritarian tendencies or whatever. In the US its found not just one, but a whole series of weak links it can exploit. The results are not pretty.”
BTW: I’m still not happy with the blog layout. Does anyone have any suggestions?
What’s up with the “trying to build my empire” line on Instagram profiles lately? Really, everyone is trying to build an empire? How about just trying to live your fucking life? Oh, right, anyone who does that is “lazy.” Everybody is trying to be Jeff Bezos, since being an ordinary person apparently doesn’t cut it anymore. Since when is the goal of the average person “building an empire?” (of course, I know the answer, since Neoliberalism). I don’t think there are enough resources for 200+ million people in the U.S. to each have their own “empire”—by definition an empire consists of the ruler and the ruled: an emperor and subjects.
How is it that people say these things uncritically? It’s like Neoliberalism in America has become so internalized, so ingrained, so much the water in which we swim, that it’s penetrated into our very soul.
And what’s up with all the MLM stuff everywhere? It seems like every woman on Instagram is engaged in some sort of multilevel marketing sales scheme. There are the old standbys like makeup (Mary Kay, Avon), with all sorts of new skin care/beauty products joining the mix. The latest schemes are things like essential oils and products made out of CBD. Then there are the always-popular health supplements. Everyone’s social media now is hawking product (when not posting conspiracy theories). It’s like everyone is a seller, but there are no buyers.
And are we ever going to reach Peak Supplements? It seems like every celebrity in the world is hawking some sort of magic pills. Tom Brady is the latest to jump on the bus in a field pioneered by luminaries like Alex Jones, Joe Rogan, Gwynneth Paltrow, Tim Ferriss, and countless televangelists. It’s like their core business model is deliberately attracting an audience of gullible paranoiacs so you can continually sell them useless shit. It was only a matter of time before a politician like Trump used this exact formula to win and maintain political power as well. I have a hunch he’s just the beginning.
This is what we call “business” today?
And related to that, whenever someone signs up to shill for one of these companies, they often say something like “proud to be a member of the [insert scam product] family.” Family? Really??? What’s with this idea that all of these money-making operations are any sort of family? Um, you’re not a member of a family; you’re an employee. How sad is it that we have come to be gaslit into seeing employers as like our families—employers that owe us nothing and will terminate us without remorse based on numbers on a spreadsheet.
To say the language of late capitalism is Orwellian doesn’t to it justice.
And has anyone noticed the posters with exhortatory messages that have sprung up all over capitalist workplaces? It’s like something out of the late stage Soviet Union. Dig that coal, bale that hay, tote that barge!
And no wonder. Just like in the late Soviet Union, morale has long since eroded away, replaced by dark cynicism and gallows humor over things like unpayable debt and health care bills. Any connection between “hard work” and reward has been severed for the vast majority of people. Whatever class you’re born into, that is where you’ll stay. So we need to constantly encourage the proletariat to keep their noses to the grindstone, because previous motivators like prosperity, stability and social advancement are long gone.
And the similarities to the late Soviet Union don’t stop there. We have the spectacle of a fossilized gerontocracy exemplified by Trump, Joe Biden, Nancy Pelosi, et. al., unable to respond to the dire challenges facing the country, just like the uninspiring grey bureaucrats of the late stage Soviet Union. We have a cynical youth alienated from the political process and facing declining living standards with no realistic way to change course. We have a media that’s basically party propaganda. We have mass spying on the citizenry beyond anything the Soviets could muster. And lately, we even have secret police “disappearing” people off the streets of American cities to unnamed dark sites.
It’s capitalism run amok. We’re not just workers anymore—we’re all perennial hustlers; we’re all an “empire of one.” We’re popping magic pills for “total human optimization” while waiting for “our ship to come in.” We’re substituting instrumental money relationships for genuine ones. Every parent is working like a madman to give their kids any edge in the unremitting status tournament of American life. As Chris Rock observed, “anytime you’re talking to an American, you’re really talking to their agent.” And it’s only gotten worse since he first made that observation.
This culture is so toxic. It’s irredeemable. It’s one reason why I feel so alienated and alone in America. How can you possibly relate to anyone else in a culture like this? How can you have any kind of genuine relationships when everyone around you is a hustler; when everyone is spending every waking moment climbing the status latter and “building their empire?” It’s just so hopeless.
Have I mentioned how much I hate marketing as an idea?
I wish humanity would think about this: We literally live to market ourselves to others in order to gain income, one way or another.
And it ruins EVERYTHING.
It may be time for me to get a nice, out of sight dishwashing job.
— Peter Joseph (@ZeitgeistFilm) July 19, 2020
In Europe, you are surrounded everywhere you turn by majestic stone architecture. The Gothic cathedrals and castles that occupy the cities and countryside of Europe provide copious examples of the wonder and beauty of stone architecture.
Stone is one of the oldest materials used in construction. It is also the most durable. The oldest surviving buildings in the world are carved from stone, or made from assembled stones. Göbekli Tepe is made from stone. The Pyramids are made from stone. It’s also a very local material—when you are building from local stone, it gives a place a distinctive feel. That’s why Paris looks the way that it does: the cream-colored Lutetian limestone quarried from the banks of the Seine.
The reason, I think, that we find stone such a compelling material is because, before we built our own structures, we occupied “natural rooms” that the earth made for us—caves. The stone walls of caves, lit by tallow lamps and torches, and illuminated with spectacular artwork, were out earliest permanent homes, and our earliest cathedrals.
The stone walls in a Gothic cathedral or medieval castle are bearing walls, being both the source of shelter from the elements and supporting the overall structure. The walls of modern buildings, by contrast, do not carry any load besides their own weight. The structure is separate, usually a skeleton frame of steel or concrete. These are typically either curtain walls (which are clipped onto or tied back to the supporting structure), or infill walls (sitting on the structure and filling the gaps within it.)
Bearing Walls: Monolithic Masonry Construction (Columbia University)
Walls in commercial construction today are usually cavity walls, consisting of a facing material held to the structure by some sort of clip system, creating a cavity between the supporting structure of the wall (typically studs or masonry) and the veneer. This cavity is designed to resist the penetration of water (since liquid moisture cannot leap a cavity*). The cavity also gives us a place to put the insulation.
Facing materials are usually panelized systems of fiber-cement, metal, porcelain, treated wood, phenolic resin, or some other weather-resistant material. Even in walls that appear to be solid brick or stone, the brick or stone is merely a facing material held by clips to a wall usually comprised of wood or metal studs.
Which is what made this article so fascinating to me: The miracle new sustainable product that’s revolutionising architecture – stone! (The Guardian)
The article talks about having stone be an actual self-supported wall rather than a thin veneer, and also a potential bearing material.
The article is based on a London exhibition of architecture which uses stone as a true building material rather than just a facade veneer. It’s entitled The New Stone Age. Here is the BBC’s coverage:
Featured prominently is 15 Clerkenwell Close, a six-story building by architect Amin Taha, which uses cut stones as the facing material of the building. The stone is deliberately left in the condition it is quarried in rather than being dressed, leading to variegated facade that resembles an urban ruin. This approach has pleased some, and left others so distressed that they launched a campaign to tear the building down!
…The result looks like what might have happened if Mies van der Rohe had been weaned on The Flintstones. It features a load-bearing exoskeleton made of massive chunks of limestone brought straight from the quarry.
The blocks have been left with their raw quarrying marks exposed and stacked on top of each other to form columns and beams. Some of the slabs’ faces show the lines where they were drilled from the rock face, others are sawn smooth as if cut by a cheese wire, while some bear the rugged texture of the sedimentary seam, freshly pried from the Earth’s crust.
The building’s geological power was too much for one Islington councillor, who complained that the “awful” building was out of keeping with the historic neighbourhood, and ensured that a demolition notice was issued, based on a supposed breach of planning permission. Taha finally won the case last year, on the proviso that the smooth stone be roughened up to look like the rest (a process which, after testing, has thankfully proven structurally too risky to carry out).
The problem with a solid stone wall is that there is no insulation or waterproofing layer as there is in a typical cavity wall. From the details on the architect’s web site, it looks there is a secondary wall behind the stone facade that accomplishes these functions. The stone is stabilized by metal anchors which tie it back to the main structure.
Another building very similar to the one discussed in the Guardian article is 30 Finsbury Square, also in London, by Eric Parry Architects. Unlike Clerkenwell Close, the stone here is dressed and smooth, and facade is designed in a rationalist manner reminiscent of Italian rationalists like Aldo Rossi or Guiseppe Terrangni.
There area few instances where stone is used as both shelter and bearing material. For example the article prominently features a photo of this winery in France:
Delas Frères Winery in France (ArchDaily)
And another example: a radio station in the Himalayas that appears to be built out of solid stonework. It’s difficult to imagine this being built anywhere else, though; I don’t think you could build something like this in downtown London or an American suburb.
The BBC article mentions Jorn Utzon’s (Sydney Opera House) own Can Lis house in Mallorca
The article also prominently features a French firm, Perraudin Architecture, which builds using stone as a structural material, as opposed to just a veneer or facade material. This gives their projects an amazing texture and heft that you just don’t see often in modern architecture. I would imagine France has a tradition of stonemasonry that goes very far back, indeed.
The building is entirely built up in load-bearing limestone walls of 40 cm. Precise coursing elevations define each stone, to be extracted, dimensioned and numbered in the quarry and then transported to the site. There, they are assembled like toy blocks using nothing but a thin bed of lime mortar.
The building is entirely built up in load-bearing limestone walls of 40 cm. Precise coursing elevations define each stone, to be extracted, dimensioned and numbered in the quarry and then transported to the site. There, they are assembled like toy blocks using nothing but a thin bed of lime mortar.
No paint or plaster was added to the walls, so the stone surfaces are left bare to display traces of the quarrying process. Projecting courses of stone on the exterior mark the boundaries between floors and help to direct rainwater away from the windows.
But what the article emphasizes is quarried stone as a more environmentally-friendly alternative to concrete. The concrete production process produces and enormous amount of carbon dioxide, whereas stone can by used as quarried directly from the ground. From the Guardian article:
When you step inside the Building Centre, you are immediately confronted with a large model of a speculative proposal for a 30-storey office tower – designed to be made entirely from stone. It looks like a series of Clerkenwell Closes stacked on top of each other, the chunky stone columns getting progressively thinner as they rise towards the clouds.
“We wanted to prove that a solid stone tower is eminently possible,” says Taha, handing me a substantial technical report that makes a hard-nosed case for such a building on grounds of both cost and carbon footprint. Using stone for the core, structure and floors, they argue, would be 75% cheaper than a steel and concrete structure, and have 95% less embodied carbon. The primary reason for the saving is that, while concrete and steel have to be fireproofed, weathered, insulated, then clad, a stone exoskeleton can be left exposed…
“Stone,” says architect Amin Taha, “is the great forgotten material of our time. In 99% of cases, it’s cheaper and greener to use stone in a structural way, as opposed to concrete or steel, but we mostly just think of using it for cladding.”…
The tactile qualities of stone are clear, but, for Taha, the environmental argument is what makes it such an important material to champion. “As a profession, we’re not thinking clearly about the embodied energy of building materials,” he says. “The perverse thing about concrete is that you take limestone, crush it, then burn it, by which time it loses 60% of its structural strength – so you then have to put steel reinforcement inside it. It’s total madness.”
By embracing stone as combined superstructure and external architectural finish, he says, we can save 60-90% of CO2 emissions for these key building elements. “And we’re standing on a gigantic ball of molten rock, so we’re not going to run out of stone any time soon.”
Can we build office towers from stone? Read the structural, economic and environmental case here: https://t.co/az2kAJFzjP@WebbYates @Groupwork_arch @stonemasonryco @Polycor @jacksoncoles @EightAssociates
— Building Centre (@BuildingCentre) March 25, 2020
Compare this to concrete:
After water, concrete is the most widely used substance on Earth. If the cement industry were a country, it would be the third largest carbon dioxide emitter in the world with up to 2.8bn tonnes, surpassed only by China and the US….Taking in all stages of production, concrete is said to be responsible for 4-8% of the world’s CO2. Among materials, only coal, oil and gas are a greater source of greenhouse gases. Half of concrete’s CO2 emissions are created during the manufacture of clinker, the most-energy intensive part of the cement-making process.
But other environmental impacts are far less well understood. Concrete is a thirsty behemoth, sucking up almost a 10th of the world’s industrial water use. This often strains supplies for drinking and irrigation, because 75% of this consumption is in drought and water-stressed regions. In cities, concrete also adds to the heat-island effect by absorbing the warmth of the sun and trapping gases from car exhausts and air-conditioner units – though it is, at least, better than darker asphalt.
It also worsens the problem of silicosis and other respiratory diseases…
Concrete: the most destructive material on Earth (The Guardian)
Plus, there’s just something about the “feel” of natural stone that can’t be captured by other materials. That’s why it has faced our buildings since ancient Egypt to ancient Rome and medieval Europe. Due to its “natural” qualities and heft, a solid stone wall simply “feels” better than modern veneer walls, in my opinion. In addition, stone and brick acquire a warm, pleasing patina over time, and are amenable to all sort of creative expression not possible with other panelized systems, and certainly not in aluminum curtain walls. For example, Brick Expressionism (Wikipedia) was common in the early twentieth century, and the beauty, variety, and expressionism of carved stone veneers is evident.
Alongside its sustainable qualities, it’s the material’s one-off nature that really appeals to the design world. “People increasingly want the authentic beauty and inconsistencies of natural stone,” says Solid Nature’s David Mahyari, “imitation ceramic tiles include realistic veins but have a repeat pattern like wallpapers, so you can tell quickly that they’re fake.”
Its age is also a factor. “Stone is a material that is millions of years old. Can you imagine this? I am completely convinced that this dimension also changes the way we relate to a stone object, establishing a different kind of connection with it and making it, somehow, more precious.”
London-based stone carver Simon Smith backs this up: “If the stone ‘takes a polish’, it’s like opening the door of the stone and seeing deep into it, and millions of years back in time.”
I’m not alone. I recently ran across this paragraph describing a project that used stone for vertical shading devices (The Jackman Law Building at the University of Toronto). It explained why the designers fought for natural stone instead of precast concrete for the shading devices:
The choice of stone for the shade fins stems from an aspiration to counter a look of mindless mediocrity that [Architect Siamak] Hariri sees being inflicted on cities by the widespread use of ersatz materials. Imitations lack the dignity, patina, and subtle variety of natural materials, he says, and he advocates for beauty as a value in its own right, as well as for its contribution to durability: “A really good building is one that people will not let be taken down.”
Continuing Education: Vertical Shading Devices (Architectural Record)
Traditionally, stone and brick cannot span spaces except using an arch, a vault (basically an extruded arch), or a dome (a revolved arch). Thus, the structural material in stone buildings was often wood or timber, or steel beams or trusses in newer buildings.
There is a way, however, to have stone span spaces: the flat stone vault, which was patented in the 1600s by French engineer Joseph Abielle. Abeille’s vault has recently been used on an innovative project in Jerusalem: a gift shop added on to an old Crusader church!:
The columns of the new shop are made out of massive stone, and the ceiling is a at stone vault composed of 169 interlocking voussoirs. The system is inspired by the invention of French engineer Joseph Abeille (1673-1756), who patented in 1699 a special system that allowed the building of at vaults.
The Flat Vault / AAU ANASTAS (Arch Daily)
The flat stone vault is completed! (AAU ANASTAS)
Tiles as a substitute for steel: the art of the timbrel vault (Low Tech Magazine)
The Nubian Vault is constructed of mud brick without requiring a temporary support:
They’ve even used such vaults to construct multi-story buildings without utilizing any concrete or steel:
The Sustainable Urban Dwelling Unit (SUDU) (No Tech Magazine)
These structures can be subsumed under the rubric of reciprocal supporting structures, in which each structural member supports every other member in turn, with a few members transferring the total load to the ground or supports (an interesting metaphor for society, no?). Reciprocal supporting structures are becoming increasingly popular.
Incidentally, I’ve noticed a distinct trend in modern architecture to have not just a single skin, but to divide the exterior and from the interior using layers—for example a layer of sun screening, or a layer for privacy as in the house above, or wrapping balconies around the building to create a “semi-private” space, as in this building:
I wonder if cementitious foam insulation (Airkrete), sprayed inside a cavity, could give stone the necessary R-values to be used as an exterior wall without a second layer. Waterproofing could be accomplished by a hydrophobic coating. Such a wall would have decent thermal and waterproofing performance, not to mention be practically permanent (and beautiful, too!)
And there are some other promising new materials that can have both a structural use and give a beautiful texture.One that’s getting a lot of attention is cross-laminated timber (CLT). CLT consists of wood planks glued together to create a structurally stable panel (NLT—Nail Laminated Timber, uses nails to hold the planks together). The planks are set together at right angles to each other to provide structural stability, similar to how plywood is made, just at a larger scale. It’s part of growing suite of mass wood technologies:
Mass timber is a generic term that encompasses products of various sizes and functions, like glue-laminated (glulam) beams, laminated veneer lumber (LVL), nail-laminated timber (NLT), and dowel-laminated timber (DLT). But the most common and most familiar form of mass timber, the one that has opened up the most new architectural possibilities, is cross-laminated timber (CLT).
To create CLT, lumber boards that have been trimmed and kiln-dried are glued atop one another in layers, crosswise, with the grain of each layer facing against the grain of the layer adjacent.
Stacking boards together this way can create large slabs, up to a foot thick and as large as 18-feet-long by 98-feet-wide, though the average is something more like 10 by 40. (At this point, the size of slabs is restricted less by manufacturing limitations than by transportation limitations.)
Slabs of wood this large can match or exceed the performance of concrete and steel. CLT can be used to make floors, walls, ceilings — entire buildings.
What makes CLT so compelling are two things: the wood facing of the material provides a beautiful surface which can be left exposed on the inside (on the outside you will still require waterproofing, insulation and cladding). But perhaps the most attractive feature is that, since they are made from trees, they remove carbon from the air instead of increase it.
Unlike stone, wood is commonly used to span, and has been the most common material to do so since ancient times. The modern use of CLT leads to a wide range of structural expressions, with nearly endless variation:
Mass Timer Primer (Canadian Architect)
CLT is the hot material of the moment, and there are many designers who are clamoring to build innovative large-scale structures in this material. There are all sort of proposals out there, from medium-sized buildings to skyscrapers (because we always have to build skyscrapers out of the hot new material for some reason). Although, for smaller-scale and residential structures, I wonder why structural insulated panels (SIPs) are not more popular. Those have been a round for a long time (an innovative use of SIPs is the Ashen Cabin by HANNAH Architecture and Design).
Once upon a time, wood was a primary building material across much of the world. But with industrialization, that changed in the West.
German architect Arnim Seidel explains that steel and concrete became the dominant building materials for to meet 20th-century demands: wide bridges, tall buildings, heavy loads.
“Wood came to be seen as backwards,” Seidel told DW.
Now, its environmental advantages are being recognized.
Materials like steel and concrete require massive amounts of energy to produce, and are usually transported over long distances. This emits CO2 that contributes to climate change.
By some estimates, producing a ton of concrete, or about a cubic meter, generates 410 kilograms of CO2 equivalent — the same amount of energy could power an average house for more than 10 days.
Locally harvested wood from sustainably managed forests not only has a much smaller carbon footprint in its production.
Using wood in buildings also sequesters carbon dioxide. When plants perform photosynthesis, this removes CO2 from the atmosphere and stores it in the wood.
“When we build with wood, we can conserve this stored CO2 for a longer period of time, and not emit it into the atmosphere,” Seidel told DW.
Another material that is making a comeback is rammed earth:
The name says it all: it’s made of damp soil or earth that is placed in formwork, and then compressed or rammed into a solid, dense wall. As a construction technique, rammed earth almost disappeared with the development of reinforced concrete, but there has been a revival in interest because of its aesthetics and its perceived environmental benefits.
The carefully chosen mix of silt, sand, and gravel with a low clay content is moistened and then placed in about 4 inch deep layers between plywood forms; that’s why one sees the different colors and stripes, as often each layer is modified for aesthetic reasons. It used to be rammed by hand, but now powered rams are often used to reduce time and labor. Engineered structural reinforcing is often required.
Electric wiring and switch boxes can be built right into the wall as it goes up, so that a clean, interior earth finish can be maintained.
The structural potential of this material is more limited than the above materials. Cement-stabilized rammed earth has greater structural potential, but usually some sort of additional structure is used. Rammed earth walls tend to be mass walls, and this, along with other characteristics, limits them to fairly mild, drier climates such as the American Southwest, the Mediterranean, and Australia for building.
Like locally-quarried stone, using the earth from the site as a building material also anchors the building to the unique place, and allows us to surround ourselves with materials that look millions of years back in time.
The Dirt on Rammed Earth (Treehugger)
In summary, there are lot of innovative materials, and new ways to use old materials that add up to a lot of design possibilities for building design going forward. Let’s hope we can rise to the challenge and create a more inspiring built environment than has often been the case in the recent past.
* Of course it can, really, such as wind driven rain, but I’m trying to keep this simple!
It’s time for a summer edition of fun facts!
The highest paid athlete of all time was a Roman charioteer; if he had lived today he would have been worth $15 billion.
Air pollution is responsible for shortening people’s lives worldwide on a scale far greater than wars and other forms of violence, parasitic and vector-born diseases such as malaria, HIV/AIDS and smoking.
California loses up to $1 billion in crops each year because of air pollution.
By 2010, 43,600 jobs had been lost or displaced in Michigan – and about 700,000 in the United States – due to the rise in the trade deficit with Mexico alone since NAFTA was enacted in 1994.
Distribution of Household Wealth in the U.S. since 1989 (Federal Reserve)
As of December 2016, more than 129 million Americans have only one option for broadband internet service in their area – equating to about 40 percent of the country.
The average consumer throws away 60 percent of clothing within a year of purchase.
TIL there are more payday loan stores in the US than there are Starbucks or McDonald’s.
‘Idiot’ once specifically referred to somebody with the mental age of a 2 year old. ‘Imbecile’ referred to somebody with the mental age of a 3 to 7 year old, and ‘Moron’ referred to somebody with the mental age of a 7 to10 year old.
If cows were a country, they would be the third-largest greenhouse gas emitter in the world.
Three out of four new or emerging infectious diseases are zoonotic.
There are fewer American farmers today than there were during the Civil War, despite America’s population being nearly 11 times greater.
5 Companies own 80% of all stock in S&P 500 listed companies.
France’s longest border is with Brazil.
Rudolph Hass, the man who grew and patented the original Hass avocado tree, didn’t make very much money despite its success as most people bought one single tree and then grew vast orchards from cuttings. He only made $5000 from his patent, and remained a postman his entire life.
The largest ancient pyramid in the world is buried inside a mountain in modern-day Mexico underneath a church.
There is an inverse correlation between the amount of money spent on a wedding, and how long the marriage lasts. The more people spend on the ceremony, the more likely the couple will get divorced.
As of 2018, there are 6 PR people for every journalist. Much of the change is attributed the 45% loss of newspaper employees from 2008 to 2017. Additionally the current median income of PR professions is $61,150 in comparison to journalists $46,270.
Brazil has nearly 60,000 murders a year, more than the US, Canada, Australia, all of Europe, China, and many Asian countries combined.
I was planning to comment on the writeup that Slate Star Codex did on The Origin of Consciousness in the Breakdown of the Bicameral Mind by Julian Jaynes, which I was surprised had not been covered before. I suppose I’ll do it sooner rather than later, since Slate Star Codex has since been taken down. I guess that means I’ll be commenting on both topics.
As most of you are probably aware by now, the New York Times was planning on running an article about the blog that would have revealed the author’s real full name. The author, who blogs under the pen name Scott Alexander, claimed that the Times was going to “doxx” him, and that he needed to remain anonymous for professional reasons. As he describes it, removing the blog was the only way to stop the story from going out.
Now, I think the reasons he wished to remain anonymous were 100% legitimate: as a professional, there are certain ethical standards that you have to uphold, and if you have patients, having them able to read your opinions probably would color the doctor/patient relationship, which is particularly important with something like psychiatric counseling. And he also thought that being named in the New York Times would make him easier to locate, and that this would endanger the housemates he lives with, because he has received a number death threats in the past which he apparently believes are credible (as an aside: can anyone express an opinion today without receiving death threats? What does that say about our society?)
I don’t know about using the term “doxxing” though; that seems intentionally hyperbolic. From my understanding, “doxxing” implies malicious intent. It’s deliberately publishing details about a person’s offline identity in order to threaten, harass, intimidate, or bully that person. The Times was doing no such thing—for better or worse, their policy was to use people’s real names unless there was a compelling reason to maintain a person’s anonymity (such as informants, whistleblowers, etc). You can certainly argue whether or not that’s a good policy (and I’m sure a lot of people think that it isn’t), but I’m sure the Times had their reasons, and there was no deliberate intent to harm Alexander or anyone else as far as I can tell form the story. For what it’s worth, I suspect this will eventually prompt the Times to change their policy, and the blog will be up again at some point in the future, so if you’re a fan of it, I wouldn’t worry.
Now, I’m hardly unbiased in this case. I too blog under a pseudonym, but for different reasons. I don’t have professional reasons to not use my real name, as I don’t have patients or clients. I do often have knowledge of confidential projects in my area, but I stringently make sure never to discuss my job or any of my professional work on this blog. And I’ve never received death threats, but even if I did, well, I live alone so if someone did decide to take me out, all that would happen is that I’d end up as dead on the outside as I am on the inside. It might even be doing me a favor.
Rather, I do it because I need to earn money to survive, and I don’t want potential employers to Google my name and find this blog or any of my opinions, even though I think they’re hardly radical or extreme. It’s sad that I have to worry about this, but that’s the world we live in. It also calls into question just how much “freedom’ we really have in modern capitalist societies, but that’s a larger topic for another time. I’m scared shitless what would come up if I actually did google my real name, so I’ve never done it. When Jim put up my recent interview on The Attack Ads! Podcast, he initially published my real name, but he was kind enough to remove it and replace it with my pen name (kinder, it seems, than the New York Times!)
I have been doxxed in real life, however, and it was not a pleasant experience. I might as well go ahead and tell the story.
The last job I had before the one I have now was for a local architecture firm, which allowed me to practice again. I put the name of my employer on my Facebook profile (I know, I know, but we’ve all done stupid things in life that make us go ‘what were you thinking?’ in retrospect).
I had an acrimonious exchange on Facebook with some random asshole, but what I didn’t know was that this random asshole happened to know one of my co-workers at this firm (who was also an asshole). Thus, armed for revenge, he sent the exchange to this scumbag, who subsequently printed it out and literally took it from desk to desk around the entire firm, and directly to the firm’s managers/owners befoe I even knew what was happening. Clearly this person was an absolute sociopath, who—like so many Americans—enjoys destroying people for sport and twisting the knife simply because he can. I was sternly reprimanded by the firm’s leaders, and I’m sure it was a major factor in my eventual dismissal, effectively ending my professional career. Oh, and this incident exactly coincided with my mother’s final months dying of cancer.
So doxxing isn’t a good thing.
And it’s not like this was an isolated incident, either. I’ve had many, many experiences like this over my professional career and in my life experience—enough that’s it’s routine by now. Perhaps I just attract bullies. Incidents like this have convinced me that people are inherently cruel and evil, and will absolutely hurt you the minute they get the chance. It has led to my developing misanthropy and paranoia. I still have many PTSD symptoms including nightmares about that job.
Of course, I immediately deleted my Facebook profile. I do currently have one under a false name, but only because I still needed to sell some of my mother’s hoarded stuff online. I don’t post anything there or have any personal info, of course. In order to have access to the Marketplace, you need to have what Facebook considers to be a valid profile (presumably to deter scammers), so I signed up for a couple of groups to make the algorithm think I’m a real person and let me have access. One was about Cardinals. The other was a Julian Jaynes discussion group.
Which finally brings us around full circle to the real subject matter at hand. I’m writing this now because I have to go from memory, as the original post is obviously no longer online.
Alexander begins by “rewriting” the book along similar lines, keeping the parts of the premise he thinks are valuable, and omitting the parts that he thinks are incorrect or speculative. This allows him to summarize the book that he thinks Jaynes “should have written.”
I actually enjoyed this approach. Unlike most Julian Jaynes fans, I’m not a Jaynes absolutist. I’ve noticed that most Jaynes enthusiasts accept 100% of his thesis and tend to treat the book as holy writ. I like to pick and choose what I think is correct.
Alexander claims that what Jaynes was actually describing was the beginning of Theory of Mind, rather than consciousness in the Jaynesian sense.
Now, I do think that Jaynes’s choice of the term consciousness is problematic. Jaynes’s supporters will always point out that he goes to great lengths to define what he means by consciousness, and they’re right—he does! But the thing is, if you have to go to such lengths to define what you mean by a term, then the term is poorly chosen. For the average person, consciousness is just the state of being awake, and when they hear that Jaynes is claiming that ancient people lacked consciousnesses, even though he explains what he means by that (their awareness was different than ours), most people will still reject the thesis outright. In other words, merely by choosing this term, you start out in a hole, and you have to spend a lot of time digging out of it before you can even do the heavy lifting. And when you’ve got a thesis as “out there” as Jaynes does, that’s even more of a problem.
I wrote about Theory of Mind in my series of posts about the Origin of Religion. From my understanding, Theory of Mind is the ability to understand that others have thoughts, feelings and ideas different than your own. From this perspective, then, Jaynes would be arguing that an ancient Greek person would be unable to perceive that his fellow Greeks had different thoughts or possessed different knowledge than he did. Put another way, an ancient Greek person at the time of Homer would fail the Sally-Anne test.
But as far as I can tell, that’s not what Jaynes was saying at all! I find it hard to believe that the author got this concept wrong, considering he’s allegedly a psychiatrist. Maybe there’s some confusion of terminology here. Voice hearing has nothing to do with this ability. As far as I know, voice hearers and schizophrenics are still aware that other people have minds of their own.
Instead, the term I would use for what Jaynes is describing is meta-consciousness, or meta-awareness. This would mean that the book’s title would be The Origin of Meta-consciousness in the Breakdown of the Bicameral Mind, which I think is clearer. That concept is is different than Theory of Mind. I would define meta consciousness as being conscious of one’s own mental states. In this paradigm, consciousness is a thing that can be thought about and contemplated separately from one’s direct experience; whereas before thoughts are just thoughts–there is no conceptual entity that these thoughts are assigned to that allows you to stand back from one’s own thoughts and reflect on them. It would be like trying to see your own eyeball without a reflection.
When people did have thoughts expressed as language inside their own heads (as opposed to verbalizations), they assigned these thoughts to a conceptual entity that has come down to us as “gods.” With the slipperiness of language, it’s possible that word “god” simply referred to this inner voice, rather than a “real” person as often depicted. To aid this conception, this inner voice was assigned a persona–the persona of the god. Statues were made of these imaginary entities who were the source of such voices. They became cultural touchstones. Both temples and statues were expressly designed to “call forth” this inner voice and hear the god’s command (i.e. induce hallucinations).
What they did NOT have was a conception of “inner self” or “soul” that these inner vocalizations could be assigned to. At least, not yet. Over time, they developed this conceptual framework though the expansion of metaphor, and this entity became the source of these nonverbalized thoughts rather than a “god.” They heard this voice, then, not as a hallucination commanding them to do things (or, rather, what we would term a hallucination), but more of a voice that was under their conscious control as surely as the ones that gave rise to verbal communication between their fellow men. “Consciousness is (a mental process creating) an introspectable mind-space.” That “introspectable mind space” is different than theory of mind, which has to do with how we perceive others.
Previously, I suggested that this was somehow related to the mind’s ability to grasp recursion, based on Douglas Hofstadter’s ideas about the recursive nature of consciousness. Once the mind could grasp the principle of recursion, it could develop meta-awareness, which is turning thoughts back on oneself as if in hall of mirrors. This allowed for the development of a new kind of counsciousness which allowed people to perceive the voices in one’s head as as originating from the ‘self’ rather than a ‘god.’ I noted that the few populations who do not seem to have recursive structures in their language do indeed seem to have very fluid and undefined senses of self by our standards, and are prone to what from our vantage point would be hallucinations. This is speculation, however.
Alexander claims that Jaynes pins the breakdown on bicameral consciousnesses on increased trading during the Bronze Age, and the requirement to deal with other people in order to trade. To negotiate deals, you need to be able to put yourself in the mind of another person. Since he is operating on the assumption that Jaynes was talking about theory of mind, this makes sense. But Jaynes wasn’t really talking about this at all.
Although Jaynes does mention the increased trading during the Bronze Age, it is more the need for novel behaviors in general that he pinpoints, rather than just the need to trade per se. Jaynes argues that bicameralism was useful in world where routine behaviors were the norm, and that people would hear the voices of their leaders in their heads commanding them what to do. In contrast, when such top-down command structures did not work—such as dealing with outsiders—it called forth new types of behavior, and this is what caused the breakdown of bicameral consciousness, not simply trade.
What’s also odd is that an even bigger culprit in Jaynes’s view is the advent of the written word, which Alexander omits completely. Oral cultures would favor bicameralism, because orders are passed down vocally from the leaders, who then become gods in their heads commanding them. But with the written word, one takes command of one’s own inner voice. You use your brain in a completely different manner in the act of reading than you do in a world where 100% of interpersonal communication is via speech. This seems like a much more likely explanation of the shift in brain function than just trade alone. Why not mention it? He also omits many of Jaynes’s ideas about the value of metaphor in language. Language is what allows us to construct the metaphorical self and the “Analog I.”
Alexander briefly mentions that Jaynes’s conception of the split brain was based on Micheal Gazzinaga’s research (and through him Roger Sperry), and that a lot of this research has been debunked or superseded. He offers no sources to back up this claim, however. I was surprised by this, because one would have thought that if anyone, a psychiatrist–who is a doctor that specializes in the brain after all–would have more qualifications here than anywhere else. From my readings, it appears that a good portion of Jaynes’s claims about how the mind processes language across the hemispheres has comported with newer research, even if the entire concept of bicameralism has not been.
There is also no mention of the reassessment of Jaynes thesis by an cross-disciplinary team in 2007 that expressed qualified support for it: The bicameral mind 30 years on: a critical reappraisal of Julian Jaynes’ hypothesis. From what I recall from Charles Feryhough’s The Voices Within, there has been some empirical support for Jaynes’s model of how the brain hears voices in recent research.
Neuroscience Confirms Julian Jaynes’s Neurological Model (The Julian Jaynes Society)
Split-Brain Researchers Are Split (Psychology Today)
There is also no mention of Jaynes’s ideas on hypnotism, which is strange. Most people associate Jaynes’s ideas with schizophrenia, which is the hearing of voices, after all. But Jaynes also claimed that his ideas explained hypnotism—hypnotism was a throwback to bicameral consciousness where verbal commands would trigger a trance mode. Both schizophrenia and hypnotism are “throwbacks” to bicameral consciousness, he argued. He even claims that there is no other valid explanation for this hypnotic state in the psychological literature; rather, it’s just handwaved away. As he writes:
…hypnosis is the black sheep of the family of problems which constitute psychology. It wanders in and out of laboratories and carnivals and clinics and village halls like an unwanted anomaly. It never seems to straighten up and resolve itself into the firmer properties of scientific theory. Indeed, its very possibility seems like a denial of our immediate ideas about conscious self-control on the one hand, and our scientific idea about personality on the other. Yet it should be conspicuous that any theory of consciousness and its origin, if it is to be responsible, must face the difficulty of this deviant type of behavioral control.
I think my answer to the opening question in obvious: hypnosis can cause this extra enabling because it engages the general bicameral paradigm which allows a more absolute control over behavior than is possible with consciousness. (original emphasis)
Whether he’s right or not, conventional psychology really does offer no good explanation for hypnotism, reinforcing his point. Hypnotherapy is a legitimate method of therapy nowadays, yet we have no real idea how or why it works!
Finally, Alexander does raise an objection I’ve always had, namely that if Jaynes’s thesis is correct, then anthropologists should have discovered a true bicameral culture somewhere in the world by now, especially in very remote cultures that have been cut off from the wider world. He notes that there are a lot of strange things going on with consciousness detailed in the anthropological literature, but nothing that rises to Jaynes’s description. He also notes that anthropological descriptions that comport somewhat with Jaynes’s description may have been published in various later books.
I believe he’s referring to Gods, Voices, and the Bicameral Mind: The Theories of Julian Jaynes, which is published by the Julian Jaynes society. I’ve been wanting to get a hold of that book, but have been reluctant due to recent events. But I’ve heard Jaynes’s partisans claim that bicameral consciousness has in fact been documented in the anthropological literature, and that the book contains some papers documenting this. So maybe I’m off base here.
Yes, it does seem that something exceptional is going on with the consciousness of pre-contact peoples, but nonetheless, it’s still a bit different than the scenario Jaynes describes in the book. People will mention the Pirahã for example. And while it’s true that there are any number of anomalous events recorded in descriptions of them, they are still different than the bicameral civilization as Jaynes outlines it.
This is often explained by claiming that bicameral consciousness was not a trait of small tribal peoples, but only began with the shift to larger societies during the Mesolithic period. They will point to the construction of large structures like the recently discovered prehistoric circle of shafts near Stonehenge as a sign of the onset of bicameralism. In chapter 1 of book two, he writes:
With but few exceptions, the plan of human group habitation from the end of the Mesolithic up to the relatively recent eras is of a god-house surrounded by man-houses.
Adding on to the idea of god houses, he also pinpoints this as the reason for the elaborate burials of deceased god-kings with grave goods:
The burial of the important dead as if they still lived is common to almost all these ancient cultures whose architecture we have just looked at. This practice has no clear explanation except that their voices were still being heard by the living, and were perhaps demanding such accommodation…these dead kings, propped up on stones, whose voices were hallucinated by the living, were the first gods. (p. 379)
Just about all ancient cultures, from the Near East, to Mesoamerica, to China, look after the departed with goods, food and offerings, and Jaynes claims this is because bicameral man still hallucinated the voices of the dead god-kings in their heads. These elaborate burials and town layouts do not occur with scattered bands of hunter-gatherers such as the Pirahã, or Australian aborigines, or any of the isolated cultures were are likely to find, goes the argument. In Jaynes conception, “early cultures develop into bicameral kingdoms.” And so it’s no surprise that we wouldn’t find any such civilization that we can document anthropolgically, say Jaynes’s defenders.
But I still insist we would have found something similar to this by now. There’s a lot of anthropological literature across a wide range of cultures across the entire world. In this conception, bicameralism is a transient phenomenon which arrives with the onset of larger cultures, and then disappears when those cultures come into contract with outsiders, or become literate. This would mean that bicameralism is a phenomena lasting only a few thousand years at most. I don’t know if I’m willing to accept that.
Overall, aside from my quibbles above, I think the review did a good job of describing Jaynes’s ideas and taking them seriously on their own terms. I particularly liked how the author wrote how the standard depiction of the numerous depictions of gods and men speaking directly to each other as simply metaphorical is basically “kind of cheating”–in a way it is. If we take these phenomena seriously just as they were described, and didn’t use the cheats and dodges of “it’s all just metaphorical” then we come to very different conclusions.
For what it’s worth, I have an alternative concept of Jaynes that I’ve been meaning to write up for a while now. This obviously isn’t the time or the place. But my argument is essentially that, to borrow from Ran Prieur, “ancient people weren’t schizophrenic, they were tripping.” I think Ran’s basically correct. They weren’t literally tripping, of course–its just that their brains were working in way more similar to a modern person on psychedelics than a modern person’s everyday consciousness. Of course, tripping people often hear voices and “see” entities as a matter of course. Any state of consciousness that the brain can achieve with a drug it can achieve without that drug.
The descriptor of this comes from Robin Carhart-Harris’s work on psychedelics in the treatment of psychological disorders. He uses the term “entropy” to describe the differences in how the brain works on a psychedelic versus “normal” consciousness. Entropic brains have a much less defined sense of self, and process the world around them in a fundamentally different way than less entropic ones. I think the way ancient people processed the world was something closer to the entropic brain on a psychedelic, or to the way children perceive things (incidentally, meditation has been shown to increase brain entropy). Why this was the case I’m not sure, but it may have to do with the fact that our own brains probably produce DMT, and that the level may have dropped over time. This could be because instrumental rationality became more adaptive to environments where our major challenge was dealing with other people rather than with nature directly as societies grew larger and more complex. This changed our style of thinking from “primary consciousness” to “secondary consciousness”:
This article proposes that states such as the psychedelic state, REM sleep, the onset-phase of psychosis and the dreamy-state of temporal lobe epilepsy are examples of a regressive style of cognition that is qualitatively different to the normal waking consciousness of healthy adult humans. We will refer to this mode of cognition as “primary consciousness” and the states themselves as “primary states.” To enter a primary state from normal waking consciousness, it is proposed that the brain must undergo a “phase transition”, just as there must have been a phase-transition in the evolution of human consciousness with the relatively rapid development of the ego and its capacity for metacognition. This implies that the relationship between normal waking consciousness and “primary consciousness” is not perfectly continuous.
The entropic brain: a theory of conscious states informed by neuroimaging research with psychedelic drugs (Frontiers in Neuroscience)