Fun Facts March 2019

Sodium Citrate is the secret ingredient to making nacho cheese sauce. Coincidentally, Sodium Citrate’s chemical formula is Na3C6H5O7 (NaCHO)
Cook’s Illustrated Explains: Sodium Citrate (Cook’s Illustrated)

According to the FBI there are 300 times more impostor Navy SEALs than actual SEALs
Don Shipley (Navy SEAL) (Wikipedia)

You were more likely to get a job if you had smallpox scars in the 18th century. The scars proved that you already had smallpox and could not pass it on to your employers.

1,500 private flew into Davos in 2019
1,500 private jets coming to Davos (BoingBoing)

According to US Customs and Border Protection, border crossings of Mexican and Central American refugees ranged from 20,000 to roughly 60,000 people per month in 2018. In Los Algodones [Mexico] alone, nearly five times as many American dental refugees are going the opposite way. To get an idea of the absurdity, one could argue there are more people currently fleeing the US’s health care system than refugees seeking asylum from extreme violence and state terror in Central America.
Millions of Americans Flood Into Mexico for Health Care — the Human Caravan You Haven’t Heard About (Truthout) similarly:

The U.S. government estimates that close to 1 million people in California alone cross to Mexico annually for health care, including to buy prescription drugs. And between 150,000 and 320,000 Americans list health care as a reason for traveling abroad each year. Cost savings is the most commonly cited reason.
American Travelers Seek Cheaper Prescription Drugs In Mexico And Beyond (NPR). Who’s the Third World country now???

Virginia students learn in trailers while state offers Amazon huge tax breaks (The Guardian)

The term “litterbug” was popularized by Keep America Beautiful, which was created by “beer, beer cans, bottles, soft drinks, candy, cigarettes” manufacturers to shift public debate away from radical legislation to control the amount of waste these companies were (and still are) putting out.
A Beautiful (If Evil) Strategy (Plastic Pollution Coalition)

Americans Got 26.3 Billion Robocalls Last Year, Up 46 Percent From 2017.

Over the past 20 years, more than $7 billion in public money has gone toward financing the construction and renovation of NFL football stadiums.
Why do taxpayers pay billions for football stadiums? (Vox)

San Francisco has more drug addicts than it has students enrolled in its public high schools.

By 2025, deaths from illicit opioid abuse are expected to skyrocket by 147%, up from 2015. Between 2015 and 2025, around 700,000 people are projected to die from an opioid overdose, and 80% of these will be caused by illicit opioids such as heroin and fentanyl. (in other words, everything is going according to plan)

35% of the decline in fertility between 2007 and 2016 can be explained by declines in births that were likely unintended, and that this is driven by drops in births to young women.

In 1853, not many Americans worked in an office. Even as late as the 1880s, fewer than 5 percent of Americans were involved in clerical work.
The Open Office and the Spirit of Capitalism (American Affairs)

About 40% of young adults cannot afford to buy one of the cheapest homes in their area in the UK, with the average deposit now standing at about £26,000
Young people living in vans, tiny homes and containers (BBC)

Terror attacks by Muslims receive an average of 357 percent more media coverage than those by other groups. (Newsweek). Maybe the New Zealand mosque shooting will change that.

One-third of the billions of dollars [GoFundMe] has raised since its inception went toward somebody’s medical expenses.
US Healthcare Disgrace: GoFundMe-Care Symptomatic of Extreme Inequality (Who. What. Why)

40% of police officer families experience domestic violence, in contrast to 10% of families in the general population.

After water, concrete is the most widely used substance on Earth. If the cement industry were a country, it would be the third largest carbon dioxide emitter in the world with up to 2.8bn tonnes, surpassed only by China and the US.
Concrete: the most destructive material on Earth (The Guardian)

Rural areas have not even recovered the jobs they lost in the recession….Suicide rates are on the rise across the nation but nowhere more so than in rural counties.
Two-Thirds of Rural Counties Have Fewer Jobs Today Than in 2007 (Daily Yonder)

Mapping the rising tide of suicide across the United States (Washington Post). According to plan…

On any given day, 37 percent of American adults eat fast food. For those between 20 and 39 years old, the number goes up to 45 percent – meaning that almost half of younger adults are eating fast food daily.
4 troubling ways fast food has changed in 30 years (Treehugger)

Global investors dumped $4.2 billion into companies working on self-driving cars (or autonomous vehicles, AVs) in the first 3 quarters of 2018.
In Praise of Dumb Transportation (Treehugger)

In the early Middle Ages, nearly one out of every thousand people in the world lived in Angkor, the sprawling capital of the Khmer Empire in present-day Cambodia.
The city of Angkor died a slow death (Ars Technica)

Neanderthals are depicted as degenerate and slouching because the first Neanderthal skeleton found happened to be arthritic.
20 Things You didn’t Know About Neanderthals (Discover)

There were more than twice as many suicides (44,193) in the US in 2018 as there were homicides (17,793)
College Dreams Dashed (Psychology Today)

Adolescents are more likely to feel depressed and self-harm, and are less likely to get a full night’s sleep, than 10 years ago.
Adolescent health: Teens ‘more depressed and sleeping less’ (BBC)

When his eight years as President of the United States ended on January 20, 1953, private citizen Harry Truman took the train home to Independence, Missouri, mingling with other passengers along the way. He had no secret service protection. His only income was an Army pension. (Reddit)

Khoisan people of South Africa were once the most populous humans on Earth. (Ancient Origins)

[T]he contribution of top firms to US productivity growth has dropped by over 40 percent since 2000. [If] in the 1960s you were to double the productivity of GM, that would clearly have a huge impact on the economy. If you were to double the productivity of Facebook overnight, it wouldn’t even move the needle – you would get slightly better targeted ads, but zero impact on the economy.
The “Biggest Puzzle in Economics”: Why the “Superstar Economy” Lacks Any Actual Superstars (ProMarket)

Almost half of new cancer patients lose their entire life savings. (Insider)

The son of a US Governor is 6,000 times more likely to become a Governor than the average American and the son of a US Senator is 8,500 times more likely to become a senator than the average American. (Reddit)

From 1987 until 2011-12—the most recent academic year for which comparable figures are available—universities and colleges collectively added 517,636 administrators and professional employees…

Part-time faculty and teaching assistants now account for half of instructional staffs at colleges and universities, up from one-third in 1987. During the same period, the number of administrators and professional staff has more than doubled. That’s a rate of increase more than twice as fast as the growth in the number of students.
New Analysis Shows Problematic Boom In Higher Ed Administrators (Huffington Post)

From 2009 to 2017, major depression among 20- to 21-year-olds more than doubled, rising from 7 percent to 15 percent. Depression surged 69 percent among 16- to 17-year-olds. Serious psychological distress, which includes feelings of anxiety and hopelessness, jumped 71 percent among 18- to 25-year-olds from 2008 to 2017. Twice as many 22- to 23-year-olds attempted suicide in 2017 compared with 2008, and 55 percent more had suicidal thoughts. The increases were more pronounced among girls and young women. By 2017, one out of five 12- to 17-year-old girls had experienced major depression in the previous year.
The mental health crisis among America’s youth is real – and staggering (The Conversation)

Infectious diseases that ravaged populations in the Middle Ages are resurging in California and around the country, especially in homeless encampments.
“Medieval” Diseases Flare as Unsanitary Living Conditions Proliferate (Truthout) Who’s the Third World Country? Repeat after me, “according to plan…”

Benjamin Franklin chose never to patent any of his inventions or register any copyright (

I think it’s time to get the hell out of here:

Rhapsody on Blue

A few years ago, a photograph went “viral” on the internet. It was just a simple picture of a dress. What was so compelling about it?

Well, what was so incredible about this particular photo was that nobody could agree about what color it was. Some people said it was white with gold stripes. Others insisted, just as firmly, that it was blue with black stripes (which is what I saw). As the BBC reported, even Kim and Kanye couldn’t agree, but decided to stay together for the sake of the money and the fame.

Why everyone is asking: What colour is this dress?’ (BBC)

White & Gold or Blue & Black? Science of the Mystery Dress (Live Science)

Relevant xkcd:

This brings to mind an old adage I head a long time ago: “You don’t see with your eyes. You see with your brain with the help of your eyes.”

And that simple, yet profound, distinction makes all the difference. Once you grasp that, a lot of these ideas begin falling into place.

For another example somewhat more pertinent to our discussion of auditory hallucinations, a sound clip went viral in much the same way. When the clip was played, some people heard the name “Laurel”. Others insisted that what the clip really said was “Yanny”. As one researcher said of these illusions, “All of this goes to highlight just how much the brain is an active interpreter of sensory input, and thus that the external world is less objective than we like to believe.”

‘Yanny’ or ‘Laurel’? Why Your Brain Hears One or the Other in This Maddening Illusion (Live Science)

Of course, the ultimate reason for the illusion was exactly the same: You don’t hear with your ears. You hear with your brain with the help of your ears.

Now, you need to keep this in mind with the discussion we’re about to have.

We’ve talked previously about how metaphor, analogy, language, and culture shape our perceptions of the world around us. It turns out that numerous studies have confirmed that the classification schemes, metaphors, models, and language that we use colors our perception of the so-called “objective” world. And ‘colors’ turns out to be an apt word.

For example, many cultures around the world do not make a distinction between the colors blue and green. That is, they don’t actually have a for ‘blue’; rather blue and green are classified as different shades of the same color. In fact, 68 languages use green-or-blue (grue) words compared to only 30 languages that use distinct words for green and blue.This does not mean that people in these cultures literally cannot ‘see’ the color blue, as if they perceived it as another color, or as somehow invisible (color perception is created by light wavelengths striking cone cells on the retina). Rather, they simply felt that no special distinction needed to be made between these colors in the language.

It turns out that this actually affects how such cultures perceive the world around them. The Himba (whom we mentioned previously) also do not make a distinction. When given a task of identifying which shades of blue and green were different, they were slower than cultures which do make such a distinction. By contrast, they do differentiate multiple shades of green, and were able to identify a different shade of green faster than people in cultures who do not make such a distinction (such as ours).

…there’s actually evidence that, until modern times, humans didn’t actually see the colour blue…the evidence dates all the way back to the 1800s. That’s when scholar William Gladstone – who later went on to be the Prime Minister of Great Britain – noticed that, in the Odyssey, Homer describes the ocean as “wine-dark” and other strange hues, but he never uses the word ‘blue’.

A few years later, a philologist (someone who studies language and words) called Lazarus Geiger decided to follow up on this observation, and analysed ancient Icelandic, Hindu, Chinese, Arabic, and Hebrew texts to see if they used the colour. He found no mention of the word blue.

When you think about it, it’s not that crazy. Other than the sky, there isn’t really much in nature that is inherently a vibrant blue.

In fact, the first society to have a word for the colour blue was the Egyptians, the only culture that could produce blue dyes. From then, it seems that awareness of the colour spread throughout the modern world…Another study by MIT scientists in 2007 showed that native Russian speakers, who don’t have one single word for blue, but instead have a word for light blue (goluboy) [голубой] and dark blue (siniy) [синий], can discriminate between light and dark shades of blue much faster than English speakers.

This all suggests that, until they had a word from it, it’s likely that our ancestors didn’t actually see blue. Or, more accurately, they probably saw it as we do now, but they never really noticed it…

There’s Evidence Humans Didn’t Actually See Blue Until Modern Times (Science Alert – note the title is misleading)

In fact, the way color is described throughout the Iliad is distinctly odd, a fact that scholars have long noted:

Homer’s descriptions of color in The Iliad and The Odyssey, taken literally, paint an almost psychedelic landscape: in addition to the sea, sheep were also the color of wine; honey was green, as were the fear-filled faces of men; and the sky is often described as bronze.

It gets stranger. Not only was Homer’s palette limited to only five colors (metallics, black, white, yellow-green, and red), but a prominent philosopher even centuries later, Empedocles, believed that all color was limited to four categories: white/light, dark/black, red, and yellow. Xenophanes, another philosopher, described the rainbow as having but three bands of color: porphyra (dark purple), khloros, and erythros (red).

The Wine-Dark Sea: Color and Perception in the Ancient World (Clarkesworld Magazine)

Perhaps the blind poet was, indeed, tripping. But the ancient Greeks were hardly alone in their unusual description of colors:

The conspicuous absence of blue is not limited to the Greeks. The color “blue” appears not once in the New Testament, and its appearance in the Torah is questioned (there are two words argued to be types of blue, sappir and tekeleth, but the latter appears to be arguably purple, and neither color is used, for instance, to describe the sky). Ancient Japanese used the same word for blue and green (青 Ao), and even modern Japanese describes, for instance, thriving trees as being “very blue,” retaining this artifact (青々とした: meaning “lush” or “abundant”).

It turns out that the appearance of color in ancient texts, while also reasonably paralleling the frequency of colors that can be found in nature (blue and purple are very rare, red is quite frequent, and greens and browns are everywhere), tends to happen in the same sequence regardless of civilization: red : ochre : green : violet : yellow—and eventually, at least with the Egyptians and Byzantines, blue.

The Wine-Dark Sea: Color and Perception in the Ancient World (Clarkesworld Magazine)

Of course, biology has a role to play here too. If someone is red/green color blind, which about 1 in 10 men are, they will make no differentiation between red and green. Nor will they be able to adequately describe what they are seeing to those of us who are not color-blind.

I always remember a discussion I had many years ago with a friend of mine who was color-blind (the one who drowned, incidentally). I asked him if he saw red and green as both red or both green. Here’s what he told me: “They’re the same.”

Me:‘The same’ as in they’re both red, or ‘the same’ as in they’re both green?”

Him: Neither. They’re just the same.

Me: So…they’re both gray then? No color at all.

Him: No, it’s not gray. It’s a color.

Me: Okay, which color? Red or green?

Him: Neither.

Me: How can it be neither? It has to be a color. Which color is it, red or green? Or some other color?

Him: I don’t know. they’re just…the same.

And on and on we went…

The Radiolab podcast did a whole episode on the topic which is worth a listen: Why the sky isn’t blue (Radiolab)

And a video explanation: The Invention Of Blue (YouTube)

The World Atlas of Language Structures Online has an entire entry devoted to terms for Green and Blue that is worth reading.

This post: Blue on Blue goes into this topic in exhaustive detail.

Perception is as much cognition as sensation. Colors don’t exist in the world. It is our brain’s way of processing light waves detected by the eyes. Someone unable to see from birth will never be able to see normal colors, even if they gain sight as an adult. The brain has to learn how to see the world and that is a process that primarily happens in infancy and childhood.

Radical questions follow from this insight. Do we experience blue, forgiveness, individuality, etc. before our culture has the language for it? And, conversely, does the language we use and how we use it indicate our actual experience? Or does it filter and shape it? Did the ancients lack not only perceived blueness but also individuated/interiorized consciousness and artistic perspective because they had no way of communicating and expressing it? If they possessed such things as their human birthright, why did they not communicate them in their texts and show them in their art?

This isn’t just about color. There is something extremely bizarre going on, according to what we moderns assume to the case about the human mind and perception.

Blue on Blue (Benjamin David Steele – a lot of material on Jaynes’s ideas here)

Another example is the fact that some cultures don’t have words of the type of relative directions that we have (left, right, etc.). Instead, they only have the cardinal directions—north, south, east, and west. This “exocentric orientation” gives them an almost superhuman sense of direction and orientation compared to people in Industrialized cultures:

In order to speak a language like Guugu Yimithirr, you need to know where the cardinal directions are at each and every moment of your waking life. You need to have a compass in your mind that operates all the time, day and night, without lunch breaks or weekends off, since otherwise you would not be able to impart the most basic information or understand what people around you are saying.

Indeed, speakers of geographic languages seem to have an almost-superhuman sense of orientation. Regardless of visibility conditions, regardless of whether they are in thick forest or on an open plain, whether outside or indoors or even in caves, whether stationary or moving, they have a spot-on sense of direction. They don’t look at the sun and pause for a moment of calculation before they say, “There’s an ant just north of your foot.” They simply feel where north, south, west and east are, just as people with perfect pitch feel what each note is without having to calculate intervals.

There is a wealth of stories about what to us may seem like incredible feats of orientation but for speakers of geographic languages are just a matter of course. One report relates how a speaker of Tzeltal from southern Mexico was blindfolded and spun around more than 20 times in a darkened house. Still blindfolded and dizzy, he pointed without hesitation at the geographic directions.

Does Your Language Shape How You Think? (New York Times)

The reference to perfect pitch is interesting, since it’s more likely for speakers of tonal languages (say, Mandarin Chinese or Vietnamese) to have perfect pitch than people who do not speak a tonal language (such as English). Another common feature of many languages is that statements, by their very syntactic structure, establish whether the speaker knows something for sure, or is making an extrapolation. For example:

…some languages, like Matsés in Peru, oblige their speakers, like the finickiest of lawyers, to specify exactly how they came to know about the facts they are reporting. You cannot simply say, as in English, “An animal passed here.” You have to specify, using a different verbal form, whether this was directly experienced (you saw the animal passing), inferred (you saw footprints), conjectured (animals generally pass there that time of day), hearsay or such. If a statement is reported with the incorrect “evidentiality,” it is considered a lie.

So if, for instance, you ask a Matsés man how many wives he has, unless he can actually see his wives at that very moment, he would have to answer in the past tense and would say something like “There were two last time I checked.” After all, given that the wives are not present, he cannot be absolutely certain that one of them hasn’t died or run off with another man since he last saw them, even if this was only five minutes ago. So he cannot report it as a certain fact in the present tense. Does the need to think constantly about epistemology in such a careful and sophisticated manner inform the speakers’ outlook on life or their sense of truth and causation?

Does Your Language Shape How You Think? (New York Times)

The Pirahã of the Brazilian Amazon have a number of these linguistic anomalies, as reported by Daniel Everett. Most famously, they do not use recursion in their language. They have essentially no numbering system—their only numbers are, one, two, and many. Nouns have no plural form. They have no simple categorical words for colors, rather they describe color in terms of various things in their environment, somewhat reminiscent of Homer’s graphic descriptions above:

I next noticed…that the Pirahãs had no simple color words, that is, no terms for color that were not composed of other words. I had originally simply accepted Steve Sheldon’s analysis that there were color terms in Pirahã. Sheldon’s list of colors consisted of the terms for black, white, red (also referring to yellow), and green (also referring to blue).

However, these were not simple words, as it turned out. They were phrases. More accurate translations of the Pirahã words showed them to mean: “blood is dirty” for black; “it sees” or “it is transparent” for white; “it is blood” for red; and “it is temporarily being immature” for green.

I believe that color terms share at least one property with numbers. Numbers are generalizations that group entities into sets that share general arithmetical properties, rather than object-particular, immediate properties. Likewise, as numerous studies by psychologists, linguists, and philosophers have demonstrated, color terms are unlike other adjectives or other words because they involve special generalizations that put artificial boundaries in the spectrum of visible light.

This doesn’t mean that the Pirahãs cannot perceive colors or refer to them. They perceive the colors around them like any of us. But they don’t codify their color experiences with single worlds that are inflexibly used to generalize color experiences. They use phrases.

“Don’t Sleep There Are Snakes” by Daniel Everett, p. 119

They also do not have any relative directions like ‘left’ and ‘right’; only absolute ones, much like Australian groups. In their culture, everything is oriented relative to the river beside which they live:

During the rest of our hunt, I noticed that directions were given either in terms of the river (upriver, downriver, to the river) or the jungle (into the jungle). The Pirahãs knew where the river was (I couldn’t tell-I was thoroughly disoriented). They all seemed to orient themselves to their geography rather than to their bodies, as we do when we use left hand and right hand for directions.

I didn’t understand this. I had never found the words for left hand and right hand. The discovery of the Pirahãs’ use of the river in giving directions did explain, however, why when the Pirahãs visited towns with me, one of their first questions was “Where is the river?” They needed to know how to orient themselves in the world!

Only years later did I read the fascinating research coming from the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands, under the direction of Dr. Stephen C. Levinson. In studies from different cultures and languages, Levinson’s team discovered two broad divisions in the ways cultures and languages give local directions. Many cultures are like American and European cultures and orient themselves in relative terms, dependent on body orientation, such as left and right. This is called by some endocentric orientation. Others, like the Pirahas, orient themselves to objects external to their body, what some refer to as exocentric orientation.

“Don’t Sleep There Are Snakes” by Daniel Everett p. 216

Despite what some might characterize as simplicity, the verbs in the language display a remarkable complexity and nuance:

Although Pirahã nouns are simple, Pirahã verbs are much more complicated. Each verb can have as many as sixteen suffixes-that is, up to sixteen suffixes in a row. Not all suffixes are always required, however. Since a suffix can be present or absent, this gives us two possibilities for each of the sixteen suffixes-216 or 65,536, possible forms for any Pirahã verb. The number is not this large in reality because some of the meanings of different suffixes are incompatible and could not both appear simultaneously. But the number is still many times larger than in any European language. English only has in the neighborhood of five forms for any verb-sing, sang, sung, sings, singing. Spanish, Portuguese, and some other Romance languages have forty or fifty forms for each verb.

Perhaps the most interesting suffixes, however (though these are not unique to Pirahã), are what linguists call evidentials, elements that represent the speaker’s evaluation of his or her knowledge of what he or she is saying. There are three of these in Pirahã: hearsay, observation, and deduction…The placement of all the various suffixes on the basic verb is a feature of grammar. There are sixteen of these suffixes. Meaning plays at least a partial role in how they are placed. So, for example, the evidentials are at the very end because they represent a judgment about the entire event being described. DSTAS; pp. 196-197

This brings to mind a fascinating point that is not widely known: as material cultures become more complex, their languages actually become more simplified!

Comparing languages across differing cultures suggests an inverse relation between the complexity of grammar and the complexity of culture; the simpler the culture in material terms, the more complex the grammar. Mark Turin notes that colonial-era anthropologists set out to show that indigenous peoples were at a lower stage of evolutionary development than the imperial Western peoples, but linguistic evidence showed the languages of supposedly primitive peoples to have surprisingly complex grammar.

He writes: “Linguists were returning from the field with accounts of extremely complex verbal agreement systems, huge numbers of numeral classifiers, scores of different pronouns and nouns, and incredible lexical variation for terms that were simple in English. Such languages appeared to be untranslatable…­(p.17)…Thus the languages of simpler cultures tend to pack grammatical information into single words, whereas those of industrial society tend to use separate words in combination to create grammatical distinctions…(p.52)…In some languages, entire sentences are packed into a single word. Nicholas Evans and Stephen Levinson give the examples of Ęskakhǭna’tàyęthwahs from the Cayuga of North America, which means “I will plant potatoes for them again,” and abanyawoihwarrgahmarneganjginjeng from the Northern Australian language Bininj Gun-wok, and means “I cooked the wrong meat for them again.” (pp. 16-17)

“The Truth About Language” by Michael C. Corballis

Last time we referred to the substantial differences in behavior that were discovered by Joseph Henrich, et alia, between Western “WEIRD” cultures and, well, just about everyone else.

As Heine, Norenzayan, and Henrich furthered their search, they began to find research suggesting wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic.

The distinct ways Americans and Machiguengans played the ultimatum game, for instance, wasn’t because they had differently evolved brains. Rather, Americans, without fully realizing it, were manifesting a psychological tendency shared with people in other industrialized countries that had been refined and handed down through thousands of generations in ever more complex market economies.

When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit, a call to the Better Business Bureau, or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In the small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our sense of fairness; it was the other way around.

The growing body of cross-cultural research that the three researchers were compiling suggested that the mind’s capacity to mold itself to cultural and environmental settings was far greater than had been assumed. The most interesting thing about cultures may not be in the observable things they do—the rituals, eating preferences, codes of behavior, and the like—but in the way they mold our most fundamental conscious and unconscious thinking and perception.

We Aren’t the World (Pacific Standard)

It brings to mind another old adage: “What we call human nature is really human habit.” That may not be true for everything, but it looks it may be true for at least some things.

Jaynes makes a great deal about the fact that the Greek language lacked any reference to an inner decision-making process (mind), or to any kind of “soul” apart from the body. When it isn’t locating the source of actors’ motivations in the gods speaking directly to them, it is locating it in various parts of the body or internal organs. The terms used in place of any kind of reference to mind or spirit are often body parts—heart, chest, lungs, liver, spleen, guts, and so on. These body parts later come to refer to a mind or soul (e.g. nous or psyche), but only much later. Psyche, for example, initially referred to ‘breath’, and nous (noos) referred to vision. Only much later do these words become associated with concepts of spirit, soul, or self. Put another, somewhat more precise, way by Brian McVeigh, “[L]inguo-conceptual changes [reflect] psychohistorical developments; because supernatural entities functioned in place of our inner selves, vocabularies for psychological terms were strikingly limited in ancient languages.” Jaynes writes:

There is in general no consciousness in the Iliad. I am saying ‘in general’ because I shall mention some exceptions later. And in general, therefore, no words for consciousness or mental acts. The words in the Iliad that in a later age come to mean mental things have different meanings, all of them more concrete. The word psyche, which later means soul or conscious mind, is in most instances life-substances, such as blood or breath: a dying warrior breathes out his psyche onto the ground or breathes it our in his last gasp.

The thumos, which later comes to mean something like emotional soul, is simply motion or agitation. When a man stop moving, the thumos leaves his limbs. But it is also somehow like an organ itself, for when Glaucus prays to Apollo to alleviate his pain and to give strength to help his friend Sarpedon, Apollo hears his prayer and “casts strength in his thumos“. The thumos can tell a man to eat, drink, or fight. Diomedes says in one place that Achilles will fight “when the thumos in his chest tells him to and a god rouses him.” But it is not really an organ and not always localized; a raging ocean has thumos.

A word of somewhat similar use is phren, which is always localized anatomically as the midriff, or sensations in the midriff, and is usually used in the plural. It is the phrenes of Hector that recognize that his brother is not near him; this means what we mean by “catching one’s breath in surprise”. It is only centuries later that it comes to mean mind or ‘heart’ in its figurative sense.

Perhaps most important is the word noos which, spelled as nous in later Greek, comes to mean conscious mind. It comes from the world noeein, to see. Its proper translation in the Iliad would be something like perception or recognition or field of vision. Zeus “holds Odysseus in his noos.” He keeps watch over him.

Another important word, which perhaps comes from the doubling of the word meros (part), is mermera, meaning in two parts. This was made into a verb by adding the ending -izo, the common suffix which can turn a noun into a verb, the resulting word being mermerizein, to be put into two parts about something. Modern translators, for the sake of supposed literary quality in their work, often use modern terms and subjective categories which are not true to the orignal. Mermerizein is thus wrongly translated as to ponder, to think, to be of divided mind, to be troubled about, to try to decide. But essentially it means to be in conflict about two actions, not two thoughts. It is always behavioristic. It is said several times of Zeus, as well as others. The conflict is often said to go on in the thumos, or sometimes in the phrenes, but never in the noos. The eye cannot doubt or be in conflict, as the soon-to-be-invented conscious mind will be able to.

These words are in general, and with certain exception, the closest that anyone, authors or characters or gods, usually get to having conscious minds or thoughts.

There is also no concept of will or word for it, the concept developing curiously late in Greek thought. Thus, Iliadic men have no will of their own and certainly no notion of free will. Indeed, the whole problem of volition, so troubling, I think, to modern psychological theory, may have had its difficulties because the words for such phenomena were invented so late.

A similar absence from Iliadic language is a word for body in our sense. The word soma, which in the fifth century B.C. comes to mean body, is always in the plural in Homer and means dead limbs or a corpse. It is the opposite of psyche. There are several words which are used for various parts of the body, and, in Homer, it is always these parts that are referred to, and never the body as a whole.

Now this is all very peculiar. If there is no subjective consciousness, no mind, soul, or will, in Iliadic men, what then imitates behavior? OoCitBotBM; pp. 69-71

Essentially, what Jaynes is doing is trying to use language to understand the consciousness of these ancient people, similar to what we saw anthropologists and linguists doing for the various remote and isolated cultures currently in existence. Their language may not dictate reality, but the words they use to describe their world offer a clue, perhaps the only clue, as to how they perceive themselves, their world, and their place in it; and how it might be different than our ego-driven point of view. After all, we can’t just hop in a time machine and head back to administer psychological tests.

P.S. As an aside to the idea of aural hallucinations, a fascinating study found that non-clinical voice hearers could distinguish “hidden speech” far more effectively than others. This is especially interesting since most studies featuring voice-hearers use the clinical (schizophrenic, epileptic, Parkinson’s, etc.) population, rather than ordinary people. The reasons for this ability are not known:

The study involved people who regularly hear voices, also known as auditory verbal hallucinations, but do not have a mental health problem. Participants listened to a set of disguised speech sounds known as sine-wave speech while they were having an MRI brain scan. Usually these sounds can only be understood once people are either told to listen out for speech, or have been trained to decode the disguised sounds.

Sine-wave speech is often described as sounding a bit like birdsong or alien-like noises. However, after training people can understand the simple sentences hidden underneath (such as “The boy ran down the path” or “The clown had a funny face”).

In the experiment, many of the voice-hearers recognised the hidden speech before being told it was there, and on average they tended to notice it earlier than other participants who had no history of hearing voices.The brains of the voice-hearers automatically responded to sounds that contained hidden speech compared to sounds that were meaningless, in the regions of the brain linked to attention and monitoring skills.

People who ‘hear voices’ can detect hidden speech in unusual sounds (Science Daily)

P.P.S. xkcd did a public survey on color perception and naming a while back:

The Archaic Mentality

The inspiration for this series of posts was an article in Psychology Today entitled: Did Our Ancestors Think Like Us? I’m pretty confident that they didn’t, but in what sense did their differ? Were they as different as Jaynes described, or was it something less extreme?

Imagine that you are a time-traveler, able to travel back roughly 40,000 years to the age of the first anatomically modern homo sapiens. Imagine stepping out of your time machine and standing face to face with one of your ancestors: Another human with a brain just as big as yours, and genes virtually identical to your genes. Would you be able to speak to this ancient human? Befriend them? Fall in love with them? Or would your ancestor be unrecognizable, as distinct from you as a wolf is distinct from a pet dog?

…Some think that, since we have the same genes as ancient humans, we should show the same mannerisms. Others suspect that human psychology may have changed dramatically over time. Nobody definitely knows (I certainly don’t), but my hunch is that the human mind today works very differently than did our ancestor’s minds.

Did Our Ancestors Think Like Us? (Psychology Today)

Brian McVeigh sums up Jaynes’s ideas this way:

In The Origin of Consciousness in the Breakdown of the Bicameral Mind [Jaynes] argued that conscious subjective interiority was not a bioevolutionary phenomenon. Rather, interiority—and by this term he did not mean perceiving, thinking or reasoning but the ability to introspect and engage in self-reflectivity—emerged historically as a cultural construction only about three millennia ago.
The Psychohistory of Metaphors, Brian McVeigh p. 133

I would argue that there is recent psychological research that tentatively backs up some of Jaynes’ claims. New research has shown that a lot of what we thought was just “basic human cognition” turns out to be socioculturally constructed. Much of the world today does not think or reason in the same way as members of Western industrial societies do. The blogger writes:

Many animals learn how to solve problems by watching other animals try and fail, but humans appear to take social learning to another level: we learn how to think from one another.

Consider that when people move to a new culture, they actually begin taking on the emotions of that culture, reporting more everyday sadness in cultures that feel more sadness and surprise in cultures where people feel more surprise. Consider that people’s ability to read others’ thoughts and feelings from their behavior depends on the number of words in their native language indicating mental states. Consider that people’s level of prejudice towards other groups (i.e. the extent of their “us versus them” mentality) and moral convictions (i.e. their belief that some acts are fundamentally right or wrong) strongly depends on whether or not they follow an Abrahamic religion. And consider that people’s ability to think “creatively,” to generate new solutions that diverge from old ones, depends on how strictly their culture regulates social norms. This is just a small sampling from hundreds of studies that show how flexible the human mind is.

For a graphic example, it was recently determined that the “primitive” Himba of Namibia are actually more mental agile than supposedly “high IQ” Westerners at solving novel problems:

“We suggest that through formal education, Westerners are trained to depend on learned strategies. The Himba participate in formal education much less often and this is one possible reason why they exhibited enhanced cognitive flexibility,”

Cognitive neuroscientists observe enhanced mental flexibility in the seminomadic Himba tribe (PsyPost). He continues:

The second reality that makes me think our minds work differently today than they did thousands of years ago is that human culture is staggeringly diverse. We speak over 6,000 languages, follow 4,000 religions, and live our lives according to a sprawling set of social and moral customs. Some other animals have diverse culture: Chimpanzees, for example, forage for food in a number of different ways that are probably socially learned. But human cultural diversity goes beyond one or two kinds of differences; our cultures are different in almost every way imaginable. The development of this cultural diversity may have had a profound impact on our psychologies.

When you put these realities together, you have (a) an amazingly diverse species with (b) an amazing capacity to learn from diversity. Add thousands of years of development and cultural change to the mix and you likely get modern human thinking that scarcely resembles ancient human psychology. This doesn’t mean that today’s humans are “better” than yesterday’s; it just means that humans are fascinating animals, more cognitively malleable than any other.

The writer doesn’t get into more detail than that, and there aren’t any further explanations so far. But the idea was backed up by a landmark paper which came out a few years ago by was Joseph Henrich, along with Steven J. Heine and Ara Norenzayan. They write:

There are now enough sources of experimental evidence, using widely differing methods from diverse disciplines, to indicate that there is substantial psychological and behavioral variation among human populations.

The reasons that account for this variation may be manifold, including behavioral plasticity in response to different environments, divergent trajectories of cultural evolution, and, perhaps less commonly, differential distribution of genes across groups in response to different selection pressures… At the same time, we have also identified many domains in which there are striking similarities across populations. These similarities could indicate reliably developing pan-human adaptations, byproducts of innate adaptations (such as religion), or independent cultural inventions or cultural diffusions of learned responses that have universal utility (such as counting systems, or calendars)…

Not only aren’t Americans typical of how the rest of the world thinks, but Americans are shockingly different (surprising, huh?). As one writer put it, “Social scientists could not possibly have picked a worse population from which to draw broad generalizations. Researchers had been doing the equivalent of studying penguins while believing that they were learning insights applicable to all birds.”

As you might imagine, one of the major differences has to do with radical individualism. Americans see themselves as “rugged individualists,” whereas everyone else sees themselves as part of a larger social fabric:

[S]ome cultures regard the self as independent from others; others see the self as interdependent. The interdependent self — which is more the norm in East Asian countries, including Japan and China — connects itself with others in a social group and favors social harmony over self-expression. The independent self — which is most prominent in America — focuses on individual attributes and preferences and thinks of the self as existing apart from the group.

…Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically. That is, the American mind strives to figure out the world by taking it apart and examining its pieces. Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles, and other objects in the background. Shown another way, in a different test analytic Americans will do better on…the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.

Are Americans the Weirdest People on Earth? (Big Think)

As for why Americans, and WEIRD (Western, Educated, Industrial, Rich, Democratic) countries more generally, are so different than the rest of the world, the authors of the original paper speculate:

To many anthropologically-savvy researchers it is not surprising that Americans, and people from modern industrialized societies more generally, appear unusual vis-á-vis the rest of the species.

For the vast majority of its evolutionary history, humans have lived in small-scale societies without formal schools, government, hospitals, police, complex divisions of labor, markets, militaries, formal laws, or mechanized transportation. Every household provisioned much or all of their own food, made its own clothes, tools, and shelter, and–aside from various kinds of sexual divisions of labor–almost everyone had to master the same skills and domains of knowledge.

Children grew up in mixed age play groups, received little active instruction, and learned largely by observation and imitation. By age 10, children in some foraging societies obtain sufficient calories to feed themselves, and adolescent females take on most of the responsibilities of women.

WEIRD people, from this perspective, grow up in, and adapt, to a highly unusual environment. It should not be surprising that their psychological world is unusual as well. p. 38 (emphasis mine)

I wrote about this study back in 2013: Americans are WEIRD.

The differences in American thinking and the rest of the world seem to mirror the left brain/right brain split described by Ian McGilchrist:

The left hemisphere is dependent on denotative language, abstraction, yields clarity and power to manipulate things that are known and fixed. The right hemisphere yields a world of individual, changing, evolving, interconnected, living beings within the context of the lived world. But the nature of things is never fully graspable or perfectly known. This world exists in a certain relationship. They both cover two versions of the world and we combine them in different ways all the time. We need to rely on certain things to manipulate the world, but for the broad understanding of it, we need to use knowledge that comes from the right hemisphere.

A Psychiatrist Explains the Difference Between Left Brain and Right Brain (Hack Spirit)

Given that thousands of years ago, there were NO industrial countries with a majority of the population educated, wealthy, or literate, it’s pretty obvious that thinking must have been quite different. Of course, that does not prove Jaynes’s ideas. However, if even modern psychology researchers report substantial differences among existing populations, why it hard to believe that people separated from us by thousands of years in time are more different that us than alike?

It’s also worth pointing out that the fundamental structure of our brain changes in response to activities we undertake to navigate our environment. It’s been hypothesized that the use of the internet and ubiquitous computer screens are “rewiring” our brains in some, possibly nefarious, way. An article about this topic in the BBC points out that this is not new–everything we do rewires our brains in some way. In other words, we do not come into the world completely “done” – much of how our brains function is culturally determined. This, in turn, changes the brain’s structure. So we need not posit that somehow the brain architecture of bicameral people was radically different, only that they were using their brains in a different way as determined by the cultural context.

We regularly do things that have a profound effect on our brains – such as reading or competitive sports – with little thought for our brain fitness. When scientists look at people who have spent thousands of hours on an activity they often see changes in the brain. Taxi drivers, famously, have a larger hippocampus, a part of the brain recruited for navigation. Musicians’ brains devote more neural territory to brain regions needed for playing their instruments. So much so, in fact, that if you look at the motor cortex of string players you see bulges on one side (because the fine motor control for playing a violin, for example, is only on one hand), whereas the motor cortex of keyboard players bulges on both sides (because piano playing requires fine control of both hands).

Does the internet rewire our brains? (BBC Future)

In a book I cited earlier, Alone in the World? the author lists the items that archaeologists look for to indicate behavioral modernity (since culture is ephemeral and does not fossilize):

1. A spoken language;

2. The cognitive capacity to generate mental symbols, as expressed in art and religion;

3. Explicit symbolic behavior, i.e., the ability to represent objects, people, and abstract concepts with arbitrary symbols, vocal or visual, and to reify such symbols in cultural practices like painting, engraving, and sculpture;

4. The capacity for abstract thinking, the ability to act with reference to abstract concepts not limited to time and space;

5. Planning depth, or the ability to formulate strategies based on past experience and to act one them in group context;

6. Behavioral, economic, and technological innovation; and

7. A bizarre inability to sustain prolonged bouts of boredom.

Often people cite the spectacular cave art of Ice Age Europe as evidence that the people living in such caves must have been behaviorally modern. But consider that some of the most sought-after art in the twentieth century was made by patients suffering from schizophrenia (voice hearing)!

The Julian Jaynes Society has compiled a list of questions about the behavior of ancient peoples that are difficult to explain without recourse to some kind of bicameral theory. I’ve copied and abridged their list below:

1. The Saliency and “Normalcy” of Visions in Ancient Times. Why have hallucinations of gods in the ancient world been noted with such frequency?

2. The Frequency of “Hearing Voices” Today. Why do auditory hallucinations occur more frequently in the general population than was previously known? If hallucinations are simply a symptom of a dysfunctional brain, they should be relatively rare. Instead, they have been found in normal (non-clinical) populations worldwide.

3. Imaginary Companions in Children. Why do between one-quarter and one-third of modern children “hear voices,” called imaginary companions?

4. Command Hallucinations. Why do patients labeled schizophrenic, as well as other voice-hearers, frequently experience “command hallucinations” that direct behavior — as would be predicted by Jaynes’s theory? If hallucinations are simply a symptom of a dysfunctional brain, one would expect they would consist of random voices, not commentary on behavior and behavioral commands.

5. Voices and Visions in Pre-literate Societies. Why are auditory and visual hallucinations, as well as divination practices and visitation dreams, found in pre-literate societies worldwide?

6. The Function of Language Areas in the Non-Dominant Hemisphere. Why is the brain organized in such a way that the language areas of the non-dominant hemisphere are the source of auditory hallucinations — unless this provided some previous functional purpose?

7. The “Religious” Function of the Right Temporal Lobe. Why is right temporal lobe implicated in auditory hallucinations, intense religious sentiments, and the feeling of a sensed presence?

8. Visitation Dreams. Why do ancient and modern dreams differ so dramatically? Studies of dreams in classical antiquity show that the earliest recorded dreams were all “visitation dreams,” consisting of a visitation by a god or spirit that issues a command — essentially the bicameral waking experience of hearing verbal commands only during sleep. This has also been noted in tribal societies.

9. The Inadequacy of Current Thinking to Account for the Origin of Religion. Why are the worship of gods and dead ancestors found in all cultures worldwide?

10. Accounting for the Ubiquity of Divination. Similarly, why were divination practices also universal?

Jaynes’s theory of a previous bicameral mentality accounts for all of these phenomena, and, in the complete absence of persuasive alternative explanations, appears to be the best explanation for each of them. As one professor once said to me, “There is either Jaynes’s theory, or just ‘weird stuff happens.'”

Questions critics fail to answer (Julian Jaynes Society)

Weird stuff, indeed!!! But there is another, perhaps even more important question not listed above. That is, why did religious concepts change so profoundly during the Axial Age? As Joseph Henrich, the anthropologist whose paper we cited above put it:

“The typical evolutionary approaches to religion don’t take into account that the kinds of gods we see in religions in the world today are not seen in small-scale societies. I mentioned the ancestor gods; other kinds of spirits can be tricked, duped, bought off, paid; you sacrifice in order to get them to do something; they’re not concerned about moral behavior…Whatever your story is, it’s got to explain how you got these bigger gods.”

Joseph Henrich on Cultural Evolution, WEIRD Societies (Conversation with Tyler)

In researching these series of posts, I’m struck by just how big a gulf there is between (to use Evens-Pritchard’s terms) Primitive Religion and Revelatory Religion.

Primitive religion, for all its dramatic variance, appears to be centered around direct revelation from gods, ancestor worship, and communal rituals. It is almost always rooted in some kind of animist belief system, and is always polytheistic.

Revelatory religions, by contrast, tend to emphasize conscious control over one’s own personal behavior (e.g. the ‘Golden Rule’). They emphasize looking for revelation by introspection—going inward—something conspicuously missing from primitive religions. Instead of direct revelation, God’s words are now written down in holy books which are consulted to determine God’s will, permanent and unchanging. Monotheism takes over from polytheism. And a significant portion of the population, unlike in primitive societies, accepts no god at all [atheism = a (without) theos (gods)]. As Brian McVeigh writes, quoting St. Augustine, “By shifting the locus of ‘spiritual activity from external rites and laws into the individual, Christianity brought God’s infinite value into each person.’ In other words, a newly spiritualized space, first staked out by Greek philosophers, was meta-framed and expanded into an inner kingdom where individual and Godhead could encounter each other.” (Psychohistory of Metaphors, pp. 52-53)

For his part Henrich and other researchers hypothesize that the difference comes from the fact that Universal Religions of Revelation (so-called “Big Gods”) allowed for larger and more diverse groups of people to cooperate, thus outcompeting parochial deities who couldn’t “scale up.” Because the “Big Gods” were all-seeing, all-knowing, omnipresent, moralizing deities with the power to reward and punish in the afterlife, they argue, it kept people on the straight-and-narrow, allowing for more higher-level cooperation between unrelated strangers even without shared cultural context. Basically, it was a meme that evolved via group selection. As they put it (PDF): “[C]ognitive representations of gods as increasingly knowledgeable and punitive, and who sanction violators of interpersonal social norms, foster and sustain the expansion of cooperation, trust and fairness towards co-religionist strangers.”

I call this “The Nannycam theory of Religion”. As God remarked to Peter Griffin on Family Guy, “I’m kind of like a nannycam. The idea that I *may* exist is enough for some people to behave better.”

By contrast, the breakdown of the bicameral mind provides an explanation. God now becomes one’s own conscience—the inner voice in one’s head. We now become responsible for our own behavior through the choices we make. The revelatory religions serve as a guide, and a replacement for the voices that no longer issue their commands. As Brian McVeigh explains:

…interiority is unnecessary for most of human behavior. If this is true, why did we as a species develop it about three thousand years ago (at least according to Julian Jaynes)? What was its purpose?

From the perspective of a sociopolitical organization [sic], interiority alleviates the need for strict heirarchical lines of command and control, which are inherently fragile. By placing a personal tool kit of command and control “inside a person’s head,” interiority becomes society’s inner voice by proxy.

Authorization based on strict hierarchical lines of command and control may be efficient for relatively small, well-circumscribed communities, but if history is any teacher, clear lines of control become less cost-effective in terms of socioeconomic capital the larger and more complex organizations become.

One authorization for immediate control of self becomes interiorized and individual-centered, an organization actually becomes stronger as its orders, directives, doctrines, admonitions, and warnings become the subjective truths of personal commitment.

Interiority, then, is a sociopolitically pragmatic tool used for control in the same way assigning names to individuals or categorizing people into specialized groups for economic production is. From the individual’s perspective, interiority makes the social environment easier to navigate. Before actually executing a behavior, we can “see” ourselves “in our heads” carrying out an action, thereby allowing us to shortcut actual behavioral sequences that may be time-consuming, difficult, or dangerous.
Brian J. McVeigh; A Psychohistory of Metaphors, pp. 33-34

There are many more “conventional” explanations of the universality of religious beliefs. One popular theory is put forward by anthropologist Pascal Boyer in “Religion Explained.” Basically, he argues that religion is an unintended side effect of  what software programmers would refer to as “bugs” in the human cognitive process:

Basing his argument on this evolutionary reasoning, Boyer asserts that religion is in effect a cognitive “false positive,” i.e., a faulty application of our innate mental machinery that unfortunately leads many humans to believe in the existence of supernatural agents like gods that do not really exist.

This also leads Boyer to describe religious concepts as parasitic on ordinary cognitive processes; they are parasitic in the sense that religion uses those mental processes for purposes other than what they were designed by evolution to achieve, and because of this their successful transmission is greatly enhanced by mental capacities that are there anyway, gods or no gods.

Boyer judges the puzzling persistence of religion to be a consequence of natural selection designing brains that allowed our prehistoric ancestors to adapt to a world of predators. A brain molded by evolution to be on the constant lookout for hidden predators is likely to develop the habit of looking for all kinds of hidden agencies. And it is just this kind of brain that will eventually start manufacturing images of the concealed actors we normally refer to as “gods.”

In this sense, then, there is a natural, evolutionary explanation for religion, and we continue to entertain religious ideas simply because of the kinds of brains we have. On this view, the mind it takes to have religion is the mind we have…Religious concepts are natural both in the phenomenological sense that they emerge spontaneously and develop effortlessly, and in the natural sense that also religious imagination belongs to the world of nature and is naturally constrained by genes, central nervous systems, and brains.
J. Wentzel van Huyssteen; Alone In The World? pp. 261-263

Of course, as Jaynes would point out, the gods as depicted in ancient literature are hardly “hidden actors.” They often speak directly to individuals and issue commands which are subsequently obeyed! Massive amounts of time and effort are spent building temples to them. That seems like an awful lot of work to satisfy a simple “false positive” in human cognition.

Other theories focus on what’s called the Theory of Mind. For example: What Religion is Really All About (Psychology Today). As a Reddit commenter put it succinctly:

The basic thesis is that we believe in gods (or supernatural minds in general) because of cognitive adaptations that evolved for social interaction. It was evolutionarily advantageous for monkeys to construct mental models of what other monkeys were feeling/perceiving/thinking, and it’s a natural step from there to believing in disembodied minds, minds that can exist without the monkey. Related YouTube lecture: Why We Believe In Gods.

Testimony to the Sumerian worship of the Cookie Monster

Perhaps. But there are an awful lot of signs in the archaeological record that our ancestors thought very differently than we do, to wit:

1. Eye idols (see above)

2. “Goddess” figurines and idols Jaynes: “Figurines in huge numbers have been unearthed in most of the Mesopotamian cultures, at Lagash, Uruk, Nippur, and Susa. at Ur, clay figures painted in black and red were found in boxes of burned brick placed under the floor against the walls but with one end opened, facing into the center of the room. The function of all these figurines, however, is as mysterious as anything in all archaeology. The most popular view goes back to the uncritical mania with which ethnology, following Frazer, wished to find fertility cults at the drop of a carved pebble. But if such figurines indicate something about Frazerian fertility, we should not find them where fertility was no problem. But we do.” Origins, p. 166. As the old joke in archaeology goes, if you can’t explain something, just claim it was for ‘fertility.’

3. Human Sacrifice

4. Trepanation

5. God kings:
Jaynes: “I am suggesting that the dead king, thus propped up on his pillow of stones, was in the hallucinations of his people still giving forth his commands…and that, for a time at least, the very place, even the smoke from its holy fire, rising into visibility from furlongs around, was, like the gray mists of the Aegean for Achilles, a source of hallucinations and of the commands that controlled the Mesolithic world of Eynan.

This was a paradigm of what was to happen in the next eight millennia. The king dead is a living god. The king’s tomb is the god’s house…[which]…continues through the millennia as a feature of many civilizations, particularly in Egypt. But, more often, the king’s-tomb part of the designation withers away. This occurs as soon as successor to a king continues to hear the hallucinated voice of his predecessor during his reign, and designates himself as the dead king’s priest or servant, a pattern that is followed throughout ancient Mesopotamia. In place of the tomb is similarly a temple. And in place of the corpse is a statue, enjoying even more service and reverence, since it does not decompose.” Origins, pp. 142-43

6. Grave goods

7. Cannibalism

8. Veneration of ancestors

9. Mummification of animals

Not to mention things like this:

A common practice among these city dwellers [of Çatalhöyük] was burying their dead under their floors, usually under raised platforms that served as beds. Often they would dig up the skulls of the dead later, plaster their faces (perhaps to recreate the faces of loved ones), and give them to other houses. Archaeologists frequently find skeletons from several people intermingled in these graves, with skulls from other people added. Wear and tear on some plastered skulls suggest they were traded back and forth, sometimes for generations, before being reburied. According to Hodder, such special skulls are just as often female as they are male.

Incredible discovery of intact female figurine from neolithic era in Turkey (Ars Technica)

The Voices in Your Head

What If God Was One Of Us?

What We Talk About When We Talk About Consciousness

The Cathedral of the Mind

One of Oliver Sacks’ last popular books, published in 2012, was about hallucinations, titled, appropriately, Hallucinations. In it, he takes a look at numerous types of hallucinatory phenomena—hallucinations among the blind (Charles Bonnett Syndrome); sensory deprivation; delirium; grieving; Post-traumatic Stress Disorder; epilepsy; migraines; hypnagogia; Parkinson’s Disease, psychedelic usage; religious ecstasy; and so on.

There are a number of interesting facts presented about auditory hallucinations.
One is the fact that although auditory hallucination of voices is indeed indicative of schizophrenia, in most cases auditory hallucinations are experienced by perfectly normal people with no other signs of mental illness.

Sacks begins his chapter on auditory hallucinations by describing an experiment in 1973 where eight “fake” patients went to mental hospitals complaining of hearing voices, but displaying no other signs of mental illness or distress. In each case, they were diagnosed as schizophrenic (one was considered manic-depressive), committed to a facility for two months, and given anti-psychotic medication (which they obviously did not take). While committed, they even openly took notes on their experiences, yet none of the doctors or staff ever wised up to the ruse. The other patients, however, were much more perceptive. They could clearly see that the fake patients were not at all mentally ill, and even asked them, “what are you doing here?” Sacks concludes:

This experiment, designed by David Rosenhan, a Stanford psychologist (and himself a pseudopatient), emphasized, among other things, that the single symptom of “hearing voices” could suffice for an immediate, categorical diagnosis of schizophrenia even in the absence of any other symptoms or abnormalities of behavior. Psychiatry, and society in general, had been subverted by the almost axiomatic belief that “hearing voices” spelled madness and never occurred except in the context of severe mental disturbance. p. 54

While people often mischaracterize Jaynes’ theory as “everyone in the past was schizophrenic,” it turns out that even today most voices are heard by perfectly normal, otherwise rational, sane, high-functioning people. This has been recognized for over a century in medical literature:

“Hallucinations in the sane” were well recognized in the nineteenth century, and with the rise of neurology, people sought to understand more clearly what caused them. In England in the 1880s, the Society for Psychical Research was founded to collect and investigate reports of apparitions or hallucinations, especially those of the bereaved, and many eminent scientists—physicians as well as physiologists and psychologists—joined the society (William James was active in the American branch)…These early researchers found that hallucinations were not uncommon among the general population…Their 1894 “International Census of Waking Hallucinations in the Sane” examined the occurrence and nature of hallucinations experienced by normal people in normal circumstances (they took care to exclude anyone with obvious medical or psychiatric problems). Seventeen thousand people were sent a single question:

“Have you ever, when believing yourself to be completely awake, had a vivid impression of seeing or being touched by a living being or inanimate object, or of hearing a voice, which impression, as far as you could discover, was not due to an external physical cause?”

More than 10 percent responded responded in the affirmative, and of those, more than a third heard voices. As John Watkins noted in his book Hearing Voices, hallucinated voices “having some kind of religious or supernatural content represented a small but significant minority of these reports.” Most of the hallucinations, however, were of a more quotidian character. pp. 56-57

While the voices heard by schizophrenics are often threatening and controlling, the voices heard by most people do not appear to have any effect on normal functioning at all.

The voices that are sometimes heard by people with schizophrenia tend to be accusing, threatening, jeering, or persecuting. By contrast, the voices hallucinated by the “normal” are often quite unremarkable, as Daniel Smith brings out in his book Muses, Madmen, and Prophets: Hearing Voices and the Borders of Sanity. Smith’s own father and grandfather heard such voices, and they had different reactions. His father started hearing voices at the age of thirteen. Smith writes:

“These voices weren’t elaborate, and they weren’t disturbing in content. They issued simple commands. They instructed him, for instance, to move a glass from one side of the table to another or to use a particular subway turnstile. Yet in listening to them and obeying them his interior life became, by all accounts, unendurable.”

Smith’s grandfather, by contrast, was nonchalant, even playful, in regard to his hallucinatory voices. He described how he tried to use them in betting at the racetrack. (“It didn’t work, my mind was clouded with voices telling me that this horse could win or maybe this one is ready to win.”) It was much more successful when he played cards with his friends. Neither the grandfather nor the father had strong supernatural inclinations; nor did they have any significant mental illness. They just heard unremarkable voices concerned with everyday things–as do millions of others. pp. 58-59

To me, this sounds an awful lot like Jaynes’s descriptions of the reality of bicameral man doesn’t it? The voices command, and the people obey the commands. Yet they still are outwardly normal, functioning individuals. You may never know whether someone is obeying voices in their head unless they explicitly told you:

This is what Jaynes calls “bicameral mind”: one part of the brain (the “god” part) evaluates the situation and issues commands to the other part (the “man” part) in the form of auditory and, occasionally, visual hallucinations (Jaynes’ hypothesises that the god part must have been located in the right hemisphere, and the man part, in the left hemisphere of the brain). The specific shapes and “identities” of these hallucinations depend on the culture, on what Jaynes calls “collective cognitive imperative”: we see what we are taught to see, what our learned worldview tells us must be there.

Julian Jaynes and William Shakespeare on the origin of consciousness in the breakdown of bicameral mind (Sonnets in Colour)

In most of the cases described in Hallucinations, people didn’t attribute their auditory or visual hallucinations to any kind of supernatural entity or numinal experience. A few did refer to them as “guardian angels”. But what if they had grown up in a culture where this sort of thing was considered normal, if not commonplace, as was the case for most of ancient history?

Hearing voices occurs in every culture and has often been accorded great importance–the gods of Greek myth often spoke to mortals, and the gods of the great monotheistic traditions, too. Voices have been significant in this regard, perhaps more so than visions, for voices, language, can convey an explicit message or command as images alone cannot.

Until the eighteenth century, voices—like vision—were ascribed to supernatural agencies: gods or demons, angels or djinns. No doubt there was sometimes and overlap between such voices and those of psychosis or hysteria, but for the most part, voices were not regarded as pathological; if they stayed inconspicuous and private, they were simply accepted as part of human nature, part of the way it was with some people. p. 60

In the book The Master and His Emissary, Iain McGilchrist dismisses Jaynes’s theory by claiming that schizophrenia is a disease of recent vintage, and only emerged sometime around the nineteenth century. Yet, as the Jaynes foundation website points out (2.7), this is merely when the diagnosis of schizophrenia was established. Before that time, it would not have been considered pathological or a disease at all. We looked at how Akhnaten’s radical monotheism was possibly inspired by God “speaking” directly to him, issuing commands to build temples, and so forth. Certainly, he thought it was, at any rate. And he’s hardly alone. We’ve already looked at notable historical personages like Socrates, Muhammad, Joan of Arc, and Margery Kempe, and there are countless other examples. Schizophrenia is no more “new” than is PTSD, which was barely recognized until after Wold War One, where it was called “shellschock.”

“My Eyes in the Time of Apparition” by August Natterer. 1913

Another thing Sacks points out is that command hallucinations tend to occur in stressful situations or times of extreme duress, or when one has some sort of momentous or climactic decision to make, just as Jaynes posited. In times of stress, perfectly ordinary, sane people often hear an “outside” voice coming from somewhere guiding their actions. This is, in fact, quite common. In “normal” conditions we use instinct or reflex to guide our actions. But in emergencies, we hear a voice that seems to come from somewhere outside our own consciousness:

If, as Jaynes proposes, we take the earliest texts of our civilisation as psychologically valid evidence, we begin to see a completely different mentality. In novel and stressful situations, when the power of habit doesn’t determine our actions, we rely on conscious thinking to decide what to do, but, for example, the heroes of Iliad used to receive their instructions from gods — which would appear in the times of uncertainty and stress.

Julian Jaynes and William Shakespeare on the origin of consciousness in the breakdown of bicameral mind (Sonnet in Colour)

For example, Dr. Sacks is perfectly aware that his “inner monologue” is internally generated. Yet in a stressful situation, the voice became externalized—something that seemed to speak to him from some outside source:

Talking to oneself is basic to human beings, for we are linguistic species; the great Russian psychologist Lev Vygotsky thought that inner speech was a prerequisite of all voluntary activity. I talk to myself, as many of us do, for much of the day–admonishing myself (“You fool! Where did you leave your glasses?”), encouraging myself (“You can do it!”), complaining (“Why is that car in my lane?”) and, more rarely, congratulating myself (“it’s done!”). Those voices are not externalized; I would never mistake them for the voice of God, or anyone else.

But when I was in danger once, trying to descend a mountain with a badly injured leg, I heard an inner voice that was wholly unlike my normal babble of inner speech. I had a great struggle crossing a stream with a buckled and dislocating knee. The effort left me stunned, motionless for a couple of minutes, and then a delirious languor came over me, and I thought to myself, Why not rest here? A nap maybe? This was immediately countered by a strong, clear, commanding voice, which said, “You can’t rest here—you can’t rest anywhere. You’ve got to go on. Find a place you can keep up and go on steadily.” This good voice, the Life voice, braced and resolved me. I stopped trembling and did not falter again. pp. 60-61

Sacks gives some other anecdotal examples of people under extreme duress:

Joe Simpson, climbing in the Andes, also had a catastrophic accident, falling off an ice ledge and ending up in a deep crevasse with a broken leg. He struggled to survive, as he recounted in Touching the Void–and a voice was crucial in encouraging and directing him:

“There was silence, and snow, and a clear sky empty of life, and me, sitting there, taking it all in, accepting what I must try to achieve. There were no dark forces acting against me. A voice in my head told me that this was true, cutting through the jumble in my mind with its coldly rational sound.”

“It was as if there were two minds within me arguing the toss. The *voice* was clean and sharp and commanding. It was always right, and I listened to it when it spoke and acted on its decisions. The other mind rambled out a disconnected series of images, and memories and hopes, which I attended to in a daydream state as I set about obeying the orders of the *voice*. I had to get to the glacier….The *voice* told me exactly how to go about it, and I obeyed while my other mind jumped abstractly from one idea to another…The *voice*, and the watch, urged me into motion whenever the heat from the glacier halted me in a drowsy exhausted daze. It was three o’clock—only three and a half hours of daylight left. I kept moving but soon realized that I was making ponderously slow headway. It didn’t seem to concern me that I was moving like a snail. So long as I obeyed the *voice*, then I would be all right.”

Such voices may occur with anyone in situations of extreme threat or danger. Freud heard voices on two such occasions, as he mentioned in his book On Aphasia:

“I remember having twice been in danger of my life, and each time the awareness of the danger occurred to me quite suddenly. On both occasions I felt “this was the end,” and while otherwise my inner language proceeded with only indistinct sound images and slight lip movements, in these situations of danger I heard the words as if somebody was shouting them into my ear, and at the same time I saw them as if they were printed on a piece of paper floating in the air.

The fact that the gods tend to come to mortals in the Iliad during times of stress has been noted by author Judith Weissman, author of “Of two minds: Poets who hear voices”:

Judith Weissman, a professor of English at Syracuse University, notes that in the Iliad the gods speak directly to the characters over 30 times, often when the characters are under stress. Many of the communications are short, brief, exhortations. The most common godly command, issued when the men are fearful in battle, is to, “fight as your father did.” At one point in the Iliad, the god Apollo picks up Hektor, who has fallen in battle, and says, “So come now, and urge on your cavalry in their numbers / to drive on their horses against the hollow ships” (15.258-59)…

Personality Before the Axial Age (Psychology Today)

Hallucinations are also quite common in soldiers suffering from PTSD. If modern soldiers experience PTSD, how much more traumatic would be ancient battles, like those described so vividly in the Iliad? I can’t even imagine standing face-to-face with a foe, close enough to feel his hot breath, and having to shove a long, sharp metal object directly into his flesh without hesitation; blood gushing everywhere and viscera sliding out of his belly onto the dirt. And yet this was the reality of ancient warfare in the Bronze and Iron ages.  Not to mention the various plagues, dislocations, natural disasters, invasions, and other assorted traumatic events.

People with PTSD are also prone to recurrent dreams or nightmares, often incorporating literal or somewhat disguised repetitions of the traumatic experiences. Paul Chodoff, a psychiatrist writing in 1963 about the effects of trauma in concentration camp survivors, saw such dreams as a hallmark of the syndrome and note that in a surprising number of cases, they were still occurring a decade and half after the war. The same is true of flashbacks. p. 239

Veterans with PTSD may hallucinate the voices of dying comrades, enemy soldiers, or civilians. Holmes and Tinnin, in one study, found that the hearing of intrusive voices, explicitly or implicitly accusing, affected more than 65 percent of veterans with combat PTSD. p. 237 note 4

The other very common occurrence where otherwise “sane” people will often hallucinate sounds or images is during grief and bereavement. Sometimes this is just hearing the voice of the departed person speaking to them or calling them. Sometimes they may actually see the person. And sometimes they may even carry on extended conversations with their deceased family members!

Bereavement hallucinations, deeply tied to emotional needs and feelings, tend to be unforgettable, as Elinor S., a sculptor and printmaker, wrote to me:

“When I was fourteen years old, my parents, brother and I were spending the summer at my grandparents’ house as we had done for many previous years. My grandfather had died the winter before.”

“We were in the kitchen, my grandmother was at the sink, my mother was helping and I was still finishing dinner at the kitchen table, facing the back porch door. My grandfather walked in and I was so happy to see him that I got up to meet him. I said ‘Grampa,’ and as I moved towards him, he suddenly wasn’t there. My grandmother was visibly upset, and I thought she might have been angry with me because of her expression. I said to my mother that I had really seen him clearly, and she said that I had seen him because I wanted to. I hadn’t been consciously thinking of him and still do not understand how I could have seen him so clearly. I am now seventy-six years of age and still remember the incident and have never experienced anything similar.”

Elizabeth J. wrote to me about a grief hallucination experienced by her young son:

“My husband died thirty years ago after a long illness. My son was nine years old at the time; he and his dad ran together on a regular basis. A few months after my husband’s death, my son came to me and said that he sometimes saw his father running past our home in his yellow running shorts (his usual running attire). At the time, we were in family grief counselling, and when I described my son’s experience, the counsellor did attribute the hallucinations to a neurologic response to the grief. This was comforting to us, and I still have the yellow running shorts.” pp. 233-234

It turns out that this kind of thing is extremely common:

A general practitioner in Wales, W.D. Rees, interviewed nearly three hundred recently bereft people and found that almost half of them had illusions or full-fledged hallucinations of a dead spouse. These could be visual, auditory, or both—some of the people interviewed enjoyed conversations with their hallucinated spouses. The likelihood of such hallucinations increased with the length of the marriage, and they might persist for months or even years. Rees considered these hallucinations to be normal and even helpful in the mourning process. p. 234

A group of Italian psychological researchers published a paper in 2014 entitled “Post-bereavement hallucinatory experiences: A critical overview of population and clinical studies.” According to their paper, after an extensive review of peer-reviewed literature, they found that anywhere from 30 to 60 percent of grieving people experienced what they called “Post-bereavement hallucinatory experiences” (PBHEs). Is it any wonder why veneration of the dead was so common across cultures from the Old World to Africa to Asia to the Americas to Polynesia? It was almost universally assumed that the dead still existed in some way across ancient cultures. Some scholars such as Herbert Spencer posited that ancestor worship was the origin of all religious rites and practices.

What is the fundamental cause of all these aural hallucinations? As neurologist Sacks freely admits, the source of these phenomena is at present unknown and understudied. Sacks references Jaynes’s “Origin of Consciousness…” in his speculation on possible explanations:

Auditory hallucinations may be associated with abnormal activation of the primary auditory cortex; this is a subject which needs much more investigation not only in those with psychosis but in the population at large–the vast majority of studies so far have examined only auditory hallucinations in psychiatric patients.

Some researchers have proposed that auditory hallucinations result from a failure to recognize internally generated speech as one’s own (or perhaps it stems from a cross-activation with the auditory areas so that what most of us experience as our own thoughts becomes “voiced”).

Perhaps there is some sort of psychological barrier or inhibition that normally prevents most of us from “hearing” such inner voices as external. Perhaps that barrier is somehow breached or underdeveloped in those who do hear constant voices. Perhaps, however, one should invert the question–and ask why most of us do not hear voices.

In his influential 1976 book, The Origin of Consciousness in the Breakdown of the Bicameral Mind, speculated that, not so long ago, all humans heard voices–generated internally from the right hemisphere of the brain, but perceived (by the left hemisphere) as if external, and taken as direct communications from the gods. Sometime around 1000 B.C., Jaynes proposed, with the rise of modern consciousness, the voices became internalized and recognized as our own…Jaynes thought that there might be a reversion to “bicamerality” in schizophrenia and some other conditions. Some psychiatrists (such as Nasrallah, 1985) favor this idea or, at the least, the idea that the hallucinatory voices in schizophrenia emanate from the right side of the brain but are not recognized as one’s own, and are thus perceived as alien…It is clear that “hearing voices” and “auditory hallucinations” are terms that cover a variety of different phenomena. pp. 63-64

Recently, neuroscientists have hypothesized the existence of something called an “efference copy” which is made by the brain of certain types of stimulus. The presence of the effluence copy informs the brain that certain actions have originated from itself, and that subsequent inputs are self-generated. For example, the efference copy of your hand movements is what prevents you from tickling yourself. The existence of this efferance copy—or rather the lack thereof—has been postulated as the reason why schizophrenics can’t understand the voices in their heads as being their own. A temporary suppression of the efference copy may be behind why so many otherwise “sane” people often hear voices as something coming from outside their own mind.

Efference copy is a neurological phenomenon first proposed in the early 19th century in which efferent signals from the motor cortex are copied as they exit the brain and are rerouted to other areas in the sensory cortices. While originally proposed to explain the perception of stability in visual information despite constant eye movement, efference copy is now seen as essential in explaining a variety of experiences, from differentiating between exafferent and reafferant stimuli (stimulation from the environment or resulting from one’s own movements respectively) to attenuating or filtering sensation resulting from willed movement to cognitive deficits in schizophrenic patients to one’s inability to tickle one’s self.

Efference Copy – Did I Do That? Cody Buntain, University of Maryland (PDF)

I talk to myself all the time. The words I’m typing in this blog post are coming from some kind of “inner self.” But I feel like that inner voice and “me” are exactly the same, as I’m guessing you do too, and so do most of us “normal” people. But is that something inherent in the brain’s bioarchitecture, or is that something we are taught through decades of schooling and absorbing our cultural context? Might writing and education play an important role in the “breakdown” of bicameralism? We’ll take a look at that next time.

The version of bicameralism that seems most plausible to me is the one where the change is a memetic rather than evolutionary-genetic event. If it were genetic, there would still be too many populations that don’t have it, but whose members when plucked out of the wilderness and sent to university seem to think and feel and perceive the world the way the rest of us do.

But integrated, introspective consciousness could be somewhere between language and arithmetic on the list of things that the h. sapien brain has always been capable of but won’t actually do if you just have someone raised by wolves or whatnot. Language, people figure out as soon as they start living in tribes. Arithmetic comes rather later than that. If Jaynes is right, unicameral consciousness is something people figure out when they have to navigate an environment as complex as a bronze-age city, and once they have the knack they teach their kids without even thinking about it. Or other peoples’ kids, if they are e.g. missionaries.

At which point brains whose wiring is better suited to the new paradigm will have an evolutionary advantage, and there will be a genetic shift, but as a slow lagging indicator rather than a cause.

John Schilling – Comment (Slate Star Codex)

[I’m just going to drop this here—much of the cause of depression stems from negative self-talk (“It’s hopeless!;” Things will never get better;” “I’m worthless,”etc.). In such cases, this “inner voice,” rather than being encouraging, is a merciless hector to the depressed individual. As psychologists often point out to their patients, we would never talk to anyone else as callously we talk to ourselves. Why is that? And it seems interesting that there are no references to depression in bicameral civilizations as far as I know. Ancient literature is remarkably free of “despair suicides” (as opposed to suicides for other reasons such as defeat in battle or humiliation).]

The Cathedral of the Mind

What if God was one of us?

What we talk about when we talk about consciousness

Nothing here but spilled chocolate milk.

There are what I call “hard” and “soft” interpretations of Jaynes’s thesis. The “hard” interpretation is exactly what is posited in the book: humans did not have reflexive self-awareness in the way we describe it today until roughly the Bronze Age.

The “soft” interpretation is that a shift in consciousness occurred, quite possibly in the way that Jaynes described it, but that it occurred around 40-70,000 years ago during the Ice Age, long before writing or complex civilizations, when our ancestors will still hunter-gatherers. Another “soft” interpretation is that our ancestors definitely thought differently than we do, but  they were still conscious agents nonetheless, and that the gods and spirits they referred to so often and who seemed to control their lives were merely figments of their imagination.

The Great Leap Forward

The idea that humans experienced some some sort of significant cognitive transformation sometime after becoming anatomically modern is no longer controversial. This is the standard view in archaeology. Scientists call this the transition from anatomically modern humans to behaviorally modern humans. This article has a good summary:

… During the Upper Paleolithic (45,000-12,000 years ago), Homo sapiens fossils first appear in Europe together with complex stone tool technology, carved bone tools, complex projectile weapons, advanced techniques for using fire, cave art, beads and other personal adornments. Similar behaviors are either universal or very nearly so among recent humans, and thus, archaeologists cite evidence for these behaviors as proof of human behavioral modernity.

Yet, the oldest Homo sapiens fossils occur between 100,000-200,000 years ago in Africa and southern Asia and in contexts lacking clear and consistent evidence for such behavioral modernity. For decades anthropologists contrasted these earlier “archaic” African and Asian humans with their “behaviorally-modern” Upper Paleolithic counterparts, explaining the differences between them in terms of a single “Human Revolution” that fundamentally changed human biology and behavior.

Archaeologists disagree about the causes, timing, pace, and characteristics of this revolution, but there is a consensus that the behavior of the earliest Homo sapiens was significantly different than that of more-recent “modern” humans.

Earliest humans not so different from us, research suggests (Science Daily)

What no one knows, however, is what caused it, how it took place, or exactly when and where it took place. But the idea that there could be some kind of drastic cognitive shift without significant physical changes is no longer fringe. As Jared Diamond wrote:

Obviously, some momentous change took place in our ancestors’ capabilities between about 100,000 and 50,000 years ago. That Great Leap Forward poses two major unresolved questions, regarding its triggering cause and it geographic location. As for its cause, I argued in my book The Third Chimpanzee for the perfection of the voice box and hence the anatomical basis for modern language, on which the exercise of human creativity is so dependent. Others have suggested instead that a change in brain organization around that time, without a change in brain size, made modern language possible. Jared Diamond; Guns, Germs and Steel, p.40

Archaeologists tend look at all the things in the archaeological record that indicate that Paleolithic humans were like us (e.g. complex tools, art, body ornamentation, trade, burial of the dead, food storage and preservation), but for some reason they downplay or dismiss all the things that show that, in many ways, they were quite different than us. That is, in some respects, they were not nearly as “behaviorally modern” as we tend to assume.

For example, here are some other things they did during this time period: carve ivory and wooden idols. Make sacrifices to their gods (including mass child sacrifice). Cannibalism. Sleep temples. Build strange statues with eyes and no mouth (eye idols). Practiced astrology. And they regularly poked holes in their skulls for reasons we are still unsure of. In other words, for all the evidence that they thought like us, there is other evidence that suggests that their thinking was substantially different that ours in many ways! But we tend to emphasize the former only, and ignore the latter. This leads to Jaynes’s idea that there may have been more than just one Great Leap Forward, and that human consciousness has changed significantly since the establishment of architectural civilizations.

Let’s take a quick detour into how scientists think the human brain may have developed to gain some insight into whether there may be evidence for bicameralism.

A short digression into brain architecture

The idea that the brain is composed of previous adaptations which have been extended is fairly well accepted. The Triune Brain hypothesis is that we have a “lizard brain”  which controls our base functions like breathing and so forth, and is highly aggressive and territorial. Then we have a rodent (paleomammalian) brain that allow us do more complex social functions such as solve basic problems. Then we have the primate (neomammalian) brain including the neocortex that allows for larger groups and advanced reasoning. This is basically seen as correct in broad strokes, although a vast oversimplification of the complexities of how the primate brain developed

From Primitive Parts, A Highly Evolved Human Brain (NPR)

The brain of an organism cannot just “go down” for maintenance while it upgrades. It has to keep the organism alive and reproducing. So new modules have to be added on the fly to what’s already there ad hoc. This leads to a brain of aggregations which have to mix with older features, much the way legacy computer code is embedded within older software. This, as you can imagine, can lead to “buggy code.”

Archaeologist Steven Mithen wrote a book about the prehistory of the mind—what we might call “cognitive archaeology.” He notes that certain processes seem to come automatically to the brain—like learning language, while others—like multiplying two large numbers together in one’s head, do not. This means that the brains is not like, say, a “general purpose” I/O microcomputer as it’s often described. He writes: “The mind doesn’t just accumulate information and regurgitate it. And nor is it indiscriminate in the knowledge it soaks up. My children—like all children—have soaked up thousands of words effortlessly, but their suction seems to lose its power when it comes to multiplication tables.”

This indicates that the human mind has some inherent, or built-in, propensities, alongside the general intelligence all animals have. That means they may be of evolutionary origin. Spoken language appears to be one of these. While we send our kids to school for years to try and pound algebra, trigonometry and the correct spelling of words into them, children soak up language from their environment shortly after birth with hardly any effort at all.

Noam Chomsky invoked something he called the “poverty of the stimulus” to make this point. He meant that given how fast and accurately children learn language by osmosis, there is no way it comes from “just” environmental inputs, like a computer. Children must be, in some sense, be pre-programmed to learn language, and thus language’s fundamental construction must be related to how the brain functions—something he called a “universal grammar.” Over time, more of these apparently “inherent” behaviors have been identified in humans:

It became increasingly unpopular to assume that a basic understanding of the world can be built entirely from experience. This was in part instigated by theorist Noam Chomsky, who argued that something as complex as the rules of grammar cannot be picked up from exposure to speech, but is supplied by an innate “language faculty.”

Others followed suit and defined further “core areas” in which knowledge allegedly cannot be pieced together from experience but must be innate. One such area is our knowledge of others’ minds. Some even argue that a basic knowledge of others’ minds is not only possessed by human infants, but must be evolutionarily old and hence shared by our nearest living relatives, the great apes.

Children understand far more about other minds than long believed (The Conversation)

This means that, rather than being like a computer or a sponge, argues Mithen, the mind is more akin to a “Swiss army knife,” with different modules for different uses, but all fundamentally a part of the same basic “object.” One study, for example, has found that the ability recognize faces is innate. This explains the human penchant for pareidolia.

You (probably) see a face in this chair, but do you ever see a chair in someone’s face?

Using the Swiss Army knife metaphor, Mithen argues that these various specialized cognitive modules overlap with what he calls “general intelligence.” This overlap between specialized intelligences and the general intelligence leads to a lot of unique features of human cognition such as creativity, socialization, and, perhaps, constructing things like ‘gods’ and the ‘self.’ Here’s a good summary:

Mithen…[argues]…that the mind should … be seen as a series of specialized “cognitive domains” or “intelligences,” each of which is dedicated to some specific type of behavior, such as specialized modules for acquiring language, or tool-using abilities, or engaging in social interaction…his argument will be that that modern human mind has an architecture built up by millions of years of evolution, which finally yielded a mind that creates, thinks, and imagines.

Mithen…highlights recent efforts in psychology to move beyond thinking of the mind as running a general-purpose program, or as a sponge indiscriminately soaking up whatever information is around. A new analogy for the human mind has taken its place: the Swiss army knife, a tool with specialized devices, designed for coping with very special types of problems.

This is found especially in Howard Gardener’s important book Frames of Mind: The Theory of Multiple Intelligences. In this well-known work we are presented with a Swiss-army knife architectural model for the mind, with each “blade,” or cognitive domain, described as a specialized intelligence. Gardener initially identified seven intelligences: linguistic, musical, logical-mathematical, spatial, bodily-kinesthetic, and two forms of personal intelligence (one for looking at on’es own mind, one for looking outward toward others).

Alone in the World? by Wentzel Van Huyssteen, pp. 194-195

Form this, Mithen proposes a new metaphor – that of a cathedral, with a central nave standing in for generalized intelligence, and numerous walled-off enclaves (side chapels) for the various specialized cognitive functions. In a nutshell, Mithen argues that the “walls” between these areas began to break down over time, and the services in the side chapels increasingly blended together with the “main service” taking place in the nave. The mixture gives rise to rise to the various symbolic and metaphorical aspects of human consciousness—what he terms “cognitive fluidity.”

Mithen fills out the three stages in the historical development of the human mind as follows:

In Phase One human minds were dominated by central “nave” of generalized intelligence.

Phase Two adds multiple “chapels” of specialized intelligences, including the cognitive domains of language, social intelligence, technical intelligence, and natural history intelligence.

Phase Three brings us to the modern mind in which the “chapels” or cognitive domains have been connected, resulting in what Mithen calls cognitive fluidity. This creative combination of the various cognitive domains of the mind would ultimately have profound consequences for the nature of the human mind. With this cognitive fluidity, the mind acquired not only the ability for, but also a positive passion for, metaphor and analogy. And with thoughts originating in different domains engaging one another, the result is an almost limitless capacity for imagination.

It is exactly this amazing ability that would make our species so different from early humans who shared the same basic mind – a Swiss army knife of multiple intelligences, but with very little interaction between them.

Mithen’s useful model here, again, is a cathedral with several isolated chapels, within which unique services of thought were undertaken, each barely audible elsewhere in the cathedral. In Mithen’s words: “Early humans seem to have been so much like us in some respects, because they had these socialized cognitive domains; but they seem so different because they lacked the vital ingredient of the modern mind: cognitive fluidity”

[Behavioral modernity] is when “doors and windows were inserted between chapel walls”, when thoughts and information began flowing freely among the diverse cognitive domains or intelligences. Specialized intelligences no longer had to work in isolation, but a a”mapping across knowledge systems” now became possible, and from this “transformation of conceptual spaces” creativity could now arise as never before.

Mithen thus appropriates some of the work of cognitive psychologists, to make the related point that in both development and evolution the human mind undergoes (or has undergone) a transformation from being constituted by a series of relatively independent cognitive domains to a situation in which ideas, ways of thinking, and knowledge now flow freely between such domains. This forms the basis for the highly plausible hypothesis that during this amazing emergent period of transition, the human brain was finally hardwired for cognitive fluidity, yielding imagination and creativity.

Alone in the World? by Wentzel Van Huyssteen pp. 195-197

And modern scientific investigation tends to back these ideas up:

The ability to switch between networks is a vital aspect of creativity. For instance, focusing on a creative puzzle with all of your attention might recruit the skills of the executive attention network. On the other hand, if the creative task involves producing a sonically pleasing guitar solo, focus might be switched from intense concentration to areas more involved in emotional content and auditory processing.

The neuroscience of creativity (Medical News Today)

It is this mixing of intelligences – this cognitive fluidity, that gives rise to language and symbolic thinking. Incremental at first, the increasing blending of these intelligences gives rise to language and symbolic thought over time. This leads to the “Great Leap Forward” seen in the archaeological record:

Of critical importance here is also a marked change in the nature of consciousness. Mithen has argued that reflexive consciousness evolved as a critical feature of social intelligence, as it enabled our ancestors to predict the behavior of other individuals. He then makes the point that there is now reason to expect early humans to have had an awareness about their own knowledge and thought processes concerning the nonsocial world. Via the mechanism of language, however, social intelligence began to be invaded by nonsocial information, and the nonsocial world becomes available for reflexive consciousness to explore…Consciousness then adopted the role of a comprehensive, integrating mechanism for knowledge that had previously been “trapped” in specialized intelligences.

The first step toward cognitive fluidity appears to have been integration between social and natural history intelligence in early modern humans around 100,000 years ago. The final step to full cognitive fluidity, the potential to entertain ideas that bring together elements from normally incongruous domains, occurred at different times in different populations between 60,000 and 30,000 years ago. This involved an integration of technical intelligence, and led to the cultural explosion we are now calling the appearance of the human mind.

…As soon as language started acting as a vehicle for delivering information into the mind, carrying with it snippets of nonsocial information…[it] now switched from a social to a general-purpose function, consciousness from a means to predict other individuals’ behavior to managing a mental database of information relating to all domains of behavior…Mithen’s most interesting point here is that some metaphors and analogies can be developed by drawing on knowledge within a single domain, but the most powerful ones are those that cross domain boundaries. By definition these kinds of metaphors can arise only within a cognitively fluid mind… Alone in the World, pp.197-199

Yes, but were they conscious? There’s the rub. Is artwork proof of reflective self-consciousness? Are burials proof of such? Clearly tool use alone is not, as we’ve seen. And some of the most vibrant artwork has been done by schizophrenics.

Like Mithen, Jaynes also calls attention to the vital role of language and metaphor in cognitive fluidity and reflective self-consciousness. Even ‘the self’ itself is a metaphor!

…The most fascinating property of language is its capacity to make metaphors … metaphor is not a mere extra trick of language…it is the very constitutive ground of language. I am using using metaphor here in its most general sense: the use of a term for one thing to describe another because of some kind of similarity between them of between their relations to other things.

There are thus always two terms in a metaphor, the thing to be described, which I shall call the metaphrand, and the thing or relation used to elucidate it, which I shall call the metaphier. A metaphor is always a known metaphier operating on a less known metaphrand.

It is by metaphor that language grows. The common reply to the question “what is it?” is, when the reply is difficult, or the experience unique, “well, it is like –.” In laboratory studies, both children and adults describing nonsense objects (or metaphrands) to others who cannot see them use extended metaphriers that with repetition become contracted onto labels. This is the major way in which the vocabulary of language is formed. The grand and vigorous function of metaphor is the generation of new language as it is needed, as human culture becomes more and more complex.

It is not always obvious that metaphor has played this all-important function. But this is because the concrete metaphiers become hidden in phonemic change, leaving the words to exist on their own. Even such an unmetaphorical-sounding word as the verb ‘to be’ was generated from a metaphor. it comes from the Sanskrit bhu, “to grow, or to make grow,” while the English forms ‘am’ and ‘is’ have evolved from the same root as the Sanskrit asmiy “to breathe.”

It is something of a lovely surprise that the irregular conjugation of our most nondescript verb is thus a record of a time when man had no independent word for ‘existence’ and could only say that something ‘grows” or that it ‘breathes.’ Of course we are not conscious that the concept of being is thus generated from a metaphor about growing and breathing. Abstract words are ancient coins whose concrete images in the busy give-and-take of talk have worn away with use. pp. 48-51

The ancient Greeks at the time of Homer lacked a word for blue; they referred to the Mediterranean Sea, for example, as “wine-colored” (οἶνοψ). The brilliant hues of the Mediterranean sunrise are famously described as “rosy-fingered” (ῥοδοδάκτυλος), and so forth. Wikipedia even has a list of them. A similar concept in Old Norse is called kenning (e.g blood = “battle sweat”).

In reading ancient texts, it is one of the rare opportunities we have to look upon a worldview entirely alien to us. The ancients described physical appearances in some ways that seem bizarre to the modern sensibility. Homer says the sea appears something like wine and so do sheep. Or else the sea is violet, just as are oxen and iron. Even more strangely, green is the color of honey and the color human faces turn under emotional distress. Yet no where in the ancient world is anything blue for no word for it existed. Things that seem blue to us are either green, black or simply dark in ancient texts.

Also, things like subjective perspective and experience are lacking. Even body parts are regularly described as having their own minds. And voices are supposedly heard in the external world, command voices telling people what to do, while voices aren’t described as being heard within the head. There is no ancient word or description of a fully internalized sense of self.

It’s hard to know what to make of all this. There are various theories that attempt to explain it. But the main takeaway is that our common sense assumptions are false. There is something more to human nature and human society than we at present experience and understand. As a species, we are barely getting to know ourselves.

Benjamin David Steele (Facebook post)

Note that in our discussion above, even our descriptions of the mind rely upon metaphors (“Swiss army knife,” “cathedral”) and spatialization (“leaping forward”).

Finally, there was a German theorist of religion named Max Müller who saw the origin of what we call ‘gods’ in the way that humans naturally tend to conceive of things they do not understand metaphorically. His theories have all but been forgotten, but I think they fit nicely with the idea that in order to comprehend certain natural phenomena, ancient peoples resorted to assigning them the category ‘god,’ even when they knew, for instance, that the sun was not literally the chariot of Apollo, or that lightning bolts were not literally thrown by Zeus. Keep in mind, what we think of when we hear the word ‘god’ in our rationalist, materialistic, monotheistic-influenced culture is probably so different than what the ancient people using it at the time meant, that we moderns cannot even conceive of what they had in mind. Here’s E. E. Evans-Pritchard describing Muller’s theories:

In [Müller’s] view, as I understand it, men have always had an intuition of the divine, the idea of the infinite–his word for God–deriving from sensory experience…Now, things which are intangible, like the sun and the sky, gave men the idea of the infinite and also furnished the material for deities…Müller did not wish to be understood as suggesting that religion began by men deifying natural objects, but rather that these gave him a feeling of the infinite and also served as symbols for it.

Müller was chiefly interested in the gods of India and of the classical world…His thesis was that the infinite, once the idea had arisen, could only be thought of in terms of metaphor and symbol, which could only be taken from what seemed majestic in the known world, such as the heavenly bodies, or rather their attributes. But these attributes then lost their original metaphorical sense and achieved autonomy by becoming personified as deities in their own right. The nomina became numina.

So religions, of this sort at any rate,might be described as a ‘disease of language’, a pithy but unfortunate expression which later Muller tried to explain away but never quite lived down. It follows, he held, that the only way we can discover the meaning of the religion of early man is by philological and etymological research, which restores to the names of the gods and the stories told about them their original sense.

Thus, Apollo loved Daphne; Daphne fled before him and was changed into a laurel tree. This legend makes no sense till we know that originally Apollo was a solar deity, and Daphne, the Greek name for the laurel, or rather the bay tree, was the name for the dawn. This tells us the original meaning of the myth: the sun chasing away the dawn.

E.E. Evans-Pritchard – Theories Of Primitive Religion, pp. 21-22

What We Talk About When We Talk About Consciousness

Previously: What If God Was One Of Us?

Last time we discussed the radical the idea that “consciousness” arose relatively late in human history, roughly around the time of the Late Bronze Age Collapse in the Mediterranean.

Now, its important to understand that when Jaynes uses the term “consciousness, he is talking about something very specific. It’s not simply being responsive to one’s exterior surroundings (sense perception), but being aware of them and filtering them though a some kind of “inner life”. Jaynes contends that this sort of meta-awareness arrived relatively late in human history, and that we can pinpoint this change in comprehension through a careful reading of ancient literature, especially sacred literature and epic poetry.

Think of it this way: you see an apple; the color waves hit your eyes, which send signals to your brain via the optic nerve. You “choose” to reach out and grasp it. A nerve signal goes out from the brain to your arm and hand. The apple is touched. Nerve fibers in the hand sends signals from the hand to the brain, describing the temperature, texture, firmness, and so forth. All of these signals are processed various areas of the brain which we can see by the neurons firing in those areas in an fMRI scan.

Jaynes isn’t talking about any of that stuff. That’s the process of sense perception. He’s talking about something else entirely. As Marcel Kuijsten of the Julian Jaynes society describes:

[2:30 -3:57] “In a nutshell, what Jaynes argues is that, as humans evolved language, along with language the brain was using language to then convey experience between the two hemispheres which were operating in a, let’s say, a less integrated fashion then they are today.”

This idea is a little shocking to people initially, because behavior was then directed by what we today call an auditory hallucination. But there’s a lot of evidence that he presents for this. The ancient literature is filled with all of these examples of people’s behavior being directed by what they interpreted as the gods, idols that they used to illicit these commands, and just quite a bit of evidence that he gets into explaining all this.”

“From that he realized that consciousness was not what people generally assume to be a biologically innate, evolved process, but it was something that was learned, and it was based on language. So after language got to a level of complexity, then we developed this ability to introspect. So he places the date for the development of consciousness much more recently than traditional ideas.

“[10:18] Most of the critiques of the theory are based on misconceptions…[11:04] The most common mistake is that they are criticizing what Jaynes is saying based on their own view of consciousness rather than how Jaynes defines it. And consciousness is defined so differently by so many people that when you go to conferences on consciousness you see all these people giving lectures and they’re all really defining it in very, very different ways.”

Julian Jaynes and the Bicameral Mind Theory (This View of Life Magazine)

Jaynes himself acknowledges the inherent difficulty of using our own conscious mind to come to an intellectual reckoning of, well, itself!

Consciousness is a much smaller part of our mental life than we are conscious of, because we cannot be conscious of what we are not conscious of. How simple is that to say; how difficult to appreciate! It is like asking a flashlight in a dark room to search around for something that does not have any light shining upon it. The flashlight, since there is light in whatever direction it turns, would have to conclude that there is light everywhere. And so consciousness can seem to pervade a mentality when actually it does not. p. 23

Again, consciousness is not simply the sense perception of the world around you. It’s not required to do basic things like eat, sleep or have sex. It’s not even necessary for talking. Chimpanzees (and gorillas) have been taught to “talk” using sign language. Unless we attribute reflective self-consciousness to great apes, then clearly language—in terms of expressing simple desires and notions about the world using nouns and verbs—is not, strictly speaking, only an act that only conscious beings can do; at least how Jaynes is describing it. All animals communicate in some fashion, whether they are self-conscious or not.

Also, it’s thought that language actually evolved in humans primarily for gossip, and that gossip evolved as a method of social bonding and coalition building, and not, please note, for ruminative thought or reflective self-awareness:

Human language didn’t evolve to name things but for gossip — our equivalent of primates grooming — which developed to maintain the bonds of trust in the ever growing social groups our ancestors formed to protect themselves against predators as they moved ‘out of Africa’ to survive…We continue to gossip today — approximately 65% of modern talking time is taken up by it, irrespective of age, gender or culture. The topics tend to be extreme events (both good and bad) that we struggle to make sense of alone. By engaging our peers we are better able to understand and act in the world around us.

The Problematic Storytelling Ape (Medium)

Nor is consciousness strictly necessary for a large-scale social organization to develop. For example, there are many examples of eusocial and prosocial species among earth’s various types of animals. Ants, bees, and wasps are among the most successful animal species on the planet, engaging in agriculture, building large nests, raising each other’s young, engaging in organized war, and living in vast “cities.” Are the hymnoptera conscious in the same way humans are? It’s highly doubtful. And yet they live in complex societies and many of their behaviors are similar.

“I’ll take the example of the leaf cutter ant,” [economics professor Lisi] Krall explained … “They cut and harvest leaves, and then they feed the leaves to their fungal gardens, and they themselves then feed on the fungal gardens,” she said. The ants “develop into vast, vast colonies that have highly developed, profound divisions of labor.” Sound familiar?…”We engaged a kind of social evolution, that started with agriculture, that put us on a path of expansion and interconnectedness and ultimately, in humans, hierarchy, and all that kind of stuff,” she said.

Humans are more like ants than lone wolves (Treehugger)

Even writing existed for thousands of years as simply a mnemonic device for recording straightforward things like genealogies and inventories—”lists, lists and more lists,” as James C. Scott put it. There’s no indication that writing, strictly speaking, requires self-consciousness.

Agriculture, villages, towns, even cities and empires arose without the benefit of writing. The earliest forms of cuneiform writing consisted of clay tablets recording market transactions and tax records with [no] moral, political or legal lessons for future generations… These were mnemonic devices, no better and no worse than a string tied around the finger or rather more sophisticated sets of knots created by the Incans [sic]. The tablets circulated as bills of exchange, carrying a symbolic value as money rather than a historical value as something-to-be-preserved. Their symbolic function served, the tablets were simply thrown away in the trash. Daniel Lord Smail, On Deep History and the Brain p. 57

Animals have also constructed dwellings like hives, mounds, and nests, and made artwork: “Animal-made works of art have been created by apes, elephants, cetacea, reptiles, and bowerbirds, among other species.” (Wikipedia)

Chimpanzee wins $10,000 prize for abstract painting (The Guardian)

It used to be thought that reflexive self-consciousness was necessary for any sort of complex culture to exist, and that cumulative cultural evolution was something unique to humans. However, in 2014 researchers managed to induce cumulative cultural evolution in baboons. In 2017, it was found that homing pigeons can also gather, pass on and improve knowledge over generations. Then, whales and dolphins (cetaceans) were added to the mix. Then came migrating ungulates (hoofed mammals). Last year, researchers even detected evidence of it among fruit flies!

Primatologists have taken to regularly attributing the differences in chimpanzee behavior in various troops across Africa to “culture” rather than biological instinct. And tool use has been documented in a wide number of animals:

The suggestion that humanity is distinct by virtue of possessing a culture subject to Lamarckian evolution is more problematic than it may appear. The glitch lies in the fact that humans are no longer considered to be the only species to possess culture.

The idea that other animals have culture has been circulating for nearly three decades and has reached a point of media saturation that partially obscures the challenge created by the fact of animal culture. Although early studies focused on the apes and monkeys who make tools and wash sweet potatoes, culture does not end with primates.

Birds’ songs and migration routes are learned and transmitted culturally rather than genetically. Some groups of dolphins manipulate sponges to protect their noses while foraging and teach the practice to their offspring. The crows of New Caledonia clip twigs to create hooked tools that are used to retrieve insects from crevices. As with chimpanzees, the types of tools used by crows vary from one group to the next, suggesting that the very use of tools is transmitted through culture. Daniel Lord Smail: On Deep History and the Brain; p. 87

So, more and more, we are finding that self-reflective consciousness is not strictly necessary for many of the behaviors we used to think were uniquely human. Cumulative cultural evolution was there all along just waiting for us to find it! To a drastically lesser degree than us, of course, but it was there nevertheless. We were just too arrogant and self-absorbed to look properly.

So, then, what exactly do we mean when we talk about consciousness? Unless we consider baboons, chimps, orangutans, dolphins, whales, pigeons, crows, bighorn sheep, ants, termites and fruit flies as all conscious the way we are, we must look elsewhere, or else redefine what it is that we are truly searching for in the first place.

What is does not mean is what’s usually called “operant conditioning.” All animals are capable that. Jaynes himself dismisses ideas of operant conditioning as indicators of the type of consciousness that characterizes human beings. After describing standard experiments in which he “taught” everything from plants to microbes to reptiles to complete various tasks, he realized this had nothing whatsoever to do with the type pf conscious behavior he was looking for:

It was, I fear, several years before I realized that this assumption makes no sense at all. When we introspect, it is not upon any bundle of learning processes, and particularly not the types of learning denoted by conditioning and T-mazes. Why then did so many worthies in the lists of science equate consciousness and learning? And why had I been so lame of mind as to follow them?…

It is this confusion that lingered unseen behind my first struggles with the problem, as well as the huge emphasis on animal learning in the first half of the twentieth century. But it is now absolutely clear that in evolution the origin of learning and the origin of consciousness are two utterly separate problems…

Is consciousness…this enormous influence of ideas, principles, beliefs over our lives and actions, really derivable from animal behavior? Alone of species, all alone! We try to understand ourselves and the world. We become rebels or patriots or martyrs on the basis of ideas. We build Chartres and computers, write poems and tensor equations, play chess and quartets, sail ships to other plants and listen to other galaxies – what have these to do with rats in mazes or the threat displays of baboons? The continuity hypothesis of Darwin for the evolution of the mind is a very suspicious totem of evolutionary mythology… pp. 7-8

The chasm is awesome. The emotional lives of men and other mammals are indeed marvelously similar, but to focus upon the similarity unduly is to forget that such a chasm exists at all. The intellectual life of man, his culture and history and religion and science, is different from anything else we know of in the universe. That is fact. It is as if all life evolved to a certain point, and then in ourselves turned at a right angle and simply exploded in a different direction. p.9

Jaynes controversially rejects the idea that consciousness is necessarily a part of human thinking and reasoning, as we commonly assume it must be. He cites the work of the Würzburg School of psychology in Germany and their discovery of so-called “imageless thoughts.”

The essential point here is that there are several stages of creative thought: first, a stage of preparation in which the problem is consciously worked over then a period of incubation without any conscious concentration upon the problem; and then the illumination which is later justified by logic. The parallel between these important and complex problems and the simple problems of judging weights or the circle-triangle series is obvious. The period of preparation is essentially the setting up of a complex situation together with conscious attention to the materials on which the striction is to work. But then the actual process of reasoning, the dark leap onto huge discovery, just as in the simple trivial judgement of weights, has no representation in consciousness. Indeed, it is sometimes almost as if the problem has to be forgotten to be solved. p. 44

Jaynes points out that not only is consciousness not necessary for performance of routine daily tasks, it can actually be counterproductive! Self-conscious reflection puts us on notice of “watching ourselves” from an observer’s point of view, and thus our performance often degrades. That is, we involve our “ego self” in what we happen to be doing at the moment. You can see this all the time with athletes. Once they start to want to win, they trip up and stop winning. The best sports actions are performed without a certain lack of self-reflection (dare we say, a lack of conscious introspection) leading to a sense of spontaneity. We might almost call it a trance, as in the Taoist tale of the dexterous butcher. There is a word for this “non-conscious” state in Chinese philosophy: wu-wei, or non-action. Ted Slingerland, and expert in ancient Chinese philosophy has written a whole book about this concept called Trying Not to Try.

It’s clearly a different sort of consciousness that Jaynes is after. It is something uniquely human, but we don’t seem to be able to find it anywhere we look, except by degrees of magnitude over various other animals. Even art, culture, building, reasoning and communication are not immune!

Nor does the “self” or “consciousness” have any sort of fixed anatomical location, inside your noggin or anywhere else for that matter, as we seem to assume. Many ancient peoples located their conscious selves in the heart, not in the head. The ancient Greeks did so, seeing the brain as merely a cooling organ for blood, like a car radiator. Out-of-body experiences also testify that consciousness can locate itself anywhere, not even within the physical body itself!

Where does consciousness take place? Everyone, or almost everyone, immediately replies, in my head. This is because when we introspect, we seem to look inward on an inner space somewhere behind our eyes. But what on earth do we mean by ‘look’? We even close our eyes sometimes to introspect even more clearly. Upon what? Its spatial character seems unquestionable…

We not only locate this space of consciousness inside our own heads. We also assume it is there in others’. In talking with a friend, maintaining periodic eye-to-eye contact (that remnant of our primate past where eye-to-eye contact was concerned in establishing tribal hierarchies), we are always assuming a space between our companion’s eyes into which we are talking, similar to the space we imagine inside out own heads where we are talking from.

And this is the very heartbeat of the matter, for we all know perfectly well that there is no such space in anyone’s head at all! There is nothing inside my head or yours except physiological tissue of one sort or another. And the fact that it is predominantly neurological tissue is irrelevant. pp. 44-45

Let us not make a mistake. When I am conscious, I am always and definitely using certain parts of my brain inside my head. But so am I when riding a bicycle, and the bicycle riding does not go on inside my head. The cases are different of course, since bicycle riding has a definite geographical location, while consciousness does not. In reality, consciousness has no location whatever except as we imagine it has. p. 46

In the end Jaynes concludes with regard to consciousness:

We have been brought to the conclusion that consciousness is not what we generally think it is. It is not to be confused with reactivity. It is not involved in hosts of perceptual phenomena. It is not involved in the performance of skills and often hinders their execution. It need not be involved in speaking, writing, listening, or reading. It does not copy down experience, as most people think. Consciousness is not at all involved in signal learning, and need not be involved in the learning of skills or solutions, which can go on without any consciousness whatever. It is not necessarily for making judgements or in simple thinking. It is not the seat of reason, and indeed some of the most difficult instances of creative reasoning go on without any attending consciousness and it has no location except and imaginary one! The immediate question therefore is, does consciousness exist at all? pp. 46-47

Jaynes concludes that it does, but to understand what he means, we have to start thinking about it in a totally different way. And for that reason, we can’t find it simply by studying physical processes in the brain. We need to engage in a bit of existentialist philosophy:

The trick to understanding his model is first understanding what he means by “consciousness”. I don’t think he means what most of us mean when we talk about say the “hard problem” of consciousness. In modern considerations of consciousness, I think we largely refer to subjective experience – the “what it is like” to be aware of the world. Jaynes however dismisses this as mere sensory perception. He is more interested in what it is to have an internal “mindspace”, an “analog I” that experiences the world. Jaynes argues for the emergence of this sense of self and an inner mindspace from language. He sees the infinite capacity for metaphor inherent in human language as a means by which we can build similarly infinite concepts and ideas about our relationship with the external world.

That is, when we introspect upon our experience as selves in the world, we construct an inner self, an “I” that exists within our mind’s eye which is what it is that has these experiences, these relationships. This inner self is an analog for what our senses perceive and how we react and is what gives us a sense of the first person in how we view the world. I guess Jaynes is thinking here of some kind of conscious interiority, a feeling of being “in here” rather than “out there” (or perhaps nowhere at all).

Jaynes observes (as have many others) that this kind of awareness rests upon language. Human language has two distinctive features – the capacities for metaphorical representation and infinite recursion. With these basic tools, human beings can build infinitely complex models of self and experience. We can also use language to communicate – share – these models. In fact, over time it is this sharing that helps to construct commonly held models of experience that shape the course of cultural progress.

Julian Jaynes and the Analog “I” (Science Philosophy Chat Forums)
The key to this is how the brain uses language to construct the self:

It is through language that we construct models of the self and through translation of our intuitions into words and ideas that we learn the limits of this language and the limits of our own particular perspective.

Through language we learn to differentiate between ourselves and others from a young age even if consciousness is not a concept that we ever learn explicitly or ever truly “know” our self.

It is in natural language — the spoken word, novels, poetry, vague metaphorical speech, descriptions of made-up things like love and self and consciousness — that we have our greatest tool to share our subjective experiences. A powerful tool to build a common roadmap to create better selves.

The self may be a fiction but in that case it is all the more vital that we embrace fiction, and by extension natural language, to communicate with each other at an ever deeper level.

The Problematic Storytelling Ape (Medium)

Thus, language is crucial in constructing the “self” i.e. the concept of the individual “I” that we normally all carry around all day inside our heads—the homumculus who has no material existence we feel like is “in there” somewhere. But—it’s important to note—the mere presence of language and writing by itself does not necessarily indicate that such introspective thinking exists. Rather, the self—the analog I—is a “concept” that utilizes our innate capacity for language, but is independent of it:

The analogue-I and analogue-me refer to mental self-relevant images that take a first-person vs. third-person perspective, respectively. Mental self-analogues are essential for goal setting, planning, and rehearsal of behavioral strategies, but they often fuel emotional and interpersonal problems when people react to their analogue selves as if they were real.

The Analogue-I and the Analogue-Me: The Avatars of the Self (Self and Identity)

Behavioral scientists have studied how this self interacts with the world. In fact, behavioral science has confirmed that there is not one, unitary “self” consistent over time, but multiple selves! In fact, these selves are often present at the same time, although separated in space. This mindblowing idea alone should cause us to reject the idea that the self is just a biological process inside our heads and not a mental construct. In a recent study on willpower, the authors of the study propose a conflict between multiple overlapping selves: “Simply put, you in the present is different than you in the future.” (Treehugger)

The second class of models posits multiple coexisting selves. This view holds that decision makers behave as if they were a composite of competing selves with different valuation systems and different priorities.

One “self” craves instant gratification (e.g., “I want to eat a cheeseburger! Yum!”), whereas another “self” is focused on maximizing long-term outcomes (e.g., “I want to eat a salad and be healthy!”). Self-control conflicts are the consequence of a present-oriented valuation system disagreeing with a future-oriented valuation system

…Evidence for multiple system models comes from functional MRI (fMRI) studies showing that self-controlled choices were associated with lateral prefrontal areas of the brain, whereas more impulsive choices were associated with the ventral striatum and ventromedial prefrontal cortex.

Beyond Willpower: Strategies for Reducing Failures of Self-Control (Sage Journals)

Given all this, Jaynes finally lists what he believes are the core characteristics of the kind of human introspective consciousness awareness he’s talking about:

1. Spatialization – We tend to describe reality in terms of spatial visualization. “If I ask you to think of the last hundred years, you may have a tendency to excerpt the matter in such a way that the succession of years is spread out, probably from left to right. But of course there is no left or right in time. There is only before and after, and these do not have any spatial properties whatever – except by analog. You cannot, absolutely cannot think of time except by spatializing it. Consciousness is always a spatialization in which the diachronic is turned into the synchronic, in which what has happened in time is excerpted and seen in side-by-sideness.” p. 60

2. Excerption (screening, or filtering) Our perception of our reality is necessarily limited. “In consciousness, we are never ‘seeing’ anything in its entirely…we excerpt from the collection of possible attentions to a thing which compromises our knowledge of it. And this is all that is possible to do since consciousness is a metaphor of our actual behavior.”

3. The Analog ‘I’“…the metaphor we have of ourselves which can move about vicarially in our imagination doing things we are not actually doing…In the example of…spatialization, it was not your physical behavioral self that was trying to ‘see’ where my theory ‘fits’ into the array of alternative theories. It was your analog ‘I'” pp. 62-63

4. The Metaphor ‘Me’“We can both look out from within the imagined self at the imagined vistas, or we can step back a bit and see ourselves perhaps kneeling down for a drink of water at a particular brook.”

5.Narratization: We construct narratives to understand the world: “In our consciousness we are always seeing our vicarial selves as the main figures in the stories of our lives. In the above illustration, the narratization is obvious, namely, walking along a wooded path. But it is not so obvious that we are constantly doing this whenever we are being conscious, and this I call narratization.”

6. Conciliation: We comprehend new things by fitting them within established patterns. “…a slightly ambiguous perceived object is made to conform to some previously learned schema, an automatic process sometimes called assimilation. We assimilate a new stimulus into our conception, or schema about it, even though it is slightly different…assimilation consciousized is conciliation. In conciliation we are making excerpts or narratizations compatible with each other, just as in external perception the new stimulus and the internal conception are made to agree…”

To this I would also add that the human mind seems to have an inherent instinct for meaning or purpose. It tends to be quite good at self-deception. And, we’ll later explore the human mind’s ability for recursion.

To get some clues about how this all developed, we”ll take a look at some theories of how the modern human brain evolved from earlier hominins next time.

BONUS: Robert Sapolsky: Are Humans Just Another Primate?

What If God Was One of Us?

Did ancient peoples have a fundamentally different consciousness than modern people?

Horus is my co-pilot

It’s a question I think deserves serious attention. Of course, this leads to a discussion of what the heck “consciousness” even means—does it mean self-awareness, or self-conscious introspection, or our perception of consensus reality? What constitutes “reality?” Are dreams and hallucinations “real,” for instance? And what does “self-awareness” really mean, anyway? Solipsism? Or something else?

Even in a secular age, consciousness retains a mystical sheen. It is alternatively described as the last frontier of science, and as a kind of immaterial magic beyond science’s reckoning. David Chalmers, one of the world’s most respected philosophers on the subject, once told me that consciousness could be a fundamental feature of the universe, like space-time or energy. He said it might be tied to the diaphanous, indeterminate workings of the quantum world, or something nonphysical.

These metaphysical accounts are in play because scientists have yet to furnish a satisfactory explanation of consciousness. We know the body’s sensory systems beam information about the external world into our brain, where it’s processed, sequentially, by increasingly sophisticated neural layers. But we don’t know how those signals are integrated into a smooth, continuous world picture, a flow of moments experienced by a roving locus of attention—a “witness,” as Hindu philosophers call it.

Do Animals Have Feelings? (The Atlantic)

As one commenter to the Atlantic article above article on Reddit points out:

“Consciousness” is an archaic sort of catch-all phrase without much empirical definition and usefulness. Sort of like how physicists used to use “ether” to describe things. Of course we’ve upgraded our concepts (and respective language) for a more enriched understanding, not needing the idea of “ether” anymore.

As the Atlantic article referenced above describes, “If one of the wasp’s aquatic ancestors experienced Earth’s first embryonic consciousness, it would have been nothing like our own consciousness.” But the question we’re pondering today is whether even our own remote ancestors had a consciousness very different than our own.

To deal with this question, let’s take a look at the 1979 book, The Origin of Consciousness in the Breakdown of the Bicameral Mind by psychologist Julian Jaynes.

The idea is that in these ancient Mediterranean civilizations, the typical human had one or more ‘gods’ — spirits, agents, separate intelligences — living alongside the conventional ‘self’ in the brain. In other words, the dominant pattern was to maintain two separate, verbally-intelligent control centers in the same brain — one for the ‘gods’ and one for the ‘humans’/’mortals’/’selves’.

Jaynes refers to this arrangement as bicameral, which means two-chambered. That’s because he postulates that the gods and conventional selves were headquartered in the two chambers of the brain — the right and left hemispheres (respectively). I think this is plausible enough, but Jaynes admits that it’s speculative, and it’s not strictly necessary for the rest of his theory. What matters is only that the human brain is (empirically!) capable of something like this arrangement.

In other words, the gods took on some of the functions we think of as the “will” or volition. (But not the conscience; that would only later become a function of a very different kind of god.) Here’s Jaynes:

“The gods were in no sense ‘figments of the imagination’ of anyone. They were man’s volition. They occupied his nervous system… and from stores of admonitory and preceptive experience, transmuted this experience into articulated speech which then ‘told’ the man what to do.”

Think of it this way. Today we have a lot of mental phenomena we can’t really account for, like “intuitions” or “gut feelings.” … Now imagine that “bad feeling” in the form of a voice telling you, “Be careful! Don’t agree to anything!”

Mr. Jaynes’ Wild Ride (Melting Asphalt)

Of the theory, Ran Prieur says, “I’m sure that ancient people had different consciousness than modern people, but Jaynes thought it was *really* different: that they were basically all schizophrenic, hearing voices and seeing visions, which they interpreted as gods.” That is, “Julian Jaynes believed that ancient people experienced their gods as auditory hallucinations.”

The experience of multiple personalities or hearing disembodied voices is extremely common even today, and not only in people suffering from acute schizophrenia:

As much as 10% of the population hear voices at some point in their lives, much higher than the clinical incidence of schizophrenia (1%)…And around 65% of children say they have had ‘imaginary friends’ or toys that play a sort of guardian-angel role in their lives.

Jaynes thought children evolve from bicameral to conscious, much as Piaget thought young children are by nature animist (ie they attribute consciousness to things, and may attribute special consciousness to favourite toy-companions…

Gods, Voice Hearing and the Bicameral Mind (Philosophy for Life)

This Aeon article is a fascinating overview of how psychologists have tried to explain how our “inner voice” integrates our personality over the course of our development. It describes the research of Charles Fernyhough, a leading researcher of inner speech and auditory hallucination at Durham University in the United Kingdom:

It’s possible to inner “hear” your own voice rather than speak your own voice,’ … Here, people listen to their own voice in their heads, perceiving the same sonic characteristics as expanded speech, but without the agency. Such experiences have been recalled by participants as their voice ‘just happening’, as ‘coming out of its own accord’, as ‘taking place’ rather than ‘being uttered’.

Some people passively experience inner speech in voices not their own – essentially as auditory hallucinations that they cannot control. Founding member of the Beach Boys Brian Wilson described the experience to Larry King in an interview on CNN in 2004: ‘I’m going to kill you. I’m going to hurt you’, an inner voice had continually repeated to him since his initial experiences with LSD in the 1960s. The value of understanding such hallucinations is self-evident: they are a hallmark of schizophrenia, a condition that affects almost 24 million people worldwide.

Of great fascination, Fernyhough has concluded that a small but significant part of the general population also experience auditory hallucinations – a phenomenon the researchers call ‘voice hearing’ to distinguish it from schizophrenia. Such voices have been reported by noted individuals throughout history, says Fernyhough. The Greek philosopher Socrates described what he called a ‘daemonic sign’, an inner voice warning him that he was about to make a mistake. Joan of Arc described hearing divine voices since childhood – the same ones that influenced her motivation to help in the siege of Orleans. The 15th-century mystic and autobiographer Margery Kempe wrote about inner conversations with God. Sigmund Freud was not immune: ‘During the days when I was living alone in a foreign city … I quite often heard my name suddenly called by an unmistakable and beloved voice.’

All this leads to another, confounding question: are verbal thoughts reaching awareness just the tip of a mental iceberg, offering only a glimpse of the unconscious mind?

The inner voice (Aeon). Or are verbal thoughts themselves consciousness?

It’s not as crazy as it sounds on first blush. In fact, we commonly experience all sorts of “altered” mental states throughout our lives—hypnotic trances, hallucinations and visions, flow (a.k.a. “being in the zone”), fever delirium, getting stoned or drunk, orgasm, dizziness, out-of-body experiences, and most obviously, dreams and nightmares. Then of course, there are our moods (anger, excitement), and feelings (ennui, jealousy).

Here are a few examples to get started: tunnel vision, runner’s high, ‘flow’, déjà-vu, daydreaming, and orgasm. Then there are spiritual or religious experiences, which are characterized by a suppressed ego and a heightened sense of unity…Then there are the states attending to physical illness — stupor, delirium, lightheadedness, or (in extreme cases) out-of-body experiences. Moods and emotions also correspond to states of consciousness: sadness, fear, surprise, laughter, joy, lust, anxiety, guilt, anger, shame, pride, boredom, and nostalgia.

Drugs put us into all kinds of interesting states…let’s not forget all the weird things that happen around sleep. Drowsiness, hypnagogia, hypnopompia, the Tetris effect, and of course dreaming itself. Every night we spend an hour or so cavorting around in a rich hallucinated fantasyland — and we think nothing of it. But this should give us pause. A brain that’s capable of dreaming should be capable of almost anything.

And all of this is only the tip of the iceberg — the states that most people have experienced at some point in their lives. In fact the brain is capable of many more and stranger things, especially if we admit into our catalogue all the states attending to brain damage, mental illness, torture, and sleep- or sensory-deprivation. Alien hand syndrome and face-blindness are but two examples.

Accepting Deviant Minds (Melting Asphalt)

The author of the above piece speculates that in our age of constant digital distractions and stimulation, we may one day lose our ability to let out mind wander—that is, to daydream (interestingly, the first sign of self-consciousness in the androids of Westworld is the “Reverie,” which is a synonym for daydreaming). If something like that were to happen, future humans would have a hard time trying to understand what the heck daydreaming once was, even though it’s well attested to in literature. People who engaged in such behaviors in the future would be considered “deviant” or “mentally ill” and in need of treatment. Descriptions of this behavior in the past would be considered as some sort of archaic collective psychosis, if not downright fantastical.

It’s not hard to imagine a world — 500 years from now, say — in which adults have lost the ability to daydream. Children, even infants, will grow up immersed in computer-mediated reality and be bombarded every waking moment with ‘optimal’ stimulation. In such a saturated world, a normal human brain may well become incapable of “day-dreaming” — of pulling up anchor from reality and drifting off into aimless daytime fantasies.

I’m not putting this forward as an actual prediction of what’s likely to happen, but merely as a hypothetical “what-if” scenario.

So what would this future society think of the few remaining people who are prone to “day-dreams”? Theirs will be the brains that, by definition, don’t respond in the normal way to environmental conditioning. It will be easy and tempting, then, to classify such people as mentally ill — to diagnose them with Aimless Imagination Disorder, perhaps. And surely there will be drugs to help keep them attending to reality, i.e., to what’s happening on their screens.

Accepting Deviant Minds (Melting Asphalt)

We would treat these daydream believers in much the same way as we treat the people who “still” hear voices in their heads today. For example:

In the 1980s, a Dutch psychiatrist called Marius Romme was treating a 30-year-old voice-hearer called Patsy Hague. She was on tranquilizers, which failed to stop the voices and made it difficult for her to think. She became suicidal.

Then Romme happened to lend her a copy of Jaynes’ book. It made her think perhaps she was not ill so much as ‘living in the wrong century’, and also gave her confidence that her voices were ‘real’, or as real as the invisible God that Romme and others believed in. Hague told Romme: ‘You believe in a God we never see or hear, so why shouldn’t you believe in the voices I really do hear?” Why not listen to what the voices had to say, rather than dismissing them as meaningless pathological symptoms?

Romme set up a meeting between Hague and other voice-hearers, who enthusiastically swapped stories and shared their sense of helplessness, vulnerability and alienation from their society. A sort of peer-led support network emerged, and has continued to blossom since then…

Gods, voice hearing and the bicameral mind (Philosophy for Life)

So who is to say what is ultimately “real” and “not real” when it comes to mental states? Our “sense of self” is just as imaginary a construct as all those ghosts and demons and other assorted imaginary friends, as this writer points out:

The brain…is capable of some pretty weird stuff. It’s not just a blank slate holding symbolic impressions of what’s happening out in the world…

I’ve spent a lot of effort…preparing us not to reject the idea of hallucinated gods out of hand. But now I ask that you keep just one thing in mind as you continue to read about Jaynes — namely, this objective fact about our species:

The human brain is capable of hallucinating voices.

Yes, hallucinated voices are weird — but they really happen. And sometimes we can even be quite cavalier about them. Every night, for example, we spend an hour or so immersed in a rich hallucinated fantasyland — only to dismiss it, when we wake up, as “just a dream.”
Wait a minute. “Just” a dream? If a dream wasn’t perfectly normal, it would be the weirdest thing that ever happened to you.

When we accuse a hallucinated voice, or the spirit that takes over during a possession, of being unreal, on what do we base the accusation? Both voices and spirits are, as we’ve seen, neurologically real — they correspond to a real pattern of neurons capable of exhibiting real intelligence. Both can be treated as agents, i.e., the kind of thing toward which it’s productive to take the intentional stance.

If anything, our objection lies in the fact that voices and spirits don’t have any reality in the world outside our minds. But there’s something else that has all these properties: the self. I, ego, myself, my conscious will. A neurologically real agent with no physical reality outside of the mind.

Hallucinated Gods (Melting Asphalt)

In fact some people even go so far as actively trying to cultivate their inner voice. This is a part of both Eastern esoteric traditions (e.g. Tibetan Buddhism) and Western (e.g. Magick). Many otherwise “sane” people with addictions often describe their addiction as a separate consciousness from their own which “makes” them drink, or do drugs, or whatever, with their “real” selves going along for the ride. These are “drives,” or “sub-personal agents” which our minds possess. Even today we refer to our personal “demons”—a telling expression, I think. Sometimes people even go so far as to give their addictions or inner voices a name. Maybe in the past, they called the voices things like Utu or “Osiris” or “Aten” or “Apollo”:

…there’s no objective sense in which one of your voices could be the “same” as one of my voices. The process of naming/identifying one’s voices is strictly a symbolic, interpretive act — and as such it would have been fraught with social and political implications. There were personal gods, household gods, state and local gods, each a meaningful token of a different kind of loyalty.

No doubt identification was influenced by all sorts of factors in the child’s life: his parents, priests, and peer group; norms about whether it’s OK to ‘invent’ new gods; where he spent his time; where he heard his voices. If a child hallucinated one of his voices with particular strength at the temple of Osiris, while bathing in the imagery, mythology, and personality of Osiris — well, it only makes sense for that voice to ‘be’ Osiris.

Hallucinated Gods (Melting Asphalt)

Nor is this just ancient history. Yesterday I was reading an article in The Guardian about a British lady named Amanda Feilding who is leading a one-woman crusade to legalize psychedelic drugs around the world for use in the treatment of serious mental disorders. Of her childhood, there’s this fascinating tidbit:

Before the light outside goes, Feilding insists that we have a wander around the grounds, where the seeds of her curiosity were sown. Out among the ancient hedges and ponds she points out the mound and tree stump that she believed housed a private god figure; her game, aged five or six, was to find ways to make that god laugh, “that kind of orgasm experience that I think a lot of young children have and then forget”.

Feilding did not forget. She wanted afterwards, she says, to recreate that childlike intensity of experience…As Feilding explains this former life, in digressive fits and starts, fretting a little that she is saying too much, she leads me through the twilit garden, over well-trodden stepping stones, pointing out a pond she dug “based on sacred geometries”, with a half-submerged colonnade as if from a forgotten civilisation…

Amanda Feilding: ‘LSD can get deep down and reset the brain – like shaking up a snow globe’ (The Guardian)

Incidentally, the idea of spirits inhabiting a particular inanimate object or place is called a tutelary deity in theology, and is quite common across cultures. It appears to be an outgrowth of animism:

A tutelary (also tutelar) is a deity or spirit who is a guardian, patron, or protector of a particular place, geographic feature, person, lineage, nation, culture, or occupation. The etymology of “tutelary” expresses the concept of safety, and thus of guardianship. (Wikipedia)

It’s interesting to contemplate the fact that in ancient literature–religious or not–humans are almost always depicted as communicating directly with the deities! For example, in every ancient legal code I’m aware of, the laws were received directly from the gods by the lawgiver, like dictating to a stenographer. Moses is one case, but hardly the only one. What if this was more than just simply colorful metaphor?

Aeon has a fascinating piece up on the origins of monotheism, which seems to have arisen more-or-less simultaneously in both Egypt and in the Hebrew culture. While many have speculated that one must have influenced the other (such as Freud), there is no record of any direct contact.The change in religion happened rapidly, over just a few decades, rather than by gradual evolution, contends the author. What’s especially interesting is the author’s speculation of how a direct communication with the deity brought about the monotheistic revolution:

My theory is that Akhenaten himself very early in his reign (or even just before) experienced a theophany – a dream or some sort of divine manifestation – in which he believed that Aten spoke to him. This encounter launched his movement which took seven to nine years to fully crystallise as exclusive monotheism.

Great idea, but based on what evidence? Mention has already been made of the two major Aten Temples called Gemet Pa-Aten constructed at Karnak and Akhet-Aten. A third temple by the same name was built in Nubia. Three temples with the same name is unprecedented, and suggests that its meaning, ‘The Aten is Found’, was vitally important to the young king’s religious programme. Could the name of the three sanctuaries memorialise the dramatic theophany that set off the revolution?

Akhenaten also uses the same language of discovery to explain how he found the land where he would establish the new city, Akhet-Aten. The aforementioned boundary inscription records Akhenaten’s words when travelling through the area that would become his new capital:

“Look, Aten! The Aten wishes to have [something] made for him as a monument … (namely) Akhet-Aten … It is Aten, my father, [who advised me] concerning it so it could be made for him as Akhet-Aten.”

Later in the same inscription, the king again repeats the line: ‘It is my father Aten who advised me concerning it.’ These texts point to an initial phenomenological event in which the king discovered the new form of the sun-god and then, through a later revelation, Aten disclosed where his Holy See should be built.

The first God (Aeon)

Interestingly, Islamic monotheism began in a similar fashion century when the Arabic merchant and trader Muhammad heard a voice commanding him to “Recite!” That voice was later attributed to the archangel Gabriel, depicted as the messenger of God (Allah) in Islam.

This is naught but a revelation revealed,
taught him by one mighty in power,
very strong; he stood poised
being on the higher horizon,
then drew near and suspended hung,
two bows’-length away, or nearer,
then revealed to His servant that he revealed.

What struck me in the passage above is how it does seem as though Akhenaten is being compelled to do various things by some sort of commanding entity, just as Jaynes hypothesized. Akhenaten even implies that the the god Aten is his “father” (monotheism is suffused with patriarchal ideas). Of course, Moses is also depicted as speaking with God directly in the Scriptures. Again, we “moderns” interpret this stuff as simply poetic license. But if Jaynes’ suppositions are to be taken seriously, it could have been much more than that!

Hammurabi receiving the laws from the sun-god Shamash

Put another way, the “self” may not be something intrinsic to the brain’s function, but something that is wired up in the environment (or not), depending on the circumstances. That is, it’s environmentally constructed. After all, the human brain is uniquely plastic, and, unlike most animals, does much of its “hardwiring” in the first twenty or so years of life outside the womb:

If we accept that the brain is teeming with agency, and thus uniquely hospitable to it, then we can model the self as something that emerges naturally in the course of the brain’s interactions with the world.

In other words, the self may be less of a feature of our brains (planned or designed by our genes), and more of a growth. Every normal human brain placed in the right environment — with sufficient autonomy and potential for social interaction — will grow a self-agent. But if the brain or environment is abnormal or wrong (somehow) or simply different, the self may not turn out as expected.

Imagine a girl raised from infancy in the complete absence of socializing/civilizing contact with other people. The resulting adult will almost certainly have a self concept, e.g., will be able to recognize herself in the mirror. But without language, norms, shame, and social punishment, the agent(s) at the top of her brain hierarchy will certainly not serve a social/PR role. She’ll have no ‘face’, no persona. She’ll be an intelligent creature, yes, but not a person.

Neurons Gone Wild (Melting Asphalt)

A real-world example is that of Helen Keller:

Another way to think of this is to imagine what would be in our heads without language. What would be left of you, had you no language with which to express your experience to yourself? I suggest no “you” at all, beyond the immediacy of existence. In this respect, it is instructive to recall Helen Keller’s words in her essay Before the Soul Dawn:

“Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness.”

“I did not know that I knew aught, or that I lived or acted or desired. I had neither will nor intellect. I was carried along to objects and acts by a certain blind natural impetus. I had a mind which caused me to feel anger, satisfaction, desire. These two facts led those about me to suppose that I willed and thought. I can remember all this, not because I knew that it was so, but because I have tactual memory. It enables me to remember that I never contracted my forehead in the act of thinking. I never viewed anything beforehand or chose it. I also recall tactually the fact that never in a start of the body or a heart-beat did I feel that I loved or cared for anything. My inner life, then, was a blank without past, present, or future, without hope or anticipation, without wonder or joy or faith.”

And her awakening upon beginning to know language, when she first appreciated the relationship between a finger-movement against her palm and the idea of ‘water’:

“That word startled my soul, and it awoke, full of the spirit of the morning, full of joyous, exultant song. Until that day my mind had been like a darkened chamber, waiting for words to enter and light the lamp, which is thought.”

(As an aside, notice here the striking contrast between the non-world of conscious unconsciousness first described and the bounding, fulsome world of metaphor that springs forth in that final paragraph).

Julian Jaynes and the Analog “I” (Science Philosophy Chat Forums)

In this way, the “self” takes on a structure that depends on (and reflects) the environment it was raised in. Perhaps auditory hallucinations and split personalities are something like vestigial behaviors such as goosebumps, or the palmar grasp reflex, that were part of our brain’s deep evolution. Their manifestation (or lack thereof) depends on the particular environment, genetics, and certain complex personality dispositions.

This presents tantalizing connections with the work of a long-forgotten Soviet psychologist named Lev Vygotsky. He work was eventually suppressed and forgotten in the West until the 1980’s according the Aeon article above. Therefore, Jaynes may not have heard of it. But one wonders whether he could have incorporated Vygotsky’s ideas on the inner voice being a product of the environment into his research:

Lev Vygotsky…said the human mind was shaped by social activity and culture, beginning in childhood. The self, he hypothesised, was forged in what he called the ‘zone of proximal development’, the cognitive territory just beyond reach and impossible to tackle without some help. Children build learning partnerships with adults to master a skill in the zone, said Vygotsky, then go off on their own, speaking aloud to replace the voice of the adult, now gone from the scene. As mastery increases, this ‘self-talk’ becomes internalised and then increasingly muted until it is mostly silent – still part of the ongoing dialogue with oneself, but more intimate and no longer pronounced to the world. This voice – at first uttered aloud but finally only internal – was, from Vygotsky’s perspective, the engine of development and consciousness itself.

The Inner Voice (Aeon)

So is the “integrated self,” with its inner voice simply a bunch of neurons firing in the brain, or is it a product of particular environmental circumstances? And did it emerge as the dominant mental paradigm fairly recently in recorded history, perhaps as recently as the Bronze Age? And, prior to that, was our inner voice considered to be a numinous experience by ancient peoples, one that they related to the only way they could (because of theory of mind)–as another sort of living being (daemons, manes, spirits, angels, jinn, elves, and so forth)?

We’ll be considering that next time. But, before that, we need to consider what we talk about when we talk about consciousness.

Snow Day

I came across an older episode of Tangentially Speaking on my computer, so I’ll lazily recycle Chris Ryan’s words:

[8:16] “Thank you to all of you who wrote to me expressing your opinions and your encouragement and your hesitations and everything else after the intro to the last episode where I talked about the conundrum and the conflicts that I’m feeling about this project. It is so cool, really. Really, it’s so cool for people to be writing to me from, you know, dropping out of the sky and expressing your support and your concern. I feel like I’ve got so many friends that I haven’t met, and that’s a wonderful, beautiful feeling to have. So I really–thank you for that.”

And I echo his sentiments. Thank you for all your letters of support and encouragement.

Right now, we’ve just been hit with another snowstorm; this one has already dumped well over a foot based on when I walked out my door. I’m at Hi-Fi Cafe right now, but I have a long afternoon of shoveling when I get home, and somehow figuring out how to shovel out my mom’s house as well. On a lighter note, I’m apparently officially their most dedicated customer:

At the moment I’m just enjoying the time off and not having to commute across town in blizzard conditions. I’ve really got to get out of here. Some of you who wrote to me are Milwaukeeans/ex-Milwaukeeans/Wisconisinites etc. Wherever you are, take care. Honestly, I don’t know how you (we) do it anymore.

My internet is now restored, so I’ll have time to respond to all of you hopefully over the rest of the week. Before I do anything, I’ve got to close out my mother’s estate, which is still a ton of work, even a over year later. It also means selling the house. I have no idea how to do that, but I want to do it as economically as possible since I’ll probably have to live on the proceeds. Right now I’m looking at For Sale by Owner. But with the epic weather we’ve been experiencing (which seems to happen every year now), it will most likely be Spring before I can realistically even think of hitting the road to anywhere (When I say ‘Spring’ I mean on the calendar—Wisconsin has no true spring. It goes from cloudy and 30-40 degrees to 70’s and sunny sometime around late May/Early June. Those of you who live here will know what I’m talking about).

I’ve got some thoughts loosely based on a post I wrote on Ran Prieur’s subreddit, which I see he has addressed on his site. I should get that up soon. Incidentally, would anybody be interested in a r/hipcriminals subReddit for discussion of posts?

I’ve also seen a couple of articles on the 200th birthday of John Ruskin, whose thoughts, as I learn more about them, echo my own in many of ways. So here’s a quote that’s apt for clearing out an estate. Turns out, getting rid of everything you own is surprisingly *hard*:

“Every increased possession loads us with new weariness.”

Happy 200th birthday, John Ruskin (Lloyd Alter, Treehugger)

Was Ruskin the most important man of the last 200 years? (BBC)

Apparently, the new Marie Kondo television program has got people saying goodbye to their possessions and giving them away to thrift stores en masse. This is leading to a massive glut, which I’ve definitely noticed myself. Consignment stores are throwing in the towel left and right and often going out of business entirely. Antique stores are flooded and have signs outside their stores stating that they won’t buy anything from anyone, ever. Even thrift stores are becoming reluctant to take more stuff. The antiques my mom collected are now practically worthless and tough to get rid of. Not great for me right now, but great for changing social mores away from overconsumption.

Why are antiques now so cheap? (Marginal Revolution)

Related, this comments thread on Naked Capitalism:

This article reminded me of another trend but that is more long term. We had to move my mother out of her unit not that long ago as she was too old to stay there and had already broken her hip and had to wait until somebody checked on her to find her.
We had to get rid of most of her stuff as she could not take it to the nursing home she was moving to. A lot of the smaller goods and trinkets we took to charity shops and I saw how the shelves were almost overflowing with such good quality things. And I mean good quality stuff.
It then occurred to me that nearly all her generation was either passing away or downgrading or moving to a retirement home to live. As these people had to downsize in any case, or had family having to get rid of their things, a lot of this stuff was going to such charity shops which explained possibly why there was so much stuff there. As the baby-boomers age even more, I would expect the tempo to increase here.
The very same mechanism is driving the collectibles market in a downwards spiral of too much inventory coming on the market as baby boomers and their parents downsize, while millennials could care less about the debris field they leave behind, they aren’t into it.
More and more of that stuff will end up in landfills since fewer and fewer in the generations after the Baby Boomers have the 3-level homes to house all that shit. 2 bedroom apartments if they’re lucky, or 1 small room in a motel. Or homeless. As George Carlin said, “Houses are containers for holding crap,” and millennials won’t be buying those 3-level, crap-holders.
One of my siblings is a major hoarder (real problem), who also sells antiques and used items in a number of different stores in their area (western PA). The three stores where they work are LOADED to the gills, and I regularly hear tales of lots of “looking” but not much buying.
Indeed, it’s true that as the older generation – now mainly the Korean War gang – moves into retirement/nursing homes, they are purging their stuff, and the boomers aren’t far behind.
What is this ongoing trend going to do to Target and other retailers of cheap Chinese crap as well as clothing sales? It’s got to hurt them. Not only are younger people buying less, but now they have all this almost free higher quality stuff available.

Links 2/7/19 (Naked Capitalism)

The things you find in the time capsule:

Yep, once upon a time women were terrified of being too skinny and looking for ways to help them put put on weight. Then we discovered putting sugar in everything. Now our magic beans do other things.

Postscript: a lot of you who wrote to me are dealing with various health conditions. I was in a very serious relationship with a woman who had severe Fibromyaligia—to the point where she was basically disabled. So I have some idea what it’s like. I’m glad I could provide you with some distraction and food for thought. Take care.

P.P.S:A few of you mentioned Morris Berman’s writings. Funny enough, not only have I followed Morris Berman’s blog and read a number of his books, he was actually one of the first commenters to this very blog! All the way back in 2011, shortly after I first started working on the hipCrime Vocab, I wrote a 9-part series called Is Japan the Future?. I posted link to it on Berman was actually working on a book about Japan at the time (which has since been published), and wrote to me encouraging me to publish what I wrote in book form. It was very kind of him to do so, especially given that I’m essentially nobody. I know he’s moved to Mexico and seems to like it there.

Also, some people suggested teaching English abroad. I’ve actually contemplated that idea before. A few years ago I visited WESLI in Madison. Anyone have any experience/thoughts on their program? Thanks.

My Life as a Statistic

“The life of a man is a struggle for existence with the certainty of defeat.” -Arthur Schopenhauer.

This will be very, very hard and very personal post to write, so you may want to skip this one. It’s also going to pretty long. Okay, you’ve been warned.

Why is it so hard?

I’ve alluded a few times to how awful the past few years have been for me, so I thought I might as well briefly share what has happened to me; Kind of a last will and testament, if you will.

Grandparents’ wedding day photo

Around 2015, as I have mentioned before, after spending more than ten years at the same architectural firm, I was given a “poison pill” job. After that I started getting called into HR repeatedly and given vague warnings about my performance and “attitude” (although very little in the way of specifics). I was placed on a humiliating “performance plan,” where I would have to report weekly and grovel before slimy sociopaths who made my skin crawl. I just couldn’t take it anymore, so I quit.

Well, you know, when you play the game of thrones, you win or you die.

That was almost exactly the same time my mom was first diagnosed with stage 4 cancer. Now, I have no other family. My absentee deadbeat dad passed away a long time ago. I have no brothers or sisters. My mother’s brother lives in New Jersey and could not care less whether his godson were alive or dead (we’ve only communicated briefly my whole life). He and his wife (a university math professor) are retired have no children. They spend most of their time sailing on their 40-foot yacht. He’s the stereotypical baby Boomer to a T- recipient of a nearly free university education that could easily be paid for by working a summer job (1960’s), graduated into a red-hot job market ( early 1970’s), and invested in East-Coast real estate (late 70’s-early 80’s) before the prices for houses on the coasts went stratospheric. He has openly stated that only reason he will vote for a political candidate is whether they will lower his personal taxes—nothing else matters.

Mom and uncle as children

I also have one sole cousin who lives in Ankeny Iowa, but she will come into our story a bit later. Growing up, I had almost no contact with my father’s side of the family, except for the awkward and tense occasional visits to LaCrosse now and then with my dad.

I had nowhere to go at that time. With only a four-year degree and no way to just spend 2-3 years with no income whatsoever, graduate school was out of the question. Besides, it takes years just to get accepted. Plus, I wasn’t particularly interested in running up at least $70,000 more of nondischargeable debt in my 40’s—I’d be paying it back for practically the rest of my life. As I’ve mentioned before, it is de facto impossible to become an architect in this country without significant parental wealth. The same is becoming true for most of the “skilled” professions—you know, the ones we are all supposed to acquire right away with our own resources so as to not get replaced by machines and AI.

I could tell stories about the staggering wealth of some of my coworkers and their families over the years at various firms. (Okay, just one: one of our school interns’ dads paid for a vacation for a week to Hawaii because, hey, why not?). Scholarships aren’t available for people like me, and part-time and night school aren’t options. I need to support myself constantly, or I’m homeless or dead. Because, freedom, or something?

Grandpa T.’s mother and father – Otto and Emilie. Grandpa is dressed in customary girl’s clothes of the time in the lower left.

Another issue is that I could see just how utterly miserable the people above me were. You know the stereotype—working crushing hours, constantly putting out fires, sending emails at 10:00PM at night, unable to take a vacation for fear of being buried upon returning to work, “laptop on the beach” syndrome, never being off the clock, etc. Architecture project management is truly one of the most miserable and thankless tasks on the planet (especially considering the pay). This paragraph from an article struck me as familiar (just replace ‘writing staff’ with ‘architectural staff’, and “content to harvest” with “work to do”):

At the contemporary office (or “co-working space”), you are your own taskmaster. You and your colleagues are not members of a collective, but a competitive market. I found this out slowly, after mistakenly assuming I was not under scrutiny. Yet no matter how hard I tried, how much earlier I came in, at least half the writing staff would be there before me. There was too much content to harvest for anyone to get away with sleeping in. Even more alarmingly, no matter how late I forced myself to stay, I was never the last one to leave.

How to Suck at Business Without Really Trying (Popula)

Mom at 6 weeks old with grandpa and grandma. My grandpa died of a stroke long before I was born; I never met him.

And so I asked myself, do I really want that? For what? It’s not like I had kids to put through college, after all. I did not want to insert the final brick into the wall of my cubicle prison. Unfortunately, like most places today, it had become an “up-or-out” type of firm. You were either moving up, or you were moving out. And I clearly didn’t have what it took to move up, even had I wanted to. But I needed to do something.

Now, I was always fairly good with computers and took a large number of comp-sci courses at university, which I enjoyed, so I signed up for a newly opened coding bootcamp in my city in the hopes of transitioning to a job with more demand and geographical mobility. I’ve sure you’ve seen these springing up all over the place, especially if you live in a big city in the U.S. I was able to take the same bus I had taken to work to get there.

Great-grandma Priebe. This tough old German bird pumped out ten kids, including my grandmother!

I’m not sure what to make of it. You could call it a ‘scam’, and maybe you’re right. But I had literally no other options, and nowhere else to go. Given my mother’s recent diagnosis, the timing was far from optimal, to put it mildly. I paid $15,000 out of pocket. I could sure use that money back now.

Mom age 7, I think.

People came from very different backgrounds. A good number were people who were fairly recently out of college and had majored in something “impractical” (e.g history, marketing, etc.). These were usually the smartest cookies. Some had little formal schooling–perhaps some community college or trade school–and they were the stereotypical “living in mom’s basement playing computer games” folks. Some were older and stuck in dead-end professions they didn’t like, such as sales, insurance, marketing, etc. Some were bartenders or sandwich artists. Suffice it to say, I was one of the few who had had an entire other “career” in an unrelated profession, and was the only licensed architect.

Grandma and grandpa posing with classic cars. Old school cool.

That was from October 2016 to February, 2017 five days a week. After finishing with that, I hit the job trail with my slim Github account, and a few web programs written in C# and JavaScript.

Grandma with a dog apparently called “rags”

At the time, my mom was doing fairly well. She was being treated with drugs instead of chemotherapy, and was responding well to treatment. The her wide circle of friends was able to take her to her doctor’s appointments. My mom, being the stereotypical mom, did not want to trouble me with her problems. She was still physically active, and betrayed no obvious outward signs of being ill. She did need to have stents replaced in her kidneys every three months to keep them functioning. She also had routine cataract surgery.

Grandpa’s brother Herbert fought in Europe during the War. I was told he was at the Battle of the Bulge.

Now, there were, of course, some people from my graduating “class” who ended up with jobs. So I don’t want to say it didn’t work out for everyone. But I was not so lucky.

It seemed the demand for entry-level programmers was highly overstated, at least in Southeastern Wisconsin. I endured a few execrable job interviews, which were few and far between. A few times, they would even force you take quizzes. The business were usually located far, far out in the distant cornfields surrounding the city, in nondescript office parks. Many of these companies seemed like awful places to work, based on the vibe. Office Space hardly seemed a parody. I won’t bore you with the horror stories.

Mom in the center, looking ready to star in Mad Men. This is where I get my height from.

But, of course, this is America, and anyone who doesn’t have a job just didn’t try hard enough and has only themselves to blame. Right?

…Or they’re lazy or hooked on drugs, right? I’ve been working nonstop since I was 16 years old, with hardly a vacation in all that time. And Ibuprofen and the occasion glass of wine is about as hard as my drug use gets (save that single Ayahuasca trip in Topanga).

Anyway, I was eventually contacted by some technical staffing firms. This was 2016, and the job freeze was starting to thaw a little. I endured another round of excruciating and humiliating job interviews at a number of firms. The begging for jobs, and the aggressive interrogation one receives in any interview makes me a little reluctant to embrace the “plenty of jobs” narrative. I constantly had to think of responses to the inevitable grilling of why someone with an architect’s license wanted to be a programmer. If I interviewed for architecture jobs, however, they wanted to know why I quit my last firm, and what the hell I had been doing for the past six months. In any interview, the “default” is: we don’t want you.

I just couldn’t win, it seemed.

My first portrait. Pretty much the exact moment when life started going downhill for me. You can see how thrilled I already am at the supposed “gift” of life.

After a long period of no work (remember, I quit and so received no unemployment benefits during this time period and my money was running out), I finally found a job in Third Ward, just down the street from my old firm (literally). I can’t say the name, of course. But my role was essentially as a Revit/BIM specialist. This particular company had a manufacturing plant where they fabricated the interior fixtures for restaurants. I won’t mention their biggest client, but here’s a hint—think golden arches. They had a large library of 3D Revit objects, which they used to do the layouts. My job was to maintain and extend this library and the associated templates.

Now, I was a temp, which means I could be let go at any time. I had no holiday pay, no vacation time, and no health insurance, despite putting in the same amount of hours and working as hard as anyone else. Fun fact: my supervisor, a grizzled, taciturn fellow in his fifties who had been in the same job since forever, guzzled a dozen Mountain Dews every single morning before lunch (I’m being absolutely dead serious here—I’m not exaggerating!). He had a mountain of tin cans beside his desk.

My dad’s parents inspecting their grandson. My grandmother died when I was 2; I never knew her. My grandfather later remarried. I never had much contact with them.

Nevertheless, I did have a job and a steady paycheck at least. I didn’t really like the job, but it paid the bills. I tried to rebuild my shattered life (and my depleted savings account). I even found myself in a pretty happy romantic relationship starting in August of that year.

My mother’s condition worsened significantly throughout 2017. She had been doing well. Her cancer was in remission. But cancer has a way of coming back. She started experiencing headaches. Then came the double vision. Then trouble walking. This severely limited her activity level.

Now, my mom had always been very physically active, and had hardly seen a doctor her whole life before this. Her inactivity was really torturing her psychologically. She complained constantly that all she could do was sit in a chair all day (which some old people actually seem to like). She actually enjoyed gardening and doing yardwork, and was very unhappy about not being physically able to do those things. She was not into reading or any intellectual pursuits, so a lot of passive activities were out. For her, mowing the lawn and pulling the weeds was a privilege and not a chore (as it is for me.)

The doctors could not determine the cause of the double vision. Initially, they thought that the cancer had spread to the brain, and that was the cause. But subsequent MRI scans showed no sings of cancer in the brain. Then they thought maybe it was pressure from the bones of the skull on the optic nerve. When it comes right down to it, despite our impressive technology, it’s striking how little doctors know sometimes.

Every weekend I was splitting my time between taking care of her and the house and trying to spend time with my then girlfriend.

“Nemo enim est tam senex qui se annum non putet posse vivere”

Them came December of 2017.

The first of the many terrible, awful, miserable, horrible Christmases to come.

After a trip back to her home over Thanksgiving (Savannah, Ga.), my girlfriend ended our relationship. At least she gave me a reason (unlike most others). She wanted a family and a typical American lifestyle—a mortgage in the suburbs, a child, a minivan, Sunday School and Church, and all the rest. She, too, was an only child, and her biological clock was not just ticking, but ringing. I never made it a secret in our relationship that I had no desire for any those things—for me, life is about simply surviving; it must be. So, it made sense, but it still hurt. I guess we probably never really belonged together, anyway. Still, it had been nice to have someone to talk to.

A few days later, I got cut loose from my job. The person who had held the position before me decided to come back to his old job, and they liked him a hell of a lot more than they did me. And so I was unemployed once again in the dead of winter, just in time for the Holidays.

By this time, my mom had become much more severely ill. The drug treatments were no longer working, and her doctor decided to discontinue them. Mom was very weak and could barely get out of bed. She was in severe pain. She couldn’t cook, so I had to bring her food. We were looking into in-home hospice care.

Mom and grandma. Being a single mother, my grandma did much of the work of raising me. They both loved flowers.

Now, you hear many stories about how fucked up the American health care system is, but when you are deep in the weeds, only then do realize how bad it really is. I’ll try and summarize this succinctly.

Thanks to the “opioid crisis” legislation signed by Mr. Trump, doctors can now no longer prescribe opioids without the patient physically coming in to see the doctor. Yet my mother was physically unable to leave the house! There was no way she could visit her doctor in person. Yet she needed her painkillers, without which she would be in constant, severe pain.

The only “solution” was to sign up for in-home hospice care, which was covered by Medicaid. Nurses would come into her home, and write the necessary prescriptions. But there was a snag. My mom had to have surgery to replace stents in her kidneys every three months or her kidneys would stop functioning. If her kidneys failed, of course, she would die. Yet, if we agreed to in-home hospice care, medicaid would no longer pay for the requisite kidney surgery!

I remember our last Christmas together. I had brought over a film starring her favorite actor, George Clooney–Hail Caesar! She told me it was the first time she had laughed in as long as she could remember.

The day after Christmas, December 26th, mom couldn’t get out of bed. Fortunately, I had stayed overnight. She told me to call an ambulance. We went to the hospital–West Allis Memorial Hospital, coincidentally the place where I was born. After several hours, she was released was moved to a hospice facility in Wauwatosa.

After a few days, she was upgraded from “acute care” meaning that she was technically not imminently dying, by the medical definition, at least. This meant she had to leave the hospice (which was only for the terminally ill), despite being unable to get out of bed. It also meant that Medicare was no longer paying for her stay, and so ,the hospice tab started running, which is approximately $9,000 dollars a month (or roughly $300 per day).

Probably the last time I genuinely smiled, captured om Kodak film. Note the Young and the Restless on the Teevee.

At times like these, survival mode kicks in. Without any job or income, I was now my mother’s full-time representative/guardian. My ‘job’ was to be in the hospice and run around taking care of all the legal, financial and medical issues (with no help whatsoever). One particular memory that sticks with me was sitting in the dead of winter in an empty hospital cafeteria watching a large hawk outside in the frigid parking lot snatch a small bird, and take the still living and struggling animal to the top of a lamppost to watch it slowly suffer and die. I couldn’t help but take it as an omen.

Obit anus, abit onus

Now here’s the catch with elder care. As a social worker confided to me, when it comes to long term health care in this country, you either have be either extremely rich or extremely poor, and nothing in between.

This because MEDICARE DOES NOT PAY FOR LONG TERM CARE. Read that fact again, and let it sink in. Yes, despite it being a health care program designed for the elderly, and with people living with acute and chronic diseases where they can’t take care of themselves for years at a time, Medicare does not pay a dime towards extended round-the-clock nursing care. Keep in mind, even the average long-term care facility in my relatively cheap part of the country costs in the neighborhood of  $100,000 per year. As you can imagine, an extended illness where an elderly person cannot take care of himself or herself for whatever reason (dementia, immobility, incontinence, etc.) could easily run into the millions of dollars. Very few people have that kind of money saved up and available. Most people have their “wealth” stored in their house, as my mom did.

Now, here’s another thing—a long term care facility will not take a patient unless they know exactly where the money is coming from upfront. That is, my mother could not be moved from the hospice to anywhere else unless there was a dedicated income stream upfront. And so she languished.

Bizarrely, Medicaid, the U.S. government’s health care program for the poor and destitute regardless of age, DOES cover long-term round-the-clock care. However, you must be utterly destitute. And when I say that, I mean DESTITUTE–you must have less than $1,000 to your name, or you do not qualify. Even my “poor” mom, who probably never made more than $35,000 her entire life, was considered too “rich” to qualify.

So the hospice financial advisors told me that I need to do a “spenddown” to make my mom “poor enough” to qualify for Medicaid. From my understanding, this is an extremely common and typical occurrence, as most people cannot pay such staggering costs upfront. When one has too much money in any savings or checking account (i.e. is not utterly destitute), that money needs to be gotten rid of, and quickly. Retirement assets are not counted, nor is housing equity, but there is a catch, which you will soon see.

Modeling 1970s fashions.

I used the money to pay off all the credit cards, and had the bank put the remaining money towards paying off what was left of the mortgage.

Yes, despite inheriting the house from the my grandmother, my uncle insisted he get his share of the house, despite already owning multiple properties on the East Coast. My mother was just a secretary and single mom with a kid to pay for. Plus, she took care of my grandmother when she was sick, since my uncle lived on the East coast and only flew in on occasion. None of it mattered. He insisted he get his half of the house, forcing my mom to take out a mortgage to pay him off. That mortgage is still on the books.

On the East Coast with my uncle and his ex-wife looking thrilled to babysit their idiot nephew

The other thing about Medicaid is that it’s really more of a loan than any true kind socialized insurance. What that means is Medicaid has a number of aggressive “clawbacks” that it takes to reclaim every red cent paid by the State (which administers it) for the recipient’s care. So, for example, even though one’s house isn’t included as a liquid asset for the purpose of the “spenddown,” if and when the house is ever sold, the State (Wisconsin) will come after every penny it is owed in Medicaid expenses. The same is true for any of the estate’s assets—Social Security, retirement funds, etc. If any part of the estate becomes liquid, it will be immediately seized by the courts. As I mentioned, just a single year stay in an assisted care facility can easily cost in excess of $100,000. So a two-year stay would cost far, far in excess of what most houses in the country are worth, including my mom’s. In other words, the estate would be utterly bankrupt (insolvent).

So if you think that the wealth amassed by the Baby Boomer generation will be passed down to the next generation (which I sometimes hear occasionally), and that this will cure our economic ills, I can assure you from personal experience that you are probably mistaken. Given how long the elderly are living with chronic conditions nowadays, and how common that is, I suspect that in actuality a huge portion of the Baby Boomers’ wealth will end up in the coffers of the State/medical insurance complex, preventing the lion’s share of that wealth from ever being passed down to future generations—generations that are already far poorer than their parents and grandparents and dealing with massive levels of debt. The macroeconomic implications of this are not being discussed anywhere, as far as I can tell (have you heard this stuff I’ve told you talked about in any media outlet??)

Lite-Brite, Lite-Brite, the magical things you can do with light! Note the awful interior decoration.

Anyway, back to our story.

While I was running ragged trying to take care of all this stuff in the dead of winter, Mom was now heavily drugged most of the time to ease the agonizing pain of the cancer that had now spread to her bones. Panicked and scared, she became paranoid and accused the nurses of trying to kill her with drugs. She accused them of making her weaker, but of course it was the disease. I received calls from exasperated nurses at all hours of the day and night because she refused to take her medications. I had to try and reassure her as best I could, even though she was not capable of thinking rationally at this point. Often mom wanted me to stay overnight at the hospice, but I really didn’t want to do that, as I would just be in the caregivers’ way. And besides, I needed the rest.

(Parenthetically, in order to kill time in the hospice, I made use of their library. There was one book in particular that offered some comfort, and I can highly recommend it, whether or not you are facing the imminent death of loved one. It’s called In the Face of Death by Peter Noll.)

I did spend most days in the hospice, though, to the point where it became like a second home to me. After all, I had no job and nowhere else to go! With no other friends or family to rely on, or even to tell my problems to, I had to take care of everything myself. I won’t bore you with all the pettifogging details of legal and financial struggles during this time, though there were plenty of them on a literal daily basis. Honestly, the entire time is now a blur. I was exhausted, gallivanting all over gray and frigid southeastern Wisconsin from dawn till dusk. I informed several of my mom’s friends who had helped take care of her, and they came out to visit occasionally. How much longer did she have? Doctors couldn’t tell me. It could be years or weeks. Or anything in between.

By the time I had finally completed the exhausting task of this “spenddown,” and seeing to all the other related legal and financial matters, it was the nearing the end of January. But by then it was too late. I could see that she was getting weaker by the day. The doctors agreed, and upgraded her condition to “officially” terminal, meaning that a hospice was now the appropriate place for her. There would be no move. The Medicare insurance kicked back in, (it does cover hospice care). At the Tuesday morning meeting with the doctor and nurses, they informed me that it was now a matter of days, perhaps a week at the most.

If you don’t know what happens when people are on the very brink of death, here it is—they stop eating, and they often lose the ability to speak. Indeed, there are no poetic “last words” in modern dying: there are often no words at all. The nurses assured me that she could still hear me. Who knows? I said what reassurance I could. But what can you say at a time like that?

January 31, 2018 my mother’s life finally came to end, just over one month after she went into the hospice. They ask if you want to go into the room and spend some time with the now lifeless body. At this point, my mom had shrunk to just skin and bones. One thing I’ll never forget is they gray pallor of the skin. The unearthly ashen color still haunts me to this day. I didn’t have any desire to look at this empty, withered shell; this grotesque caricature of a formerly living being. Finally, at long last, the suffering was over. Relief, and then guilt at one’s own relief, seems to be the universal response to such events.

Now you know why I was so excited about this year’s World Series run.

Since she had been initially diagnosed several years earlier, my mother had made the necessary arrangements for her final internment—cremation, and a slot in the mausoleum at Woodlawn Cemetery here on the south side of Milwaukee. My grandparents are interred elsewhere in this park. I’m sure you’re aware of just how expensive it is to die in modern America—even an urn costs a fortune! I paid for it out of pocket (despite being unemployed). My mom was adamant that she wanted the simplest, plainest, most unassuming memorial possible, and absolutely no formal funeral, either in a church or a funeral home. If you’ve been fortunate enough to never dealt with the funeral industry, I can inform you that they will do their absolute best to upsell you everything they possibly can (with the requisite sympathy, of course). After all, you’re extremely emotionally vulnerable at this point in your life, and they know it. So you are easy mark. Even death is not safe from hypercapitalism in America. Luckily, my mom was a “no frills” type of person. Besides, our entire family was now pretty much gone by that point.

They day before my mom died I had a job interview with another tech placement firm. I interviewed at a few places. One of them offered me a job. They wanted me to start right way. It would be on Valentine’s Day, just after my mom’s memorial service.

Around this time, I decided to visit a psychiatrist. While my mom was alive, I knew I couldn’t kill myself because she depended on me for so much. It would devastate her. But now, hell, why not? Is it normal to spend your days constantly thinking about how you’d “rather not be here?” Probably not. The doctor put me on antidepressants. While some of the acute urge to die faded, I still can’t say that I’m particular happy. Every day, I wish I didn’t have to get up and face another day. I still rather wish I weren’t here.

High School yearbook photo–Bay View High School.’61, I think? Richie, Potsie and Ralph were a few classes ahead.

On a chilly February day in 2018, about twenty of mom’s friends and coworkers from the Milwaukee Country Transit System—where she had worked in a tiny cubicle for nearly 50 years—gathered outside in the cemetery to pay their final respects. My now ex-girlfriend was kind enough to come out for support. People said some kind words. I gave a short speech, which I can’t remember now but I hope was sufficient for the occasion. And then it was over. All except for the cleanup, of course.

It was at this time I was contacted by my dad’s cousin, who lives in the Minneapolis area. My cousin Joan had been diagnosed with state 4 cancer. It was lung cancer, and apparently, it is very common for lung cancer to spread to the brain, which hers had. The prognosis was not great. She was 60.

Requiem aeternam dona eis, Domine

“In early youth, as we contemplate our coming life, we are like children in a theatre before the curtain is raised, sitting there in high spirits and eagerly waiting for the play to begin. It is a blessing that we do not know what is really going to happen. Could we foresee it, there are times when children might seem like innocent prisoners, condemned, not to death, but to life, and as yet all unconscious of what their sentence means.”
Arthur Schopenhauer, “On the Sufferings of the World”

My parents divorced, I think in 1980.

One of the very few times my dad was even around. Christmas, year unknown.

I remember the day I finally severed all contact with my father. I was at a friend’s house playing an RPG (yes I was/am a nerd). I received a panicked call from my mom. My dad was in jail again—he had crashed a motorcycle and claimed that I was the driver (which I was not, of course). That was the kind of shit that happened on a pretty regular basis. Dad had spent most of his time in jails and hospitals, and the spare bedrooms of his sympathetic friends.

What went wrong? Most people simply assume it was alcoholism, because this is Wisconsin, after all. But I never saw my dad drink. No, whatever was wrong, it was something else—mental illness, chronic lead poising, schizophrenia, who the hell knows? It certainly wasn’t raw intelligence—he was a member of MENSA, a mathematics whiz, fluent in two languages, and had several engineering degrees from MSOE (which is why he had moved here in the first place). No, IQ does not always equal success. Perhaps the old genius/insanity dichotomy.

Probably the saddest and most heartbreaking thing I found while cleaning out the house was my mom’s diary from 1979, detailing her disintegrating marriage. I found it among the piles and piles of stuff long hidden away. It brought back a lot of repressed memories for me. Here are some selected entries. Read it and weep, as they say:

Friday, May 18th:
Don late for work @ WOKY and angry because wrong part for motorcycle. I took car to get new belts – needs alternator. Mom’s ring not in at Gimbels. Shopped for groceries.

Monday, May 21st:
Worked Claim. Chad home from school. Don angry again that he had to babysit Chad all day and furiously violent. He supposedly had lots to do to prepare for his trip on Wed. – in other words, Chad was a burden to him.

Tuesday, May 22nd:
Worked Claim. Sent Chad to school sick as Don wouldn’t babysit. Was so worried all day. Came home – Don argued some more about how Chad won’t leave him alone. Don wanted to watch TV & yelled and yelled at Chad.

Wednesday, May 23rd:
I had a good day. Shopped in A.M. and got coat for wedding. Don played with motorcycle till 2PM – then left for LAX – or wherever? He gave me card and awful geranium for our anniversary. “Happy anniversary, Irene.”

Thursday, May 24th:
Had a good day till noon. Got 20% off on coat I bought Wed. and bought another coat–London Fog–saved $60.00 on both coats. When picking up Chad – teacher angry with me and Chad because Chad cries; won’t conform. I was so upset I cried, and he came out of school in tears. Mrs. W. is unreasonable. Called Carol, mom & K. said not to trust her.

Friday May 25th:
Had a good day! Washed. Chad did as his teacher told him but had problems with a magic trick which made him cry. Mrs. W. hates crying. I told Chad to forget school and we’d enjoy 3 days. Only 3 more days of school – thank goodness! If next year follows this year in school, I’ll leave the church and school.

Thursday, May 31st:
Chad’s last day of school. Chad’s graduation at night from Kindergarten – I was nervous. He was only one not to kiss teacher. Don came to church 10 minutes late and put motorcycle on sidewalk. We left early.

Friday, June 8th:
I asked Don if he’s planning another trip – he’s been so nice. He got angry and said I was being sarcastic but didn’t deny anything except he won’t change oil again in car. Everything has a $ sign on it – it’s so disgusting. Why can’t he just be nice to be nice – instead of difficult?

Saturday, June 9th:
Wrote thank-you’s and L & I. Joyce called. Ken agreed to divorce settlement. Final 6-14. I’m so happy everything went well for her and she’s happy too. I wish I had the courage she has!

Some of the last entries recorded in the diary:

Saturday June 23:
I left for 2 hours with K. Don beat Chad with belt. I am so sad I left Chad knowing Don was in a bad mood. If he ever beats Chad again I’ll leave. When I come home at 10:30 he left to go to his boss’s wedding, which he never told me about.

Sunday June 24:
It’s pretty bad when I can’t leave Chad with his father for fear his father will beat him. How much longer can I go on?

So it goes. That ought to give you a general flavor of my childhood. But I’m sure it’s all my fault somehow…

My dad was semi-homeless and working as a motorcycle mechanic here in Milwaukee when he collapsed one day, and was rushed to a hospital. He had no health insurance, so by this time his entire body was riddled with incurable lung cancer. He want into a coma almost immediately as his body shut down. He lasted not even a week before succumbing. Thankfully, because he was veteran of the U.S. army, the Veterans Administration took care of all the medical expenses and burial. He is buried in the Southeastern Wisconsin military cemetery in Union Grove, Wisconsin. I have never visited.

Here’s the wonderful inheritance I received:

A bit less than Donald Trump’s “small loan” of a million dollars, n’est-ce pas? It also makes it kind of hard to follow Mitt Romney’s advice. I guess I’ll just have to bootstrap harder.

I remember another family member a few years back who, upon being diagnosed with cancer, decided not to seek any treatment. Like an unusually large number of my family members, she had no spouse or children. She worked a variety of jobs, including as a car salesman (I once bought a car from her) and a real estate agent (her final job). After I bought my house she cut off all contact. Mom thinks it was because I didn’t buy it from her (circumstances didn’t allow it). The reason she gave for the lack of any effort to fight the cancer was that “she didn’t want to outlive her money.” She was in her late fifties, I think. My family in a nutshell. So it goes.

The following spring, a cardinal repeatedly rapped on my window every morning as I was getting ready for work. I later found out that cardinals were supposedly signs of departed loved ones vising you according to Native American folklore, or something. Of course, I’m sure he just saw his reflection in my window and thought it was a rival male. I even managed to capture his agitation on video:


The next week I stated my job at the new architecture firm—a “real” job with a salary and benefits. Ironically, it was not far from where the hospice was located. It seemed like a (sort of) happy ending, but as you’ll soon see it was anything but. For me, there are no happy endings.

You know the drill when you get hired—in addition to the W-2 forms and proof of citizenship they require to designate an emergency contact and beneficiary for the CYA insurance policy they take out on you. I had to assure them that there was literally no one to identify in either of those roles. I had to really fight to convince them that, yes, I have no other family or friends capable of that role. This, of course, was unheard of to them. I guess I really am special.

Shortly thereafter, my my dad’s sister Jane (my cousin’s mother), went into the hospital, and died within a few days. My cousin and her family had to arrange the funeral and deal with the accompanying mess in the midst of beginning her aggressive brain cancer treatments.

My only cousin Joan, old school cool. Courtesy Facebook. Hopefully she won’t mind.

My cousin was not even raised by her mother, in fact, but rather her (our) grandparents. My dad’s sister had spent much of her wasted youth getting drunk in bars and shacking up with various assorted violent and alcoholic criminals (my cousin’s father is, in fact, a convicted murderer. She never met him). Abortion was illegal back then, and besides, that side of the family is very Catholic. And so, Joan only returned to Des Moines from LaCrosse when she was 18. She jokes that she was “raised by wolves,” or else that she raised herself, and that’s essentially true. By this time, her mother was repeatedly strung out on various drugs. For most of her life, she has taken care of her mother rather than the other way around.

My cousin’s cancer continued to worsen and spread over 2018. Thankfully, unlike me, she has a huge support network. She has two daughters in in their thirties from two previous relationships (they have different fathers). Both of them are happily married to husbands with good careers and have several kids of their own. Her husband Gary, has a large and colorful family, including six kids of his own from his previous marriage. Those kids are a mixed bag–some are rather troubled, others not. His twin daughters and their families are the most stable and consistent visitors. Between the two of them, they have something like twenty grandchildren. Add to that all the various friends and neighbors.

So I guess some small branch of the family will continue. Our surname, however, dies with me.

Despite my dad’s many problems, my father’s side of the family has been far more welcoming and friendly than my mom’s ever was, despite me having hardly any contact with them until I was in my late thirties (maybe that’s because they’re the Catholic side, LoL). Two things I always liked visiting Joan: She was one of the only people who understood how messed up our family was. There was no need to obfuscate or hide anything, so I could speak openly to someone who could relate. And we shared the same dry, cynical sense of humor.

My ex-girlfriend’s company folded in a very public bankruptcy in 2018 that the employees all knew was coming for some time. Thus, I have a little bit of insider knowledge of the so-called “retail apocalypse” (it was a major retail department store chain headquartered in Milwaukee). After a short job search, a former coworker recommended her for a position at a major Southern retailer in beautiful, sunny Bradenton, Florida. The company paid for her move, and set her up in a spacious corporate condominium with a swimming pool right on the beach, where I’m told the tropical sunsets over the Gulf of Mexico are quite stunning (despite the smelly red tide this year). She really likes it, and it sounds like paradise. So there IS a happy ending after all—just not for me. I’m sure she’ll find someone who has been successful in life with whom she will be able to have the white picket-fence lifestyle she has envisioned for herself in such detail.

Meanwhile, I hoped to visit my cousin, since I didn’t know how much time she had left. But given the fact that her treatments essentially wiped her out for weeks at a time, I had to make sure I wouldn’t get in the way. I was finally able to arrange a visit over Labor Day, 2018. I had arranged it through her husband, who was her caregiver, so it was a surprise. Joan seemed fairly lucid, despite clearly showing outward physical signs of the disease. She told me she was throwing a “half-birthday” party the next month. What else do you do when you don’t know whether or not you will live to see your next birthday? Suffice it say, my cousin loves life far more than I do.

So I drove back to Iowa the following month, and it was a pretty impressive turnout. This time, however, I could see that she was much more “out of it.” She was now undergoing aggressive radiation treatment for brain cancer. She had poor balance, and had fallen down the stairs a few weeks earlier. The treatment was taking a heavy toll on her, and her entire family. I stayed with my second cousin Kate, and we talked about what she and the family were going through. I shared my own perspective, having just gone through something similar. Fun fact: Kate actually makes a living by blogging! Not so fun fact: her husband, also named Chad, was recently diagnosed with the early stages of MS.

The next day about 20 of us went out for breakfast where the party was held the previous afternoon – Fletcher’s restaurant in Ankeny if you’re ever in the neighborhood. The breakfast buffet is quite impressive.


In May 1846 at the height of the ‘taming of the Wild West’ and gold fever, the intrepid colonists of the Donner Party set out from Little Sandy River in Wyoming on the last stage of a long trek to California and a new life, a journey that had begun in Springfield, Illinois, more than a month before. Several untoward events – disorganisation at the start, some ill-advised routing, and attacks by Indians along the way – conspired to delay the party, which at its height numbered eighty-seven men, women and children. As a result, they reached the Sierra Nevada mountains, the jagged line of snow-covered peaks that barred their way west, much later than they had intended, just as winter began to close in.

Though they struggled on, they ended up trapped in the mountains by snowstorms at an entirely anonymous spot now known as Donner Pass. Here, they tried to sit out the winter. But since they had expected to be through the mountains well before winter set in, they had come unprepared. Their food gave out, and some even gave in to cannibalism. By the time a series of rescue parties arrived from California in February and March the following year, forty-one of the eighty-seven pioneers had died.

What makes these bald statistics interesting is who died and who survived. Disproportionately more people who travelled alone died, while the chances of surviving were much higher among those who had travelled as families. Frail grannies travelling with their families made it, but not the strapping young men travelling alone. It paid to be travelling with kith and kin.

A second example is provided by another of the iconic events in American folklore. When the Mayflower colonists set foot on the American mainland in 1620, they were ill prepared to face the harsh New England winter. They suffered from severe malnutrition, disease and lack of resources, and no fewer than fifty-three of the 103 colonists died in that first winter. But for the intervention and generosity of the local Indians, the colony would have died out completely. Again, mortality was highest among those who came alone, and lowest among those who came as families.

The issue is not so much that families rush around and help each other, though that is certainly true, but rather that there seems to be something enhancing about being with kin. Being surrounded by family somehow makes you more resilient than when you are simply with friends – however much you argue with them. This much is clear from two studies of childhood sickness and mortality, one the city of Newcastle-upon-Tyne during the 1950s and the other on the Caribbean island of Dominica during the 1980s. In both cases, the amount of childhood illness and mortality experienced by a family was directly correlated with the size of its kinship network. Very young children in big families got sick less often, and were less likely to die. Again, this is not just because there are more people to rush around and do things in large families. Rather, it has something to do with just being in the centre of a web of interconnected relationships. Somehow, it makes you feel more secure and content, and better able to face the vagaries the world conspires to throw at you.

from “How Many Friends Does One Person Need?” by Robin Dunbar. pp. 39-41

Everything That Remains

“she no longer threw out anything, because everything might eventually come in handy: not even the cheese rinds or the foil on chocolates, with which she made silver balls to be sent to missions to ‘free a little black boy.'”
― Primo Levi,
The Periodic Table

Now, here’s another thing about my mother—she was a hoarder. Every nook and cranny was crammed with stuff. My grandmother, being a child of the Great Depression and growing up the child of impoverished immigrants, saved literally everything no matter how trivial, because it might be useful someday, and she passed that trait on to my mother who never, ever, threw anything out. You would not believe the stuff I found!. My mom also had the habit of never throwing any piece of paper away, no matter how trivial or inconsequential. Boxes and boxes are piled high with credit card statements, credit card offers (!), energy and water bills, bank statements, greeting cards, thank-you notes, letters, postcards, correspondence, old newspaper clippings, retail advertisements, coupons, and other scattered miscellany dating back to the 1960’s.

I think this was outlawed by the Geneva Protocol.

Thus, I was burdened with getting rid of several decades of accumulated crud. Words fail in conveying the sheer magnitude of the task that lay before me. In essence, I had two full-time jobs in 2018—one paid, the other not—and every fleeting snatch of free time that year was spent engaged in one job or the other. I feel like this entire year of my life was a write-off. There was no “me” time (save for occasionally writing this blog). Fun and recreation were a distant memory.

I printed up hundreds of flyers and went door-to-door in the neighborhood. The turnout for the few weekends of estate sales in June was pretty good. This was an “old fashioned” Milwaukee neighborhood where people spent their entire lives in the same spot, and it wasn’t at all unusual for neighbors to know each other for over 20 years, help each other out, etc. Many neighbors came over to buy stuff and offer condolences. Some of them had lived there since my grandparents were still alive. Many knew my mother. An empty-nest older couple from across the street took pity on me and helped me organize and run the sales, even donating some tables to help me out.

As people strolled though the house to shop, their typical reaction I got was usually the same—pity. They felt sorry at the magnitude of what I had to accomplish by myself and told me horror stories of their own experiences clearing out an elderly relative’s estate. To me, this indicated that I wasn’t just feeling sorry for myself—this really was an unusually gargantuan task. “I sure don’t envy you” was something I heard often in sympathetic tones.

Despite selling a ton of stuff over several exhausting summer weekends, it was still a drop in the proverbial bucket. Every week I filled up the garbage bins to the hilt. I hit up eBay to sell off my old toys that my mom insisted on saving because she thought taht they would be worth a lot of money someday. They weren’t worth a lot of money, but some of it was worth something. That’s fortunate, as I really need that money right now.

Just a bit of the swag I sold on eBay. Sorry, it’s already gone!

Let me tell you that once you’ve gone through this, you will realize just how empty buying and consumerism really is. Minimalism and decluttering will become a religious experience. “You can’t take it with you,” indeed.

Interestingly, if you’ve ever read the book by The MinimalistsEverything That Remains—the author of that book had a similar experience. While cleaning out his mother’s possessions after her death, he realized just how utterly futile it is to spend a lifetime accumulating stuff that all gets thrown out anyway. Like me, the author of that book was the child of a single mother growing up poor in the Rust Belt. Unlike me, he actually once had a wife and a successful and lucrative career. I still think Fight Club put it best, “The things you own end up owning you.”

I had to engage the services of a local attorney to arrange the probate proceedings. Even though my mother had a will and I was the sole heir, the probate proceedings are still a requirement. The bank filed a claim for the home equity loan, but that and the mortgage are the only claims.

[Estate planning tip: if your state allows for a “transfer on death deed”, (it’s quite recent and varies from state to state) and you are in line to inherit real estate property (especially if you are the sole heir), get that drawn up by an attorney right away. You’ll save a ton of hassle. Thank me later.]

Now, almost exactly a year later, nearly everything has been binned, sold, auctioned, put out on the curb, demolished, donated to any number of thrift stores, food banks and charities, burned, shredded, recycled, or otherwise disposed of. There are still many assorted items to deal with—an antique vanity, an old blender, a mahogany display case from Germany, etc. (anyone want these?). I am also getting rid of everything I own as well. No matter what happens, I’m not going to be needing them anymore.

Genuine mahogany hardwood. On wheels. Lockable with key. From Germany (allegedly). Only $500.00 cheap!

And finally, I must sell the house my grandparents built in 1940 for $3700—where they raised two children; where I spent all of my youthful summers; and where my mom spent the last 25 years of her life.

My mom asked me to do an architectural rendering of the house. In my defense, I didn’t have much to work with.

It’s an odd thing liquidating the legacy of two families. Of being the last. Of immanent extinction. At least I have plenty of fellow travelers in the natural world to contemplate. The photos you see here are all going to be destroyed eventually, along with everything else. Nothing but confetti and ashes. When I am done, there will be no record of my family ever having been here. What would my immigrant great-grandparents make of such an end? What would they think knowing that this country would chew up their descendants and spit them out without mercy? That I would be the last ever holder of our surname? I’m reminded of the Chinese term, guanggun, or “bare branches”. I sometimes wonder—would they do it all over again if they knew that it was all going to end this way?

Great-nana’s immigration papers. Anyone out there read German? I can see she sailed from Bremen but can’t make out her home province.

This might explain to readers just why I hold many beliefs that I do: fatalism, cynicism, philosophical pessimism, antinatatalism, and a profound sense of the tragic. It also explains how I can write so dispassionately about a future which is looking more and more grim by the day. I have no stake in it, whatsoever. I simply observe, and have no reason to be unduly optimistic (or pessimistic, for that matter). Things are neither good nor bad, they just are. I never wanted anything more than a tiny modicum of happiness before the lights go out forever. I only wish I had found it.

So when I write about things like collapse and extinction, I have a uniquely visceral, intimate perspective on what I’m taking about.

Fun fact: Fun fact: Both John F. Kennedy’s sister and Joseph Stalin’s daughter died in Wisconsin in the 2000’s.

Gallery of Endangered Animals (Tim Flach, photographer)

Wanderer’s Nightsong

Ach, ich bin des Treibens müde!
Was soll all der Schmerz und Lust?
Süßer Friede,
Komm, ach komm in meine Brust!

Last week I heard that a close friend from high school had died. He was exactly two days older than me (his birthday August 17th, mine the 19th). Although he lived in Arizona, I did know that he had been in and out of the hospital for several years, thanks to Facebook (which I am no longer on). He was rather obese and suffered from poor health his entire life. I don’t know the ultimate cause of death, but my guess is something like heart failure.

(One of my other best friends from high school died many, many years ago. He was asthmatic and foolishly went swimming on a hot summer day with very poor air quality. He had an asthma attack and drowned. He was 28 as I recall).

Now, ere we are, exactly one year on, and the architecture firm I worked for fired me last Friday.

I keep wondering what I could have done differently. Perhaps I really am just too stupid to do this job. Maybe I truly am incompetent, despite doing this for so many years. Maybe I’m not a good enough politician. Perhaps I lack the killer instinct. In any case, my career has been the epitome of abject failure. It’s brought me nothing but pain. At what point does the sunk cost fallacy apply? But, as I’ve learned, once you are specialized, past a certain age, the job market gives you no leeway. F. Scott Fitzgerald got it right as usual: “there are no second acts in American lives.”

Assuming I’m not too stupid to do this job, the only other conclusion I can arrive at is that even after doing it for so long, I have not been adequately trained. And if I’ve spent almost 25 years in this profession and still don’t have the requisite skills, then I’m never going to have them. A rather damning indictment of the architecture profession, don’t you think? I’ve read many a lament over the fact that generational skills are being lost, and there are too few people coming up to replace those leaving. But whose fault is that? In my case, it seems I have little choice. The architecture profession truly is eating it’s young, as indeed are many other professions, it seems. But what does that portend for the future?

Personally, I don’t really care anymore. Not my problem.

The fate of the architect is the strangest of all. How often he expends his whole soul, his whole heart and passion, to produce buildings into which he himself may never enter.–GOETHE

As I write these words, my cousin is still fighting on. But for how much longer? Not to be morbid, but the grim reality is, I can’t see how she will finish out the year. When she is gone, the glue that held that part of the family together will also be gone; scattered to the winds. And my last blood relative will be gone from the earth.

And so here I am. No family. No friends. No significant other. No job. No viable profession or career to speak of. No mentor. No helpers. No income. No retirement savings. On the bright side—no kids and no debts. As I write this, we are experiencing an epic cold blast of biblical proportions. The wind chills are expected to be around 40 below zero this evening, conveniently the same temperature in both Celsius and Fahrenheit if I have any international readers. Yesterday, I spent an hour shoveling the foot or so of snow we were blanketed with. It’s a hard, hard land, indeed. At least the sun is out today—a rare occurrence. Tomorrow, the city will essentially be shut down due to the cold—and this is Wisconsin!

The snowed-in entrance to The Hipcrime Vocab international headquarters. Send in the Saint Bernards!

Coldest Blast in Years Heading for Midwest, Great Lakes (Weather Underground)

Football season is over…

Don’t worry, I don’t have a gun on the table as I write this (I’ve never even fired one!). And yet I wonder: is there ever a rational case for suicide? And if so, when is it time to throw in the towel?

Typically when I read about rational suicide, it’s in the context of a painful, incurable disease, which I don’t happen to have (as far as I know, but the way things are going…). But, honestly, I struggle with finding a reason to go on. Why suffer? It seems that life itself is a painful, incurable disease, and one that offers very little in the way of recompense for all of its burdens and the suffering it inflicts.

I contemplate the late Anthony Bourdain, who lived the type of life that I would consider ideal. And yet, even he ultimately found no reason to go on. Although I was aware of Mr. Bourdain’s work, I never saw a single show he was on (they were on Cable, after all), nor did I read any of his books. Yet I can’t help but feel a certain philosophical kinship with the man.

As I observed in a disturbingly popular post from a few years back, they don’t have to kill you if they can get you to kill yourself. In that post I was pondering whether the epidemic of deaths across America’s Heartland was a symptom of spreading mental illness, or rather merely a rational response to intolerable circumstances. Fentanyl doesn’t seem like that bad of a way to go, all things considered.

I’ve since made peace with the prospect of my own death. Not that I welcome it, mind you. Since it is inevitable, after all, I never saw much of a point in immanentizing it. I still very much consider it as a last resort. If I had another option, I would not hesitate to take it.

I feel like this world has no place for me. Perhaps I should have been born a hundred years earlier. Really, I don’t know how many more messages the universe can send me saying, in effect: “it’s time for you to go.” You were never meant to have been here in the first place. I mean, really: my mom mentioned offhandedly how surprised she was when she found out she was pregnant, since my parents were apparently hardly ever intimate throughout their miserable clusterfuck of a marriage (Aaaargh—too much information, mom, too much information!!!!). That’s why I’m an only child, an accident of evolution. I often told my mom that I wished she had opted for abortion (legal as of 1973). Instead, here I am, lucky recipient of the “gift” of life without a valid return receipt.

Anyway, I felt I owed readers a reason if ever this blog should ever end, and now seemed like a good time. It won’t be right away, though—I’ve got several posts already written in various states of completion. Plus, I have no job and it’s too cold to be outside. Nothing else to do but read and write!

And so I come to the end of my tale. What I could really use right now is some advice. I would especially prefer practical advice (someplace warm to move to, something else to do to earn money, etc.) Anything at all will be accepted, no matter how crazy or ridiculous it sounds. After all, at this point I literally have nothing left to lose. Leave comments below, or email” I’ve now set this up to send messages directly to my Gmail account.

Thanks for reading.



Some Must-Reads

I recently ran across a couple of good articles that relate to a lot of themes The HipCrime Vocab project has been discussing lately.

This article in The Guardian: Who’s correct about human nature, the left or the right? makes the point that I’ve made repeatedly over the past year. As the byline says, “Most conservatives see it as ‘common sense’ that humans are selfishly competitive – but things looked different pre-capitalism.”

Indeed they did. Pre-capitalism, society was tasked with “habitation” rather than “improvement,” and social ties were based around one’s group ties. Economic relations were “embedded” in social institutions. That means constraints as well, but the constraints served an important purpose. The idea of everyone being in a zero-sum competition for a small number of jobs would have struck most people as an absurd way to organize society. I’ve been reading a book called, The Market as God, by theologian Harvey Cox which makes a point I’ve also made repeatedly here on this blog: economics is far more of a religion that it is a science. The author writes of the Market’s mythical Genesis:

The relationship between religion and The Market is a long and convoluted saga. When did it start? One day a Cro-Magnon man traded a chiseled-stone spearhead with a hunter for a slice of newly slain saber-toothed tiger. He was so pleased with the exchange that the next morning he laid out some other tools he had made on a large rock and watched for passersby to stop and deal. The first market was born, and that was about forty-three thousand years ago.

This, of course, is a myth, and like any other myth it takes place on some other plane of time and space. It has no basis in fact; its purpose is to explain or justify some feature of our own times.

But there are good myths and bad ones…Its lack of any basis in real history is not what makes it a bad myth. Many good myths share that quality. Still, since those who use it often assert it is historical.

It is important to remember that anthropological and historical research has shown that the earliest people did not have markets. Rather, theirs were gift cultures, at least within social groups. One was expected, of course, eventually to reciprocate for gifts accepted. But the reciprocation was not expected to happen right away; otherwise it would amount to tit-for-tat bargaining. What little barter did happen took place only with outsiders. Thus trust, reciprocity, and the importance of community are more primal and more natural, if that word is relevant in this case. They were present before markets or even bartering appeared.

Also, when two people met each other in even the most primitive of exchanges, they were already embedded in social and symbolic worlds which overlapped in both conflict and mutuality. There had probably been previous encounters and there would be more to come.

As intertribal connections increased, the role of traders, once peripheral, grew as well. But even when simple forms of currency appeared (in the form of shells or beads, for example) both the buyer and seller knew they were part of larger interlaced worlds that relied on some common assumptions. The spearhead-for-a-slice or any of its variants is ahistorical. It may be a useful fiction, for some people, because it serves as what theologians call a “myth of origin” for the religion of the Market God. It suggests that market values are primal, even ingrained in the human psyche. We are, as the T-shirt has it, “Born to Shop.”

But the truth is that market economies are not timeless. They appeared in human history under certain ascertainable conditions. The fact that they have existed for a long time does not make them eternal and it does not guarantee they will always be with us.

The Guardian article makes many of the same points. The primacy of the Market and the individual was an intellectual project from the start, and pressed into service to justify land and resource grabs that Europeans were undertaking across the globe, including within their own countries by the upper classes. History was subsequently ‘retconned‘ to make it seem like something natural and inevitable; dissenting views and ideas were quashed.

Liberalism, which first emerged in the 17th century, has at its core a distinctive conception of human nature. The most important point about humans for liberals is the fact that they are individuals. It involves “seeing the individual as primary, as more ‘real’ or fundamental than human society and its institutions and structures” and “involves attaching a higher moral value to the individual than to society” (Arblaster).

Furthermore, this conception of human nature “tends … to impute a high degree of completeness and self-sufficiency to the single human being, with the implication that separateness … is the fundamental, metaphysical human condition”.

As a fundamentally “complete” individual, the liberal human has pre-given and fixed, rather than socially constructed needs and preferences. More often than not, the liberal individual is also a radical egoist who enters into interaction with other individuals simply in order to satisfy pre-formed preferences.

The relationship between this conception of human nature and capitalism is obvious. The atomised liberal individual reflects the atomised conditions of bourgeois society in which social ties of kinship and fealty have been dissolved. It is worth stressing that this was a new understanding of human nature. In pre-capitalist philosophy wholeness or completeness usually belonged to the community rather than to the individual.

Rather than self-sufficient individuals, humans were seen to be embedded in communal relations that almost wholly defined them. The view of human nature that underpins the politics of the modern-day right, then, arose at a particular historical juncture. It is not some ideologically “neutral” description.

Definitely read the whole thing at the link above. The other eloquent post is entitled A liberal elite still luring us towards the abyss by journalist Jonathan Cook.

Cook points out that Liberalism itself has become something of a religion. For the true believers, anyone not on-board is a heretic. Yet the zealots of liberalism refuse to see the problems inherent in their own world view. Instead they tell themselves reassuring stories about they are on “the right side of history”, and cast anyone with the slightest doubts about their project as “irrational,” if not outright barbaric as John Gray points out,

…there has been a shift in the mood of liberals. Less than a decade ago, they were confident that progress was ongoing. No doubt there would be periods of regression; we might be in one of those periods at the present time. Yet over the long haul of history, there could be no doubt that the forces of reason would continue to advance.

Today, liberals have lost that always rather incredible faith. Faced with the political reversals of the past few years and the onward march of authoritarianism, they find their view of the world crumbling away. What they need at the present time, more than anything else, is some kind of intellectual anodyne that can soothe their nerves, still their doubts and stave off panic.

Thus, we currently stand between two options that are truly terrible to contemplate. On one side is the Neoliberal status quo that pits us against each other, sets up a hereditary aristocracy of wealth and PhD. degrees, seizes the public’s common heritage in a new Enclosure Movement, and pushes us ever closer towards what I’ve termed Neofeudalism. Its result is conflict and heartrending psychological despair across the globe, combined with the ongoing destruction of the natural world.

Davos Elites Love to Advocate for Equality – So Long As Nothing Gets Done (Branko Milanovic, Promarket)

On the other side is reactionary nationalism, inflaming racial and ethnic divisions as a pathway to gain and retain power. It uses the classic tactics of “us-versus-them” thinking, scapegoating, images of a mythic past, surrendering to a “strong-man” patriarchal leader, open hostility to intellectualism and the arts, and an utter disdain for the very concept of equality before the law.

Both of these options are quite grim to contemplate, as Cook points out. We need to find another way he argues, one that preserves the gains of liberalism but adopts a view of human nature more in line with who we really are as human beings:

…the abyss has not opened up…because liberalism is being rejected. Rather, the abyss is the inevitable outcome of this shrinking elite’s continuing promotion – against all rational evidence – of liberalism as a solution to our current predicament. It is the continuing transformation of a deeply flawed ideology into a religion. It is idol worship of a value system hellbent on destroying us.

Liberalism, like most ideologies, has an upside. Its respect for the individual and his freedoms, its interest in nurturing human creativity, and its promotion of universal values and human rights over tribal attachment have had some positive consequences. But liberal ideology has been very effective at hiding its dark side – or more accurately, at persuading us that this dark side is the consequence of liberalism’s abandonment rather than inherent to the liberal’s political project.

The loss of traditional social bonds – tribal, sectarian, geographic – has left people today more lonely, more isolated than was true of any previous human society. We may pay lip service to universal values, but in our atomised communities, we feel adrift, abandoned and angry.

The liberal’s professed concern for others’ welfare and their rights has, in reality, provided cynical cover for a series of ever-more transparent resource grabs. The parading of liberalism’s humanitarian credentials has entitled our elites to leave a trail of carnage and wreckage in their wake in Afghanistan, Iraq, Libya, Syria and soon, it seems, in Venezuela. We have killed with our kindness and then stolen our victims’ inheritance.

…the absolute prioritising of the individual has sanctioned a pathological self-absorption, a selfishness that has provided fertile ground not only for capitalism, materialism and consumerism but for the fusing of all of them into a turbo-charged neoliberalism. That has entitled a tiny elite to amass and squirrel away most of the planet’s wealth out of reach of the rest of humanity.

Worst of all, our rampant creativity, our self-regard and our competitiveness have blinded us to all things bigger and smaller than ourselves. We lack an emotional and spiritual connection to our planet, to other animals, to future generations, to the chaotic harmony of our universe. What we cannot understand or control, we ignore or mock…

Go check it out.