The Archaic Mentality

The inspiration for this series of posts was an article in Psychology Today entitled: Did Our Ancestors Think Like Us? I’m pretty confident that they didn’t, but in what sense did their differ? Were they as different as Jaynes described, or was it something less extreme?

Imagine that you are a time-traveler, able to travel back roughly 40,000 years to the age of the first anatomically modern homo sapiens. Imagine stepping out of your time machine and standing face to face with one of your ancestors: Another human with a brain just as big as yours, and genes virtually identical to your genes. Would you be able to speak to this ancient human? Befriend them? Fall in love with them? Or would your ancestor be unrecognizable, as distinct from you as a wolf is distinct from a pet dog?

…Some think that, since we have the same genes as ancient humans, we should show the same mannerisms. Others suspect that human psychology may have changed dramatically over time. Nobody definitely knows (I certainly don’t), but my hunch is that the human mind today works very differently than did our ancestor’s minds.

Did Our Ancestors Think Like Us? (Psychology Today)

Brian McVeigh sums up Jaynes’s ideas this way:

In The Origin of Consciousness in the Breakdown of the Bicameral Mind [Jaynes] argued that conscious subjective interiority was not a bioevolutionary phenomenon. Rather, interiority—and by this term he did not mean perceiving, thinking or reasoning but the ability to introspect and engage in self-reflectivity—emerged historically as a cultural construction only about three millennia ago.
The Psychohistory of Metaphors, Brian McVeigh p. 133

I would argue that there is recent psychological research that tentatively backs up some of Jaynes’ claims. New research has shown that a lot of what we thought was just “basic human cognition” turns out to be socioculturally constructed. Much of the world today does not think or reason in the same way as members of Western industrial societies do. The blogger writes:

Many animals learn how to solve problems by watching other animals try and fail, but humans appear to take social learning to another level: we learn how to think from one another.

Consider that when people move to a new culture, they actually begin taking on the emotions of that culture, reporting more everyday sadness in cultures that feel more sadness and surprise in cultures where people feel more surprise. Consider that people’s ability to read others’ thoughts and feelings from their behavior depends on the number of words in their native language indicating mental states. Consider that people’s level of prejudice towards other groups (i.e. the extent of their “us versus them” mentality) and moral convictions (i.e. their belief that some acts are fundamentally right or wrong) strongly depends on whether or not they follow an Abrahamic religion. And consider that people’s ability to think “creatively,” to generate new solutions that diverge from old ones, depends on how strictly their culture regulates social norms. This is just a small sampling from hundreds of studies that show how flexible the human mind is.

For a graphic example, it was recently determined that the “primitive” Himba of Namibia are actually more mental agile than supposedly “high IQ” Westerners at solving novel problems:

“We suggest that through formal education, Westerners are trained to depend on learned strategies. The Himba participate in formal education much less often and this is one possible reason why they exhibited enhanced cognitive flexibility,”

Cognitive neuroscientists observe enhanced mental flexibility in the seminomadic Himba tribe (PsyPost). He continues:

The second reality that makes me think our minds work differently today than they did thousands of years ago is that human culture is staggeringly diverse. We speak over 6,000 languages, follow 4,000 religions, and live our lives according to a sprawling set of social and moral customs. Some other animals have diverse culture: Chimpanzees, for example, forage for food in a number of different ways that are probably socially learned. But human cultural diversity goes beyond one or two kinds of differences; our cultures are different in almost every way imaginable. The development of this cultural diversity may have had a profound impact on our psychologies.

When you put these realities together, you have (a) an amazingly diverse species with (b) an amazing capacity to learn from diversity. Add thousands of years of development and cultural change to the mix and you likely get modern human thinking that scarcely resembles ancient human psychology. This doesn’t mean that today’s humans are “better” than yesterday’s; it just means that humans are fascinating animals, more cognitively malleable than any other.

The writer doesn’t get into more detail than that, and there aren’t any further explanations so far. But the idea was backed up by a landmark paper which came out a few years ago by was Joseph Henrich, along with Steven J. Heine and Ara Norenzayan. They write:

There are now enough sources of experimental evidence, using widely differing methods from diverse disciplines, to indicate that there is substantial psychological and behavioral variation among human populations.

The reasons that account for this variation may be manifold, including behavioral plasticity in response to different environments, divergent trajectories of cultural evolution, and, perhaps less commonly, differential distribution of genes across groups in response to different selection pressures… At the same time, we have also identified many domains in which there are striking similarities across populations. These similarities could indicate reliably developing pan-human adaptations, byproducts of innate adaptations (such as religion), or independent cultural inventions or cultural diffusions of learned responses that have universal utility (such as counting systems, or calendars)…

Not only aren’t Americans typical of how the rest of the world thinks, but Americans are shockingly different (surprising, huh?). As one writer put it, “Social scientists could not possibly have picked a worse population from which to draw broad generalizations. Researchers had been doing the equivalent of studying penguins while believing that they were learning insights applicable to all birds.”

As you might imagine, one of the major differences has to do with radical individualism. Americans see themselves as “rugged individualists,” whereas everyone else sees themselves as part of a larger social fabric:

[S]ome cultures regard the self as independent from others; others see the self as interdependent. The interdependent self — which is more the norm in East Asian countries, including Japan and China — connects itself with others in a social group and favors social harmony over self-expression. The independent self — which is most prominent in America — focuses on individual attributes and preferences and thinks of the self as existing apart from the group.

…Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically. That is, the American mind strives to figure out the world by taking it apart and examining its pieces. Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles, and other objects in the background. Shown another way, in a different test analytic Americans will do better on…the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.

Are Americans the Weirdest People on Earth? (Big Think)

As for why Americans, and WEIRD (Western, Educated, Industrial, Rich, Democratic) countries more generally, are so different than the rest of the world, the authors of the original paper speculate:

To many anthropologically-savvy researchers it is not surprising that Americans, and people from modern industrialized societies more generally, appear unusual vis-á-vis the rest of the species.

For the vast majority of its evolutionary history, humans have lived in small-scale societies without formal schools, government, hospitals, police, complex divisions of labor, markets, militaries, formal laws, or mechanized transportation. Every household provisioned much or all of their own food, made its own clothes, tools, and shelter, and–aside from various kinds of sexual divisions of labor–almost everyone had to master the same skills and domains of knowledge.

Children grew up in mixed age play groups, received little active instruction, and learned largely by observation and imitation. By age 10, children in some foraging societies obtain sufficient calories to feed themselves, and adolescent females take on most of the responsibilities of women.

WEIRD people, from this perspective, grow up in, and adapt, to a highly unusual environment. It should not be surprising that their psychological world is unusual as well. p. 38 (emphasis mine)

I wrote about this study back in 2013: Americans are WEIRD.

The differences in American thinking and the rest of the world seem to mirror the left brain/right brain split described by Ian McGilchrist:

The left hemisphere is dependent on denotative language, abstraction, yields clarity and power to manipulate things that are known and fixed. The right hemisphere yields a world of individual, changing, evolving, interconnected, living beings within the context of the lived world. But the nature of things is never fully graspable or perfectly known. This world exists in a certain relationship. They both cover two versions of the world and we combine them in different ways all the time. We need to rely on certain things to manipulate the world, but for the broad understanding of it, we need to use knowledge that comes from the right hemisphere.

A Psychiatrist Explains the Difference Between Left Brain and Right Brain (Hack Spirit)

Given that thousands of years ago, there were NO industrial countries with a majority of the population educated, wealthy, or literate, it’s pretty obvious that thinking must have been quite different. Of course, that does not prove Jaynes’s ideas. However, if even modern psychology researchers report substantial differences among existing populations, why it hard to believe that people separated from us by thousands of years in time are more different that us than alike?

It’s also worth pointing out that the fundamental structure of our brain changes in response to activities we undertake to navigate our environment. It’s been hypothesized that the use of the internet and ubiquitous computer screens are “rewiring” our brains in some, possibly nefarious, way. An article about this topic in the BBC points out that this is not new–everything we do rewires our brains in some way. In other words, we do not come into the world completely “done” – much of how our brains function is culturally determined. This, in turn, changes the brain’s structure. So we need not posit that somehow the brain architecture of bicameral people was radically different, only that they were using their brains in a different way as determined by the cultural context.

We regularly do things that have a profound effect on our brains – such as reading or competitive sports – with little thought for our brain fitness. When scientists look at people who have spent thousands of hours on an activity they often see changes in the brain. Taxi drivers, famously, have a larger hippocampus, a part of the brain recruited for navigation. Musicians’ brains devote more neural territory to brain regions needed for playing their instruments. So much so, in fact, that if you look at the motor cortex of string players you see bulges on one side (because the fine motor control for playing a violin, for example, is only on one hand), whereas the motor cortex of keyboard players bulges on both sides (because piano playing requires fine control of both hands).

Does the internet rewire our brains? (BBC Future)

In a book I cited earlier, Alone in the World? the author lists the items that archaeologists look for to indicate behavioral modernity (since culture is ephemeral and does not fossilize):

1. A spoken language;

2. The cognitive capacity to generate mental symbols, as expressed in art and religion;

3. Explicit symbolic behavior, i.e., the ability to represent objects, people, and abstract concepts with arbitrary symbols, vocal or visual, and to reify such symbols in cultural practices like painting, engraving, and sculpture;

4. The capacity for abstract thinking, the ability to act with reference to abstract concepts not limited to time and space;

5. Planning depth, or the ability to formulate strategies based on past experience and to act one them in group context;

6. Behavioral, economic, and technological innovation; and

7. A bizarre inability to sustain prolonged bouts of boredom.

Often people cite the spectacular cave art of Ice Age Europe as evidence that the people living in such caves must have been behaviorally modern. But consider that some of the most sought-after art in the twentieth century was made by patients suffering from schizophrenia (voice hearing)!

The Julian Jaynes Society has compiled a list of questions about the behavior of ancient peoples that are difficult to explain without recourse to some kind of bicameral theory. I’ve copied and abridged their list below:

1. The Saliency and “Normalcy” of Visions in Ancient Times. Why have hallucinations of gods in the ancient world been noted with such frequency?

2. The Frequency of “Hearing Voices” Today. Why do auditory hallucinations occur more frequently in the general population than was previously known? If hallucinations are simply a symptom of a dysfunctional brain, they should be relatively rare. Instead, they have been found in normal (non-clinical) populations worldwide.

3. Imaginary Companions in Children. Why do between one-quarter and one-third of modern children “hear voices,” called imaginary companions?

4. Command Hallucinations. Why do patients labeled schizophrenic, as well as other voice-hearers, frequently experience “command hallucinations” that direct behavior — as would be predicted by Jaynes’s theory? If hallucinations are simply a symptom of a dysfunctional brain, one would expect they would consist of random voices, not commentary on behavior and behavioral commands.

5. Voices and Visions in Pre-literate Societies. Why are auditory and visual hallucinations, as well as divination practices and visitation dreams, found in pre-literate societies worldwide?

6. The Function of Language Areas in the Non-Dominant Hemisphere. Why is the brain organized in such a way that the language areas of the non-dominant hemisphere are the source of auditory hallucinations — unless this provided some previous functional purpose?

7. The “Religious” Function of the Right Temporal Lobe. Why is right temporal lobe implicated in auditory hallucinations, intense religious sentiments, and the feeling of a sensed presence?

8. Visitation Dreams. Why do ancient and modern dreams differ so dramatically? Studies of dreams in classical antiquity show that the earliest recorded dreams were all “visitation dreams,” consisting of a visitation by a god or spirit that issues a command — essentially the bicameral waking experience of hearing verbal commands only during sleep. This has also been noted in tribal societies.

9. The Inadequacy of Current Thinking to Account for the Origin of Religion. Why are the worship of gods and dead ancestors found in all cultures worldwide?

10. Accounting for the Ubiquity of Divination. Similarly, why were divination practices also universal?

Jaynes’s theory of a previous bicameral mentality accounts for all of these phenomena, and, in the complete absence of persuasive alternative explanations, appears to be the best explanation for each of them. As one professor once said to me, “There is either Jaynes’s theory, or just ‘weird stuff happens.'”

Questions critics fail to answer (Julian Jaynes Society)

Weird stuff, indeed!!! But there is another, perhaps even more important question not listed above. That is, why did religious concepts change so profoundly during the Axial Age? As Joseph Henrich, the anthropologist whose paper we cited above put it:

“The typical evolutionary approaches to religion don’t take into account that the kinds of gods we see in religions in the world today are not seen in small-scale societies. I mentioned the ancestor gods; other kinds of spirits can be tricked, duped, bought off, paid; you sacrifice in order to get them to do something; they’re not concerned about moral behavior…Whatever your story is, it’s got to explain how you got these bigger gods.”

Joseph Henrich on Cultural Evolution, WEIRD Societies (Conversation with Tyler)

In researching these series of posts, I’m struck by just how big a gulf there is between (to use Evens-Pritchard’s terms) Primitive Religion and Revelatory Religion.

Primitive religion, for all its dramatic variance, appears to be centered around direct revelation from gods, ancestor worship, and communal rituals. It is almost always rooted in some kind of animist belief system, and is always polytheistic.

Revelatory religions, by contrast, tend to emphasize conscious control over one’s own personal behavior (e.g. the ‘Golden Rule’). They emphasize looking for revelation by introspection—going inward—something conspicuously missing from primitive religions. Instead of direct revelation, God’s words are now written down in holy books which are consulted to determine God’s will, permanent and unchanging. Monotheism takes over from polytheism. And a significant portion of the population, unlike in primitive societies, accepts no god at all [atheism = a (without) theos (gods)]. As Brian McVeigh writes, quoting St. Augustine, “By shifting the locus of ‘spiritual activity from external rites and laws into the individual, Christianity brought God’s infinite value into each person.’ In other words, a newly spiritualized space, first staked out by Greek philosophers, was meta-framed and expanded into an inner kingdom where individual and Godhead could encounter each other.” (Psychohistory of Metaphors, pp. 52-53)

For his part Henrich and other researchers hypothesize that the difference comes from the fact that Universal Religions of Revelation (so-called “Big Gods”) allowed for larger and more diverse groups of people to cooperate, thus outcompeting parochial deities who couldn’t “scale up.” Because the “Big Gods” were all-seeing, all-knowing, omnipresent, moralizing deities with the power to reward and punish in the afterlife, they argue, it kept people on the straight-and-narrow, allowing for more higher-level cooperation between unrelated strangers even without shared cultural context. Basically, it was a meme that evolved via group selection. As they put it (PDF): “[C]ognitive representations of gods as increasingly knowledgeable and punitive, and who sanction violators of interpersonal social norms, foster and sustain the expansion of cooperation, trust and fairness towards co-religionist strangers.”

I call this “The Nannycam theory of Religion”. As God remarked to Peter Griffin on Family Guy, “I’m kind of like a nannycam. The idea that I *may* exist is enough for some people to behave better.”

By contrast, the breakdown of the bicameral mind provides an explanation. God now becomes one’s own conscience—the inner voice in one’s head. We now become responsible for our own behavior through the choices we make. The revelatory religions serve as a guide, and a replacement for the voices that no longer issue their commands. As Brian McVeigh explains:

…interiority is unnecessary for most of human behavior. If this is true, why did we as a species develop it about three thousand years ago (at least according to Julian Jaynes)? What was its purpose?

From the perspective of a sociopolitical organization [sic], interiority alleviates the need for strict heirarchical lines of command and control, which are inherently fragile. By placing a personal tool kit of command and control “inside a person’s head,” interiority becomes society’s inner voice by proxy.

Authorization based on strict hierarchical lines of command and control may be efficient for relatively small, well-circumscribed communities, but if history is any teacher, clear lines of control become less cost-effective in terms of socioeconomic capital the larger and more complex organizations become.

One authorization for immediate control of self becomes interiorized and individual-centered, an organization actually becomes stronger as its orders, directives, doctrines, admonitions, and warnings become the subjective truths of personal commitment.

Interiority, then, is a sociopolitically pragmatic tool used for control in the same way assigning names to individuals or categorizing people into specialized groups for economic production is. From the individual’s perspective, interiority makes the social environment easier to navigate. Before actually executing a behavior, we can “see” ourselves “in our heads” carrying out an action, thereby allowing us to shortcut actual behavioral sequences that may be time-consuming, difficult, or dangerous.
Brian J. McVeigh; A Psychohistory of Metaphors, pp. 33-34

There are many more “conventional” explanations of the universality of religious beliefs. One popular theory is put forward by anthropologist Pascal Boyer in “Religion Explained.” Basically, he argues that religion is an unintended side effect of  what software programmers would refer to as “bugs” in the human cognitive process:

Basing his argument on this evolutionary reasoning, Boyer asserts that religion is in effect a cognitive “false positive,” i.e., a faulty application of our innate mental machinery that unfortunately leads many humans to believe in the existence of supernatural agents like gods that do not really exist.

This also leads Boyer to describe religious concepts as parasitic on ordinary cognitive processes; they are parasitic in the sense that religion uses those mental processes for purposes other than what they were designed by evolution to achieve, and because of this their successful transmission is greatly enhanced by mental capacities that are there anyway, gods or no gods.

Boyer judges the puzzling persistence of religion to be a consequence of natural selection designing brains that allowed our prehistoric ancestors to adapt to a world of predators. A brain molded by evolution to be on the constant lookout for hidden predators is likely to develop the habit of looking for all kinds of hidden agencies. And it is just this kind of brain that will eventually start manufacturing images of the concealed actors we normally refer to as “gods.”

In this sense, then, there is a natural, evolutionary explanation for religion, and we continue to entertain religious ideas simply because of the kinds of brains we have. On this view, the mind it takes to have religion is the mind we have…Religious concepts are natural both in the phenomenological sense that they emerge spontaneously and develop effortlessly, and in the natural sense that also religious imagination belongs to the world of nature and is naturally constrained by genes, central nervous systems, and brains.
J. Wentzel van Huyssteen; Alone In The World? pp. 261-263

Of course, as Jaynes would point out, the gods as depicted in ancient literature are hardly “hidden actors.” They often speak directly to individuals and issue commands which are subsequently obeyed! Massive amounts of time and effort are spent building temples to them. That seems like an awful lot of work to satisfy a simple “false positive” in human cognition.

Other theories focus on what’s called the Theory of Mind. For example: What Religion is Really All About (Psychology Today). As a Reddit commenter put it succinctly:

The basic thesis is that we believe in gods (or supernatural minds in general) because of cognitive adaptations that evolved for social interaction. It was evolutionarily advantageous for monkeys to construct mental models of what other monkeys were feeling/perceiving/thinking, and it’s a natural step from there to believing in disembodied minds, minds that can exist without the monkey. Related YouTube lecture: Why We Believe In Gods.

Testimony to the Sumerian worship of the Cookie Monster

Perhaps. But there are an awful lot of signs in the archaeological record that our ancestors thought very differently than we do, to wit:

1. Eye idols (see above)

2. “Goddess” figurines and idols Jaynes: “Figurines in huge numbers have been unearthed in most of the Mesopotamian cultures, at Lagash, Uruk, Nippur, and Susa. at Ur, clay figures painted in black and red were found in boxes of burned brick placed under the floor against the walls but with one end opened, facing into the center of the room. The function of all these figurines, however, is as mysterious as anything in all archaeology. The most popular view goes back to the uncritical mania with which ethnology, following Frazer, wished to find fertility cults at the drop of a carved pebble. But if such figurines indicate something about Frazerian fertility, we should not find them where fertility was no problem. But we do.” Origins, p. 166. As the old joke in archaeology goes, if you can’t explain something, just claim it was for ‘fertility.’

3. Human Sacrifice

4. Trepanation

5. God kings:
Jaynes: “I am suggesting that the dead king, thus propped up on his pillow of stones, was in the hallucinations of his people still giving forth his commands…and that, for a time at least, the very place, even the smoke from its holy fire, rising into visibility from furlongs around, was, like the gray mists of the Aegean for Achilles, a source of hallucinations and of the commands that controlled the Mesolithic world of Eynan.

This was a paradigm of what was to happen in the next eight millennia. The king dead is a living god. The king’s tomb is the god’s house…[which]…continues through the millennia as a feature of many civilizations, particularly in Egypt. But, more often, the king’s-tomb part of the designation withers away. This occurs as soon as successor to a king continues to hear the hallucinated voice of his predecessor during his reign, and designates himself as the dead king’s priest or servant, a pattern that is followed throughout ancient Mesopotamia. In place of the tomb is similarly a temple. And in place of the corpse is a statue, enjoying even more service and reverence, since it does not decompose.” Origins, pp. 142-43

6. Grave goods

7. Cannibalism

8. Veneration of ancestors

9. Mummification of animals

Not to mention things like this:

A common practice among these city dwellers [of Çatalhöyük] was burying their dead under their floors, usually under raised platforms that served as beds. Often they would dig up the skulls of the dead later, plaster their faces (perhaps to recreate the faces of loved ones), and give them to other houses. Archaeologists frequently find skeletons from several people intermingled in these graves, with skulls from other people added. Wear and tear on some plastered skulls suggest they were traded back and forth, sometimes for generations, before being reburied. According to Hodder, such special skulls are just as often female as they are male.

Incredible discovery of intact female figurine from neolithic era in Turkey (Ars Technica)

4 thoughts on “The Archaic Mentality

  1. Actually this brings to mind Lewis Mumford’s thesis that cultural change precedes technological change. I’ll have to go back and re-read.

    I think I’ve suspected something like this for a long time – I can remember feeling uncomfortable the first time I saw an operating system using icons (=hieroglyphs) instead of words – it felt like a three-thousad-year retrograde step. And though our machines are getting faster they are stagnant (at best) in what we achieve with them. Computing is no longer about solving problems but about seeing what other people are up to, for the most part.

    • Funny you should mention Mumford, I was going to refer to his idea that the concepts of individualism and self-reflection were encouraged by inventions like glass, and more specifically, mirrors. Interesting how you never see Stone-Age self portraits. In Technics and Society, he writes:

      For perhaps the first time, except for reflections in the water and in the dull surfaces of metal mirrors, it was possible to find an image that corresponded accurately to what others saw. Not merely in the privacy of the boudoir: in another’s home, in a public gathering, the age of the ego in the new and unexpected attitudes accompanied one. The most powerful prince of the seventeenth century created a vast hall of mirrors, and the mirror spread from one room to another in the bourgeois household. Self-consciousness, introspection, mirror-conversation developed with the new object itself: this preoccupation with one’s image comes at the threshold of the mature personality when young Narcissus gazes long and deep into the face of the pool–and the sense of the separate personality, a perception of the objective attributes of one’s identity, grows out of this communion. (p. 129)

      I think this increasing self-awareness and self-control is probably behind the declines in violence that Pinker has demonstrated. That is, it’s less of a switch (or breakdown) as Jaynes described, and more of a spectrum. That’s why it’s so hard to find “pure” bicameralism.

    • Yes. As often happens, I had to split my really long entry into several shorter entries and reorganize them. It’s surprisingly hard!

Leave a Reply

Your email address will not be published.