So far, in our review of The Recursive Mind, we’ve discovered that recursive thinking lay behind such uniquely human traits as grammatical spoken language, mental time travel, Theory of Mind, higher-order religion, and complex kinship groups.
Whether or not recursion holds the key to the human mind, the question remains how we came to be the way we are–at once so dominant over the other apes in terms of behavior and yet so similar in genetic terms…In modern-day science, it is difficult to avoid the conclusion that the human mind evolved through natural selection, although as we have seen, some recent authors, including Chomsky, still appeal to events that smack of the miraculous—a mutation, perhaps, that suddenly created the capacity for grammatical language…But of course we do have to deal with the seemingly vast psychological distance between ourselves and our closest relatives, the chimpanzees and bonobos. p. 167
What follows is a quick tour through the major milestones in becoming human:
The first and most important change was our standing upright and unique walking gait. While bipedalism may seem in retrospect to offer many obvious advantages, it turns out that it may have been not all that advantageous at all:
Bipedalism became obligate rather than facilitative from around two million years ago, and as we shall see, it was from this point that the march to humanity probably began. That is, our forebears finally gained the capacity to walk freely, and perhaps run, in open terrain, losing much of the adaptation to climb trees and move about in the forest canopy.
Even so, it remains unclear just why bipedalism was retained. As a means of locomotion on open terrain, it offers no obvious advantages. Even the knuckle-walking chimpanzee can reach speeds of up to 48 km per hour, whereas a top athlete can run at about 30 km per hour. Other quadrupedal animals, such as horses, dogs, hyenas, or lions, can easily outstrip us, if not leave us for dead. One might even wonder, perhaps, why we didn’t hop rather than stride, emulating the bipedal kangaroo, which can also outstrip us humans…The impression is that the two-legged model, like the Ford Edsel, was launched before the market was ready for it. Or before it was ready for the market, perhaps. pp. 185-186
Of course, bipedalism left the hands free for tool use, but tool use came much later in the human repertoire, so we couldn’t have evolved walking specifically for that. Persistence hunting also seems to have been a later adaptation, so it was also not likely a primary cause. Another possibility is carrying things, perhaps food or infants. Another possibility is language, which, as Corballis argued earlier, may have originated with hand gestures long before verbal communication. If that’s true, then the capacity for language—as opposed to speech—goes back very far in the human repetoire.
One interesting thing I didn’t know is that even though chimpanzees are by far our closest genetic relatives among the great apes, anatomically we are closer to orangutans. This means that our transition to bipedalism may have developed not from knuckle-walking, as commonly presumed, but from hand-assisted bipedalism, where we walked upright along horizontal branches in the forest canopy, supported by our arms. The knuckle-walking gait of our chimp/bonobo cousins may not derive from our common ancestor, but may have developed after the split from the human lineage as an alternative method of crossing open territory.
The most arboreal of the great apes is the orangutan, which is more distantly related to us than either the chimpanzee or gorilla. Nevertheless its body morphology is closer to that of the human than is that of chimpanzee or gorilla.
In the forest canopy of Indonesia and Malaysia, orangutans typically adopt a posture known as hand-assisted bipedalism, supporting themselves upright on horizontal branches by holding on to other branches, usually above their heads. They stand and clamber along the branches with the legs extended, whereas chimpanzees and gorillas stand and move with flexed legs. Chimpanzees and gorillas may have adapted to climbing more vertically angled branches, involving flexed knees and a more crouched posture, leading eventually to knuckle-walking as the forested environment gave way to more open terrain. If this scenario is correct, our bipedal stance may derive from hand-assisted bipedalism, going back some 20 million years. Knuckle-walking, not bipedalism, was the true innovation. p. 184
In an earlier post we mentioned an updated version of the Aquatic Ape Hypothesis (AAH), today called by some the “Littoral Hypothesis.” The following is taken from another book by Corballis entitled “The Truth About Language”, where he goes into more detail:
Another view is that our forebears inhabited coastal areas rather than the savanna, foraging in water for shellfish and waterborne plants, and that this had a profound influence on our bodily characteristics and even our brain size. This aquatic phase may perhaps have preceded a later transition to the savanna. p. 48…Superficially, at least, it seems to explain a number of the characteristics that distinguish humans from other living apes. These include hairlessness, subcutaneous fat, bipedalism, our large brains, and even language…p. 95…The ability to breathe voluntarily…was an adaptation to diving, where you need to hyperventilate before plunging and then holding your breath during the dive. The fine-motor control over lips, tongue, velum, and throat necessary for producing consonants evolved for the swallowing of soft, slippery foods such as mollusks without biting or chewing. Think of oysters and white wine. p. 162
Philip Tobias once suggested that the term aquatic ape should be dropped, as it had acquired some notoriety over some of its more extravagant claims. [Mark] Verhaegen suggests that the aquatic theory should really be renamed “the littoral theory,” because early Homo was not so much immersed in water as foraging on the water’s edge, diving or searching in shallow water, and probably also roaming inland. p. 162…Tobias died in 2011, but in a chapter published in 2011 he wrote: “In contrast with the heavy, earth-bound view of hominin evolution, an appeal is made here for students of hominin evolution to body up, lighten and leaven their strategy by adopting a far great [sic] emphasis on the role of water and waterways in hominin phylogeny, diversification, and dispersal from one watergirt [sic] milieu to others…p. 95… Even in its modern form the aquatic ape hypothesis (AAH) remains controversial. Verhaegen quotes Wikipedia as asserting that “there is no fossil evidence for the AAH”; he disagrees, citing evidence that “virtually all archaic Homo sites are associated with abundant edible shellfish…162
To me, one of the more convincing arguments for a water-related evolutionary path for humans is the idea of giving birth in water. This may have been the way we accommodated a larger skull size alongside a pelvis designed for upright walking. I got this idea when I read an article several years ago about a woman who gave birth in water. Her husband recorded the birth and put it on the internet. The clip went viral since the birth took place almost effortlessly, with none of the agonizing pain which normally accompanies hospital births. This would also explain why human babies alone among the primates know instinctively to hold their breath in water. Perhaps human females are ideally meant to give birth in water—a practice that is once again achieving popularity in some alternative circles. Here’s the story:
At this time, human ancestors adopted a “dual-mode” existence, using facilitative bipedalism, to migrate across the rapidly expanding grasslands, but retaining the ability to take refuge in the tree canopy if needed. This is evident from the anatomy of the various species of Australopithecus, who were able to walk upright, but retained hands, feet and arms that allowed them to climb trees and hang from branches. These chimp-sized animals may have adopted a scavenging mode of existence to get their daily protein; cracking bones with stones to get at marrow inside, and stealing eggs from nests, while retreating back into the forest canopy when faced with fiercer competition. The most famous Australopithecus, Lucy, appears to have died by falling out of a tree. They may have done this activity during the heat of the day, leading to the gradual loss of hair and addition of sweat glands, since many big predators are crepuscular or nocturnal. They were already fairly social animals, and when threatened, they may have responded by banding together and hurling stones at their predators and rivals when threatened, as explored further below.
Humans are able to throw projectiles with much greater force and accuracy than any other primate, or really any other animal. Of course, other monkeys and chimps do hurl things (such as poo) when they are upset. But in humans, this ability evolved to its furthest extent.
We probably first began by throwing rocks and stones, a technique that is far more effective when done in groups, as William Von Hippel noted in The Social Leap. From there, we graduated to spears, javelins, and boomerangs, and then invented devices to further enhance our throwing capacity such as spear-throwers (woomeras) and slings. Slings continued to be devastating weapons on the battlefield well into historical times–during Roman times, the most famous and effective slingers came from the Balearic Islands in the western Mediterranean, and were widely employed as mercenaries.
Paul Bingham has argued that one of the characteristics that have reinforced social cohesion in humans is the ability to kill at a distance. Human societies can therefore be rid of dissenters in their midst, or threats from the outside, with relatively little threat of harm to the killer! Nevertheless the dissenters, or the rival band, may themselves resort to similar tactics, and so began an arms race that has continued to this day. It started, perhaps, with the throwing of rocks, followed in succession by axes, spears, boomerangs, bows and arrows, guns, rockets, bombs, and nuclear missiles, not to mention insults. Such are the marks of human progress…
Whether or not it was throwing that sustained bipedalism in an increasingly terrestrial existence, it does at least illustrate that bipedalism frees the hands for intentional potentially skilled action. It allows us to use our hands for much more than specifically chucking stuff about. Moreover, our primate heritage means that our arms are largely under intentional control, creating a new potential for operating on the world, instead of passively adapting to it. Once freed from locomotory duty, our hands and arms are also free to move in four-dimensional space, which makes them ideal signaling systems for creating and sending messages… p. 190
The idea that we evolved to throw also helps explain the mystery of the seeming perfection of the human hand, described by Jacob Bronowski as “the cutting edge of the mind.” Adding to the power and precision of throwing are the sensitivity of the fingers, the long opposable thumb, and the large area of the cortex involved in control of the hand. The shape of the hand evolved in ways consistent with holding and hurling rocks of about the size of modern baseballs or cricket balls, missile substitutes in modern pretend war. In real war, hand grenades are about the same size.
Our hands have also evolved to provide two kinds of grip, a precision grip and a power grip, and Richard W. Young suggests that these evolved for throwing and clubbing, respectively. Not only do we see young men throwing things about in sporting arenas, but we also see them wielding clubs, as in sports such as baseball, cricket, hockey, or curling. In various forms of racquet sports, the skills of clubbing and throwing seem to be combined. p. 188
3. Extended Social Groups
The last common ancestor of humans and chimps may have lived in the Pliocene Epoch which began some 5.333 million years ago. As it ended, it transitioned into the Pleistocene, which was characterized by a rapidly changing climate which featured a recurring series of crippling ice ages. The Pleistocene lasted from about 2.588 million years ago, to roughly 12,000 years ago, succeeded by the more stable climate of the Holocene. It was during the Pleistocene era that woodlands shrank and were replaced by open grasslands. During this time, the genus Homo emerged, and human ancestors permanently transitioned from an arboreal existence to becoming savanna-based hunter-gatherers, possibly beginning as scavengers.
Adaptability was key. Any species which relied solely on the slow pace of genetic evolution would have been at a severe disadvantage in the rapidly changing world of the Pleistocene. The newly-evolved Homo genus, with its omnivorous diet, free hands for tool use, upright gait, large brains, and gregarious social nature, was ideally suited for this epoch. During this time, walking went from facilitative to obligatory, and we left the tree canopy behind for good. Homo habilis was already using stone tools near the beginning of this era (although earlier Australopithecines may have used some tools as well). Then came a plethora of upright-walking, tool-using apes: Homo rudolfensis; Homo ergaster; and the always-popular with schoolchildren Homo erectus.
Coming from the forest and moving out onto the savanna, these apes could not compete on speed, strength or aggressiveness against the big predators. What did they do? The solution was to form larger and more tightly-knit social groups. It is these social groups that are thought to have been a primary driver behind increasing intelligence and brain expansion (about which, more below).
An especially dangerous feature of the savanna was the presence of large carnivorous animals, whose numbers peaked in the early Pleistocene. They included at least 12 species of saber-tooth cats and nine species of hyena. Our puny forebears had previously been able to seek cover from these dangerous predators in more forested areas, and perhaps by retreating into water, but such means of escape were relatively sparse on the savanna. Not only did the hominins have to avoid being hunted down by these professional killers, with sharp teeth and claws, and immense speed and strength, but they also had to compete with them for food resources. p. 192
It was human intelligence and sociability that allowed our ancestors to survive in this threatening environment—a combination of man’s “intellectual powers,” and “social qualities,” as Charles Darwin put it.
The hominins therefore built on their primate inheritance of intelligence and social structure rather than on physical attributes of strength or speed. This is what might be termed the third way, which was to evolve what has been termed the “cognitive niche,” a mode of living built on social cohesion, cooperation, and efficient planning. It was a question of survival of the smartest. p. 194
It was not only our social nature, but our unique social strategy which is different from all other primates. Simply put, we developed extended families. We also developed cooperative child-rearing and pair-bonding, which allowed us to evolve larger social groups that other primates, who remain largely in fixed-size groups throughout their lives and do not typically develop deep relationships outside of it.
Sarah Blaffer Hrdy was argued that social bonding evolved first in the context of child rearing, She points out that great apes are loathe to allow others to touch their infants during the first few months, whereas human mothers are very trusting in allowing others to carry and nurture their babies. This is evident not only in daycare centers, but in extended families units [sic] that characterize many peoples of the world. Among New Zealand Maori, for instance, initial teaching and socialization is based on a larger unit known as whanau, which is the extended family, including children, parents, grandparents, cousins, uncles, aunts, and often beyond. The understanding of whanau is recursive, looping back many generations. p. 194
It takes a village indeed!
This puts paid to all seventeenth-century English Liberal notions of government that rely on “voluntary associations” or purposeful submission to a despot in exchange for protection and order. Governments did not form in order to secure “private property” as John Locke argued, nor were early societies a “war of all against all” as Hobbes thought—we would have gone extinct long ago if that were the case. It is private property, not social organization, which is novel in the human experience. Since extended families and kinship groups predate the genus Homo, the fact is that we humans have never had to make any sort of conscious, rational decision to form complex social groups—we are literally born into them! The question is, rather, how such groups evolved from the small tribal societies of the past into today’s large, impersonal, market-based nation-states.
4. Controlled use of Fire
At some point, humans harnessed fire, the only species (that we know of ) to do so. To get a bit more technical, we harnessed a source of extrasomatic energy, later supplemented by fossil fuels. Exactly when this occurred, however, is a matter of debate. Fire does not fossilize, and while the results of ancient combustion can be detected, it is often difficult to determine whether these were natural or artificially-controlled fires. Rather, arguments for a very archaic use of fire come primarily from human anatomy—humans are adapted to cooked food and cannot survive on strictly raw food diets, unlike chimps and gorillas. This leads to the conclusion that humans have been using fire long enough to evolve a dependency on it—certainly for hundreds of thousands of years, at least. Our small jaws, duller teeth, shorter intestines, and bulbous skulls all derive from anatomical changes due to cooking. Some recent evidence has suggested fire use over one million years ago. It indicates that that sitting around a campfire and telling stories has been part of social bonding since time immemorial.
Richard Wrangham has suggested that the secret of hominin evolution originated in the controlled use of fire, which supplied warmth and protection from hostile predators. From around two million years ago, he thinks, Homo erectus also began to cook tubers, greatly increasing their digestibility and nutritional value. Cooked potatoes, I’m sure you will agree, are more palatable than raw ones. Other species may have been handicapped because they lacked the tools to dig for tubers, or the means to cook them.
Cooked food is softer, leading to the small mouths, weak jaws, and short digestive system that distinguish Homo from earlier hominins and other apes. Cooking also led to division of labor the sexes, with women gathering tubers and cooking them while the men hunted game. At the same time, these complementary roles encouraged pair bonding, so that the man can be assured of something to eat if his hunting expedition fails to produce meat to go with the vegetables. p. 195
5. Rapid Brain Expansion
During the Pleistocene, the human brain underwent a remarkable and unprecedented expansion for reasons that are still debated. From Australopithecenes to archaic humans, the brain roughly tripled in size. The size of the brain is correlated roughly to an organism’s body size. This is known as the encephalization quotient. Given human’s relatively small body size, our brains are much larger than they “should” be. It’s also an energy hog, taking up some 20 percent of our metabolism to keep running.
Fossil evidence shows that brain size remained fairly static in the hominins for some four million years after the split from the apes. For example, Australopithecus Afarensis…had a brain size of about 433 cc, slightly over the chimpanzee size of about 393 cc, but less than that of the much larger gorilla at 465 cc. It was the emergence of the genus Homo that signaled the change. Homo habilis and Homo rudolfensis were still clumsily bipedal but their brains ranged in size from around 500 CC to about 750 CC, a small increase over that of earlier hominins. Homo ergaster emerged a little over 1.8 million years ago, and by some 1.2 million years ago boasted a brain size a some 1,250 cc. Thus in a space of about 750,000 years, brain size more than doubled—that’s pretty quick on an evolutionary time scale.
Brain size continued to increase at a slower rate. It appears to have reached a peak, not with Homo sapiens, dating from about 170,000 years ago, but with Neanderthals…In some individual Neanderthals, brain capacity seems to have been as high as 1,800 cc, with an average of about 1,450 cc. Brain size in our own species, Homo sapiens, is a little lower, with a present-day average f about 1,350 cc, but still about three times the size expected of an ape with the same body size…this final increase in brain size—the dash for the summit as it were—seems to have coincided with an advance in technological innovation over that which had prevailed for the previous 1.5 million years. pp. 198-199
It’s not just the expansion of the brain that is remarkable, but the expansion of the neocortex, or “outer shell” of the brain where many of the “higher” cognitive functions reside. Here, too, we find that the human neocortex is much larger than expected given the size of the brain and body. The size of the neocortex is roughly correlated with intelligence and the size of the social group in mammals, giving us some indication of the intelligence and group size of early human ancestors. “Humans have the largest neocortical ratio, at 4.1, closely followed by the chimpanzee at 3.2. Gorillas lumber in at 2.65, orangutans at 2.99, and gibbons at 2.08. According to the equation relating group size to neocortical ratio, humans should belong to groups of 148, give or take about 50. This is reasonably consistent with the estimated sizes of early Neolithic villages.” (p. 198)
Robin Dunbar has suggested that even though Neanderthal brains were larger overall than Homo sapiens, more of their neocortex was devoted to visual processing—their skulls indicate eyes that were 20 percent larger than our own. This was an adaptation to the darkness of the northern climes. The running of this enlarged visual system, he argues, precluded parts of the brain from being harnessed for other uses—social uses in particular. Thus, Neanderthals were not able to develop larger groupings, or things such as higher-order religions and recursion, he argues. Homo sapiens, evolving in the more tropical regions of Africa, did not have this same handicap.
Perhaps the most extraordinary revelation from this chapter is that there appear to be significant genetic changes to the brain within recorded history!
We are beginning to learn something of the genetic changes that gave us our swollen heads. One gene known to be a specific regulator of brain size is the abnormal spindle-like microcephaly associated (ASPM) gene, and the evidence suggests strong positive selection of this gene in the lineage leading to Homo sapiens. Indeed, a selective sweep appears to have occurred as recently as 5,800 years ago, suggesting that the human brain is still undergoing rapid evolution. Another gene known as microcephalin (MCPH6) has also been shown to regulate brain size, and one variant in modern humans arose an estimated 37,000 years ago. Other genes involved in the control of brain size that have undergone accelerated rates of protein evolution at points in human lineage have also been identified. p. 199
What’s most extraordinary about this information, given our discussion of Julian Jaynes’s theories, is that the evidence above indicates a selective sweep of genes that affect brain development in exactly the time-frame specified by Jaynes—something that Jaynes’s critics have always claimed was patently impossible! Of course, this does not mean that these genes are what lay behind his hypothesized development of “consciousness”—only that it is possible that there were indeed changes to how the brain functions within recorded history.
Often it’s claimed that the breakdown of the bicameral mind was due to a massive change in the brain’s architecture. Critics mistakenly assert that Jaynes implied that the corpus callosum—the massive bundle of nerves that connects the two hemispheres—evolved during historical times. But Jaynes claims nothing of the sort! While he discusses split-brained patients (with a severed corpus callosum) in order to understand the separate functions of each hemisphere, nowhere does he imply any recent anatomical changes to the brain’s basic structure. And, besides, the fact that hearing voices is common in humans today, indicates that such a massive change is not needed in any case. Rather, only a slight change in perception was required. Jaynes suggests that it was an inhibition in communication between Broca’s and Wernicke’s areas, which are connected by the anterior commisure, which might have contributed to the breakdown of the bicameral mind. There is also evidence that the amount of “white matter” in the brain (as contrasted with gray matter), changes brain function, and abnormalities in white matter have been associated with schizophrenia and other mental illnesses. We have no idea whether the genes specified above had anything to do with this, of course. But preliminary data show that this gene does not affect IQ, so it was not raw intelligence which cause the selective sweep of the ASPM gene. Could this gene have altered some of the functioning of the brain much in the manner Jaynes described, and did this give rise to the recursive “self” developing and expanding sometime during the late Bronze Age? Here we can only speculate.
Here’s anthropologist John Hawks explaining the significance of this discovery:
Haplogroup D for Microcephalin apparently came under selection around 37,000 years ago (confidence limit from 14,000 to 60,000 years ago). This is very, very recent compared to the overall coalescence age of all the haplotypes at the locus (1.7 million years). Some populations have this allele at 100 percent, while many others are above 70 or 80 percent. Selection on the allele must therefore have been pretty strong to cause this rapid increase in frequency. If the effect of the allele is additive or dominant, this selective advantage would be on the order of 2 or 3 percent — an advantage in reproduction.
The story for ASPM is similar, but even more extreme. Here, the selected allele came under selection only 5800 years ago (!) (confidence between 500 and 14,100 years). Its proliferation has almost entirely occurred within the bounds of recorded history. And to come to its present high proportion in some populations of near 50 percent in such a short time, its selective advantage must have been very strong indeed — on the order of 5 to 8 percent. In other words, for every twenty children of people without the selected D haplogroup, people with a copy of the allele averaged twenty-one, or slightly more.
Recent human brain evolution and population differences (john hawks weblog)
In a bizarre Planet of the Apes scenario, Chinese scientists have recently inserted human genes related to brain growth and cognition into moneys in order to determine what role genes play in the evolution of intelligence:
Human intelligence is one of evolution’s most consequential inventions. It is the result of a sprint that started millions of years ago, leading to ever bigger brains and new abilities. Eventually, humans stood upright, took up the plow, and created civilization, while our primate cousins stayed in the trees.
Now scientists in southern China report that they’ve tried to narrow the evolutionary gap, creating several transgenic macaque monkeys with extra copies of a human gene suspected of playing a role in shaping human intelligence.
“This was the first attempt to understand the evolution of human cognition using a transgenic monkey model,” says Bing Su, the geneticist at the Kunming Institute of Zoology who led the effort…
…What we know is that our humanlike ancestors’ brains rapidly grew in size and power. To find the genes that caused the change, scientists have sought out differences between humans and chimpanzees, whose genes are about 98% similar to ours. The objective, says, Sikela, was to locate “the jewels of our genome”—that is, the DNA that makes us uniquely human.
For instance, one popular candidate gene called FOXP2—the “language gene” in press reports—became famous for its potential link to human speech. (A British family whose members inherited an abnormal version had trouble speaking.) Scientists from Tokyo to Berlin were soon mutating the gene in mice and listening with ultrasonic microphones to see if their squeaks changed.
Su was fascinated by a different gene: MCPH1, or microcephalin. Not only did the gene’s sequence differ between humans and apes, but babies with damage to microcephalin are born with tiny heads, providing a link to brain size. With his students, Su once used calipers and head spanners to the measure the heads of 867 Chinese men and women to see if the results could be explained by differences in the gene.
By 2010, though, Su saw a chance to carry out a potentially more definitive experiment—adding the human microcephalin gene to a monkey…
Chinese scientists have put human brain genes in monkeys—and yes, they may be smarter (MIT Technology Review)
One of the more remarkable theories behind brain growth argues that a virus, or perhaps even symbiotic bacteria, helped along human brain growth, and hence, intelligence. Robin Dunbar raises the intriguing possibility that brain expansion was fuelled by a symbiotic alliance with the tuberculosis bacterium!
The problem of supporting a large brain is so demanding that it may have resulted in the rather intriguing possibility that we used external help to do so in the form of the tuberculosis bacterium. Although TB is often seen as a terrible disease, in fact only 5 per cent of those who carry the bacterium are symptomatic, and only a proportion of those die (usually when the symptoms are exacerbated by poor living conditions). In fact, the TB bacterium behaves much more like a symbiont than a pathogen – even though, like many of our other symbionts, it can become pathogenic under extreme conditions. The important issue is that the bacterium excretes nicotinamide (vitamin B3), a vitamin that turns out to be crucial for normal brain development. Chronic shortage of B3 rapidly triggers degenerative brain conditions like pellagra. The crucial point here is that vitamin B3 is primarily available only from meat, and so a supplementary source of B3 might have become desirable once meat came to play a central role in our diet. Hunting, unlike gathering, is always a bit chancy, and meat supplies are invariably rather unpredictable. This may have become even more crucial during the Neolithic: cereals, in particular, are poor in vitamin B3 and a regular alternative supply might have become essential after the switch to settled agriculture.
Although it was once thought that humans caught TB from their cattle after domestication around 8,000 years ago, the genetic evidence now suggests that human and bovine TB are completely separate strains, and that the human form dates back at least 70,000 years. If so, its appearance is suspiciously close to the sudden upsurge in brain size in anatomically modern humans that started around 100,000 years ago. Human Evolution: Our Brains and Behavior by Robin Dunbar; pp. 248-249
It’s not just brain size. The human brain undergoes an unusually large amount of development after birth, unlike most other species, even other great apes. Other great apes don’t have things like extended childhoods and adolescence. It leads to the helplessness and utter dependency of our infants in the near term, but it has a big payoff in social adaptability in the long term. It means that humans’ intellectual capabilities can be—to a large extent—shaped by the environment they are born into, rather than just genes. The brain is “wired up” based on the needs of the environment it is born into. This affects things like language and sociability. This is key to what we saw above: adaptability and behavioral flexibility were the key to our species’ success.
Another critical difference between humans and other primates lies in the way in which the human brain develops from birth to adulthood. We humans appear to be unique among our fellow primates, and perhaps even among the hominins, in passing through four developmental stages–infancy, childhood, juvenality, and adolescence…During infancy, lasting from birth to age two and a half, infants proceed from babbling to the point that they know that words or gestures have meaning, and can string them together in two-word sentences. This is about the level that the bonobo Kanzi has reached…it is the next stage, childhood, that seems to be especially critical to the emergence of grammatical language and theory of mind…Childhood seems to be the language link that is missing in great apes and the early hominins, which may account for the fact that, so far at least, great apes have not acquired recursive grammar. But it is also during childhood that theory of mind, episodic memory, and understanding of the future emerge. Childhood may be the crucible of the recursive mind.
During the juvenile phase, from age 7 to around 10, children begin to appreciate the more pragmatic use of language, and how to use language to achieve social ends. The final stage is adolescence, which…is…unique to our own species, and sees the full flowering of pragmatic and social function, in such activities as storytelling, gossip, and sexual maneuvering. Adolescence also has a distinctive effect on male speech, since the surge of testosterone increases the length and mass of the vocal folds, and lowers the vibration frequency…
Locke and Bogin focus on language, but the staged manner in which the brain develops may account more generally for the recursive structure of the human mind. Recursive embedding implies hierarchical structure, involving metacontrol over what is embedded in what, and how many layers of embedding are constructed. Early development may establish basic routines that are later organized in recursive fashion.
I’ve always been struck by how children who are more intellectually precocious tend to take longer to mature—they are “late bloomers.” In contrast, there are those who mature very quickly and then hit a plateau. Of course, we lump them all together in prison-like schools according to chronological age , despite highly variable developmental speed and gender differences. This leads to all sorts of bulling and abuse, as the faster-developing “jocks” torment the slower-developing “nerds”—a feature unique to modern industrial civilization. The emotional scarring resulting from this scenario causes incalculable amounts of suffering and misery, but I digress…
Human children are the most voracious learners planet Earth has ever seen, and they are that way because their brains are still rapidly developing after birth. Neoteny, and the childhood it spawned, not only extended the time during which we grow up but ensured that we spent it developing not inside the safety of the womb but outside in the wide, convoluted, and unpredictable world.
The same neuronal networks that in other animals are largely set before or shortly after birth remain open and flexible in us. Other primates also exhibit “sensitive periods” for learning as their brains develop, but they pass quickly, and their brain circuitry is mostly established by their first birthday, leaving them far less touched by the experiences of their youth.
Based on the current fossil evidence, this was true to a lesser extent of the 26 other savanna apes and humans. Homo habilis, H. ergaster, H. erectus, even H. heidelbergensis (which is likely the common ancestor of Neanderthals, Denisovans, and us), all had prolonged childhoods compared with chimpanzees and gorillas, but none as long as ours. In fact, Harvard paleoanthropologist Tanya Smith and her colleagues have found that Neanderthals reversed the trend. By the time they met their end around 30,000 years ago, they were reaching childbearing age at about the age of 11 or 12, which is three to five years earlier than their Homo sapiens cousins…
We are different. During those six critical years, our brains furiously wire and rewire themselves, capturing experience, encoding and applying it to the needs of our particular life. Our extended childhood essentially enables our brains to better match our experience and environment. It is the foundation of the thing we call our personalities, the attributes that make you you and me me. Without it, you would be far more similar to everyone else, far less quirky and creative and less, well … you. Our childhood also helps explain how chimpanzees, remarkable as they are, can have 99 percent of our DNA but nothing like the same level of diversity, complexity, or inventiveness.
7. Tool Use
Humans used tools largely in the context of hunting and butchering large prey. Humans probably also used tools to secure other resources, such as the digging up of tubers mentioned earlier. Gourds and eggshells are used by foragers to carry water. Slings may have been used for rocks from a long time ago. In the book The Artificial Ape, archaeologist Timothy Taylor argued that humans must have used baby slings—probably made from animal pelts—to carry their infants as far back as a million years ago. He makes this case since infants cannot walk effectively for the first few years of their life, and since early humans were constantly on the move, mothers must have had some way of efficiently carrying their offspring that left their hands free (other apes cling to mother’s hair—not an option for us). He argues that the sling is critical to allowing our infants to be born as helpless as they are, and thus facilitated the extended infancy described above. Fire may have also been a useful tool—many cultures around the world have used it to reshape the natural landscape and drive game.
When looking at the long arc of history, what stands out is not so much the rapidity of cultural change, but rather just how slow tool use and development was over millions of years. While today we are used to rapid, constant technological change, during the Pleistocene toolkits often remained unchanged for hundreds of thousands of years. So much for innovation!
Nevertheless advances in toolmaking were slow. There is little to suggest that the early hominins were any more adept at making or using tools than are present-day chimpanzees, despite being bipedal, and it was not really until the appearance of the genus Homo that toolmaking became more sophisticated.
The earliest such tools date from about 2.5 million years ago, and are tentatively associated with H, rudolfensis. These tools, relatively crude cutters and scrapers, make up what is known as the Oldowan industry. A somewhat more sophisticated tool industry, known as the Acheulian industry, dates from around 1.6 million years ago in Africa, with bifacial tools and hand-axes…The Acheulian industry remained fairly static for about 1.5 million years, and seems to have persisted in at least one human site dating from only 125,000 years ago. Nevertheless, there was an acceleration of technological invention from around 300,000 to 400,000 years ago, when the Acheulian industry gave way to the more versatile Levallois technology. Tools comprising combinations of elements began to appear, including axes, knives and scrapers mounted with hafts or handles, and stone-tipped spears. John F. Hoffecker sees the origins of recursion on these combinatorial tools, which were associated with our own forebears, as well as with the Neanderthals, who evolved separately from around 700,000 years ago. pp. 205-206
Corballis speculates that the rapid tool advancement seen in more recent Homo sapiens owes its origins more to our evolved social capabilities than to developments resulting from the primitive crude stone tools of our earlier ancestors: “My guess is that recursive thought probably evolved in social interaction and communication before it was evident in the material creations of our forebears. The recursiveiness and generativity of technology, and of such modern artifacts as mathematics, computers, machines, cities, art, and music, probably owe their origins to the complexities of social interaction and story telling, rather than to the crafting of tools…” (p. 206)
The full flowering of stone tool technology came during a period called the Upper Paleolithic, or Late Stone Age, also associated with such behaviorally modern artifacts as sculptural “Venus” figurines, cave paintings, and deliberate burials (indicating some rudimentary religious belief). The adoption of such “modern” behavioral traits, and the adoption of vastly more sophisticated tools is related, argues Corballis:
This second wave of innovation was most pronounced in Europe and western Asia, beginning roughly when Homo sapiens arrived there. The Upper Paleolithic marked nearly 30,000 years of almost constant change, culminating in a level of modernity equivalent to that of many present-day indigenous peoples. Technological advances included clothing, watercraft, heated shelters, refrigerated storage pits, and bows and arrows. Elegant flutes made from bone and ivory have been unearthed in southwest Germany, dated at some 40,000 years ago, suggesting early musical ensembles…Flax fibers dating from 30,000 years ago have been found in a cave in Georgia, and were probably used in hafting axes and spears, and perhaps to make textiles; and the presence of hair suggests also that they were used to sew clothes out of animal skins. The people of this period mixed chemical compounds, made kilns to fire ceramics, and domesticated other species.
Stone tools date from over two million years ago, but remained fairly static until the Upper Paleolithic, when they developed to include more sophisticated blade tools, as well as burins and tools for grinding. Tools were also fashioned from other materials, such as bone and ivory, and included needles, awls, drills, and fishhooks…p. 214
The general consensus today is that all modern humans are descended from groups that left Africa after 70,000 years ago, perhaps driven by climate change. These migrants eventually displaced all earlier species of archaic Homo. We also know that some interbreeding between our ancestors and these other species took place. Humans carry DNA signatures from Neanderthals, Denisovans, and an as-yet-undiscovered human ancestor.
Evolutionary biologists have classified six major haplogroups of humans: L0, L1, L2, L3, M and N. A haplogroup is a large grouping of haplotypes, which are groups of alleles (variant forms of a gene) inherited from a single parent. In this case, geneticists used mitochondrial DNA, which is inherited exclusively from our mothers, to specify the haplogroups. Mitochondria—the “battery” of the cell, began its existence as symbiotic bacteria, and thus has a distinct genetic signature. Of the four “L” haplogroups, only L3 migrated out of Africa. The M and N haplogroups are a descendants of the L3 haplogroup. Haplogroup M has a more recent common ancestor than haplogroup N, and is found both inside and outside Africa. All indigenous lineages outside of Africa are derived from the M and N haplogroups, exclusively.
Why haplogroup L3 alone migrated out of Africa is a big question. Another related big question for evolutionary biologists is, how much of modern human behavior existed in Africa prior to this outmigration, and how much of human behavior arose after it? For example, did complex spoken language evolve before or after we left Africa? What about symbolic thought, art, religion, and sophisticated tool use? Did we use fire? Given that fact that sapiens displaced all the earlier hominins who had evolved outside of the continent (most likely from Homo heidelbergensis, and perhaps a few remote branches of erectus), we must have had some kind of innate advantage over the native inhabitants, the thinking goes. What exactly it was has proved harder to determine, but recursion might well be the answer.
The population of the earliest lineage, L0, is estimated to have expanded through the period 200,000 to 100,000 years ago. . . The L0 and L1 lineages exist at higher frequencies than the other lineages among present-day hunter-gatherers, who may therefore offer a window into the early history of Homo sapiens…The L3 lineage is of special interest, because it expanded rapidly m size from about 60,000 to 80,000 years ago, and seems to have been the launching pad for the migrations out of Africa that eventually populated the globe. Of the two non-African lineages that are the immediate descendants of L3, lineage M is estimated to have migrated out of Africa between 53,000 and 69,000 years ago, and lineage N between 50,000 and 64,000 years ago.
Why did L3 expand so rapidly, and migrate from Africa? One suggestion 1s that L3 gained some cultural advantage over the other lineages, perhaps through the invention of superior technologies, and that this gave them the means to migrate successfully. Paul Mellars suggests that the African exodus was predated by advances in toolmaking, including new stone-blade technologies, the working of animal skins, hafted implements, and ornaments. Some of the improvements in tool technology can be attributed to the use of fire to improve the flaking properties of stone, which dates from around 72,000 years ago in the south coast of Africa…
It need not follow that the L3 people were biologically more advanced than their African cousins, and it may well be that the exodus was driven by climate change rather than any technical superiority of L3 over the other haplogroups that remained in Africa. During the last ice age, there were a series of rapid climate swings known as Heinrich events. One of these events, known as H9, seems to have occurred at the time of the exodus from Africa, and was characterized by cooling and loss of vegetation, making large parts of North, West, and East Africa inhospitable for human occupation. It may also have been accompanied by a drop in sea levels, creating a land bridge into the Levant. So out of Africa they went, looking no doubt for greener pastures.
The exodus seems to have proceeded along the coast of the Red Sea, across the land bridge, and then round the southern coasts of Asia and southeast Asia, to reach New Guinea and Australia by at least 45,000 years ago. Mellars notes similarities in artifacts along that route as far as India, but remarks that technology seems to have declined east of India, especially in Australia and New Guinea. This may be attributable, he suggests, to the lack of suitable materials, adaptation to a more coastal environment requiring different technologies, and random fluctuations (cultural drift). A remarkable point of similarity, though, is the presence of red ochre in both Africa and in the earliest known human remains in Australia. Ochre was probably used in ritualistic body-painting, and perhaps in painting other surfaces. pp. 209-211
9. The Rise of Agriculture
Of course, the wild climate swings of the Pleistocene era eventually came to and end giving way to the more climatically stable (to date) Holocene epoch. As the Last Glacial Maximum (LGM) came to a close, the earth underwent a massive de-glaciation, sending massive amounts of cold, fresh water into the world’s oceans. Sea levels rose, and many land areas became submerged, such as Berinigia (isolating the Americas), Doggerland (isolating Britain) and the Sahul Shelf (isolating Australasia). The melting glaciers caused the climate to undergo a rapid shift once again, killing off large numbers of the megafauna that earlier humans had relied on as their primary food source—animals such as the Wooly mammoth and ground sloth. The vast herds of reindeer that had provided sustenance for Paleolithic Europeans retreated northwards with the receding taiga, and southern Europe became heavily forested with larch and birch trees. In reaction, many human ancestors found themselves living in forests and grasslands once gain, relying more and more on smaller, more solitary prey animals, and plant foods such as fruits, seeds and nuts—a change sometimes referred to as the Broad Spectrum Revolution.
We know that the domestication of cereals dates from about 10-12,000 years ago in the Fertile Crescent—present-day Iraq, Syria, Lebanon, Israel, Kuwait, Jordan, southeastern Turkey and southwestern Iran. What’s less clear, however, is just how long these plants were cultivated before we decided to grow them intensively enough to alter their DNA to the point where these plants became dependent upon us (and we upon them). Recent evidence keeps pushing this horticultural activity—”proto-farming”—further and further back into the past, suggesting that agriculture is less of an anomaly or innovation than formerly thought. It apparently coexisted for a long time along seasonal hunting and foraging in the Near East. In addition, it appears that other desirable plant foods like figs and legumes were cultivated long before cereal grains. In some cases, the Neolithic Revolution appears to have been actively resisted for as long as possible by many cultures.
After the colder, dryer Younger Dryas period ended about 12,000 years ago, humans began to settle down in various grassy foothills and river valleys around the world and more intensively cultivate plant foods—especially cereal crops—which began the long march toward civilization, for better or worse.
10. Final Conclusions
Corballis strongly argues here (as he has in several books) that language originated with gestures, possibly before human ancestors migrated from Africa. Verbal speech, by contrast, came about much later, and may originate sometime after the exodus from Africa—perhaps as recently as 50-60,000 years ago based on anatomical evidence.
He argues that second- or perhaps third-order recursion was sophisticated enough to account for many of the types of behaviors we see in archaic humans (such as cooperative hunting and rudimentary religious beliefs), but that higher levels of recursive thought were inaccessible to them. These, he says, are unique to Homo sapiens, and may have begun as recently as 30,000 years ago during the Upper Paleolithic era, but we don’t know for sure.
He argues that these recursive abilities were mainly the result of human social needs, which then exploded into other diverse areas such as art, music, religion, and—perhaps most significantly—grammatical language, which can combine recursively to form an infinite number of ideas and concepts. Much later, things like advanced technology, science and mathematics flowed from these same recursive abilities as human societies grew ever larger and more complex. Humans’ ability to plan for and anticipate alternative futures is far more sophisticated than in any other species.
These recursive abilities also gave us the ability to know what others are thinking, leading directly to cumulative memetic evolution—passing down ideas and concepts, and adding to and extending them over time. No other species can do this as we can. Recursive thought also gave birth to mental time travel, allowing human thought to roam both the past and the future, and imagine alternative futures, or even fictional ones—i.e. stories, which bind human societies together. Stories gave rise to more complicated social groups which are recursively nested in expanding circles of kinship and affiliation.
By looking at simpler examples from around the animal kingdom, Corballis argues that the development of these abilities was not a sudden, random and inexplicable event as some have argued. Rather, he says, it was the natural outcome of the same evolutionary processes that led to all the other mental and physical abilities that make us unique in the animal kingdom:
In this book, I have tried to argue that recursion holds the key to that difference in mind, underlying such uniquely such uniquely human characteristics as language, theory of mind, and mental time travel. It was not so much a new faculty, though, as an extension of existing faculties…there is no reason to suppose that the recursive mind evolved in some single, miraculous step, or even that it was confined to our species. Instead, it was shaped by natural selection, probably largely during the last two million years. p. 226
Although recursion was critical to the evolution of the human mind…it is not a “module,” the name given to specific, innate functional units, many of which are said to have evolved during the Pleistocene. Nor did it depend on some specific mutation, or some special kind of neuron, or the sudden appearance of a new brain structure. Rather, recursion probably evolved through progressive increases in short-term memory and capacity for hierarchical organization. These in turn were probably dependent on brain size, which increased incrementally, albeit rapidly, during the Pleistocene. But incremental changes can lead to sudden more substantial jumps, as when water boils or a balloon pops. In mathematics, such sudden shifts are known as catastrophes, so we may perhaps conclude that emergence of the human mind was catastrophic. p. 222
I have argued…that the extension of recursive principles to manufacture and technology was made possible largely through changes in the way we communicate. Language evolved initially for the sharing of social and episodic information, and depended at first on mime, using bodily movements to convey meaning. Through conventionalization, communication became less mimetic and more abstract. In the course of time it retreated into the face and eventually into the mouth, as late Homo gained voluntary control over voicing and the vocal tract, and the recursive ability to create infinite meaning through combinations of articulate sounds. This was an exercise in miniaturization, releasing the rest of the body, as well as recursive principles, for manipulation of the physical environment.
The complexities of the modern world are not of course the product of individual minds. Rather, they are the cumulative products of culture. Most of us have no idea how a jet engine, or a computer, or even a lightbulb, actually works. We all stand on the shoulders of giants…pp. 223-224
This concludes my review of The Recursive Mind by Michael C. Corballis. I hope you’ve enjoyed it and learned something new along the way.