A MAN AMONG APES
Make it thy business to know thyself, which is the most difficult lesson in the world.
Miguel de Cervantes, based on the ancient Greek aphorism, "Know thyself"
I have a remarkable friend named Bob Brain. In his early eighties, Bob now finds himself among the last of the true natural historians. Not much worth knowing about the complex ecological interplay of organism and environment escapes his deliberation. Lately, Bob's mind is on the origins of animal life. The particulars of his work on that topic are beyond the bounds of this book, but an overview is relevant. Although Bob's research is rooted in theory, it's driven by the recovery of fossils; in any historical science, data generated in the course of well-conceived fieldwork are the definitive sources of testable hypotheses. And, Bob's observations are defying conventional wisdom about the first appearance of the animals. He and his colleagues are documenting spongelike organisms fossilized in Namibian limestones. These fossils are in excess of 750 million years old, significantly more ancient than customary estimates of when the first animals emerged from their simpler eukaryotic ancestors. These results do not, however, inspire in Bob the same kind of rapture that has motivated other heretic intellectuals in their own battles with establishment "big boys." Instead, Bob's probing of the beginnings of multicellularity only confirm his long-held convictions that humankind in this new century stands at the brink of its own destruction, and that we tilt toward that abyss as the ultimate consequence of more than half a billion years of heterotrophy. Prior to the animals, all organisms were autotrophic, using inorganic carbon dioxide to fulfill their energetic requirements. The evolution of heterotrophy changed all that, and those organisms that entered this intensely competitive system began to ingest other life forms in order to satisfy their carbon needs. The food web was spun, and consumerism, in the basest sense of that term, was born. One evening, overlooking a hard southern African sunset, Bob contemplated this natural state of affairs and deadpanned to me that "once these bastards evolved there was no turning back. It's quite disgusting, really."
It is from this basic biological perspective, and with an appreciation of the great depth of the Earth's geological history, that an uncharitable reader could dismiss the contents of this book. Is predation not now, more than 700 million years after its emergence, rendered just as mundane as it is profound? Animals kill, animals eat. Humans are animals, so they kill and eat. So what? But, consider just how unique an activity is human hunting. It's true that some nonhuman animals use tools to hunt, others target prey larger than an individual hunter, and a few even share food-but none, other than the human animal, possesses these traits as a behavioral complex used in combination to both satisfy their caloric and nutritional requirements and to build social cohesion. Many researchers argue that this type of multifaceted, socially complex hunting-still employed today by the last of the traditional human foragers-is the very socioecological basis of our humanness. If so, then human predation is undoubtedly a topic worthy of serious scholarship. Indeed, it has already generated an enormous body of research during the 150-plus years since the study of human evolution was first codified as paleoanthropology, a true, empirically based scientific endeavor. This book draws on that research, situating notions of human predation within a generalized paleoanthropological framework.
In addition to well-reasoned hypotheses of human evolution, part of that framework is the sociohistorical milieu in which those hypotheses were produced. Paleoanthropology never operated in a vacuum. Global events, concerns of science at large, and the proclivities of its practitioners conspired time and again to connect ideas about human predation to studies of aggression. And, invariably, new data from the human fossil record or from research on great apes, our closest living relatives, eventually called for a reassessment of that purported linkage between hunting and hostility.
Currently, we are firmly in the latter part of this repeating cycle. In the past fifteen or so years, it is primatologists who have generated the most provocative ideas about the presumed connections between human aggression and hunting and about how our earliest ancestors interacted with their biological symbionts and competitors. In particular, the keen observations of extant primates by Craig Stanford, Richard Wrangham, and Donna Hart and Robert Sussman have led to prominent but divergent conclusions about human origins. Drawing on his fieldwork in the forests of Tanzania, Stanford argued that primordial men were, like chimpanzees, eager and efficient killers, flesh-hungry Hunting Apes. Viewing the world through a darker lens, Wrangham upped the ante. His broad survey of ape behavior suggested to him that our forebears were not just hunters but that our agnatic ancestors were Demonic Males, hyperantagonistic louts, fully capable of dragging women around by their hair, and worse. Just as the collective work of Stanford and Wrangham brought the blood-letting to a fever pitch by the early 2000s, Hart and Sussman entered the scene to offset any such notions of Stone Age machismo and brutishness. They reminded us that most wild primates live a precarious existence, surrounded by and subject to the whims of hungry, prowling predators. Arguing by analogy, they continued that humans of the past must surely have faced the same looming conditions as do the primates of today. This opposing construct gave us Man the Hunted, the idea that our ancestors lived under continual menace from sabertooth cats, giant hyenas, and even large birds-eking out a spare existence in the shadows of a predator's world. So it is that we are left with the latest incarnations of the two great, contrasting narratives of human evolution: early man as misanthrope and mighty hunter versus early man as milksop.
In my opinion, there is merit in both of these views of early human life. However, I also argue that in perhaps underestimating-and surely underreporting-the importance of a rich archaeological record, produced by our Stone Age ancestors and stretching back at least two and a half million years into the past, the otherwise excellent accounts of my monkey- and ape-studying colleagues lack the essential component of the story. Simply put, the archaeological record stands as testament to the actual behavior of our prehistoric forerunners. Admittedly, this stone-and-bone witness of past action is accessible to us only in glimpses-the record is woefully incomplete, subject to the vagaries of ancient preservation and modern discovery. But those glimpses are still imbued with a genuineness of testimony that data on modern primate behavior can only approximate in its stead. Paleoanthropologist Tim White has opined in a similar context, "The rich detail of the modern world compared to the paucity of the prehistoric world can serve to obscure the recognition and analysis of evolutionary novelty. The present illuminates the past in myriad ways. However, the unwary paleobiologist can easily misinterpret past organisms by using inappropriate interpretive constructs based solely on modern form and function."
My already mild critique of the primate-centric approach to reconstructing our evolutionary history is softened even more by the undeniable fact that the work of Stanford, Wrangham, Hart, and Sussman (as well as many other primate specialists) is as fundamental as it is exceptional. In fact, without it science lacks the proper comparative framework to discern those precious few aspects of human uniqueness that might also be the basis of humanness. Those two things-human uniqueness and humanness-are, of course, different. Chimpanzees don't build skyscrapers, and baboons can't conduct orchestral symphonies (although it would be fun to watch them give it a go). More than that, though: neither a chimpanzee nor a baboon could even create and manage a simple campfire without a human's prompting and training. But these uniquely human capabilities-grand and humble-are, all the same, just overlays on a now very, very deeply contained human essence, which, if to be revealed, will require the efforts of not just geneticists, psychologists, and primate behaviorists but also those of archaeologists and paleoanthropologists. This little book concerns itself primarily with presenting current data from these latter two disciplines in supplement to the fine primatological work already extant. Combining these datasets is intended to uncover the basis of humanness from natural history records that are always tantalizing but that are also usually imperfect and, oftentimes, frustratingly resistant. Reduced, the story is a simple one. Human hunting underlies humanness. Successful human hunting is necessarily decoupled from human aggression. Tools, in enhancing the distance between a human hunter and its nonhuman prey, facilitated this decoupling of action and emotion.
The first proposition isn't original. Many paleoanthropologists have, over a long time, argued that hunting underpins humanness. Language is the only other proposed prime mover of "becoming human" that equals the influence of the "hunting hypothesis." Determining the prehistoric emergence of either-human-styled hunting or language-remains elusive, but that does not discourage continuing efforts. Most of those intellectual labors pivot around Homo erectus, a well-known species of extinct human ancestor that existed between 1.8 million and 500,000 years ago.
Does the Boy Make the (Hu)man?
The discovery, relegation, and eventual scientific acceptance of Homo erectus-and the psychological impact of its transition through those stages on its discoverer-is often told. In the end, Eugène Dubois emerges as a flawed hero, one whose emotional oscillation between impracticable tenacity and crushing resignation is finally vindicated posthumously, and only in historical retrospection. The story began valiantly enough in 1887 when Dubois bucked the shared academic wisdom that humans first evolved in Europe. Undeterred by archaic-looking (and thus possibly quite ancient) Neandertal fossils already known from various locales in Europe, Dubois acted instead on Darwin's prescient notion about the biogeography of human origins. Like Lord Monboddo (James Burnett), the famous Scottish jurist of nearly one hundred years before him, Darwin recognized that because our closest living relatives, the apes, are all tropical species, then the tropics must be where our most recent common ancestor with the apes resided, as well as where the earliest members of each descending lineage must have evolved. From that elegantly reasoned starting point, a three-year stretch of Dubois's search for man's earliest ancestor was, in contrast, epitomized by a paleontological naïveté surpassed only by its human cost. Supporting himself as a medical officer in the Royal Dutch East Indies Army, Dubois's fossil prospecting on Sumatra was an abject failure: no truly ancient fossils were recovered, one of his two engineers died, and several members of his conscripted prisoner work crew deserted. Others of those who toughed it out were wretchedly ill for much of the expedition. It was only in 1890 that Dubois finally moved to more fertile ground in Java, where a human skull discovered a few years earlier piqued his interest. By November 1890, Dubois's crew had unearthed a gnarly piece of lower jawbone that was undoubtedly that of a hominin (that is, a member of the zoological group that includes modern humans and all extinct species that are more closely related to us than they are to chimpanzees, with whom the hominins shared a most recent common ancestors about 6 million years ago). Motivated, Dubois continued his searches in Java, where, in October 1891, he found a hominin skullcap on the banks of the Solo River, and a hominin thigh bone ten months later. This was a truly impressive haul, especially considering that Dubois pressed on in spite of the orthodoxy of the time that largely consigned human origins research in the tropics as a wild goose chase.
Unfortunately, even when presented with Dubois's impressive proof to the contrary, that prevailing attitude did not desist. The best reception Dubois found for his fossils was lukewarm. Based on its morphology, competent anatomists could not deny that the thigh bone would have supported a two-legged, upright-walking (bipedal) hominin. But, most also refused to accept an association between the thigh bone and the beetle-browed skullcap, which they conjectured belonged to a giant, gibbonlike ape rather than to a transitional human species, as Dubois claimed. Initially, Dubois rallied against this poor treatment and defended the hominin status of Pithecanthropus erectus ("upright ape-man," the original scientific moniker of Homo erectus) on the European lecture circuit, but around the turn of the twentieth century he abruptly silenced himself on the subject and hid the fossils.
Dubois's interpretation of Homo erectus as a genuine human ancestor was ultimately vindicated by subsequent discovery of the species's remains at other sites throughout Asia, Europe, and Africa. Study of those finds gradually built up a compelling picture of an animal that was quite distinct from Australopithecus, the genus of true ape-man species from the Pliocene and lower Pleistocene (geological epochs that spanned, collectively, 5.3 million-780,000 years ago), and from the very earliest putative hominins like Ardipithecus, which first appeared about 7 million years ago, late in the Miocene Epoch. Conspicuously, Homo erectus skulls-with brainpan volumes ranging from 600 to 1067 cubic centimeters, and with an average somewhere around 880 cubic centimeters-are much larger than those of modern apes, ancient ape-men, and putative Miocene root hominins. (Compare the relatively impressive cranial capacity of Homo erectus to an apish 350- to 600-cubic centimeter range for Australopithecus, 300-350 for Ardipithecus, and a modern human average of about 1400 cubic centimeters.) Corresponding to the expanded braincase of Homo erectus, is its reduced, somewhat less projecting face as compared to apes and earlier occurring hominins. Inferred from isolated scraps of the skeleton, sketchy estimates put Homo erectus adults at around five to five and a half feet in height and about 120 pounds. Each of these findings was an exciting incremental advance beyond Dubois's rudimentary understanding of Homo erectus. But, it was a discovery made nearly thirty years ago in the fossil-rich badlands of northern Kenya that truly pulled back the veil to reveal Homo erectus in its emergent humanness. The first miniscule fragments of what would eventually be pieced together into a nearly complete skeleton of Homo erectus (figure 1) were found by legendary fossil hunter Kamoya Kimeu at an unassuming site called Nariokotome.
The 1.5 million-year-old skeleton of Nariokotome Boy, from northern Kenya. The child, when he died, was well on his way to adulthood; likewise, his skeleton represents an early hominin species, Homo erectus, that was well on its way to becoming human. (Photograph courtesy of Alan Walker)
Analysis of the Nariokotome Homo erectus skeleton disclosed a seemingly young boy with a body of nearly adult stature-a body that was long and linear, the kind of physique that is best adapted to the tropics, able to maximize heat dissipation across its relatively expansive surface area compared to its relatively small volume. Moreover, Nariokotome Boy's arms and legs are also proportioned in the same way as are those of modern people. This implies to many anatomists that he possessed a more efficient, modern humanlike upright gait than did the putative root hominins and ape-men, some of which had extremely wide hips, relatively short legs, and even opposable big toes-morphology that would have, in comparison to Homo erectus, cost these types of hominins some efficiency in two-legged, bipedal striding. The narrow but deep, barrel-shaped ribcage of Nariokotome Boy is also like ours and contrasts with the inverted funnel-shaped ribcages of apes (more on this seemingly innocuous difference later in this chapter).
Initially, it seemed that the boy had died well before he was done growing. In order to estimate the boy's age at death, Walker and his team (as well as other researchers) used their knowledge of the ages at which different parts of modern human skeletons stop growing. First, all of Nariokotome Boy's permanent teeth, except his upper canines and upper and lower wisdom teeth, are erupted: in terms of the pace at which modern human adult teeth develop and emerge into the mouth, this condition places him between ten and ten and a half years old when he died. In contrast, the elbow end of the boy's upper arm bone, the humerus, had just begun to fuse (it is not until a long bone is done growing that the joint surfaces at both ends fuse permanently to the shaft in the middle) when he died, giving an estimated age of around thirteen years old at death. Probably based on his large body, most researchers favored the older estimate, derived from the data on his long bone fusion, as the most likely age at which the boy died. And this was where the situation stalled for several years.
But more recently, Christopher Dean and Holly Smith, specialists on early hominin teeth, surprised many paleoanthropologists by concluding that Nariokotome Boy may have instead been as young as eight years old when he died. In order to ascertain the time elapsed in tooth development at his death, Dean and Smith studied the boy's perikymata, growth bands on tooth enamel that form incrementally at particular rates. Complicating Dean and Smith's task was the fact that the Nariokotome Boy died when he was an older juvenile, meaning that most of his baby teeth were already shed and several of his erupted permanent teeth had already ceased developing. Thus, no single tooth preserved in his jaws records time for the whole of his short life (as can be the case for another mammal that died as a younger juvenile). To overcome this difficulty, Dean and Smith started counting perikymata in the earliest formed tooth that is available in Nariokotome Boy's jaws. They then continued by picking up the uninterrupted count in another tooth that overlapped with the first tooth but that had also continued growing beyond the completed formation of that first tooth. In addition, because a tooth's roots continue growing for some time after its crown is completely formed, Dean and Smith needed to add estimated elapsed time of subsequent root growth to the Nariokotome Boy's teeth with completed crowns.
Another hitch in most perikymata studies is that the number of days between the formation of any tooth's adjacent perikymata, or periodicity, varies among individuals, although for modern people average periodicity is eight days. Nariokotome Boy, however, has only a few perikymata on his front teeth; research has shown that teeth with few, widely spaced perikymata usually have higher than usual periodicities (in other words, greater than eight-day periodicities). In addition, the state of Nariokotome Boy's tooth root formation argues against a typical eight-day periodicity for him. For these reasons, Dean and Smith suggest that a ten-day periodicity is more likely to characterize Nariokotome Boy, which, combined with the estimates of tooth root formation, places him between almost eight and a half and nearly nine years old when he died. (Using instead the typical modern human eight-day periodicity makes Nariokotome Boy even younger at death, between seven and a half and eight years old.)
Momentum in this mini-renaissance of our understanding of Homo erectus biology accelerated in 2008, when added to Dean and Smith's startling findings was the description of a newly discovered female Homo erectus pelvis from a fossil site in Ethiopia called Busidima. Analysis of the Busidima Homo erectus pelvis shed further new light on just what kind of animals were Nariokotome Boy and the rest of his species. The bony birth canal of the Busidima pelvis would have accommodated delivery of a baby with a brain volume of around 315 cubic centimeters, 30 to 50 percent of the adult brain size reconstructed for Homo erectus. In this respect, Homo erectus was like a modern person, with impressive prenatal brain development. But, because the adult Homo erectus brain reached (at best) only two-thirds of the mass of a modern human's, it means that its remaining postnatal brain growth was completed relatively more quickly after birth than it is for humans. The upshot is that Homo erectus children-even those as young as the Nariokotome Boy-were probably highly independent and physically capable much earlier in life than are modern human kids. Even allowing for his long, lean build at the time of his death, the initial extrapolations of a six-foot-one-inch to six-foot-five-inch adult into whom a supposed thirteen-year-old Narioktome Boy would grow was always perplexing. Dean and Smith's case for an eight-year-old Nariokotome Boy, already nearing his adult height and weight when he died, is less mystifying but holds just as profound-if different-implications.
Taken together, the new growth data for Nariokotome Boy and the Busidima Homo erectus pelvis paint a fascinating picture of a truly transitional animal-one that was like modern apes (and extinct root hominins and ape-men) in having a brief childhood, but also like modern humans in having significant brain growth in utero. In other words, Nariokotome Boy (and presumably all Homo erectus individuals, if he is indeed representative of the species) at eight years old was as physically mature as an ape of the same age but far exceeded the corporeal development of a modern eight-year-old human child, instead approaching more closely that of a fifteen-year-old boy. So, what does this mean for Homo erectus if we pose the crass (and probably unanswerable) question: Did the species, for all its admitted impressiveness, manage to traverse that indistinct threshold across which an ancestral hominin became a human?
In superlative irony, it is the Nariokotome Boy's principal analyst who has been most ardent in answering no to that question. It was Alan Walker, an eclectic intellectual with a background in geology and primate anatomy, who assembled and led the team of scientists who first studied Nariokotome Boy's skeletal biology. Walker's colleague, Ann MacLarnon, analyzed the skeleton's thoracic vertebrae and concluded, because of the constricted central canals of those vertebrae, that the boy possessed a spinal cord that would have been smaller in his thoracic region than that of a modern person. Specifically, based on comparisons with the differential distribution of spinal cord tissues in modern primates, Nariokotome Boy would have had a smaller amount of spinal cord gray matter in his thoracic spinal column than do modern people. This, in turn, indicated to MacLarnon that people living today have greater innervation of their thoraxes than did Homo erectus. By extension, modern people also have greater control over their breathing than did Homo erectus. Because precise control of breathing is a requisite for human speech, Walker and MacLarnon contend further that Homo erectus was unable to talk. And, for Walker, "At some deep level, being fully human is predicated upon being linguate. That meant that the boy my colleagues and I spent so many years discovering and analyzing was profoundly in-human."
Thus for those who equate language and humanness, the case on the humanness of Homo erectus was assumed closed: Nariokotome Boy lacked language, therefore he was not human. More recently, however, paleoanthropologists Bruce Latimer and Jim Ohman argued that the narrow thoracic vertebral canal of Nariokotome Boy is aberrant, its constriction not from normal development of the boy but instead a manifestation of pathological axial dysplasia. For Latimer and Ohman this means that we should not assume that the structure of Nariokotome Boy's vertebral canal is typical of Homo erectus as a species-and, as follows, conclusions about language deficiencies in the species based on the boy's skeletal morphology are, at best, premature. MacLarnon and her collaborator Gwen Hewitt countered that a pathologically narrowed vertebral canal in Nariokotome Boy-by up to 40 percent assuming that other, "normal" Homo erectus individuals had canals of the same sizes as do modern humans-would have disrupted neurological communication between the child's brain and his legs. They go on to note that Nariokotome Boy's leg bones are thick and robust-evidence of their normal, habitual use, which implies further that they were, necessarily, also normally innervated.
So, if the jury is still out on the "language-makes-human hypothesis" as it articulates with what we know about Homo erectus, then what about "man the hunter"-or "woman the gatherer," or "the home base/male-female food-sharing model," or any of those other once much vaunted hypotheses of humanness that I discuss in the forthcoming pages? For now, it is sufficient to note that each was proved ultimately a caricature of human cultural evolution, but also that within the core of that scrap heap is a nugget worthy of excavation. The starting premise of each of those discarded antiquations-that social organization is a practical marker of humanness-remains a pragmatic one. That is because the social organization of our ancient ancestors is potentially detectable in the archaeological and fossil records.
The Gut of the Matter
Modern hunter-gatherers are not untouched by the machinations of bordering agropastoralists and ever-encroaching industrialized societies, but they still maintain a cultural status among living people that most closely approximates the "natural" social condition of Homo sapiens. What are the circumstances that allowed for this fundamental state of organization to develop from an apelike existence, and when did it occur: at only 200,000 years ago, when modern humans first appeared; before that, with the emergence of Homo erectus at 1.8 million years ago; deeper in time, with beginning of the genus Homo at roughly 2.3 million years ago; or, even before, with the australopithecines or Miocene root hominins? Time and again, that question has been answered by invoking a single "magic catalyst." And each such answer is ultimately (if not immediately) annihilated.
An increased regularity of meat eating (beyond that observed in modern apes) does seem to be an essential component of the basic human subsistence strategy, but it is only part of a complex feedback system. Katharine Milton, a specialist on primate diets and nutrition, elaborated on this nuanced awareness: "It is the behavioral trajectory taken by humans to secure high-quality foods-rather than simply the foods themselves-that has made humans human. ... Evolving humans appear to have relied increasingly on brain power as the key element in their dietary strategy, using technological and social innovations to secure and process foods before ingestion." Human foraging societies developed out of this evolutionary strategy. And so, unsurprisingly, humanization was a process, rather than, as "magic catalyst" thinking evokes, a singular event.
Milton further discriminates her model from a "meat-as-the-magic-catalyst hypothesis" by viewing animal protein not as an exclusive nutritional focus of ancestral hominins, but instead as a dietary facilitator that allowed those hominins to intensify their exploitation of carbohydrate-rich plant foods. According to Milton, high-energy plants, rather than meat, were the most important fuels supporting increasingly larger brains (and bodies) as the genus Homo evolved since it first appeared about 2.3 million years ago. Because meat is so densely nutritional, consuming it regularly can ease the high-cost search and recovery of glucose-laden plants-resources that are patchily distributed and seasonally oscillating in their availability. Even a relatively small amount of meat (or of other carcass resources, like skin, marrow, and brains) efficiently satisfies daily requirements for essential fatty acids and basal energy.
This fact-that, with meat, a little gets you a lot-eventually became the basis of an influential explanation for a shift in hominin body form, which was tracked through the course of human evolution. The human brain is a voracious consumer; at just around 2 percent of the body's total mass, it demands nearly 20 percent of its daily intake of energy and oxygen. Paleoanthropologists Leslie Aiello and Peter Wheeler referred to the brain as the most important expensive body tissue in their "expensive tissue hypothesis" of the evolution of the genus Homo. But, like the brain, the heart, kidneys, liver, and gastrointestinal tract are also very demanding metabolically. Aiello and Wheeler argued that it was impossible for all these other organs to be maintained at such high cost if the hominin brain was to enlarge over time; there is a limit on the amount of energy that an organism is capable of capturing every day. The problem is that an early hominin would not survive very long with compromised heart, kidney, or liver function. Properly functioning guts are also essential to life, but there is more biological latitude for variation in gut size and long-term survivorship. According to Aiello and Wheeler, this all means that reducing gut size was the only evolutionarily workable option for early Homo as it evolved larger brains but also continued to maintain critical life-supporting bodily functions.
Aiello and Wheeler also suggested that paleontological evidence confirmed their hypothesis. Typically, hominin soft tissues, like brains and guts, do not fossilize. However, ancillary bony evidence that is readilypreserved can inform us indirectly about the respective sizes of our ancestors' brains and guts. The empty space that the brain occupied in life can be measured on fossilized skulls, providing a size estimate of that once-living, now-decayed organ. Comparisons of skulls from different hominin species show that brain volume increased between hominin species through time. Gut size is trickier, but probably not impossible, to pin down using the osseous fossil record. Aiello and Wheeler wrote, "The large gut of the living [apes] gives their bodies a somewhat pot-bellied appearance, lacking a discernable waist. This is because the rounded profile of the abdomen is continuous with that of the lower portion of the rib cage, which is shaped like an inverted funnel. ..." Based on the fragments of ape-man ribs and vertebrae recovered up to the early 2000s, most paleoanthropologists inferred that the earliest members of the genus Australopithecus had thoraxes with the same apish morphology as the trunks of modern gorillas and chimpanzees. In addition, the configuration of Australopithecus hips, which are more widely flaring at their tops than is expected from the estimated heights of individual ape-men, was also used to reconstruct their guts as protuberant.
However, in 2005, a 3.6 million-year-old ape-man skeleton, from the species Australopithecus afarensis, was discovered at the Ethiopian site of Woranso-Mille. The skeleton preserves several partial ribs and a partial pelvis, allowing reconstruction of the shape of its thorax. To the surprise of many, the skeleton's reconstructed thorax does not have the inverted funnel shape of an ape, thought previously (based on more fragmentary rib fossils) to also characterize the ape-men. Instead, the Woranso-Mille ribcage is uniquely bell shaped, wide at the top, like that of early Homo and modern humans-and, wide at the bottom. (An even more recent study of Australopithecus afarensis thoracic vertebrae, recovered from the Ethiopian site of Hadar, agrees that ape-men had ribcages that were in many ways more humanlike than was previously appreciated.) Implications of this more accurate understanding of Australopithecus thorax shape for the "expensive tissue hypothesis" remain unexplored, but I note that inferences of an expansive ape-men gut are unchanged by the Woranso-Mille skeleton; it is only the shape of the top of the specimen's ribcage-unrelated to gut size-that is unexpected.
More direct assault on the "expensive tissue hypothesis" comes from a recent study on brain size and organ mass in one hundred modern mammal species, including twenty-three types of primates. Anthropologists Ana Navarrete, Carl van Schaik, and Karin Isler demonstrated that when "controlling for fat-free body mass, brain size is not negatively correlated with the mass of the digestive tract or any other expensive tissue, thus refuting the expensive tissue hypothesis." The researchers concluded that brain enlargement in hominins must have, therefore, been enabled instead by a combination of biological phenomena, which not only increased energy capture (as is possible with a higher-quality diet) but that also reduced overall energy demands of the individual. Among other adaptations, Navarrete and her colleagues point to the increasing efficiency of two-legged bipedal walking over the long course of human evolution (as evinced by more recent, larger-brained Homo erectus having longer legs and narrower hips than did earlier, smaller-brained Australopithecus, with its shorter legs and wider hips) as a reduction in energy expenditure relative to the climbing and four-legged quadrupedal locomotion of nonhuman apes.
Regardless, the large brain size and barrel-shaped ribcage of Homo erectus, as exemplified by the Nariokotome Boy, mean that-no matter how and why those morphologies evolved-evolution ultimately set up a situation in which Homo erectus was now locked into an inescapable feedback system, a system in which reliable access to animal product was no longer a luxury but a necessity. The small, short guts of Homo erectus did not provide long enough transit times to extract from bulk vegetation its complexly bound energy and nutrients that these hominins needed to feed their large, hungry brains. In addition, by almost 1.8 million years ago, there is incontrovertible evidence from the site of Dmanisi, in the (former Soviet) Republic of Georgia, of Homo erectus living outside of Africa, along the southern slopes of the Caucasus Mountains. Subsisting in that markedly seasonal environment, with its long winters during which plant food was unavailable, necessitated that Homo erectus was a proficient meat forager-like modern humans, using weapons to kill animals larger than an individual hunter, and, because of the large size of prey, probably sharing meat within a group. Indeed, the preponderance of archaeological evidence from Homo erectus sites worldwide (not just from temperate regions, like the Caucasus Mountains) supports the idea that the species as a whole had developed these humanlike abilities by as early as 1.8 million years ago.
But, can we push deeper in time? Were the ape-men also perhaps more men than apes in their predatory prowess? A puckish, early twentieth-century champion of Australopithecus certainly thought so. More than that, he thought he had proved it in the most dramatic way.