Facultative Carnivore Reasons

Can a carnivore diet provide all essential nutrients?

A carnivore diet can provide all essential nutrients

Can a carnivore diet provide all essential nutrients?

Can a carnivore diet provide all essential nutrients?

Abstract

Purpose of review: The aim of this study was to summarize current contributions affecting knowledge and predictions about the nutritional adequacy of plant-free diets, contextualized by historical accounts.

Recent findings: As demonstrated in recent experiments, nutrient interactions and metabolic effects of ketogenic diets can impact nutritional needs, sometimes resulting in nutrient-sparing effects. Other studies highlight conflicting hypotheses about the expected effect on metabolic acidosis, and therefore mineral status, of adding alkaline mineral-rich vegetables.

Summary: A carnivore diet is a newly popular, but as yet sparsely studied form of ketogenic diet in which plant foods are eliminated such that all, or almost all, nutrition derives from animal sourced foods. Ketogenic diets are already nutritionally controversial due to their near-complete absence of carbohydrate and high dietary fat content, but most ketogenic diet advocates emphasize the inclusion of plant foods. In this review, we discuss the implications of relying solely on animal sourced foods in terms of essential nutrient status.

Animal Fat and Cholesterol May Have Helped Primitive Man Evolve a Large Brain

Large brains evolved from animal fat and cholesterol in diet.

Animal Fat and Cholesterol May Have Helped Primitive Man Evolve a Large Brain

Animal Fat and Cholesterol May Have Helped Primitive Man Evolve a Large Brain


ANIMAL FAT AND CHOLESTEROL MAY HAVE HELPED PRIMITIVE MAN EVOLVE A LARGE BRAIN FRANK D. MANN* In 1962, James V. Neel published a paper entitled "Diabetes Mellitus: A 'Thrifty' Genotype Rendered Detrimental by 'Progress'?" [I]; 20 years later, Neel revisited his theory, suggesting appropriate revisions because of significant additions to our knowledge of diabetes. During the intervening period, just as predicted by Neel's theory, a high frequency of diabetes was observed in populations emerging from the sparse diet of a primitive economy into the "progress" of a developed economy [2]. A prominent example of such a population, the Pima Indian tribe, has been and continues to be thoroughly studied [3]. Neel himself carried out a difficult field investigation of two South American tribes, showing that, in accordance with his theory, they were not diabetic while remaining in a primitive state [2]. An interest of many years' duration in Neel's original concept of a "thrifty" genotype led me to consider the possibility of an analogous case of a heritage now detrimental but originally advantageous. Overproduction of cholesterol, associated with consumption of animal fat but probably also with evolutionary changes in hepatic physiology, may have been helpful in meeting increased need for cholesterol in the evolution of the large brain of modern humans. I shall present an analysis of evidence for this hypothLeakey 's Reasoning: A Rich Meat Diet was Required for the Large Human Brain Leakey has been led by his reading of the anthropological language of fossil teeth and skulls to an unequivocal judgment of the importance of a meat diet [4]. Leakey does not attempt to evaluate the individual nutritional components of the meat diet, but considers a nutritionally richer diet to be necessary for the maintenance of the metabolically expensive large brain. The modern human brain surely is metabolically expensive, requiring 20 percent of the total resting oxygen consumption of the body, although it is only 2 percent of the body weight [5]. Chamberlain also considers the meat diet to be important and focuses on the essential fatty acids. While noting that these substances are present in vegetable sources, he suggests that the ample supply of the essential as well as other unsaturated fatty acids in meat could have conferred a survival advantage, a three-fold increase in the size of the human brain within 3 million years of evolution [6]. To obtain meat as the principal constituent of diet, primitive man would have had to compete with animals such as cats, which had already evolved formidable built-in equipment for killing: fangs and claws. Rapid evolution of a large brain was needed in this deadly competition. A diet rich in animal fat would have markedly increased the intake and hepatic synthesis of cholesterol. Would this have helped the development of the large brain? Evidence regarding the functions of cholesterol and the mechanisms of the supply and distribution is pertinent to this hypothesis. Cholesterol and Membranes Cholesterol has long been known to be an essential constituent of cell membranes and has been reported to comprise, on a molecular basis, half of the lipid content of the external cell membrane [7]. Myelinated nerve fibers make up a large part of the total mass of the brain. Their highly specialized covering membranes contain a large quantity of various lipids. Thus, early biochemists found brain to be a convenient source material from which to prepare cholesterol. Fielding has described the complex mechanism of transporters and receptors which have evolved to assure that all membranes have the appropriate individual content of free cholesterol [8]. Tight regulation of the very different amounts of free cholesterol in different membranes is necessary for various vital functions performed by structures located in the membranes . These include the ion pumps necessary for the life ofevery cell, and adenylate cyclase, which is required for the cell to respond to transmitter substances such as norepinephrine. Schroeder and colleagues have recently reviewed the evidence that highly asymmetric distributions of free cholesterol, within the membranes themselves...

Metabolic correlates of hominid brain evolution

Human brains use much more energy because of rich energy diets high in animal fat

Metabolic correlates of hominid brain evolution

While the brain of an adult primate consumes <10% of the total resting metabolic rate, this amounts to 20-25% in the case of anatomically modern humans [Leonard et al. 2003].


Metabolic correlates of hominid brain evolution

Abstract

Large brain sizes in humans have important metabolic consequences as humans expend a relatively larger proportion of their resting energy budget on brain metabolism than other primates or non-primate mammals. The high costs of large human brains are supported, in part, by diets that are relatively rich in energy and other nutrients. Among living primates, the relative proportion of metabolic energy allocated to the brain is positively correlated with dietary quality. Humans fall at the positive end of this relationship, having both a very high quality diet and a large brain size. Greater encephalization also appears to have consequences for aspects of body composition. Comparative primate data indicate that humans are ‘under-muscled’, having relatively lower levels of skeletal muscle than other primate species of similar size. Conversely, levels of body fatness are relatively high in humans, particularly in infancy. These greater levels of body fatness and reduced levels of muscle mass allow human infants to accommodate the growth of their large brains in two important ways: (1) by having a ready supply of stored energy to ‘feed the brain’, when intake is limited and (2) by reducing the total energy costs of the rest of the body. Paleontological evidence indicates that the rapid brain evolution observed with the emergence of Homo erectus at approximately 1.8 million years ago was likely associated with important changes in diet and body composition.

Phylogenetic rate shifts in feeding time during the evolution of Homo

Smaller molars and less time spent feeding despite body size show that meat eating was causal in our evolution.

Phylogenetic rate shifts in feeding time during the evolution of Homo

Starting with Homo erectus, humans developed smaller molars and also began to spend a lot less time on feeding than would be predicted from body mass and phylogeny with other apes (only 5% instead of a predicted 48% of daily activity in Homo sapiens) [Organ et al. 2011].


Phylogenetic rate shifts in feeding time during the evolution of Homo

Chris Organ, Charles L. Nunn, Zarin Machanda, and Richard W. Wrangham


Abstract

Unique among animals, humans eat a diet rich in cooked and nonthermally processed food. The ancestors of modern humans who invented food processing (including cooking) gained critical advantages in survival and fitness through increased caloric intake. However, the time and manner in which food processing became biologically significant are uncertain. Here, we assess the inferred evolutionary consequences of food processing in the human lineage by applying a Bayesian phylogenetic outlier test to a comparative dataset of feeding time in humans and nonhuman primates. We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split. Along this same branch, Homo erectus shows a marked reduction in molar size that is followed by a gradual, although erratic, decline in H. sapiens. We show that reduction in molar size in early Homo (H. habilis and H. rudolfensis) is explicable by phylogeny and body size alone. By contrast, the change in molar size to H. erectus, H. neanderthalensis, and H. sapiens cannot be explained by the rate of craniodental and body size evolution. Together, our results indicate that the behaviorally driven adaptations of food processing (reduced feeding time and molar size) originated after the evolution of Homo but before or concurrent with the evolution of H. erectus, which was around 1.9 Mya.

Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans

Decrease in teeth size, jawbones, reduction of chewing muscles, and weaker bite force indicate shift to animal source foods

Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans

Furthermore, the shift from fibrous plants to including ASFs, together with the use of tools, paralleled a decrease in teeth size and jawbones, a reduction in chewing muscles, and weaker maximum bite force capabilities [Teaford & Ungar 2000; Zink & Lieberman 2016].

Diet and the evolution of the earliest human ancestors

Mark F. Teaford and Peter S. Ungar

Abstract

Over the past decade, discussions of the evolution of the earliest human ancestors have focused on the locomotion of the australopithecines. Recent discoveries in a broad range of disciplines have raised important questions about the influence of ecological factors in early human evolution. Here we trace the cranial and dental traits of the early australopithecines through time, to show that between 4.4 million and 2.3 million years ago, the dietary capabilities of the earliest hominids changed dramatically, leaving them well suited for life in a variety of habitats and able to cope with significant changes in resource availability associated with long-term and short-term climatic fluctuations.

Since the discovery of Australopithecus afarensis, many researchers have emphasized the importance of bipedality in scenarios of human origins (1, 2). Surprisingly, less attention has been focused on the role played by diet in the ecology and evolution of the early hominids (as usually received). Recent work in a broad range of disciplines, such as paleoenvironmental studies (3, 4), behavioral ecology (5), primatology (6), and isotope analyses (7), has rekindled interests in early hominid diets. Moreover, important new fossils from the early Pliocene raise major questions about the role of dietary changes in the origins and early evolution of the Hominidae (810). In short, we need to focus not just on how the earliest hominids moved between food patches, but also on what they ate when they got there.

This paper presents a review of the fossil evidence for the diets of the Pliocene hominids Ardipithecus ramidus, Australopithecus anamensis, Australopithecus afarensis, and Australopithecus africanus. These hominids offer evidence for the first half of human evolution, from our split with prehistoric apes to the earliest members of our own genus, Homo. The taxa considered are viewed as a roughly linear sequence from Ardipithecus to A. africanus, spanning the time from 4.4 million to 2.5 million years ago. As such, they give us a unique opportunity to examine changes in dietary adaptations of our ancestors over nearly 2 million years. We also trace what has been inferred concerning the diets of the Miocene hominoids to put changes in Pliocene hominid diets into a broader temporal perspective. From such a perspective, it becomes clear that the dietary capabilities of the early hominids changed dramatically in the time period between 4.4 million and 2.3 million years ago. Most of the evidence has come from five sources: analyses of tooth size, tooth shape, enamel structure, dental microwear, and jaw biomechanics. Taken together, they suggest a dietary shift in the early australopithecines, to increased dietary flexibility in the face of climatic variability. Moreover, changes in diet-related adaptations from A. anamensis to A. afarensis to A. africanus suggest that hard, abrasive foods became increasingly important through the Pliocene, perhaps as critical items in the diet.

Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans

  • Katherine D. Zink &

  • Daniel E. Lieberman

Abstract
The origins of the genus Homo are murky, but by H. erectus, bigger brains and bodies had evolved that, along with larger foraging ranges, would have increased the daily energetic requirements of hominins1,2. Yet H. erectus differs from earlier hominins in having relatively smaller teeth, reduced chewing muscles, weaker maximum bite force capabilities, and a relatively smaller gut3,4,5. This paradoxical combination of increased energy demands along with decreased masticatory and digestive capacities is hypothesized to have been made possible by adding meat to the diet6,7,8, by mechanically processing food using stone tools7,9,10, or by cooking11,12. Cooking, however, was apparently uncommon until 500,000 years ago13,14, and the effects of carnivory and Palaeolithic processing techniques on mastication are unknown. Here we report experiments that tested how Lower Palaeolithic processing technologies affect chewing force production and efficacy in humans consuming meat and underground storage organs (USOs). We find that if meat comprised one-third of the diet, the number of chewing cycles per year would have declined by nearly 2 million (a 13% reduction) and total masticatory force required would have declined by 15%. Furthermore, by simply slicing meat and pounding USOs, hominins would have improved their ability to chew meat into smaller particles by 41%, reduced the number of chews per year by another 5%, and decreased masticatory force requirements by an additional 12%. Although cooking has important benefits, it appears that selection for smaller masticatory features in Homo would have been initially made possible by the combination of using stone tools and eating meat.

Meat and Nicotinamide: A Causal Role in Human Evolution, History, and Demographics

Meat and Nicotinamide

Meat and Nicotinamide: A Causal Role in Human Evolution, History, and Demographics

Meat and Nicotinamide: A Causal Role in Human Evolution, History, and Demographics - 2017 Adrian C Williams and Lisa J Hill

Hunting for meat was a critical step in all animal and human evolution. A key brain-trophic element in meat is vitamin B3 / nicotinamide. The supply of meat and nicotinamide steadily increased from the Cambrian origin of animal predators ratcheting ever larger brains. This culminated in the 3-million-year evolution of Homo sapiens and our overall demographic success. We view human evolution, recent history, and agricultural and demographic transitions in the light of meat and nicotinamide intake. A biochemical and immunological switch is highlighted that affects fertility in the ‘de novo’ tryptophan-to-kynurenine-nicotinamide ‘immune tolerance’ pathway. Longevity relates to nicotinamide adenine dinucleotide consumer pathways. High meat intake correlates with moderate fertility, high intelligence, good health, and longevity with consequent population stability, whereas low meat/high cereal intake (short of starvation) correlates with high fertility, disease, and population booms and busts. Too high a meat intake and fertility falls below replacement levels. Reducing variances in meat consumption might help stabilise population growth and improve human capital.

Meat, NAD, and Human Evolution

Archaeological and palaeo-ontological evidence indicate that hominins increased meat consumption and developed the necessary fabricated stone tools while their brains and their bodies evolved for a novel foraging niche and hunting range, at least 3 million years ago. This ‘cradle of mankind’ was centred around the Rift Valley in East Africa where the variable climate and savannah conditions, with reductions in forests and arboreal living for apes, may have required clever and novel foraging in an area where overall prey availability but also predator dangers were high44–50 (Figure 2). Tools helped hunting and butchery and reduced time and effort spent chewing as did cooking later.51 Another crucial step may have been the evolution of a cooperative social unit with divisions of labour, big enough to ensure against the risks involved in hunting large game and the right size to succeed as an ambush hunter – with the requisite prosocial and altruistic skills to also share the spoil across sexes and ages.52 The ambitious transition from prey to predator hunting the then extensive radiation of megaherbivores so big that they are normally considered immune to carnivores, needed advanced individual and social cognition as humans do not have the usual physical attributes of a top predator.53–59 Adult human requirements to run such big brains are impressive enough, but during development, they are extraordinarily high with 80% to 90% of basal metabolic rate necessary in neonates – this is probably not possible after weaning without the use of animal-derived foods.51,60,61

Endurance running and the evolution of Homo

Long distance endurance running may have played a role in persistence hunting

Endurance running and the evolution of Homo

Endurance running and the evolution of Homo

Abstract

Striding bipedalism is a key derived behaviour of hominids that possibly originated soon after the divergence of the chimpanzee and human lineages. Although bipedal gaits include walking and running, running is generally considered to have played no major role in human evolution because humans, like apes, are poor sprinters compared to most quadrupeds. Here we assess how well humans perform at sustained long-distance running, and review the physiological and anatomical bases of endurance running capabilities in humans and other mammals. Judged by several criteria, humans perform remarkably well at endurance running, thanks to a diverse array of features, many of which leave traces in the skeleton. The fossil evidence of these features suggests that endurance running is a derived capability of the genus Homo, originating about 2 million years ago, and may have been instrumental in the evolution of the human body form.


Rethinking the evolution of the human foot: insights from experimental research

Nicholas B. Holowka, Daniel E. Lieberman

Journal of Experimental Biology 2018 221: jeb174425 doi: 10.1242/jeb.174425 Published 6 September 2018

Abstract

Adaptive explanations for modern human foot anatomy have long fascinated evolutionary biologists because of the dramatic differences between our feet and those of our closest living relatives, the great apes. Morphological features, including hallucal opposability, toe length and the longitudinal arch, have traditionally been used to dichotomize human and great ape feet as being adapted for bipedal walking and arboreal locomotion, respectively. However, recent biomechanical models of human foot function and experimental investigations of great ape locomotion have undermined this simple dichotomy. Here, we review this research, focusing on the biomechanics of foot strike, push-off and elastic energy storage in the foot, and show that humans and great apes share some underappreciated, surprising similarities in foot function, such as use of plantigrady and ability to stiffen the midfoot. We also show that several unique features of the human foot, including a spring-like longitudinal arch and short toes, are likely adaptations to long distance running. We use this framework to interpret the fossil record and argue that the human foot passed through three evolutionary stages: first, a great ape-like foot adapted for arboreal locomotion but with some adaptations for bipedal walking; second, a foot adapted for effective bipedal walking but retaining some arboreal grasping adaptations; and third, a human-like foot adapted for enhanced economy during long-distance walking and running that had lost its prehensility. Based on this scenario, we suggest that selection for bipedal running played a major role in the loss of arboreal adaptations.

Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye

Eyes - Humans can communicate with gazes, useful for communication when hunting.

Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye

Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye

Abstract

In order to clarify the morphological uniqueness of the human eye and to obtain cues to understanding its adaptive significance, we compared the external morphology of the primate eye by measuring nearly half of all extant primate species. The results clearly showed exceptional features of the human eye: (1) the exposed white sclera is void of any pigmentation, (2) humans possess the largest ratio of exposed sclera in the eye outline, and (3) the eye outline is extraordinarily elongated in the horizontal direction. The close correlation of the parameters reflecting (2) and (3) with habitat type or body size of the species examined suggested that these two features are adaptations for extending the visual field by eyeball movement, especially in the horizontal direction. Comparison of eye coloration and facial coloration around the eye suggested that the dark coloration of exposed sclera of nonhuman primates is an adaptation to camouflage the gaze direction against other individuals and/or predators, and that the white sclera of the human eye is an adaptation to enhance the gaze signal. The uniqueness of human eye morphology among primates illustrates the remarkable difference between human and other primates in the ability to communicate using gaze signals.

Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

Man the Fat Hunter

Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

It is likely that humans preferred large herbivores given the abundance of their biomass, the relative ease of hunting them, net caloric returns, and their higher fat content, which accommodates physiological limits on protein consumption


Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

Abstract

The worldwide association of H. erectus with elephants is well documented and so is the preference of humans for fat as a source of energy. We show that rather than a matter of preference, H. erectus in the Levant was dependent on both elephants and fat for his survival. The disappearance of elephants from the Levant some 400 kyr ago coincides with the appearance of a new and innovative local cultural complex – the Levantine Acheulo-Yabrudian and, as is evident from teeth recently found in the Acheulo-Yabrudian 400-200 kyr site of Qesem Cave, the replacement of H. erectus by a new hominin. We employ a bio-energetic model to present a hypothesis that the disappearance of the elephants, which created a need to hunt an increased number of smaller and faster animals while maintaining an adequate fat content in the diet, was the evolutionary drive behind the emergence of the lighter, more agile, and cognitively capable hominins. Qesem Cave thus provides a rare opportunity to study the mechanisms that underlie the emergence of our post-erectus ancestors, the fat hunters.

The impact of large terrestrial carnivores on Pleistocene ecosystems

Hunting large prey as humans did is exclusively associated with hypercarnivory

The impact of large terrestrial carnivores on Pleistocene ecosystems

The impact of large terrestrial carnivores on Pleistocene ecosystems

Significance

At very high densities, populations of the largest herbivores, such as elephants, have devastating effects on the environment. What prevented widespread habitat destruction in the Pleistocene, when the ecosystem sustained many species of huge herbivores? We use data on predator–prey body mass relationships to predict the prey size ranges of large extinct mammalian carnivores, which were more diverse and much larger than living species. We then compare these prey size ranges with estimates of young mammoth sizes and show that juvenile mammoths and mastodons were within predicted prey size ranges of many of the Pleistocene carnivores. From this and other fossil evidence we argue that, by limiting population sizes of megaherbivores, large carnivores had a major impact on Pleistocene ecosystems.

Caries Through Time: An Anthropological Overview

Caries related to hypocarnivory

Caries Through Time: An Anthropological Overview

Although tooth plaque isn’t suitable for determining the HTL within the vast zooarchaeological landscape [53, 54], it may be marginally accurate for identifying shifts. For instance, Neanderthals were known to rely heavily on animal-sourced foods and only showed six caries (signs of decay) out of 1,250 of their teeth that were examined [55]. Caries started appearing in substantial numbers between 13,700 and 15,000 years ago in Morocco, alongside evidence of increased starch consumption [56]. The low occurrence of caries during most of the Pleistocene corresponds to a low carbohydrate, high HTL pattern.


Hard plant tissues do not contribute meaningfully to dental microwear: evolutionary implications

Abstract
Reconstructing diet is critical to understanding hominin adaptations. Isotopic and functional morphological analyses of early hominins are compatible with consumption of hard foods, such as mechanically-protected seeds, but dental microwear analyses are not. The protective shells surrounding seeds are thought to induce complex enamel surface textures characterized by heavy pitting, but these are absent on the teeth of most early hominins. Here we report nanowear experiments showing that the hardest woody shells – the hardest tissues made by dicotyledonous plants – cause very minor damage to enamel but are themselves heavily abraded (worn) in the process. Thus, hard plant tissues do not regularly create pits on enamel surfaces despite high forces clearly being associated with their oral processing. We conclude that hard plant tissues barely influence microwear textures and the exploitation of seeds from graminoid plants such as grasses and sedges could have formed a critical element in the dietary ecology of hominins.


Non-occlusal dental microwear variability in a sample of Middle and Late Pleistocene human populations from Europe and the Near East

Abstract

Non-occlusal, buccal tooth microwear variability has been studied in 68 fossil humans from Europe and the Near East. The microwear patterns observed suggest that a major shift in human dietary habits and food processing techniques might have taken place in the transition from the Middle to the Late Pleistocene populations. Differences in microwear density, average length, and orientation of striations indicate that Middle Pleistocene humans had more abrasive dietary habits than Late Pleistocene populations. Both dietary and cultural factors might be responsible for the differences observed. In addition, the Middle Paleolithic Neanderthal specimens studied show a highly heterogeneous pattern of microwear when compared to the other samples considered, which is inconsistent with a hypothesis of all Neanderthals having a strictly carnivorous diet. The high density of striations observed in the buccal surfaces of several Neanderthal teeth might be indicative of the inclusion of plant foods in their diet. The buccal microwear variability observed in the Neanderthals is compatible with an overall exploitation of both plant and meat foods on the basis of food availability. A preliminary analysis of the relationship between buccal microwear density and climatic conditions prevailing in Europe during the Late Pleistocene has been attempted. Cold climatic conditions, as indicated by oxygen isotope stage data, seem to be responsible for higher densities of microwear features, whereas warmer periods could correspond to a reduced pattern of scratch density. Such a relationship would be indicative of less abrasive dietary habits, perhaps more meat dependent, during warmer periods.


Caries Through Time: An Anthropological Overview


Bioanthropological researches carried out in the last few decades have given special emphasis to the study of the relation between disease, as well as social and environmental phenomena, enhancing the already strong connection between lifestyle and health conditions during history of humankind (Cohen & Armelagos, 1984; Katzenberg & Saunders, 2008; Larsen, 1997). Because infectious diseases result from the interaction between host and agent, modulated by ecological and cultural environments, the comparative study of the historic prevalence of diseases in past populations worldwide can provide important data about their related factors and etiology. The study of dental diseases (such as caries) has been given special attention from Paleopathology2. The tooth, for its physical features tends to resist destruction and taphonomic conditions better than any other body tissue and therefore, is a valuable element for the study on individual’s diet, and social and cultural factors related to it, from a population perspective. Caries is one of the infectious diseases more easily observable in human remains retrieved from archaeological excavations. For their long time of development and non-lethal nature the lesions presented at the time of the death remain recognizable indefinitely, allowing to infer, along with other archaeological and ecological data, the types of food that a specific population consumed, the cooking technology they used, the relative frequency of consumption, and the way the food was shared among the group (Hillson, 2001 2008; Larsen, 1997; Rodríguez, 2003).


Earliest evidence for caries and exploitation of starchy plant foods in Pleistocene hunter-gatherers from Morocco

Significance

We present early evidence linking a high prevalence of caries to a reliance on highly cariogenic wild plant foods in Pleistocene hunter-gatherers from North Africa. This evidence predates other high caries populations and the first signs of food production by several thousand years. We infer that increased reliance on wild plants rich in fermentable carbohydrates caused an early shift toward a disease-associated oral microbiota. Systematic harvesting and processing of wild food resources supported a more sedentary lifestyle during the Iberomaurusian than previously recognized. This research challenges commonly held assumptions that high rates of caries are indicative of agricultural societies.

Evidence for dietary change but not landscape use in South African early hominins

Early Homo indistinguishable from carnivores using modified trace elements method on tooth samples.

Evidence for dietary change but not landscape use in South African early hominins

Using a modified trace elements method on tooth samples, early Homo was found to be indistinguishable from carnivores [52].


Evidence for dietary change but not landscape use in South African early hominins


Abstract

The dichotomy between early Homo and Paranthropus is justified partly on morphology. In terms of diet, it has been suggested that early Homo was a generalist but that Paranthropus was a specialist. However, this model is challenged and the issue of the resources used by Australopithecus, the presumed common ancestor, is still unclear. Laser ablation profiles of strontium/calcium, barium/calcium and strontium isotope ratios in tooth enamel are a means to decipher intra-individual diet and habitat changes. Here we show that the home range area was of similar size for species of the three hominin genera but that the dietary breadth was much higher in Australopithecus africanus than in Paranthropus robustus and early Homo. We also confirm that P. robustus relied more on plant-based foodstuffs than early Homo. A South African scenario is emerging in which the broad ecological niche of Australopithecus became split, and was then occupied by Paranthropus and early Homo, both consuming a lower diversity of foods than Australopithecus.