Without Grandmothers We Might Not Be Here At All

As adults, we’re often nostalgic for our childhood. A time when life seemed so much simpler. When we were free from the hassles of money, pressures of work and responsibilities of family and care.

When we were free to play and imagine people and worlds far from reality, and almost everything we did was new, exciting, and to be explored and understood; to be conquered, torn apart, feared, cuddled or tasted.

It might come as a surprise to learn that even having a childhood is something unique to humans.

We’re the only primate to have one, and the only one also to suffer the pangs of adolescence; but that’s another story.

Childhood is one stage in the human life cycle, or what biologists call our ‘life history’.

Life history is, for example, the time it takes for a fetus to grow, or the length of the various stages of life, like childhood or adulthood, important events like the age at first birth for a mother or the number offspring she has at each birth, age at death, and so on.

While every species has a unique life history, ours is downright weird compared to other primates, and indeed, most mammals.

Even among hunter-gatherers, our species normally lives around twice as long as our chimpanzee cousins do; we have the longest lifespan of all primates.

Infant mortality was similarly high among human foragers and chimpanzees, but if you survived until 15 years of age, your life expectancy would have soared to about 54 years (human forager) and 30 years (chimp) of age.

Most mammals including chimps have three stages in their life cycle: infancy, a juvenile stage and adulthood.

Infancy, the period from birth until weaning, when kids move onto solid food, is a lot shorter in humans though than other apes.

Infants in traditional societies were often weaned after about 3 years of age, but in chimpanzees it normally occurs around age 7.

Now, all primates except humans make the transition from infancy to adulthood via a juvenile (or ‘tween’) stage.

Instead, we pass through two extra stages in our life cycle – childhood and adolescence – giving us five stages of growth and development instead of three.

At each of these stages the body grows at different rates, different organs mature at varying times, and in traditional human societies, there were changes in the kinds of foods eaten and the roles kids played in society.

Childhood normally lasts around 4 years, from ages 3 until roughly 7 years of age. It’s the time after weaning when we would have been learning how eat solid foods, prepared for us by adults, when our brains reached their full size, and our first permanent molar teeth appeared.

Why are we the only primate to have a childhood? Well, it probably evolved as a mechanism to allow women to have more offspring.

Human females reproduce for around twice as long as chimps, owing to childhood and early weaning.

Breastfeeding can be an effective form of birth control by delaying the return to ovulation.

So, by weaning kids much sooner, mothers are free to reproduce again, and much, much, sooner than in other apes.

So our species can have many more children than any other apes through extending our overall period of reproduction and reducing the interval between births; which helps in part to explain why there’s seven billion of us today.

Intertwined with the evolution of childhood is the origins of grandmothering.

We’re also the only primate to experience menopause, or more correctly, to have grandmothers; women who live well beyond the reproductive stage of their lives.

We’re apparently not completely alone in this among mammals, with some species like killer whales also experiencing menopause.

Episode 12 of my UNSWTV series, ‘How did we get here’, looks at the importance of grandmothers in human evolution.

But human grandmothers probably evolved as a result of the early weaning of infants: weaned children rely heavily on foods collected and prepared by adults.

Hunter-gatherer children would also have been highly vulnerable to being killed by predators, and are especially vulnerable to disease. So, they would have, still do, demand considerable care and attention.

Because grandmothers have finished reproducing themselves, they are uniquely placed to invest time into helping feed and care for their grandchildren.

This would have greatly improved the survival of children, and allowed their daughters to have more of them, passing on more of their own genes through better survival rates among their grandkids.

And, not wanting to neglect the granddads entirely, the wisdom of a lifetime of experience for both grandparents must have been a great bonus for the entire community in handing down traditional knowledge, culture and understanding of the environment.

Hunter-gatherer fathers and grandfathers are also known to play a much larger role in childcare then other kinds of societies like pastoralists or farmers, and not just in providing food.

But studies of recent populations suggest there is probably no real reproductive benefit to men surviving to be grandfathers.

Perhaps men live to a ripe old age because evolution has favoured long lifespan for the entire species owing to the benefits of grandmothering? In this case, grandfathers might simply be incidental rather than a necessity.

Still, one pattern that seems to be consistent across many foraging societies is that an absence of grandmothers leads to higher childhood mortality than an absence of fathers.

When did childhood and grandmothering evolve? Its difficult to be certain because the different stages in human life cycle often don’t leave clear evidence for us in the fossil remains our ancestors.

Certainly, there seems to have been a shift in longevity by around 40,000 years ago in Europe when we see many more individuals surviving into old age.

But this is more than three-quarters of the way through the evolution of our species, which evolved more than 200,000 years ago in Africa.

The five stages in the human life cycle are universal and must therefore be under the strong influence of our genes. So its very likely that our unusual life cycle was present from the birth of our species as well.

Before 40,000 years ago old people were probably very rare in all communities, but their existence, especially of grandmothers, could have made a huge difference to child survival and mortality and may be the main reason we’re here at all.

FactCheck Q&A: Do We Only Have Space for About 150 People in Our Lives?

The Conversation is fact-checking claims made on Q&A, broadcast Mondays on the ABC at 9:35pm. Thank you to everyone who sent us quotes for checking via Twitter using hashtags #FactCheck and #QandA, on Facebook or by email.

We, on average, for our entire history have associated with about 150 other people, and now after millions of years of doing that, we are a very social animal. – Professor of population studies at Stanford University, author and ecologist Paul Ehrlich, speaking on Q&A, November 2, 2015.

Professor Ehrlich’s assertion refers to a widely discussed figure known as “Dunbar’s number”.

When asked to elaborate on his Q&A comment, Professor Ehrlich told The Conversation by email that:

The Dunbar (Robin Dunbar) number is ~150, size of hunter gatherer groups, still length of Christmas lists, and so on. My point was we’re a small-group social animal now suddenly (in cultural evolution time) trying to find ways to live in gigantic groups.

But does 150 really represent the ideal number of people we have all evolved to interact with socially?

Neocortex size

The theory emerged from a series of studies beginning in 1992 by Robin Dunbar, a primatologist based at University College London. The studies aimed to understand the evolution of the large brain, especially the neocortex, of primates including humans.

The neocortex is the balloon-like, highly folded, outer part of the mammalian brain, which in humans is associated with higher cognitive functions like planning and executive control.

Dunbar proposed that his number, 150, “predicts a ‘natural’ cognitive community size for humans”.

But let’s be clear up front: this number does not derive from an ecological principle or evolutionary law governing the way complex species like primates naturally organise themselves.

Instead, it is an estimate – a prediction – derived from an equation Dunbar used to describe the statistical association between neocortex size and the number of individuals typically living in the social groups of various primate species.

While his research has been widely cited and influential, especially in the social sciences and humanities, it has been very controversial. Indeed, it has been the subject of strong criticism in primatology and cognitive and experimental psychology.

So, what’s controversial about Dunbar’s number?

A bewildering array of correlations

First of all, it’s now well understood that larger sized mammals possess a larger neocortex: it comprises about 87% of a sperm whale’s brain, 80% of the human brain, 71% of a camel’s brain but only 15% of a shrew’s.

While we could speculate about the previously unappreciated intelligence of some of these species, there’s probably nothing particularly special about a large-bodied species possessing a large neocortex as such. A big neocortex may not necessarily tell us anything about that animal’s social life.

Second, other ecological factors have been found to produce similarly strong correlations with brain or neocortex size in primates.

Various studies have shown that other factors can explain neocortex size equally as well as social group size. They include primate territory size, diet (especially fruit-eating and other kinds of extractive feeding behaviour), and other variables like nighttime versus daytime activity patterns.

In fact, so many strong statistical correlations have been found by researchers looking into this question that one study bleakly noted the “bewildering array of correlations between brain size and behavioural traits”.

All of this points to the fact that Dunbar’s theory is regarded by many experts as an incomplete explanation for the complexity of primate brains, cognition and behaviour.

One common factor among many of these aspects of primate ecology is that they all rely heavily on visual cues and the processing of visual information by the brain.

The larger neocortex of primates results to a considerable extent from a larger visual cortex (visual brain system), which clearly has many demands on it. Social behaviour is just one of them.

Primates seem to be unique among mammals in showing a strong evolutionary link between an enlarging neocortex and a larger cerebellum, the brain region beneath the neocortex that processes and coordinates sensory and motor control and is involved in the learning of motor skills.

Focusing solely on the neocortex misses a big part of the picture of primate brain evolution.

Our human-centric view of the world

Another problem pointed out by other primatologists is that no matter how dispassionately we might study primate cognition, we will inevitably impose our own, species-centric view of the world on it.

This is the problem of anthropocentrism: our belief that humans are the most important species on the planet. We simply can’t escape making inferences through our own “socio-cognitive spectacles”, as pointed out by the philosopher Wittgenstein.

This problem becomes particularly acute with primates, which are our evolutionary cousins. It is easy for us to impose complexity on behaviours that may not be complex at all within primate (versus human) social settings.

Furthermore, when Dunbar’s number has been tested against the actual social organisation of historical and living hunter-gatherer groups, it has been found to be wanting.

The anthropologist Frank Marlowe, for example, has suggested that hunter-gatherers spend most of their time living in “local bands”, and that these typically comprise only around 30 people, regardless of where in the world they are living.

Dunbar, in responding to Marlowe, has pointed out that local bands are often unstable, and change in size regularly, making other (larger) units of social organisation more appropriate for investigation.

There is simply no agreement among researchers about which unit of human social organisation is the most appropriate one for studying evolution.

Many other criticisms have been levelled at Dunbar’s theory and show that it, and the predictions emerging from it about human social organisation, are widely regarded as overly simplistic.


While Professor Ehrlich correctly quoted the number 150 as Dunbar’s number, he didn’t quite present the whole picture. He could have been more accurate by linking the figure to its source and he overlooked the abundance of contradictory and highly critical published studies of Dunbar’s theory. – Darren Curnoe


The author has provided a good, critical assessment of Dunbar’s number and a useful discussion on the weaknesses of correlative comparative studies.

Dunbar’s number is an arresting idea with a pithy name, easy to digest and just counter-intuitive enough to have broad appeal, which may explain why the idea it encapsulates has caught on so readily outside of primatology and anthropology.

Despite the limitations and problems with Dunbar’s number and the idea that neocortex size seems adapted to living in social groups of fewer than 150 individuals, I believe Dunbar’s thinking remains useful.

In particular, I see value in Dunbar’s argument that at each level of closeness, we are limited in how many relationships we can have: an average of five intimate supportive relationships, 15 close friends and so on.

Whatever the neurobiological mechanisms, Dunbar has made useful predictions about the limited nature of human social capacity, and they remain to be thoroughly tested against competing ideas.

Paul Ehrlich quoted a piece of science that has become pop folklore, but that is also controversial.

However, the point he was making – that humans have limited social capacity and that our evolved social capacities don’t suit us well to living in societies of millions – has not been refuted.

I suggest that readers interested in this topic listen to Dunbar’s TED talk, especially the bit from about 7:20 in which he discusses the layered capacities for different types of relationship. – Rob Brooks

This article was originally published on The Conversation. Read the original article.

Why Are Humans Unique? It’s the Small Things That Count

Can there be any more important a question than, ‘How did we get here?’

Of course, I don’t mean those books we all gawked at as tweens desperate to understand our transforming pubescent bodies.

I mean, ‘How did we get here, as a species?’ ‘How did we come to be so different to all other life?’

In the way that we look: with our large, balloon like brains and skulls, hairless bodies, tiny teeth, protruding chins, puny muscles, and bobbling about on two feet.

Also in the ways that we behave: with our remarkably complex and conscious brains, articulate speech and language, symbolic, creative, minds, and extraordinary imagination.

And how did we come to occupy virtually every nook and cranny the planet has to offer, even travelling to places beyond Earth?

The fossil, genetic and archaeological records provide the only hard evidence we have about our evolutionary past.

Yet, even if we cast our attention back to the Palaeolithic (or Stone Age) we really get no sense at all that we as a species would be destined to be the apes that would eventually shape the planet itself, on a global scale.

But each year, with the rapid pace of scientific discovery about our evolutionary past, our ‘biological patch’ is getting smaller and smaller; and, 2015 has been a truly remarkable year in this sense.

It seems like a good time to pause and take stock: How different are we? And, what can the records of our evolutionary history tell us about the journey to human uniqueness?

Our evolutionary branch on the tree of life began a mere 8 million years ago: a time when we shared a common ancestor with living chimpanzees.

Homo sapiens, also called ‘modern humans’ by anthropologists – a concept I’ll return to later – evolved according to the fossil record more than 200,000 years ago.

That’s a long time ago in terms of human generations of course: roughly 10,000 generations back.

But its a mere blink of an eye in the history of planet Earth and life.

In broad terms, we can divide the human evolutionary story into two major phases, and in doing so, can trace the gradual assembling of different parts of the ‘package’ of human modernity.

In the first phase, between roughly 7.5 million and 2 million years ago, we see a group of very ape like creatures living only in Africa.

A famous example is ‘Lucy’ from Ethiopia who belongs to the species Australopithecus afarensis and lived between around 3 and 4 million years ago.

These prehuman apes were very ‘unhuman’-like, except in one or two key respects.

Most importantly, they walked upright, on two feet, when on the ground, as we do; but also spent a lot of their time living in trees.

They also had brains and bodies similar in size to living chimpanzees.

From among these two-footed tree swingers, the human genus, Homo branched off, ushering in the beginnings of apes that would live permanently on the ground.

Homo appears in the fossil record close to 3 million years ago – as we learned just this year with a new fossil jaw from Ethiopia which added half a million years to the history of our genus.

With Homo we see brains getting much larger, very quickly also bodies reaching the human size, our muscles, especially those used for climbing, becoming pretty weak.

Very likely also at this time, body hair became short, fine and patchy as prehumans became obligate, ground-dwelling, bipeds.

We’ve also learned this year that we had previously underestimated the hand capabilities of these prehuman apes, which may have been pretty similar to our own.

Remarkably also, the earliest stone tools now date back to almost 3.5 million years ago: being invented by Lucy’s kind with their small brains.

Some archaeologists also think that some of the earliest members of Homo – notably Homo erectus – with its human body size, but brain three quarters the size of ours, may have been able to make and control fire.

The importance of fire is that it would have allowed our Palaeolithic ancestors to cook their food, unlocking new and sometimes safer sources of nutrition to feed an energy hungry and evolving brain.

But the oldest examples of fire are only around 300,000-400,000 years old, in the form of burnt bone and deep ash and charcoal layers in caves.

They are associated with the species Homo heidelbergensis or perhaps the earliest Neanderthals (Homo neanderthalensis) living in Europe and West Asia.

Still, it certainly predates Homo sapiens, showing that fire is far from being unique to us, as Charles Darwin once opined.

This evolutionary time also marked the very first excursions by a two footed ape out of Africa, with Homo erectus settling Europe and eventually Asia as far east as present day China and Indonesia beginning from at least 1.8 million years ago.

Around a million years later the species Homo heidelbergensis appears in the fossil record, and also has a rather wide distribution across Africa, Europe and Asia.

Homo heidelbergensis is likely to have been the species that gave rise to both our Neanderthal cousins and we modern humans, and like us, it occupied a very wide range of environments, with a few important exceptions.

Now, one of the most exciting human fossil sites ever found is Sima de Los Hueseos – ‘the pit of bones’ – in Atapuerca, northern Spain.

Here, anthropologists have so far found more than six and half thousand fossils of an early human species, dated to more than 500,000 years ago.

The bones are pilled up one atop another in a way that strongly suggests they were deliberately disposed of in the cave, as complete bodies: in a kind of human rubbish pit.

But, some of the scientists working at the ‘pit of bones’ think the piles of fossils represent not just intentional disposal of the dead but indicate a sense of the afterlife, representing a kind of burial practice.

Again, hundreds of thousands of years before Homo sapiens appears.

We also now know from DNA extracted from the fossils from Sima de Los Huesos that the bones sample an early part of the Neanderthal evolutionary branch.

This means that Neanderthals were disposing of their dead, but not necessarily burying them like we do, at least half a million years ago.

In tracing the origins of this (admittedly incomplete) list of features historically claimed to be unique to Homo sapiens we get the distinct impression that the ‘biological patch’ we humans have recognised as our own is narrowing rather quickly.

If many of the hallmarks of humankind can no longer be claimed as exclusive, what does this leave for our species to claim as unique, and to explain the differences between us and other life?

Not much, actually.

Anthropologists often use the term ‘modern humans’, more specifically, ‘anatomically modern humans’, more or less interchangeably with the species name Homo sapiens.

What’s meant by this term is essentially any fossil that would blend within the range of physical variation we see around the planet today, or in the recent past.

A related concept is that of ‘behaviourally modern humans’, which is used by archaeologists to distinguish humans whose behaviour we would recognise as being like our own.

Now, you might think this latter term would be unnecessary: surely, you might ask, anatomically and behaviourally modern humans are the same thing, right?

If only it were that simple!

Actually, the fossil record shows that the earliest bones that resemble living humans are from Africa, specifically, Tanzania, Ethiopia and South Africa, and are dated between about 220,000 and 170,000 years ago.

Why are they regarded to be anatomically modern human? Mostly on account of their bubble shaped skulls, large brain volumes, small teeth, and finely built jaws with protruding chins.

Anatomically modern humans got into West Asia, specifically present day Israel, more than 100,000 years ago.

But, until very recently, it was thought they didn’t get anywhere east or north of the Levant until much later, perhaps only 50,000 years ago, at most.

Skeletal remains dating to around 40,000 years old have been found at Lake Mungo in Australia, Niah Cave in Malaysian Borneo, Tam Pa Ling in Laos, and Tianyuan Cave near Beijing in China.

Just three weeks ago we learned that anatomically modern humans have been in East Asia, specifically southern China, for at least 80,000 years, and perhaps even 120,000 years.

Forty-seven human teeth from the site of Daoxian Cave, which are remarkably modern looking, provide a strong case for the precociously early occupation of the region by our kind.

When do we see the earliest evidence for behaviourally modern humans?

Stone tools don’t give us any real insights into this question for the first 100,000 years or so of our evolution as species.

That’s right, there is a gap of more than 100,000 years between the appearance of anatomically modern and behaviourally modern humans. Odd right?

The ‘smoking gun’ that archaeologists look for when trying to pinpoint the emergence of the modern human mind is the signs of symbolic behavior.

When we think about symbols we know that among living species we humans are the only ones, as far as we know, that are capable of inventing them.

Chimpanzees have been taught to use sign language or simple pictographic languages and they do so to great effect, but they don’t invent the symbols themselves.

A good example of a simple yet powerful symbol is the cross, as explored in my an episode of my UNSWTV series, ‘How did we get here?’

One episode of ‘How did we get here?’ explores the human use of symbols and the role they play on our lives.

How might we get at this kind of thinking, of a symbolic human mind, from the archaeological record?

Archaeologists point to examples like the:
• Making of jewellery, with shell beads at least 100,000 years old in Africa.
• Grinding up of ochre to make paint for painting living bodies or of the deceased in preparing them during a burial ceremony.
• Cremation of the dead, with the earliest evidence being from Australia in form of the Mungo Lady who was cremated more than 40,000 years ago.
• Rock paintings on cave walls, the oldest, as of last year, being found in Indonesia and dating to about 40,000 years old, older than anything in Europe or Africa.

We modern humans also live in places other human species simply haven’t been found.

There’s clear evidence, especially from the archaeological record, that only modern humans have occupied deserts, rainforests, the Arctic Circle and even the Steppe Grassland environments seen in Siberia and Eastern Europe.

While we’re remarkably flexible and able to alter our diet, behavior and technology to suit our circumstances, this all occurred well after 100,000 years ago.

Why then did it seemingly take more than 100,000 years after our appearance as a species for the first signs of the modern human mind to make a show?

One possibility is that some kind of revolution occurred around this time – perhaps the arrival of complex human language being associated with a gene mutation.

One candidate is the FOXP2 gene, which is vital for the development of normal speech and language.

This gene is shared with Neanderthals and chimpanzees as well, but we humans have a particular mutation affecting the regulation of the gene that is not found in the genome of our cousins.

Ironically, as we gather more scientific evidence, and our technologies get more powerful, the big questions about our past, evolution and place in nature get harder to answer with any satisfaction.

With only around 100 genes of any consequence distinguishing us from our Neanderthal cousins, and most of them being related to our immune system, skin or sense of smell, we are being forced to focus now on the small biological changes in our evolution to explain what feels like a massive gulf.

Seemingly changes of only minor genetic importance had profound consequences for us as a species, and, as it turns out, the well being and future of the planet as well.

How a One Night Stand in the Ice Age Affects Us All Today

Over the past half decade, ancient DNA research has revealed some surprising aspects to our evolutionary history during the past 50,000 years.

Perhaps the most startling of these has been the extent to which the ancestors of living people across the planet interbred with other closely related species of human.

But where in the world did these cross-species matings occur? Which archaic species were involved?

Just how much of the human genome comprises DNA from these archaic relatives?
And what impact did interbreeding have on our evolution and general biology as a species?

These are questions are the core of current research into interbreeding as revealed by DNA sequences obtained from fossils in Europe and Asia, as well as from comparisons with the genomes of living people.

In Africa, interbreeding with an archaic species has left genetic signatures in the genomes of some living sub-Saharan populations.

Roughly two percent of the DNA of these people derives from an archaic species as a result of mating that occurred around 35,000 years ago.

The very well known Neanderthals – for whom we have hundreds of fossils including near complete skeletons – interbred with the founders of living European and East Asian populations.

Estimates published in 2014 indicate that 1.5-2% of the genome of living non-Africans was inherited from Neanderthals.

Yet, East Asians have significantly more Neanderthal genes than Europeans do indicating that their ancestors interbred with this archaic species perhaps more than once, or in an event separate to that involving the ancestors of western Eurasians.

Another species, the mysterious ‘Denisovans,’ is known from the fossil record only by a single tooth, finger bone and toe bone.

Yet their fully sequenced genome shows that they shared their genes with the ancestors of some Southeast Asians, New Guineans and Aboriginal Australians.

These living people also show the genetic signs of interbreeding with Neanderthals, so have inherited DNA from both of these species.

Not only do we all carry the evidence for these interspecies dalliances, in some cases these genes seem to have provided real benefits for us today.

Take for example the finding last year by Emilia Huerta-Sánchez and her team that the ability of populations living today in Tibet to thrive at high altitude is the result of a gene inherited from the mysterious ‘Denisovans.’

The gene in question – EPAS1 – is associated with differences in haemoglobin levels at high altitude underpinning the capacity of the individuals carrying it to pump more oxygen around in their blood.

The Denisovans also seem to have contributed genes that bolstered the immune systems of people in New Guinea and Australia.

In Europe, interbreeding with the Neanderthals may also have provided gene variants associated with lipid catabolism, or the conversion of fat to energy in the body’s cells.

Other examples include genes associated with: sugar metabolism; muscle and nervous system function; skin formation and structure; skin, hair and eye colour; and the female reproductive system, especially the formation of ova.

But of course we would expect natural selection to work in both directions given that these mating events were between different species: Homo sapiens x Homo neanderthalensis, Homo sapiens x Denisovans and Homo sapiens x mystery African species.

One particularly interesting example compared the genome of a female Neanderthal with 1,000 contemporary human ones from across the world and found clear evidence for negative selection.

Mapping the DNA of Neanderthals against this large number of human genomes also showed that there were vast ‘deserts’ of Neanderthal ancestry.

One million base pairs compared across the autosomes (i.e. other than the X or Y chromosomes) showed four windows in Europeans and 14 in East Asians where around 0.1% of the DNA was Neanderthal.

The human Y chromosome is also known to be lacking Neanderthal DNA suggesting strong natural selection against hybrid males, who were likely to have been infertile.

Other genes inherited from the Neanderthals seem to have conferred greater risk for a range of diseases such as lupus, biliary cirrhosis, Crohn’s disease, altered optic-disc size, smoking behaviour, IL-18 levels (producing inflammation) and type 2 diabetes.

One of the especially odd things about the evidence for interbreeding with the Denisovans is that the only fossils we have for them were recovered from Denisova Cave in southern Siberia, some 6,000 km northwest of New Guinea.

How can this be given the very high frequency of Denisovan genes in New Guineans and Australians and apparently low level or even absence of Denisovan DNA in the genomes of mainland East Asians?

One study by Skolund and Jakobsson suggested Denisovan DNA may also be found in mainland East Asians, but this has been controversial and difficult to pin down owing to its apparent very low levels.

But if correct, perhaps the mating with the Denisovans happened on mainland East Asia, not so far from Denisova Cave, the genes being carried later to New Guinea and Australia?

A new study of the occurrence of Denisovan DNA in living humans published in the journal Molecular Biology and Evolution has finally confirmed the widespread signal of a low level of Denisovan ancestry across Eastern Eurasian and Native American populations.

Pengfei Qin and Mark Stoneking of the Max Planck Institute for Evolutionary Anthropology examined a set of 600,000 genetic markers in 2,493 individuals from 221 worldwide populations.

They found that for living New Guineans and a single genome sample from the North of Australia around 3.5% of their DNA derives from the Denisovans.

In contrast, in East Asians and Native Americans the amount of Denisovan DNA plummets to a minimal 0.13-0.17% of their genome.

Qin and Stoneking concluded that Denisovan ancestry is therefore strongly associated with New Guinean ancestry.

So, the presence of Denisovan DNA outside of New Guinea – its place of highest occurrence – is probably the result of recent population migrations from New Guinea into Australia, Southeast Asia and mainland East Asia.

In other words, at some time in the past some New Guineans migrated into northern Australia and back to mainland East Asia carrying their Denisovan DNA with them and spreading it around the region.

So far, no archaeological or genetic evidence has been found to support the idea that New Guineans migrated back to Asia well after New Guinea and Australia had been settled.

But, with so many new findings coming from ancient human DNA, and many archaeological models confirmed in the process, we simply can’t afford to dismiss this one.

Once again genetic research is turning long held notions about our evolution on its head: bring it on I say!

Did ‘Rising Star’ Shine Too Bright?

Last week was rather exceptional for human evolution science, even for those of us who are used to the extravagances of media attention that surround the field.

We were spoilt with the announcement of no less than two major discoveries in just as many days.

The first of them – the new South African species Homo naledi – attracted a great deal of attention from a media only too keen to indulge in truck loads of hyperbole and speculation.

The other announcement – the sequencing of ancient DNA from 300,000 to 400,000 year old fossils from Atapuerca in Spain – barely rated a mention in the press, overshadowed by the naledi hype.

This was probably in part because it was announced at an international conference, the coverage it received in Science suggesting it will shortly be published in detail in this prestigious journal.

This seems to be a regular practice by Science these days, as shown with other similar discoveries.

But perhaps also the announcement of Homo naledi the day before the Atapuerca DNA study broke meant that the media had been largely saturated; so it was also a bit of bad luck in the timing.

So, what was all the fuss surrounding Homo naledi about?

The bones of this new species were discovered accidentally by cavers exploring the Dinaledi Chamber of the ‘Rising Star’ Cave in the Cradle of Humankind region near Johannesburg, and subsequently brought to the attention of scientists.

A modest excavation resulted in 1,550 fossils from an extinct human relative, representing the partial skeletons of at least 15 individuals.

The teeth are described as primitive but small; its hand, wrist, lower limb and foot bones are human-like; while other bones of the trunk, shoulder, pelvis and thigh are also quite primitive, being a lot like species of Australopithecus.

Reading the scientific article describing Homo naledi you realise that the work is detailed, rigorous and careful.

It involved a large number of specialists covering a very wide set of physical features on the bones and teeth.

The case for the new species is, in my opinion, detailed, compelling and praise worthy.

So far, so good: another new species, the human tree gets all the more interesting, and complicated.

The human drama surrounding the discovery of the bones and their recovery by a group of petite, commando style, female cave explorers is also fun and adds a lot of colour to the tale of the discovery of Homo naledi.

One rather odd thing about it though is that the scientists involved still haven’t determined its geological age.

This is unprecedented in my experience and raises lots of questions in my mind like: Did the scientists rush the announcement for some reason? Why didn’t they wait until they had an age estimate at hand before going to a journal? Are the geologists unable to date the fossils?

My ‘nonsense-filter’ also tells me that all the talk in the media about this new species burying its dead and having human-like morality, or that is dismantles one of the key pillars of human uniqueness, needs to be called out for what it truly is: absurd.

Completely unnecessary hype to sell the significance of the find to the media.

It’s just the sort of thing that infuriates many scientists and detracts from an otherwise significant discovery; pity really.

The fossils recovered from the site are so far apparently exclusively from naledi and may represent near-complete (or complete) bodies that ended up in one part of the cave.

The geologists involved believe the cave was always dark and therefore the bodies may have been deliberately placed there.

Could be, but there might be other explanations as well that need to be given much more serious scientific exploration.

Why leap to the most complicated, least likely explanation? I’ll leave you to work out why.

Even so, other very rich fossil sites like Sima de Los Huesos (the ‘pit of bones’) in Atapuerca, northern Spain, coincidentally the focus of the new DNA research, have also produced a very large number of hominin remains, and they may also have been put deliberately into the cave.

This site is between 300,000 and 400,000 years old.

Yet, as with Rising Star Cave, there is no evidence that they were burying their dead, or had a concept of the afterlife or morality or engaged in ritual or religious ceremony.

Archaeology, biology and neuroscience all tell us that such behaviours fall exclusively within the human domain, and I see nothing about this new find that changes this.

The oldest convincing evidence for funeray practice is associated with our species and could be up to 160,000 years old.

Again, it would have been helpful to know how old naledi really is; and speculating it could be as old as 3 million years, without any apparent evidence, as the team is reported to have done in the media, is like adding nitroglycerine to the fire of media speculation.

It’s one thing to get the message out to the public about the exciting discoveries we’re making and to educate the very people who kindly allow us the privilege of doing science using their hard earned tax dollars.

I’m thrilled when my colleagues announce their work to the media, even if I don’t always agree with their conclusions.

It can be fun to have a bit of a public stoush over interpretations, and the wider public benefits from a sense that scientific findings can be interpreted in varying ways.

Doing so helps enrich understanding of the human enterprise we call science and to maintain or even grow public interest in it in a world driven by an overriding economic imperative and one prone to disregarding the huge cultural and intellectual contributions it makes to society.

But if we go too far, we run the real risk of trivialising the huge investment of time, money, energy, care and intellectual effort that goes into many scientific discoveries.

It can also do damage to science itself and, dare I suggest, even contribute to the mistrust that increasing numbers of people in the Anglophone West seem to feel about it.

You can also end up with egg on your face, and some people never seem to learn this lesson.

In contrast, the Atapuerca DNA research has direct bearing on understanding the evolution of the living human species, which is quite rightly where the central focus of human evolution research should be.

Researchers have argued about three scenarios for the Atapuerca hominins: they might be the earliest known Neanderthals; or could sample the population that gave rise to Neanderthals; or perhaps are the common ancestor of both humans and Neanderthals.

The research, as reported by Ann Gibbons, confirms that they are in fact the earliest Neanderthals: a kind of ‘archaic’ Neanderthal if you like, and subsequently evolved into the ‘classic’ Neanderthals we see in Europe and West Asia by about 150,000 years ago.

What are the broader implications of the research for understanding the evolution of living humans?

First, the finding pushes the age of the shared human-Neanderthal ancestor well beyond 400,000 years ago, suggesting our species, H. sapiens, might also be at least this old.

Also, with the Atapuerca group living in Europe, it’s even possible that our species evolved in this or an adjacent region of Eurasia, and later migrated back into Africa.

And being close to the common ancestor, the Atapuerca fossils give us real insights into what it must have looked like and the ancestral body form of our own species.

The fossils from Europe, Asia and Africa from around this time are physically very diverse, with some researchers thinking they represent multiple species, only one of which could be the ancestor of living humans.

Question is, which one?

This new research suggests the European branch is closest among them all and deserves much more attention in this regard.

In contrast, we don’t know, and will doubtless ever know, whether Homo naledi had anything to do with the evolution of living humans, least of all whether its brain, mind or behaviour were anything like our own.

The Long Reach of the Past: Did Prehistoric Humans Shape Today’s Ecosystems?

We all know that humans are having a massive impact on the planet.

Our effects include altering the Earth’s rotation by damming large amounts of surface water; changing the composition of the atmosphere by punching a hole in the ozone layer and adding vast amounts of CO2, methane and other pollutants; transforming the composition and temperature of the oceans; and clearing large tracts of land and removing or dramatically altering vast numbers of terrestrial and aquatic ecosystems in the process.

Plenty of these changes are plain for all to see; others are more obscure, but no less significant.

And, with the COP21 UN Paris Climate Change Conference just around the corner, politicians, policy makers and NGOs are again turning their attentions to reaching an agreement that aims to keep global temperature change to below 2° Celsius.

A major issue for scientists studying the Earth’s physical and biological systems is just how great the influence of humans has been and for how long its been happening.

Our global destructive impacts mean that potentially any organism or ecosystem, and many of the earth’s physical systems like erosion, soil formation and water cycles, carbon and nitrogen cycles, and climate, have been affected in some way by human activity.

But can we disentangle the effects of human activity on these systems and organisms from natural signals and cycles?

I’d argue we probably can’t – that human impacts are just too wide ranging and too ancient, and that our disruptive and destructive effects have reached every part of the planet.

This means that probably every scientific study of any contemporary system or living organism catalogues the effects of our species and its economic activity in some way.

If we go back far enough to a time when humans didn’t exist, we have the potential to understand how the world looked and how natural systems behaved before we were around.

Problem is that the information we get is from the very incomplete and often biased geological record, in the form of fossils and various archives of climate and environmental change like isotopes recorded in ice or cave stalagmites.

And, of course, many organisms alive in the past are now extinct: the planet looked very different even just 20,000 years ago at the peak of the last major cold stage of the Ice Age or Pleistocene epoch.

Scientists like me who study extinct organisms and long lost ecosystems wonder whether large scale human impacts like those we see today are truly confined to the period following industrialisation.

Did the ‘Anthropocene’ really begin 215 years ago?

Or does the environmental legacy of our Palaeolithic ancestors reach into today’s world?

This issue is at the centre of one of the most hotly contested questions in palaeontology and archaeology, namely, the extinction of the Pleistocene ‘megafauna’.

But its a lot broader than this issue of course and it cuts to the core of who we are as a species, the way we have evolved, and lengths we’ll go to ensure our own survival; some would argue even our future survival.

Some scientists have also suggested that the megafauna extinctions set the stage for the planet’s sixth major extinction event, which is unfolding before our eyes.

During the last phase of the Ice Age, between roughly 50,000 and 10,000 years ago, almost 200 species of mammals went extinct across the globe.

That’s half of the world’s mammals weighing more than 44 kg perishing in what was an instant in the long history of life on the planet.

A growing body of highly contested research suggests that humans may indeed have dramatically shaped the diversity of living mammals in the deep past, just like today, leaving us an impoverished natural legacy.

And, let’s not forget that humans with our average body mass of close to 70 kg are megafauna as well.

While our species is obviously still here, we remain the chief suspect in the extinction of our close cousins the Neanderthals, Denisovans, and probably other relatives around this time.

The megafauna debate has been highly polarised for decades: humans being blamed on one hand, and natural climate change on the other.

Environmentally altering activities like burning the landscape by ancient hunter-gatherers in places like Australia, for example, have proven very difficult to establish, and their possible impacts hard to separate from natural climate cycles.

Yet other studies suggest that Ancient Aborignal Australians were one of the major agents involved in dispersing baobab trees in northern Australia; so our environmental impacts can be quite surprising.

And the chronology of human settlement and timing of megafauna disappearance in Australia remain uncertain: significant barriers to resolving the extinction question with any certainty.

So again, a major difficulty is the poor quality of the information we have from the fossil, archaeological and ancient environmental records.

The spectre of the confounding effects of natural and human-induced environmental signals remain all too real even for the Ice Age.

Another way to approach the question of human environmental change in the past is to construct mathematical models to look at changes over time and the influences of natural cycles and changes compared to human facilitated ones.

A battery of such studies is beginning to point firmly to the prehistoric human colonisation of new parts of the planet as a major driver of extinction and environmental change; possibly the leading cause of the megafauna extinctions.

New research published over the last couple of months by Soren Faurby of the Museo Nacional de Ciencias Naturales Madrid and Jens-Christian Svenning of Aarhus University has also pointed the finger squarely at humans.

In an interesting twist, they modelled what worldwide diversity patterns of mammals might look like in the absence of past and present human impacts, based on estimates of the natural distribution of each species (5,747 of them) according to its ecology, biogeography and the current environmental template.

They found that prehistoric human-driven extinctions in addition to recent ones were probably an important influence on present global mammal diversity patterns.

They even suggested that areas normally thought by ecologists to be biodiversity hot spots, like mountains, may in fact reflect their role as refuges for species otherwise affected by hunting and habitat destruction, rather than reflecting a natural pattern.

I’m satisfied that a strong case exists that humans did play an extinction role and that there truly is a link between what our Palaeolithic ancestors were doing to the environment and what we’re doing today.

The difference now of course is that with an expected almost 10 billion people by 2050 and the remarkably destructive technology we possess we’re doing damage on an unprecedented scale and face a future on a planet with an irreparably damaged biosphere and dramatically altered atmosphere.

Spare Your Health, Budget, and the Planet: Ditch the Palaeodiet

Interest in the diet of our evolutionary ancestors would ordinarily be a topic of curiosity in only the most obscure of scientific circles.

But the popularity of the so-called palaeodiet has brought unprecedented attention to the foods consumed by Stone Age or Palaeolithic people.

And, it might ultimately be doing us all more harm than good.

The palaeodiet is claimed to be a recipe for natural health and able to cure a vast range of diseases.

We await the clinical trials to pass verdict on its claimed disease curing benefits, but at the moment most such claims seem like little more than snake oil peddling or faith healing.

Even a cursory look at the palaeodiet highlights huge contradictions and a wilful ignorance of the science behind human evolution and diet.

Proponents of the palaeodiet eschew all processed food, but are happy to suck on beverages like wine.

Don’t be fooled though, wine, like many other components of the human diet, is a processed food.

Wine making involves turning a fruit into an alcoholic beverage through the mechanical breakdown or heating up of grapes, addition of sugar, acid, nutrients, yeast and other chemicals to promote fermentation, add flavor, remove sediment and preserve the wine.

And humans have been processing their food for tens of thousands perhaps millions of years, so its absurd to think you can exclude processed food altogether.

The palaeodiet eliminates all grains, legumes and potatoes, yet there is plenty of evidence that humans have evolved to eat carbohydrates especially starches.

Take the amylase genes which evolved to aid the digestion of starch either in our saliva or pancreas through secretion into the small intestine.

Humans are unique among primates in possessing large numbers of salivary amylase genes and there is a clear association between gene number and the concentration of the amylase enzyme in the saliva.

Plant foods containing high quantities of starch may even have been essential for the evolution of the large human brain over the last 2 million years, according to new research by Karen Hardy from Universitat Autónoma de Barcelona and colleagues published recently in The Quarterly Review of Biology.

Our brains are three times the size of our chimpanzee cousins and are undoubtedly the seat of many of the differences between us in terms of our biology including behaviour.

Previous models such as the ‘expensive tissue’ hypothesis of Aiello and Wheeler proposed that the use of stone tools facilitated a shift from a mostly plant-based to largely meat-comprising diet in our ancestors in order to feed our large brains.

This shift, they suggested, facilitated the evolution of our enlarged brain as well as smaller teeth and reduced gut adapted for eating meat.

Yet there have been lingering doubts, sometimes claimed refutations, of the links between human evolution and meat eating.

There is no clear association across mammals including primates between an enlarged brain and reduced gut size.

Instead, large brains seem to be found in mammals that are capable of storing large amounts of body fat to stave off starvation and also have efficient forms of locomotion like our human bipedalism.

The new model from Hardy and co-authors suggests that cooked starch greatly increased energy availability to energy expensive tissues like the brain, red blood cells, and also the developing fetus.

They also suggest that the number of copies of the salivary amylase gene may
have enhanced the importance of starch in human evolution following the controlled use of fire and development of cooking.

But there are of course many sources of carbohydrates in the diet and research suggests that early humans may have eaten underground food items like roots, tubers and rhizomes, as well as seeds, certain fruits and bark which are all widely available and rich in starch.

Grains were also an important and highly effective source of carbohydrates in the Palaeolithic, despite what the palaeodiet states.

Grinding seeds to make flour and probably bread is known from at least 25,000 years ago in Europe, arguably much longer, and humans have been cooking for at least 400,000 years, but perhaps even 2 million years.

The truth is we have no idea how much meat was eaten in the Palaeolithic because so little of the plant food remains have preserved for us to study and to garner an accurate picture of the complete diet of our ancestors.

Mammal bones with signs of butchering or cooking are plentiful in the archaeological record, but bones always preserve as fossils much longer than plant remains, and so we have a highly skewed view of past diets.

We would also do well to keep in mind that the role and safe amounts of animal food in the contemporary human diet remain controversial in nutritional and medical science regardless of what we think our ancestors may have eaten.

Red meat in particular has been linked to a range of diseases like metabolic syndrome, a variety of cancers, atherosclerosis and Type 2 diabetes, so a degree of caution about safe levels of consumption seems wise.

If your aim is to lose weight, then the palaeodiet is by no means your only option.

Much clinical research has shown that the key to weight loss is reducing the total amount of calories consumed, regardless of whether its carbohydrates, protein or fat.

Watch what you eat, reduce your calories and lift your activity level, is a tried and true formula that works for most people.

Studies of hunter-gatherers during the last couple of hundred years have also shown they walked an awful lot: on average 360 km a year, but up to 3,600 km per annum.

So, you might consider a palaeo-exercise regime combined with a scientifically based and balanced diet as a healthy starting point for weight loss and general good health, rather than the potentially dangerous palaeodiet.

Nutritionists also advise greatly reducing the amount of factory-made foods we consume because much of it lacks nutritional balance, and often has excessive calories and high sugar, salt or fat.

I guess this is one thing palaeodieters and nutritionists are close to agreement on, probably because it seems an awful lot like common sense.

While palaeodiet inventor Loren Cordain argues we should only be eating animals that have themselves eaten a ‘wild’ diet, Australian celebrity chef Pete Evans has extended it to consuming only organic food.

Adopting such an approach to food selection is impossible for most of the planet’s 7 billion inhabitants who couldn’t afford expensive organically grown food.

Evans wants the palaeodiet to be the new ‘normal’ for everyone, but to me, this smacks of Western middle class elitism and is simply out of touch with the realities faced by most people on the planet.

Anyway, most of the sources of animal food consumed by palaeodieters are from domesticated animals, which have been bred for flavour and meat quantity, and haven’t eaten a truly wild diet for thousands of years.

Eating a diet based on wild caught food would also be devastating for the planet.

The environment is becoming degraded and its natural resources depleted on a remarkable scale and pace, and a good deal of this is associated with agriculture and activities like fishing.

It’s estimated that each year tens, perhaps hundreds, of millions of sharks alone are harvested from the oceans and in many places fisheries are far from sustainable.

Similarly, if you’re concern is with animal welfare, then organic farming may not always be the best choice.

We need to get the balance right in our food choices between the broader effects of production on the environment, welfare of livestock and impacts on humankind more broadly.

The United Nations predicts there will be almost 10 billion people in the world by 2050.

This will lead to a dramatic need to increase food production to feed the extra people.

The scale of the challenge ahead was pithily described by Charles Godfray and co-authors in an article about the challenges of population growth and food security in Science magazine in 2010:

This challenge requires changes in the way food is produced, stored, processed, distributed, and accessed that are as radical as those that occurred during the 18th- and 19th-century Industrial and Agricultural Revolutions and the 20th-century Green Revolution. Increases in production will have an important part to play, but they will be constrained as never before by the finite resources provided by Earth’s lands, oceans, and atmosphere.

All of this within the context of the growing impact global climate change will have on food and water availability as well.

If we’re truly concerned about the fate of the planet and humankind, especially those of us in the West, we all need to be prepared to comprise our lifestyles including our diet and ditch luxuries like the palaeodiet.

Eating large amounts of meat, especially animals which have eaten a wild diet, is simply unrealistic, unsustainable and unreasonable if we want to do our bit for nature and the rest of humankind.