Asia is the Gift that Keeps on Giving in Prehistory

Political and economic pundits constantly remind us that this is the ‘Asian Century’, and it’s shaping up to be that way also for human origins science.

I’ve only recently waxed lyrical about the enormous impact East Asia is having on our thinking in prehistory circles; reshaping the intellectual landscape as it were.

At the risk of sounding like a broken record, new research from Asia again looks set to rewrite another chapter in the human story; this time in South Asia.

Asian prehistory, it would seem, has been gifted an archaeological lottery ticket whose numbers just keep coming up.

Some of the hottest issues internationally in prehistory include questions like: When did the first humans exit Africa and settle Europe and Asia? and Which species was involved?

Over the past decade these questions have been investigated with renewed vigour as the scientific spotlight awakens Asia from a long archaeological slumber; and it’s beginning to pay dividends.

For the first 4.5 million years of our evolutionary history two footed, human-like, apes seem only to have inhabited Africa.

From Sahelanthropus tchadensis 7.5 million years ago up till Australopithecus afarensis and Kenyanthropus platyops around 3.5 million years ago, the first couple of chapters in the human story were confined to the Mother continent.

With the arrival of Homo, close to 3 million years ago, this changed, and we saw the first signs of the kinds of early humans that would come to settle the distant reaches of the planet.

Why then and not earlier? It seems that only once early humans became obligate ground dwellers, leaving the trees for good, that such a shift could begin.

A true ground dwelling lifestyle, the evidence would suggest, began with Homo, particularly species like Homo rudolfensis and Homo habilis.

Outside of Africa, the archaeological site of Dmanisi in Georgia provides the earliest widely accepted evidence for the first major human dispersal, at around 1.8 million years ago.

The human species involved here is either Homo erectus, Homo ergaster or perhaps something new to science and not seen in the fossil records of Africa or East Asia.

Dmanisi has been a constant source of controversy in anthropology with views about its fossils ranging from a single to multiple species sampled at the site.

The oldest Homo erectus/ergaster fossils in East Africa have been found at Koobi Fora on the shores of Lake Turkana in Kenya, and these have dominated thinking about its evolution for 40 years.

Because of them, we’ve been locked into an ‘Out of Africa’ view about the species since the 1970s.

The most recent research on their geology and dating, however, suggests that all of the Koobi Fora remains except a single fossil are actually younger than those from Dmanisi.

Further east, the human fossils from Java, which also belong to Homo erectus, are dated between 1.6 million years and surprisingly as young as about 40,000 years old. I’ll leave the younger ones for another day.

More controversially, in China, stone tools from Longgupo Cave and Renzidong could be even older at between 2.0 and 2.5 million years old.

Less troublesome are the two teeth from Yuanmou in southwest China, which are believed also to belong to Homo erectus or a related species but date to around 1.7 million years old.

The big issue this all raises is that we simply can longer assume that Homo erectus evolved in Africa and dispersed into Asia. In fact, the evidence is building that it might actually be the other way around.

Enter India. Specifically, the Siwalik Hills north of the city of Chandigarh.

The ‘Siwaliks’ have been well known in palaeontological circles for over a hundred years, providing an abundance of fossils including some of the first evidence for extinct apes, going back to over 9.2 million years old.

New research by a joint Indian-French team has found new evidence about the first humans to settle Asia, and it’s bound to hit anthropology like a tsunami if the work stands up to scrutiny.

Near the village of Masol, archaeologists have found scatters of stone tools and animal fossils, some sporting cut marks by early humans while butchering carcasses, and all believed to date to about 2.6 million years old.

Until last year, this would have made them among the oldest stone tools anywhere in the world; though that honour now belongs to the Lomekwian tool industry from Kenya.

The discovery, published as a set of articles in the journal Comptes Renus Palevol, was led by scientists from various French institutions such as the Histoire naturelle de l’Homme préhistorique and the Society for Archaeological and Anthropological Research in Chandigarh.

Such an early age would push us back well beyond Homo erectus/ergaster and into the mysteries of the earliest members of Homo, about whom we still know so little.

So, Asia might have been settled by humans soon after the evolution of Homo; and such a primitive species could even have given rise to Homo erectus, perhaps in Asia itself.

The reaction to the discovery has been mixed, as you might expect, and some reasonable questions about the context of the finds have been asked.

All of the tools were found on the ground surface and none were recovered during excavations undertaken at a number of locations.

Still, the work makes it clear that they must have come from nearby, and very old, sediments.

And, we’d do well to remember that most of the fossil humans found at famous sites like Koobi Fora during the 1960s-1980s were also found on the surface.

When fossils and stone tools still have sediment attached to them, as the one’s from Masol do, it should be a relatively straight forward process to sort out where they eroded from.

The new discovery shows once again that we have for far too long focused too intensely on the archaeological record of Africa.

Asia still has many surprises awaiting us; and I’m sure a few more will be revealed in the year ahead.

East Asia Makes a Comeback in the Human Evolution Stakes

Archaeological discoveries in East Asia over the last decade or so have dramatically rewritten our understanding of human evolution.

But the implications don’t sit easily with many scholars internationally who continue to see Europe and Africa as the heartland of human origins.

For more than 150 years our understanding of human evolution has been largely shaped by the discoveries made in Europe and parts of Africa, like the caves near Johannesburg and the Great Rift Valley on the east of the continent.

This is partly because disciplines like geology, evolutionary biology and archaeology, as we know them today, began in Europe during the 19th Century.

But it’s also due to maverick archaeologists like Raymond Dart, Robert Broom, Louis Leakey and Mary Leakey working in former British colonies like South Africa, Tanzania and Kenya.

They set out to make a name for themselves, put their ‘new’ countries on the map, scientifically speaking, and turn the spotlight away from Europe, the centre of intellectual power.

East Asia featured prominently in the history and theory of human evolution during the late 19th Century and early 20th Century, but by the mid-1900s it was viewed internationally as an evolutionary backwater.

The first discovery to put Asia on the map was Pithecanthropus found by Eugene Dubois and his team in Java in 1891-92.

Then, the finding of Sinanthropus at Chou Kou Tien (now Zhoukoudian) near Beijing by Otto Zdansky from 1921, and soon after by Davidson Black and Li Jie, seemed to confirm its importance.

As an aside, among these early East Asian prehistorians, Dubois’ story stands out as the most fascinating, and his discovery of Pithecanthropus erectus (now Homo erectus) is one the most important in the history of archaeology.

Dubois was a young medically trained Dutch anatomist fascinated by 19th Century ideas about evolution, especially those developed by the highly influential German biologist Ernst Haeckel; Darwin’s continental intellectual rival.

Haeckel discovered and described hundreds of species, but was also the first biologist to draw an evolutionary tree or pedigree that included humans, placing us correctly among the Great Apes of Africa and Asia.

This was a profound moment in European science and it had a big impact on the archaeologists of the time.

To account for the evolutionary divergence of humans from the apes, Haeckel needed an intermediate step or ‘missing link’ for his tree, so he invented a speechless human-like ape, which he dubbed ‘Pithecanthropus’.

In 1887, Dubois left a plum job as an anatomist at the University of Utrecht, where he was on track to rapidly become a full professor, for a position as a military surgeon in the colonial army so he could find Pithecanthropus.

Remember, this was almost forty years before Dart discovered the Taung Child (Australopithecus) in South Africa, and 45 years before Louis Leakey made his first discovery in the East African Rift.

And find it he did, in just four short years. An incredible achievement.

Through the 20th Century, work continued, off and on, in archipelago Southeast Asia and on the mainland of East Asia, and was increasingly undertaken by local archaeologists.

But political events especially the rise of Communism meant that many East Asian countries and their scientists became isolated from the international community.

And as colonial powers pulled out of East Asia after World War II many nations had other priorities, with archaeology receiving less attention than it had under European rule.

From its first published description in 1913 until its eventual exposure as a fraud in 1953, the Piltdown skull played a major role in shaping ideas about human evolution, and was one of the main reasons why East Asia continued to be overlooked.

Also, as more and more discoveries were being made of early hominins like Zinjanthropus boisei, Australopithecus afarensis and Homo habilis, Africa quickly came to be seen as the evolutionary cradle of humanity.

This was seemingly confirmed with the discovery in the 1970s and 1980s of Homo erectus remains that predated those from Asia, further marginalising the region.

With the wide acceptance of the so-called Out-of-Africa theory of modern human origins from the 1980s onwards, and concurrent gradual decline in acceptance of the alternatives (like the model Multiregional origins model), marked the final nail in the coffin for East Asia.

But then, the unexpected happened. The unearthing of a 15,000 year old pre-modern human on the island of Flores in eastern Indonesia.

The dramatic discovery of the Hobbit, or Homo floresiensis, a one metre tall human with a grapefruit sized brain, and in most respects resembling three million year old pre-humans like ‘Lucy’ from Africa, changed everything.

Soon after, Robin Dennell and Will Roebroeks published an article in Nature arguing that maybe Homo erectus had evolved in East Asia after all.

Then along came the Denisovans, and while their fossil remains were found in southern Siberia, their DNA was found today in people living in parts of Southeast Asia, New Guinea and Australia.

Then a string of discoveries in the last few years like: the 130,000 year old Zhirendong jaw with its modern looking chin; my own work on the pre-modern Red Deer Cave people; a new archaic species in northern China living perhaps only 60,000 years ago; modern humans arriving in East Asia precociously early, at least 80,000 years ago; the list goes on!

A new discovery published recently in Nature marks yet another find in the ever lengthening list of exciting and history altering discoveries.

Stone tools and animal fossils dated to between 100,000 and more than 200,000 years old have been found in Walanae Basin of Sulawesi: the oldest archaeology in the island, and showing it was inhabited by a unknown archaic species long before modern humans were on the scene.

Sadly, no human bones were found, so we have no idea who made the tools.

I suspect they were made by a species that we don’t see anywhere else. Not Homo erectus, nor Homo floresiensis, but a novel one. Why?

Sulawesi sits on the eastern edge of the famous biographic zone ‘Wallacea’ – marking the transition from an Asian ecology to an Australasian one – and has a truly remarkable fauna and flora.

It’s also located just to the north of Flores and so was probably on the North-South migration path for many animals including early humans.

The mammals that inhabit Sulawesi today are remarkable for their diversity: of the 127 endemic mammals that inhabit Indonesia, 62 percent are unique to Sulawesi.

Among the primates, there are seven endemic species of the monkey genus Macaca and at least seven Tarsier species probably more.

In my view, the same kinds of evolutionary pressures that led to this remarkable diversity of non-human primates would have acted also on the early humans inhabiting the island.

A couple of hundred thousand years is plenty of time for new species to form.

Besides, we have the precedent of Homo floresiensis immediately to the south, and very like multiple species represented in the Homo erectus sample from nearby Java also.

It’s well and truly time to reassess the role East Asia played in human evolution, and to recognise that far from being a backwater, it was in fact a hotbed of human evolution right up till the end of the Ice Age.

Bone Suggests ‘Red Deer Cave People’ a Mysterious Species of Human

It’s been an exciting year for human evolution with several discoveries dramatically rewriting major episodes of our ancient past.

Some of this progress stems from major advances in fields like ancient genomics, while much has resulted from new fossil and archaeological discoveries made in Africa and China.

What’s interested me the most has been the discovery of archaic humans living in northern China until perhaps 70,000 years ago and the oldest anatomically modern humans in the region appearing at least 80,000 years ago.

This is because they fall squarely within my own area of research: human evolution over the past few hundred thousand years in East Asia and Australasia.

Unlike anything else

In 2012, we announced the discovery of the ‘Red Deer Cave people’ in Southwest China, a mysterious human group we identified from cranial and jaw bones and teeth from two cave sites located in Southwest China.

Today, a team I co-lead with Professor Ji Xueping of the Yunnan Institute of Cultural Relics and Archaeology, and involving colleagues from a range of institutions in China and Australia, announced the discovery of yet another highly unusual bone from the Red Deer Cave people. And it seems to confirm they were a mysterious group of pre-modern humans.

The Red Deer Cave (Maludong) during research in 2008.
Ji Xueping & Darren Curnoe, Author provided

Our previous work showed that the features of their bones and teeth possess a remarkable number of similarities to archaic humans. This is despite them having lived only between about 14,000 and 11,000 years ago from radiocarbon dating of charcoal.

Their anatomy was nothing like we’d seen before in modern humans, whether they lived 200,000 or 200 years ago: they were truly unique and a real mystery to us and many of our colleagues.

We suggested they could represent either a very early modern human population, perhaps one that settled the region more than 100,000 years ago and became isolated. Or, they could be a late surviving archaic species, akin to a population of Neanderthals surviving in isolation until the end of the Ice Age in Southwest China.

Some of our colleagues also proposed at the time that they might be hybrids between modern humans and an unknown archaic species as an explanation for their peculiar traits.

We had focused our work on the skulls and teeth, representing four or five individuals, thinking they would offer the best insights into just who these mysterious people might be.

But, alas, we were left with considerable uncertainty. There was no clear answer about which species they might belong to or whether they could be hybrids. So back to work we went.

Archaic or hybrid?

A couple of months ago we published a new study of the Longlin or Laomaocao Cave specimen, which we had also placed in the Red Deer Cave people in 2012.

We’re now treating it as part of a separate group, distinct from the bones from Red Deer Cave, or Maludong, and one that we now think is indeed very likely to be a hybrid. And direct dating on human bone now confirms that the specimen is only 10,500 years old.

The Longlin Cave cranium.
Darren Curnoe, Author provided

If we’re correct, then either there were archaic humans still around at that time in Southwest China who interbred with modern humans, or their hybrid features persisted longer after interbreeding occurred because of isolation and perhaps through the action of natural selection or genetic drift.

Our study published this week outlines detailed work on a thigh bone or femur from Maludong, located only 6km Southwest of the city of Mengzi, near the Northern Vietnam border.

Like the skull bones from the site, it is also dated to about 14,000 years old. But unlike them, it provides a much clearer indication of what at least some of the Red Deer Cave people bones might be.

Our work shows that the thigh bone strongly resembles very ancient species like early Homo erectus or Homo habilis, which lived around 1.5 million years ago or more in Africa.

Red Deer Cave people thigh bone compared with a modern human (not to scale).
Darren Curnoe, Ji Xueping & Getty Images, Author provided

Like these pre-modern humans, the Maludong femur is very small. The shaft is narrow, with the outer layer of the shaft (or cortex) very thin, the walls of the shaft are reinforced (or buttressed) in areas of high strain, the femur neck is long, and the place of muscle attachment for the primary flexor muscle of the hip (the lesser trochanter) is very large and faces strongly backwards.

Surprisingly, we reconstructed its body mass to be about 50 kilograms, making the the individual very small by pre-modern and Ice Age human hunter-gatherer standards.

We need to be a bit careful though, as it is only one bone. Still, when seen in the context of the archaic looking skull bones and teeth from Maludong, our results are very compelling.

Controversies

How is it that such an ancient looking species could have survived until so recently in Southwest China? Well, the environment and climate of Southwest China is unique owing to the tectonic uplift of the Qinghai-Tibetan Plateau.

Yunnan Province today has the greatest biodiversity of plants and animals in the whole of China. It is one of 20 floristic endemic centres as a result of its complex landscape of high mountains, deep valleys, rift lakes and large rivers.

The region around Maludong is also biogeographically on the northern edge of tropical Southeast Asia and many species found there today are very ancient indeed. The area is a biological refugium owing to its variable topography and tropical location.

The Maludong femur might therefore represent a relic, tropically adapted, archaic population that survived relatively late in this biogeographically complex, highly diverse and largely isolated region.

Now, we can’t deny that our work is controversial, with some our colleagues simply unable to accept the possibility that archaic looking bones could be so young, especially in East Asia.

Yet, when Homo floresiensis was found a decade ago the same kinds of comments were made. This species looks a lot like Australopithecus skeletons, like Lucy), that lived in Africa 3 or 4 million years ago. While not everyone has accepted the so-called “Hobbit” from Flores as a valid new species, most anthropologists and archaeologists have.

At a conference in Shanghai this week, which I attended, scientists from the Russian Academy of Science in Siberia presented evidence about the cave of Denisova in southern Siberia. Coincidentally, a new article by the same team on Denisovan DNA also come out this week in the Proceedings of the National Academy of Sciences of the USA.

It was a big surprise to me to learn that they have found rather similar kinds of things at Denisova Cave, except that the bones are 30,000-40,000 years older than at Maludong.

They’ve recovered evidence for multiple archaic species like the Neanderthals and Denisovans in the same cave layers as modern human dating to about 50,000 year ago. And in a slightly older unit in the cave they have found Neanderthal, Denisovan and possible Homo erectus bones, again together from a single layer.

Within this context, and the Hobbit from Indonesia, our finds don’t look so out of place after all.

The author and colleague Ji Xueping at a Palaeolithic cave in southern China.
Ji Xueping & Darren Curnoe, Author provided

Riddles

We need to also keep in mind that most of what we know about human evolution is based on the fossil records of Europe and some parts of Africa, like the East African Rift Valley, and caves in South Africa.

We’re quickly learning that Europe and Africa may not provide the best model for us to use to interpret the fossil record of East Asia. For example, Denisova Cave is as far east as we’ve found the Neanderthals, and they don’t seem to have occupied Siberia permanently. This is unlike Europe, where they lived until about 40,000 years ago. And so far, no Neanderthals have been found in China or anywhere further South of Denisova Cave.

The fact is that we’ve really only scratched the surface in East Asia. We still have an enormous amount to learn about which species were living there when the first modern humans arrived, and about how they interacted with the Palaeolithic ancestors of living East Asians.

Despite the progress we’re making about these and other ancient humans in Southwest China, we’re left with many riddles still about the Red Deer Cave people. Just who exactly were these mysterious Stone Age people? Why did they survive so late? Why are they found only in tropical Southwest China?

What did modern humans make of them? And how did they interact with them when they encountered them? Did they interbreed with them?

We hope to be able to answer more of these questions soon.

In the Driver’s Seat of Evolution

Humans have had a profound influence over evolution; ours and the evolution of many other species.

So much so today that we are without doubt in the driver’s seat of evolution for many species, including our own.

We’ve sent hundreds, probably thousands, of species to extinction, and potentiality millions more in the near future with the current pace of destructive environmental change.

The present era has been dubbed the ‘Sixth Mass Extinction’ in Earth’s history with the average rate of loss of species around 100 times greater than the ‘background’ rate seen over long timescales in the fossil record.

Extinction is, after all, the end of evolution, and the loss of one species can have unexpected spin-offs.

The disappearance of a single species changes the composition and functioning of an ecosystem, with its delicate balance between its constituent species and the physical environment, honed by thousands or millions of years of evolution.

But if it’s a critical – ‘keystone’ or ‘ecosystem engineer’ – species that disappears, the entire ecosystem can be profoundly changed or even collapse.

A good example of an ecosystem engineer is elephants, and in the past, mammoths and other extinct elephant relatives.

They destroy trees sometimes turning woodlands into grasslands, dig up huge amounts of soil in foraging and when drinking water and their dung can cover the ground in densities of up to 2 kg per square metre.

But, compared to humans, their impacts are significant only on local and regional scales, and globally unimportant.

Yet humans are clearly the greatest ecosystem engineer that has existed.

So widespread has the damage caused by us been, especially over the last couple of hundred years, that the planet itself may even have entered a ‘state-shift’ in which the biosphere may be close to a ‘tipping-point’.

Whole ecosystems are being transformed, many have collapsed, and all as a result of species loss and too little time and space for evolution to take its course, disallowing plants and animals to adapt to change.

Take plants: modelling suggests that for roughly 30 percent of the planet, the pace at which species will have to migrate just to keep pace with projected climate change is much greater than in the past when we saw major shifts during the Ice Age.

In many cases they’ll simply have nowhere to go as a result of human fragmentation of landscapes or because they occupy narrow zones like mountainous areas.

What might a state-shift in the planet’s biosphere look like?

Well, we can look to the past to get some idea of what might happen, only this time, more dramatic.

The last great cold phase of the Ice Age – dubbed the Last Glacial Maximum – occurred between roughly 30,000 and 15,000 years ago.

After this time the climate became punctuated by a number of short and rapid changes before entering the modern climate phase.

A short warming occurred between roughly 14,500 and 12,500 years ago, followed by another cold phase lasting about 1,000 years, until the present warm period called the Holocene began around 11,500 years ago.

With these swings from cold to warm to cold and back to warm there were dramatic changes in weather patterns at local and regional scales, and profound ecological shifts across the planet.

Rainfall and wind patterns, weather cycles, and humidity levels changed. Natural fire regimes altered.

There were major shifts in the distributions of plants and animals.

Many species were lost and ecosystems altered in composition, resulting in major shifts in biodiversity.

In short, the biosphere went through a turbulent time with around half of the planet’s species of large-bodied mammals going extinct, as well as a number of species of large birds and reptiles, and a few species of small animals.

We’ve no idea how many plant or insect species were lost.

Whole ecosystems were transformed, especially in higher latitudes, which were most vulnerable to climate change.

But even the tropics went through major and rapid climate shifts altering weather patterns like the East Asian Monsoon and the ecosystems dependent upon it.

At the same time all this was happening humans passed through a profound shift in population size and behaviour.

The population growth curve began its rapid exponential rise between 15,500 and 11,500 years ago, beginning with around 10 million people and ending today at more than 7 billion, and peaking at a little over 9 billion by 2075.

This time also ushered in the development of agriculture or the Neolithic period.

This construction of a new human ecological niche through cultural and technological change was one of the most profound events in the evolution of modern humans; more profound even than the changes the ensued when our kind first left Africa 60,000 years ago.

Yet, it led to changes in the human genome associated with changed diet and disease exposure, within a few thousand years of agriculture commencing.

It’s even been suggested that the lack of pigmentation in the skin of living Europeans was only widespread well after farming began in the region after 7,000 years ago; so it even changed the way we look.

Agriculture saw large tracts of land being cleared, animals and plants domesticated, humans dramatically shifting their diets, big changes in the diseases suffered and their epidemiological patterns, many major migrations and population replacements, and people living a more sedentary life resulting in the beginnings of cities, states, extensive trade and warfare.

Many of these changes are familiar to us today, and are becoming major issues again with global warming.

Ironically, the responses of the people living during the climatically tumultuous period between 15,000 to 11,500 years ago set the backdrop for the challenges we face today.

We can’t blame them, they were just trying to survive, and of course had no knowledge of the path they were putting us all on.

Things are different now: we’re only too aware of the havoc we’re wreaking on the planet and ourselves and understand the steps we need to take to change the future.

And meetings like COP21 are aiming to change things for the better.

Question is, are we really willing to respond quickly enough and make the changes we have to to derail the express train to evolutionary oblivion?

Without Grandmothers We Might Not Be Here At All

As adults, we’re often nostalgic for our childhood. A time when life seemed so much simpler. When we were free from the hassles of money, pressures of work and responsibilities of family and care.

When we were free to play and imagine people and worlds far from reality, and almost everything we did was new, exciting, and to be explored and understood; to be conquered, torn apart, feared, cuddled or tasted.

It might come as a surprise to learn that even having a childhood is something unique to humans.

We’re the only primate to have one, and the only one also to suffer the pangs of adolescence; but that’s another story.

Childhood is one stage in the human life cycle, or what biologists call our ‘life history’.

Life history is, for example, the time it takes for a fetus to grow, or the length of the various stages of life, like childhood or adulthood, important events like the age at first birth for a mother or the number offspring she has at each birth, age at death, and so on.

While every species has a unique life history, ours is downright weird compared to other primates, and indeed, most mammals.

Even among hunter-gatherers, our species normally lives around twice as long as our chimpanzee cousins do; we have the longest lifespan of all primates.

Infant mortality was similarly high among human foragers and chimpanzees, but if you survived until 15 years of age, your life expectancy would have soared to about 54 years (human forager) and 30 years (chimp) of age.

Most mammals including chimps have three stages in their life cycle: infancy, a juvenile stage and adulthood.

Infancy, the period from birth until weaning, when kids move onto solid food, is a lot shorter in humans though than other apes.

Infants in traditional societies were often weaned after about 3 years of age, but in chimpanzees it normally occurs around age 7.

Now, all primates except humans make the transition from infancy to adulthood via a juvenile (or ‘tween’) stage.

Instead, we pass through two extra stages in our life cycle – childhood and adolescence – giving us five stages of growth and development instead of three.

At each of these stages the body grows at different rates, different organs mature at varying times, and in traditional human societies, there were changes in the kinds of foods eaten and the roles kids played in society.

Childhood normally lasts around 4 years, from ages 3 until roughly 7 years of age. It’s the time after weaning when we would have been learning how eat solid foods, prepared for us by adults, when our brains reached their full size, and our first permanent molar teeth appeared.

Why are we the only primate to have a childhood? Well, it probably evolved as a mechanism to allow women to have more offspring.

Human females reproduce for around twice as long as chimps, owing to childhood and early weaning.

Breastfeeding can be an effective form of birth control by delaying the return to ovulation.

So, by weaning kids much sooner, mothers are free to reproduce again, and much, much, sooner than in other apes.

So our species can have many more children than any other apes through extending our overall period of reproduction and reducing the interval between births; which helps in part to explain why there’s seven billion of us today.

Intertwined with the evolution of childhood is the origins of grandmothering.

We’re also the only primate to experience menopause, or more correctly, to have grandmothers; women who live well beyond the reproductive stage of their lives.

We’re apparently not completely alone in this among mammals, with some species like killer whales also experiencing menopause.

Episode 12 of my UNSWTV series, ‘How did we get here’, looks at the importance of grandmothers in human evolution.

But human grandmothers probably evolved as a result of the early weaning of infants: weaned children rely heavily on foods collected and prepared by adults.

Hunter-gatherer children would also have been highly vulnerable to being killed by predators, and are especially vulnerable to disease. So, they would have, still do, demand considerable care and attention.

Because grandmothers have finished reproducing themselves, they are uniquely placed to invest time into helping feed and care for their grandchildren.

This would have greatly improved the survival of children, and allowed their daughters to have more of them, passing on more of their own genes through better survival rates among their grandkids.

And, not wanting to neglect the granddads entirely, the wisdom of a lifetime of experience for both grandparents must have been a great bonus for the entire community in handing down traditional knowledge, culture and understanding of the environment.

Hunter-gatherer fathers and grandfathers are also known to play a much larger role in childcare then other kinds of societies like pastoralists or farmers, and not just in providing food.

But studies of recent populations suggest there is probably no real reproductive benefit to men surviving to be grandfathers.

Perhaps men live to a ripe old age because evolution has favoured long lifespan for the entire species owing to the benefits of grandmothering? In this case, grandfathers might simply be incidental rather than a necessity.

Still, one pattern that seems to be consistent across many foraging societies is that an absence of grandmothers leads to higher childhood mortality than an absence of fathers.

When did childhood and grandmothering evolve? Its difficult to be certain because the different stages in human life cycle often don’t leave clear evidence for us in the fossil remains our ancestors.

Certainly, there seems to have been a shift in longevity by around 40,000 years ago in Europe when we see many more individuals surviving into old age.

But this is more than three-quarters of the way through the evolution of our species, which evolved more than 200,000 years ago in Africa.

The five stages in the human life cycle are universal and must therefore be under the strong influence of our genes. So its very likely that our unusual life cycle was present from the birth of our species as well.

Before 40,000 years ago old people were probably very rare in all communities, but their existence, especially of grandmothers, could have made a huge difference to child survival and mortality and may be the main reason we’re here at all.

FactCheck Q&A: Do We Only Have Space for About 150 People in Our Lives?

The Conversation is fact-checking claims made on Q&A, broadcast Mondays on the ABC at 9:35pm. Thank you to everyone who sent us quotes for checking via Twitter using hashtags #FactCheck and #QandA, on Facebook or by email.


We, on average, for our entire history have associated with about 150 other people, and now after millions of years of doing that, we are a very social animal. – Professor of population studies at Stanford University, author and ecologist Paul Ehrlich, speaking on Q&A, November 2, 2015.

Professor Ehrlich’s assertion refers to a widely discussed figure known as “Dunbar’s number”.

When asked to elaborate on his Q&A comment, Professor Ehrlich told The Conversation by email that:

The Dunbar (Robin Dunbar) number is ~150, size of hunter gatherer groups, still length of Christmas lists, and so on. My point was we’re a small-group social animal now suddenly (in cultural evolution time) trying to find ways to live in gigantic groups.

But does 150 really represent the ideal number of people we have all evolved to interact with socially?

Neocortex size

The theory emerged from a series of studies beginning in 1992 by Robin Dunbar, a primatologist based at University College London. The studies aimed to understand the evolution of the large brain, especially the neocortex, of primates including humans.

The neocortex is the balloon-like, highly folded, outer part of the mammalian brain, which in humans is associated with higher cognitive functions like planning and executive control.

Dunbar proposed that his number, 150, “predicts a ‘natural’ cognitive community size for humans”.

But let’s be clear up front: this number does not derive from an ecological principle or evolutionary law governing the way complex species like primates naturally organise themselves.

Instead, it is an estimate – a prediction – derived from an equation Dunbar used to describe the statistical association between neocortex size and the number of individuals typically living in the social groups of various primate species.

While his research has been widely cited and influential, especially in the social sciences and humanities, it has been very controversial. Indeed, it has been the subject of strong criticism in primatology and cognitive and experimental psychology.

So, what’s controversial about Dunbar’s number?

A bewildering array of correlations

First of all, it’s now well understood that larger sized mammals possess a larger neocortex: it comprises about 87% of a sperm whale’s brain, 80% of the human brain, 71% of a camel’s brain but only 15% of a shrew’s.

While we could speculate about the previously unappreciated intelligence of some of these species, there’s probably nothing particularly special about a large-bodied species possessing a large neocortex as such. A big neocortex may not necessarily tell us anything about that animal’s social life.

Second, other ecological factors have been found to produce similarly strong correlations with brain or neocortex size in primates.

Various studies have shown that other factors can explain neocortex size equally as well as social group size. They include primate territory size, diet (especially fruit-eating and other kinds of extractive feeding behaviour), and other variables like nighttime versus daytime activity patterns.

In fact, so many strong statistical correlations have been found by researchers looking into this question that one study bleakly noted the “bewildering array of correlations between brain size and behavioural traits”.

All of this points to the fact that Dunbar’s theory is regarded by many experts as an incomplete explanation for the complexity of primate brains, cognition and behaviour.

One common factor among many of these aspects of primate ecology is that they all rely heavily on visual cues and the processing of visual information by the brain.

The larger neocortex of primates results to a considerable extent from a larger visual cortex (visual brain system), which clearly has many demands on it. Social behaviour is just one of them.

Primates seem to be unique among mammals in showing a strong evolutionary link between an enlarging neocortex and a larger cerebellum, the brain region beneath the neocortex that processes and coordinates sensory and motor control and is involved in the learning of motor skills.

Focusing solely on the neocortex misses a big part of the picture of primate brain evolution.

Our human-centric view of the world

Another problem pointed out by other primatologists is that no matter how dispassionately we might study primate cognition, we will inevitably impose our own, species-centric view of the world on it.

This is the problem of anthropocentrism: our belief that humans are the most important species on the planet. We simply can’t escape making inferences through our own “socio-cognitive spectacles”, as pointed out by the philosopher Wittgenstein.

This problem becomes particularly acute with primates, which are our evolutionary cousins. It is easy for us to impose complexity on behaviours that may not be complex at all within primate (versus human) social settings.

Furthermore, when Dunbar’s number has been tested against the actual social organisation of historical and living hunter-gatherer groups, it has been found to be wanting.

The anthropologist Frank Marlowe, for example, has suggested that hunter-gatherers spend most of their time living in “local bands”, and that these typically comprise only around 30 people, regardless of where in the world they are living.

Dunbar, in responding to Marlowe, has pointed out that local bands are often unstable, and change in size regularly, making other (larger) units of social organisation more appropriate for investigation.

There is simply no agreement among researchers about which unit of human social organisation is the most appropriate one for studying evolution.

Many other criticisms have been levelled at Dunbar’s theory and show that it, and the predictions emerging from it about human social organisation, are widely regarded as overly simplistic.

Verdict

While Professor Ehrlich correctly quoted the number 150 as Dunbar’s number, he didn’t quite present the whole picture. He could have been more accurate by linking the figure to its source and he overlooked the abundance of contradictory and highly critical published studies of Dunbar’s theory. – Darren Curnoe


Review

The author has provided a good, critical assessment of Dunbar’s number and a useful discussion on the weaknesses of correlative comparative studies.

Dunbar’s number is an arresting idea with a pithy name, easy to digest and just counter-intuitive enough to have broad appeal, which may explain why the idea it encapsulates has caught on so readily outside of primatology and anthropology.

Despite the limitations and problems with Dunbar’s number and the idea that neocortex size seems adapted to living in social groups of fewer than 150 individuals, I believe Dunbar’s thinking remains useful.

In particular, I see value in Dunbar’s argument that at each level of closeness, we are limited in how many relationships we can have: an average of five intimate supportive relationships, 15 close friends and so on.

Whatever the neurobiological mechanisms, Dunbar has made useful predictions about the limited nature of human social capacity, and they remain to be thoroughly tested against competing ideas.

Paul Ehrlich quoted a piece of science that has become pop folklore, but that is also controversial.

However, the point he was making – that humans have limited social capacity and that our evolved social capacities don’t suit us well to living in societies of millions – has not been refuted.

I suggest that readers interested in this topic listen to Dunbar’s TED talk, especially the bit from about 7:20 in which he discusses the layered capacities for different types of relationship. – Rob Brooks


This article was originally published on The Conversation. Read the original article.

Why Are Humans Unique? It’s the Small Things That Count

Can there be any more important a question than, ‘How did we get here?’

Of course, I don’t mean those books we all gawked at as tweens desperate to understand our transforming pubescent bodies.

I mean, ‘How did we get here, as a species?’ ‘How did we come to be so different to all other life?’

In the way that we look: with our large, balloon like brains and skulls, hairless bodies, tiny teeth, protruding chins, puny muscles, and bobbling about on two feet.

Also in the ways that we behave: with our remarkably complex and conscious brains, articulate speech and language, symbolic, creative, minds, and extraordinary imagination.

And how did we come to occupy virtually every nook and cranny the planet has to offer, even travelling to places beyond Earth?

The fossil, genetic and archaeological records provide the only hard evidence we have about our evolutionary past.

Yet, even if we cast our attention back to the Palaeolithic (or Stone Age) we really get no sense at all that we as a species would be destined to be the apes that would eventually shape the planet itself, on a global scale.

But each year, with the rapid pace of scientific discovery about our evolutionary past, our ‘biological patch’ is getting smaller and smaller; and, 2015 has been a truly remarkable year in this sense.

It seems like a good time to pause and take stock: How different are we? And, what can the records of our evolutionary history tell us about the journey to human uniqueness?

Our evolutionary branch on the tree of life began a mere 8 million years ago: a time when we shared a common ancestor with living chimpanzees.

Homo sapiens, also called ‘modern humans’ by anthropologists – a concept I’ll return to later – evolved according to the fossil record more than 200,000 years ago.

That’s a long time ago in terms of human generations of course: roughly 10,000 generations back.

But its a mere blink of an eye in the history of planet Earth and life.

In broad terms, we can divide the human evolutionary story into two major phases, and in doing so, can trace the gradual assembling of different parts of the ‘package’ of human modernity.

In the first phase, between roughly 7.5 million and 2 million years ago, we see a group of very ape like creatures living only in Africa.

A famous example is ‘Lucy’ from Ethiopia who belongs to the species Australopithecus afarensis and lived between around 3 and 4 million years ago.

These prehuman apes were very ‘unhuman’-like, except in one or two key respects.

Most importantly, they walked upright, on two feet, when on the ground, as we do; but also spent a lot of their time living in trees.

They also had brains and bodies similar in size to living chimpanzees.

From among these two-footed tree swingers, the human genus, Homo branched off, ushering in the beginnings of apes that would live permanently on the ground.

Homo appears in the fossil record close to 3 million years ago – as we learned just this year with a new fossil jaw from Ethiopia which added half a million years to the history of our genus.

With Homo we see brains getting much larger, very quickly also bodies reaching the human size, our muscles, especially those used for climbing, becoming pretty weak.

Very likely also at this time, body hair became short, fine and patchy as prehumans became obligate, ground-dwelling, bipeds.

We’ve also learned this year that we had previously underestimated the hand capabilities of these prehuman apes, which may have been pretty similar to our own.

Remarkably also, the earliest stone tools now date back to almost 3.5 million years ago: being invented by Lucy’s kind with their small brains.

Some archaeologists also think that some of the earliest members of Homo – notably Homo erectus – with its human body size, but brain three quarters the size of ours, may have been able to make and control fire.

The importance of fire is that it would have allowed our Palaeolithic ancestors to cook their food, unlocking new and sometimes safer sources of nutrition to feed an energy hungry and evolving brain.

But the oldest examples of fire are only around 300,000-400,000 years old, in the form of burnt bone and deep ash and charcoal layers in caves.

They are associated with the species Homo heidelbergensis or perhaps the earliest Neanderthals (Homo neanderthalensis) living in Europe and West Asia.

Still, it certainly predates Homo sapiens, showing that fire is far from being unique to us, as Charles Darwin once opined.

This evolutionary time also marked the very first excursions by a two footed ape out of Africa, with Homo erectus settling Europe and eventually Asia as far east as present day China and Indonesia beginning from at least 1.8 million years ago.

Around a million years later the species Homo heidelbergensis appears in the fossil record, and also has a rather wide distribution across Africa, Europe and Asia.

Homo heidelbergensis is likely to have been the species that gave rise to both our Neanderthal cousins and we modern humans, and like us, it occupied a very wide range of environments, with a few important exceptions.

Now, one of the most exciting human fossil sites ever found is Sima de Los Hueseos – ‘the pit of bones’ – in Atapuerca, northern Spain.

Here, anthropologists have so far found more than six and half thousand fossils of an early human species, dated to more than 500,000 years ago.

The bones are pilled up one atop another in a way that strongly suggests they were deliberately disposed of in the cave, as complete bodies: in a kind of human rubbish pit.

But, some of the scientists working at the ‘pit of bones’ think the piles of fossils represent not just intentional disposal of the dead but indicate a sense of the afterlife, representing a kind of burial practice.

Again, hundreds of thousands of years before Homo sapiens appears.

We also now know from DNA extracted from the fossils from Sima de Los Huesos that the bones sample an early part of the Neanderthal evolutionary branch.

This means that Neanderthals were disposing of their dead, but not necessarily burying them like we do, at least half a million years ago.

In tracing the origins of this (admittedly incomplete) list of features historically claimed to be unique to Homo sapiens we get the distinct impression that the ‘biological patch’ we humans have recognised as our own is narrowing rather quickly.

If many of the hallmarks of humankind can no longer be claimed as exclusive, what does this leave for our species to claim as unique, and to explain the differences between us and other life?

Not much, actually.

Anthropologists often use the term ‘modern humans’, more specifically, ‘anatomically modern humans’, more or less interchangeably with the species name Homo sapiens.

What’s meant by this term is essentially any fossil that would blend within the range of physical variation we see around the planet today, or in the recent past.

A related concept is that of ‘behaviourally modern humans’, which is used by archaeologists to distinguish humans whose behaviour we would recognise as being like our own.

Now, you might think this latter term would be unnecessary: surely, you might ask, anatomically and behaviourally modern humans are the same thing, right?

If only it were that simple!

Actually, the fossil record shows that the earliest bones that resemble living humans are from Africa, specifically, Tanzania, Ethiopia and South Africa, and are dated between about 220,000 and 170,000 years ago.

Why are they regarded to be anatomically modern human? Mostly on account of their bubble shaped skulls, large brain volumes, small teeth, and finely built jaws with protruding chins.

Anatomically modern humans got into West Asia, specifically present day Israel, more than 100,000 years ago.

But, until very recently, it was thought they didn’t get anywhere east or north of the Levant until much later, perhaps only 50,000 years ago, at most.

Skeletal remains dating to around 40,000 years old have been found at Lake Mungo in Australia, Niah Cave in Malaysian Borneo, Tam Pa Ling in Laos, and Tianyuan Cave near Beijing in China.

Just three weeks ago we learned that anatomically modern humans have been in East Asia, specifically southern China, for at least 80,000 years, and perhaps even 120,000 years.

Forty-seven human teeth from the site of Daoxian Cave, which are remarkably modern looking, provide a strong case for the precociously early occupation of the region by our kind.

When do we see the earliest evidence for behaviourally modern humans?

Stone tools don’t give us any real insights into this question for the first 100,000 years or so of our evolution as species.

That’s right, there is a gap of more than 100,000 years between the appearance of anatomically modern and behaviourally modern humans. Odd right?

The ‘smoking gun’ that archaeologists look for when trying to pinpoint the emergence of the modern human mind is the signs of symbolic behavior.

When we think about symbols we know that among living species we humans are the only ones, as far as we know, that are capable of inventing them.

Chimpanzees have been taught to use sign language or simple pictographic languages and they do so to great effect, but they don’t invent the symbols themselves.

A good example of a simple yet powerful symbol is the cross, as explored in my an episode of my UNSWTV series, ‘How did we get here?’

One episode of ‘How did we get here?’ explores the human use of symbols and the role they play on our lives.

How might we get at this kind of thinking, of a symbolic human mind, from the archaeological record?

Archaeologists point to examples like the:
• Making of jewellery, with shell beads at least 100,000 years old in Africa.
• Grinding up of ochre to make paint for painting living bodies or of the deceased in preparing them during a burial ceremony.
• Cremation of the dead, with the earliest evidence being from Australia in form of the Mungo Lady who was cremated more than 40,000 years ago.
• Rock paintings on cave walls, the oldest, as of last year, being found in Indonesia and dating to about 40,000 years old, older than anything in Europe or Africa.

We modern humans also live in places other human species simply haven’t been found.

There’s clear evidence, especially from the archaeological record, that only modern humans have occupied deserts, rainforests, the Arctic Circle and even the Steppe Grassland environments seen in Siberia and Eastern Europe.

While we’re remarkably flexible and able to alter our diet, behavior and technology to suit our circumstances, this all occurred well after 100,000 years ago.

Why then did it seemingly take more than 100,000 years after our appearance as a species for the first signs of the modern human mind to make a show?

One possibility is that some kind of revolution occurred around this time – perhaps the arrival of complex human language being associated with a gene mutation.

One candidate is the FOXP2 gene, which is vital for the development of normal speech and language.

This gene is shared with Neanderthals and chimpanzees as well, but we humans have a particular mutation affecting the regulation of the gene that is not found in the genome of our cousins.

Ironically, as we gather more scientific evidence, and our technologies get more powerful, the big questions about our past, evolution and place in nature get harder to answer with any satisfaction.

With only around 100 genes of any consequence distinguishing us from our Neanderthal cousins, and most of them being related to our immune system, skin or sense of smell, we are being forced to focus now on the small biological changes in our evolution to explain what feels like a massive gulf.

Seemingly changes of only minor genetic importance had profound consequences for us as a species, and, as it turns out, the well being and future of the planet as well.

How a One Night Stand in the Ice Age Affects Us All Today

Over the past half decade, ancient DNA research has revealed some surprising aspects to our evolutionary history during the past 50,000 years.

Perhaps the most startling of these has been the extent to which the ancestors of living people across the planet interbred with other closely related species of human.

But where in the world did these cross-species matings occur? Which archaic species were involved?

Just how much of the human genome comprises DNA from these archaic relatives?
And what impact did interbreeding have on our evolution and general biology as a species?

These are questions are the core of current research into interbreeding as revealed by DNA sequences obtained from fossils in Europe and Asia, as well as from comparisons with the genomes of living people.

In Africa, interbreeding with an archaic species has left genetic signatures in the genomes of some living sub-Saharan populations.

Roughly two percent of the DNA of these people derives from an archaic species as a result of mating that occurred around 35,000 years ago.

The very well known Neanderthals – for whom we have hundreds of fossils including near complete skeletons – interbred with the founders of living European and East Asian populations.

Estimates published in 2014 indicate that 1.5-2% of the genome of living non-Africans was inherited from Neanderthals.

Yet, East Asians have significantly more Neanderthal genes than Europeans do indicating that their ancestors interbred with this archaic species perhaps more than once, or in an event separate to that involving the ancestors of western Eurasians.

Another species, the mysterious ‘Denisovans,’ is known from the fossil record only by a single tooth, finger bone and toe bone.

Yet their fully sequenced genome shows that they shared their genes with the ancestors of some Southeast Asians, New Guineans and Aboriginal Australians.

These living people also show the genetic signs of interbreeding with Neanderthals, so have inherited DNA from both of these species.

Not only do we all carry the evidence for these interspecies dalliances, in some cases these genes seem to have provided real benefits for us today.

Take for example the finding last year by Emilia Huerta-Sánchez and her team that the ability of populations living today in Tibet to thrive at high altitude is the result of a gene inherited from the mysterious ‘Denisovans.’

The gene in question – EPAS1 – is associated with differences in haemoglobin levels at high altitude underpinning the capacity of the individuals carrying it to pump more oxygen around in their blood.

The Denisovans also seem to have contributed genes that bolstered the immune systems of people in New Guinea and Australia.

In Europe, interbreeding with the Neanderthals may also have provided gene variants associated with lipid catabolism, or the conversion of fat to energy in the body’s cells.

Other examples include genes associated with: sugar metabolism; muscle and nervous system function; skin formation and structure; skin, hair and eye colour; and the female reproductive system, especially the formation of ova.

But of course we would expect natural selection to work in both directions given that these mating events were between different species: Homo sapiens x Homo neanderthalensis, Homo sapiens x Denisovans and Homo sapiens x mystery African species.

One particularly interesting example compared the genome of a female Neanderthal with 1,000 contemporary human ones from across the world and found clear evidence for negative selection.

Mapping the DNA of Neanderthals against this large number of human genomes also showed that there were vast ‘deserts’ of Neanderthal ancestry.

One million base pairs compared across the autosomes (i.e. other than the X or Y chromosomes) showed four windows in Europeans and 14 in East Asians where around 0.1% of the DNA was Neanderthal.

The human Y chromosome is also known to be lacking Neanderthal DNA suggesting strong natural selection against hybrid males, who were likely to have been infertile.

Other genes inherited from the Neanderthals seem to have conferred greater risk for a range of diseases such as lupus, biliary cirrhosis, Crohn’s disease, altered optic-disc size, smoking behaviour, IL-18 levels (producing inflammation) and type 2 diabetes.

One of the especially odd things about the evidence for interbreeding with the Denisovans is that the only fossils we have for them were recovered from Denisova Cave in southern Siberia, some 6,000 km northwest of New Guinea.

How can this be given the very high frequency of Denisovan genes in New Guineans and Australians and apparently low level or even absence of Denisovan DNA in the genomes of mainland East Asians?

One study by Skolund and Jakobsson suggested Denisovan DNA may also be found in mainland East Asians, but this has been controversial and difficult to pin down owing to its apparent very low levels.

But if correct, perhaps the mating with the Denisovans happened on mainland East Asia, not so far from Denisova Cave, the genes being carried later to New Guinea and Australia?

A new study of the occurrence of Denisovan DNA in living humans published in the journal Molecular Biology and Evolution has finally confirmed the widespread signal of a low level of Denisovan ancestry across Eastern Eurasian and Native American populations.

Pengfei Qin and Mark Stoneking of the Max Planck Institute for Evolutionary Anthropology examined a set of 600,000 genetic markers in 2,493 individuals from 221 worldwide populations.

They found that for living New Guineans and a single genome sample from the North of Australia around 3.5% of their DNA derives from the Denisovans.

In contrast, in East Asians and Native Americans the amount of Denisovan DNA plummets to a minimal 0.13-0.17% of their genome.

Qin and Stoneking concluded that Denisovan ancestry is therefore strongly associated with New Guinean ancestry.

So, the presence of Denisovan DNA outside of New Guinea – its place of highest occurrence – is probably the result of recent population migrations from New Guinea into Australia, Southeast Asia and mainland East Asia.

In other words, at some time in the past some New Guineans migrated into northern Australia and back to mainland East Asia carrying their Denisovan DNA with them and spreading it around the region.

So far, no archaeological or genetic evidence has been found to support the idea that New Guineans migrated back to Asia well after New Guinea and Australia had been settled.

But, with so many new findings coming from ancient human DNA, and many archaeological models confirmed in the process, we simply can’t afford to dismiss this one.

Once again genetic research is turning long held notions about our evolution on its head: bring it on I say!

Did ‘Rising Star’ Shine Too Bright?

Last week was rather exceptional for human evolution science, even for those of us who are used to the extravagances of media attention that surround the field.

We were spoilt with the announcement of no less than two major discoveries in just as many days.

The first of them – the new South African species Homo naledi – attracted a great deal of attention from a media only too keen to indulge in truck loads of hyperbole and speculation.

The other announcement – the sequencing of ancient DNA from 300,000 to 400,000 year old fossils from Atapuerca in Spain – barely rated a mention in the press, overshadowed by the naledi hype.

This was probably in part because it was announced at an international conference, the coverage it received in Science suggesting it will shortly be published in detail in this prestigious journal.

This seems to be a regular practice by Science these days, as shown with other similar discoveries.

But perhaps also the announcement of Homo naledi the day before the Atapuerca DNA study broke meant that the media had been largely saturated; so it was also a bit of bad luck in the timing.

So, what was all the fuss surrounding Homo naledi about?

The bones of this new species were discovered accidentally by cavers exploring the Dinaledi Chamber of the ‘Rising Star’ Cave in the Cradle of Humankind region near Johannesburg, and subsequently brought to the attention of scientists.

A modest excavation resulted in 1,550 fossils from an extinct human relative, representing the partial skeletons of at least 15 individuals.

The teeth are described as primitive but small; its hand, wrist, lower limb and foot bones are human-like; while other bones of the trunk, shoulder, pelvis and thigh are also quite primitive, being a lot like species of Australopithecus.

Reading the scientific article describing Homo naledi you realise that the work is detailed, rigorous and careful.

It involved a large number of specialists covering a very wide set of physical features on the bones and teeth.

The case for the new species is, in my opinion, detailed, compelling and praise worthy.

So far, so good: another new species, the human tree gets all the more interesting, and complicated.

The human drama surrounding the discovery of the bones and their recovery by a group of petite, commando style, female cave explorers is also fun and adds a lot of colour to the tale of the discovery of Homo naledi.

One rather odd thing about it though is that the scientists involved still haven’t determined its geological age.

This is unprecedented in my experience and raises lots of questions in my mind like: Did the scientists rush the announcement for some reason? Why didn’t they wait until they had an age estimate at hand before going to a journal? Are the geologists unable to date the fossils?

My ‘nonsense-filter’ also tells me that all the talk in the media about this new species burying its dead and having human-like morality, or that is dismantles one of the key pillars of human uniqueness, needs to be called out for what it truly is: absurd.

Completely unnecessary hype to sell the significance of the find to the media.

It’s just the sort of thing that infuriates many scientists and detracts from an otherwise significant discovery; pity really.

The fossils recovered from the site are so far apparently exclusively from naledi and may represent near-complete (or complete) bodies that ended up in one part of the cave.

The geologists involved believe the cave was always dark and therefore the bodies may have been deliberately placed there.

Could be, but there might be other explanations as well that need to be given much more serious scientific exploration.

Why leap to the most complicated, least likely explanation? I’ll leave you to work out why.

Even so, other very rich fossil sites like Sima de Los Huesos (the ‘pit of bones’) in Atapuerca, northern Spain, coincidentally the focus of the new DNA research, have also produced a very large number of hominin remains, and they may also have been put deliberately into the cave.

This site is between 300,000 and 400,000 years old.

Yet, as with Rising Star Cave, there is no evidence that they were burying their dead, or had a concept of the afterlife or morality or engaged in ritual or religious ceremony.

Archaeology, biology and neuroscience all tell us that such behaviours fall exclusively within the human domain, and I see nothing about this new find that changes this.

The oldest convincing evidence for funeray practice is associated with our species and could be up to 160,000 years old.

Again, it would have been helpful to know how old naledi really is; and speculating it could be as old as 3 million years, without any apparent evidence, as the team is reported to have done in the media, is like adding nitroglycerine to the fire of media speculation.

It’s one thing to get the message out to the public about the exciting discoveries we’re making and to educate the very people who kindly allow us the privilege of doing science using their hard earned tax dollars.

I’m thrilled when my colleagues announce their work to the media, even if I don’t always agree with their conclusions.

It can be fun to have a bit of a public stoush over interpretations, and the wider public benefits from a sense that scientific findings can be interpreted in varying ways.

Doing so helps enrich understanding of the human enterprise we call science and to maintain or even grow public interest in it in a world driven by an overriding economic imperative and one prone to disregarding the huge cultural and intellectual contributions it makes to society.

But if we go too far, we run the real risk of trivialising the huge investment of time, money, energy, care and intellectual effort that goes into many scientific discoveries.

It can also do damage to science itself and, dare I suggest, even contribute to the mistrust that increasing numbers of people in the Anglophone West seem to feel about it.

You can also end up with egg on your face, and some people never seem to learn this lesson.

In contrast, the Atapuerca DNA research has direct bearing on understanding the evolution of the living human species, which is quite rightly where the central focus of human evolution research should be.

Researchers have argued about three scenarios for the Atapuerca hominins: they might be the earliest known Neanderthals; or could sample the population that gave rise to Neanderthals; or perhaps are the common ancestor of both humans and Neanderthals.

The research, as reported by Ann Gibbons, confirms that they are in fact the earliest Neanderthals: a kind of ‘archaic’ Neanderthal if you like, and subsequently evolved into the ‘classic’ Neanderthals we see in Europe and West Asia by about 150,000 years ago.

What are the broader implications of the research for understanding the evolution of living humans?

First, the finding pushes the age of the shared human-Neanderthal ancestor well beyond 400,000 years ago, suggesting our species, H. sapiens, might also be at least this old.

Also, with the Atapuerca group living in Europe, it’s even possible that our species evolved in this or an adjacent region of Eurasia, and later migrated back into Africa.

And being close to the common ancestor, the Atapuerca fossils give us real insights into what it must have looked like and the ancestral body form of our own species.

The fossils from Europe, Asia and Africa from around this time are physically very diverse, with some researchers thinking they represent multiple species, only one of which could be the ancestor of living humans.

Question is, which one?

This new research suggests the European branch is closest among them all and deserves much more attention in this regard.

In contrast, we don’t know, and will doubtless ever know, whether Homo naledi had anything to do with the evolution of living humans, least of all whether its brain, mind or behaviour were anything like our own.

The Long Reach of the Past: Did Prehistoric Humans Shape Today’s Ecosystems?

We all know that humans are having a massive impact on the planet.

Our effects include altering the Earth’s rotation by damming large amounts of surface water; changing the composition of the atmosphere by punching a hole in the ozone layer and adding vast amounts of CO2, methane and other pollutants; transforming the composition and temperature of the oceans; and clearing large tracts of land and removing or dramatically altering vast numbers of terrestrial and aquatic ecosystems in the process.

Plenty of these changes are plain for all to see; others are more obscure, but no less significant.

And, with the COP21 UN Paris Climate Change Conference just around the corner, politicians, policy makers and NGOs are again turning their attentions to reaching an agreement that aims to keep global temperature change to below 2° Celsius.

A major issue for scientists studying the Earth’s physical and biological systems is just how great the influence of humans has been and for how long its been happening.

Our global destructive impacts mean that potentially any organism or ecosystem, and many of the earth’s physical systems like erosion, soil formation and water cycles, carbon and nitrogen cycles, and climate, have been affected in some way by human activity.

But can we disentangle the effects of human activity on these systems and organisms from natural signals and cycles?

I’d argue we probably can’t – that human impacts are just too wide ranging and too ancient, and that our disruptive and destructive effects have reached every part of the planet.

This means that probably every scientific study of any contemporary system or living organism catalogues the effects of our species and its economic activity in some way.

If we go back far enough to a time when humans didn’t exist, we have the potential to understand how the world looked and how natural systems behaved before we were around.

Problem is that the information we get is from the very incomplete and often biased geological record, in the form of fossils and various archives of climate and environmental change like isotopes recorded in ice or cave stalagmites.

And, of course, many organisms alive in the past are now extinct: the planet looked very different even just 20,000 years ago at the peak of the last major cold stage of the Ice Age or Pleistocene epoch.

Scientists like me who study extinct organisms and long lost ecosystems wonder whether large scale human impacts like those we see today are truly confined to the period following industrialisation.

Did the ‘Anthropocene’ really begin 215 years ago?

Or does the environmental legacy of our Palaeolithic ancestors reach into today’s world?

This issue is at the centre of one of the most hotly contested questions in palaeontology and archaeology, namely, the extinction of the Pleistocene ‘megafauna’.

But its a lot broader than this issue of course and it cuts to the core of who we are as a species, the way we have evolved, and lengths we’ll go to ensure our own survival; some would argue even our future survival.

Some scientists have also suggested that the megafauna extinctions set the stage for the planet’s sixth major extinction event, which is unfolding before our eyes.

During the last phase of the Ice Age, between roughly 50,000 and 10,000 years ago, almost 200 species of mammals went extinct across the globe.

That’s half of the world’s mammals weighing more than 44 kg perishing in what was an instant in the long history of life on the planet.

A growing body of highly contested research suggests that humans may indeed have dramatically shaped the diversity of living mammals in the deep past, just like today, leaving us an impoverished natural legacy.

And, let’s not forget that humans with our average body mass of close to 70 kg are megafauna as well.

While our species is obviously still here, we remain the chief suspect in the extinction of our close cousins the Neanderthals, Denisovans, and probably other relatives around this time.

The megafauna debate has been highly polarised for decades: humans being blamed on one hand, and natural climate change on the other.

Environmentally altering activities like burning the landscape by ancient hunter-gatherers in places like Australia, for example, have proven very difficult to establish, and their possible impacts hard to separate from natural climate cycles.

Yet other studies suggest that Ancient Aborignal Australians were one of the major agents involved in dispersing baobab trees in northern Australia; so our environmental impacts can be quite surprising.

And the chronology of human settlement and timing of megafauna disappearance in Australia remain uncertain: significant barriers to resolving the extinction question with any certainty.

So again, a major difficulty is the poor quality of the information we have from the fossil, archaeological and ancient environmental records.

The spectre of the confounding effects of natural and human-induced environmental signals remain all too real even for the Ice Age.

Another way to approach the question of human environmental change in the past is to construct mathematical models to look at changes over time and the influences of natural cycles and changes compared to human facilitated ones.

A battery of such studies is beginning to point firmly to the prehistoric human colonisation of new parts of the planet as a major driver of extinction and environmental change; possibly the leading cause of the megafauna extinctions.

New research published over the last couple of months by Soren Faurby of the Museo Nacional de Ciencias Naturales Madrid and Jens-Christian Svenning of Aarhus University has also pointed the finger squarely at humans.

In an interesting twist, they modelled what worldwide diversity patterns of mammals might look like in the absence of past and present human impacts, based on estimates of the natural distribution of each species (5,747 of them) according to its ecology, biogeography and the current environmental template.

They found that prehistoric human-driven extinctions in addition to recent ones were probably an important influence on present global mammal diversity patterns.

They even suggested that areas normally thought by ecologists to be biodiversity hot spots, like mountains, may in fact reflect their role as refuges for species otherwise affected by hunting and habitat destruction, rather than reflecting a natural pattern.

I’m satisfied that a strong case exists that humans did play an extinction role and that there truly is a link between what our Palaeolithic ancestors were doing to the environment and what we’re doing today.

The difference now of course is that with an expected almost 10 billion people by 2050 and the remarkably destructive technology we possess we’re doing damage on an unprecedented scale and face a future on a planet with an irreparably damaged biosphere and dramatically altered atmosphere.

Spare Your Health, Budget, and the Planet: Ditch the Palaeodiet

Interest in the diet of our evolutionary ancestors would ordinarily be a topic of curiosity in only the most obscure of scientific circles.

But the popularity of the so-called palaeodiet has brought unprecedented attention to the foods consumed by Stone Age or Palaeolithic people.

And, it might ultimately be doing us all more harm than good.

The palaeodiet is claimed to be a recipe for natural health and able to cure a vast range of diseases.

We await the clinical trials to pass verdict on its claimed disease curing benefits, but at the moment most such claims seem like little more than snake oil peddling or faith healing.

Even a cursory look at the palaeodiet highlights huge contradictions and a wilful ignorance of the science behind human evolution and diet.

Proponents of the palaeodiet eschew all processed food, but are happy to suck on beverages like wine.

Don’t be fooled though, wine, like many other components of the human diet, is a processed food.

Wine making involves turning a fruit into an alcoholic beverage through the mechanical breakdown or heating up of grapes, addition of sugar, acid, nutrients, yeast and other chemicals to promote fermentation, add flavor, remove sediment and preserve the wine.

And humans have been processing their food for tens of thousands perhaps millions of years, so its absurd to think you can exclude processed food altogether.

The palaeodiet eliminates all grains, legumes and potatoes, yet there is plenty of evidence that humans have evolved to eat carbohydrates especially starches.

Take the amylase genes which evolved to aid the digestion of starch either in our saliva or pancreas through secretion into the small intestine.

Humans are unique among primates in possessing large numbers of salivary amylase genes and there is a clear association between gene number and the concentration of the amylase enzyme in the saliva.

Plant foods containing high quantities of starch may even have been essential for the evolution of the large human brain over the last 2 million years, according to new research by Karen Hardy from Universitat Autónoma de Barcelona and colleagues published recently in The Quarterly Review of Biology.

Our brains are three times the size of our chimpanzee cousins and are undoubtedly the seat of many of the differences between us in terms of our biology including behaviour.

Previous models such as the ‘expensive tissue’ hypothesis of Aiello and Wheeler proposed that the use of stone tools facilitated a shift from a mostly plant-based to largely meat-comprising diet in our ancestors in order to feed our large brains.

This shift, they suggested, facilitated the evolution of our enlarged brain as well as smaller teeth and reduced gut adapted for eating meat.

Yet there have been lingering doubts, sometimes claimed refutations, of the links between human evolution and meat eating.

There is no clear association across mammals including primates between an enlarged brain and reduced gut size.

Instead, large brains seem to be found in mammals that are capable of storing large amounts of body fat to stave off starvation and also have efficient forms of locomotion like our human bipedalism.

The new model from Hardy and co-authors suggests that cooked starch greatly increased energy availability to energy expensive tissues like the brain, red blood cells, and also the developing fetus.

They also suggest that the number of copies of the salivary amylase gene may
have enhanced the importance of starch in human evolution following the controlled use of fire and development of cooking.

But there are of course many sources of carbohydrates in the diet and research suggests that early humans may have eaten underground food items like roots, tubers and rhizomes, as well as seeds, certain fruits and bark which are all widely available and rich in starch.

Grains were also an important and highly effective source of carbohydrates in the Palaeolithic, despite what the palaeodiet states.

Grinding seeds to make flour and probably bread is known from at least 25,000 years ago in Europe, arguably much longer, and humans have been cooking for at least 400,000 years, but perhaps even 2 million years.

The truth is we have no idea how much meat was eaten in the Palaeolithic because so little of the plant food remains have preserved for us to study and to garner an accurate picture of the complete diet of our ancestors.

Mammal bones with signs of butchering or cooking are plentiful in the archaeological record, but bones always preserve as fossils much longer than plant remains, and so we have a highly skewed view of past diets.

We would also do well to keep in mind that the role and safe amounts of animal food in the contemporary human diet remain controversial in nutritional and medical science regardless of what we think our ancestors may have eaten.

Red meat in particular has been linked to a range of diseases like metabolic syndrome, a variety of cancers, atherosclerosis and Type 2 diabetes, so a degree of caution about safe levels of consumption seems wise.

If your aim is to lose weight, then the palaeodiet is by no means your only option.

Much clinical research has shown that the key to weight loss is reducing the total amount of calories consumed, regardless of whether its carbohydrates, protein or fat.

Watch what you eat, reduce your calories and lift your activity level, is a tried and true formula that works for most people.

Studies of hunter-gatherers during the last couple of hundred years have also shown they walked an awful lot: on average 360 km a year, but up to 3,600 km per annum.

So, you might consider a palaeo-exercise regime combined with a scientifically based and balanced diet as a healthy starting point for weight loss and general good health, rather than the potentially dangerous palaeodiet.

Nutritionists also advise greatly reducing the amount of factory-made foods we consume because much of it lacks nutritional balance, and often has excessive calories and high sugar, salt or fat.

I guess this is one thing palaeodieters and nutritionists are close to agreement on, probably because it seems an awful lot like common sense.

While palaeodiet inventor Loren Cordain argues we should only be eating animals that have themselves eaten a ‘wild’ diet, Australian celebrity chef Pete Evans has extended it to consuming only organic food.

Adopting such an approach to food selection is impossible for most of the planet’s 7 billion inhabitants who couldn’t afford expensive organically grown food.

Evans wants the palaeodiet to be the new ‘normal’ for everyone, but to me, this smacks of Western middle class elitism and is simply out of touch with the realities faced by most people on the planet.

Anyway, most of the sources of animal food consumed by palaeodieters are from domesticated animals, which have been bred for flavour and meat quantity, and haven’t eaten a truly wild diet for thousands of years.

Eating a diet based on wild caught food would also be devastating for the planet.

The environment is becoming degraded and its natural resources depleted on a remarkable scale and pace, and a good deal of this is associated with agriculture and activities like fishing.

It’s estimated that each year tens, perhaps hundreds, of millions of sharks alone are harvested from the oceans and in many places fisheries are far from sustainable.

Similarly, if you’re concern is with animal welfare, then organic farming may not always be the best choice.

We need to get the balance right in our food choices between the broader effects of production on the environment, welfare of livestock and impacts on humankind more broadly.

The United Nations predicts there will be almost 10 billion people in the world by 2050.

This will lead to a dramatic need to increase food production to feed the extra people.

The scale of the challenge ahead was pithily described by Charles Godfray and co-authors in an article about the challenges of population growth and food security in Science magazine in 2010:

This challenge requires changes in the way food is produced, stored, processed, distributed, and accessed that are as radical as those that occurred during the 18th- and 19th-century Industrial and Agricultural Revolutions and the 20th-century Green Revolution. Increases in production will have an important part to play, but they will be constrained as never before by the finite resources provided by Earth’s lands, oceans, and atmosphere.

All of this within the context of the growing impact global climate change will have on food and water availability as well.

If we’re truly concerned about the fate of the planet and humankind, especially those of us in the West, we all need to be prepared to comprise our lifestyles including our diet and ditch luxuries like the palaeodiet.

Eating large amounts of meat, especially animals which have eaten a wild diet, is simply unrealistic, unsustainable and unreasonable if we want to do our bit for nature and the rest of humankind.

How Many Forms Can An Ape Take?

The study of form has been central to biology ever since people have contemplated how life came to exist and how individual species or groups of them are related to one another.

When biologists speak of ‘form’ they mean the shape, appearance or structure an organism takes — be it whole organism or only a constituent part such as a bodily system, organ, microscopic structure or even a molecule.

A famous example from the 20th century is the form of the DNA molecule, which we have known to be a double helix since Watson and Crick published their model in 1953.

Palaeontologists like me are especially interested in form because it gives us clues about the diversity of past life and deep insights into the history and mechanisms of evolution.

Most organisms seem to be well designed for their ecological circumstances, an observation that is as old as biology itself.

Why this is the case and how well the fit between organism and environment actually is remain fundamental questions still in evolutionary science today.

Developing views about form

Interest in form goes back to the Ancient Greeks who were the first people to formally observe the great variety of life and explain how it came into being.

In his work Historia Animalium (The History of the Animals), Aristotle (384-322 BC) produced one of the first scholarly works devoted to the subject of comparative anatomy, or the comparison form among animals.

A little later, the Roman naturalist and philosopher Pliny the Elder (about 23-79 AD) also pondered the diversity and relationships of life and was also the first person to describe the strong resemblance of humans to primates in his book Naturalis Historia (Natural History).

Galen of Pergamon (c130-199/ or 217 AD), a prominent Greek physician, surgeon and philosopher in the Roman Empire, and whose ideas dominated Western medicine for over 1000 years, also recognised similarities in form between humans and primates.

He is even said to have commended his students to study primate anatomy in order to develop a better understanding of humans.

Seventeen centuries later, Charles Darwin provided in 1859 a mechanism by which favourable forms could become widespread within species and persist over long periods, namely, through natural selection.

His understanding of how form arose within individuals, and was inherited and modified, remained rather rudimentary, and his speculations were ultimately shown to be incorrect.

These problems would have to await a proper understanding of growth and development, including embryology, and the principles of inheritance, all of which were developed during or soon after Darwin’s time.

It was really with D’Arcy Thompson’s 1917 book On Growth and Form that the examination of form took on a more scientific bent.

While others before him like Goethe had recognised the importance of form — inventing terms like ‘morphology’ to describe it — theirs was a largely descriptive approach, not an explicitly geometric or mathematical one.

In 1968, in his book Order and Life, James Needham wrote, “the central problem of biology is the form problem,” and it remains so today.

In the 1990s the scientific discipline of ‘evo-devo’, or evolutionary developmental biology, blending embryology, genetics and evolution, began to provide deep insights into how form was constructed and how it changed.

Evo-devo marked the beginning of a profound shift in our understanding, but there remain many unanswered questions.

A limit to form?

One question that has plagued the study of form since before Thompson is whether nature sets limits on how many different forms life can take.

Or put another way, is evolution constrained in the solutions it can invent to solve the ecological problems species face?

Influential palaeontologists like Simon Conway-Morris of Cambridge University, for example, who has devoted his career to studying the Cambrian explosion of animal life argues there is indeed a limitation to the number of possible forms life can take.

The evidence for this, he argues, comes from the prevalence of repeated forms in nature across distinct evolutionary lines, or what biologists call evolutionary convergence.

If this were correct, what might be the cause of such limitations to form? Is there, for example, only a limited number of combinations that genes can take?

The genes that control the body plan in the developing embryo such as the Homeobox cluster are after all highly conserved and very ancient.

There are striking similarities in these genes even between humans and fruit flies.

But Conway-Morris sees repeating forms in nature as evidence for God or design.

For me, Creation is too complex a solution to be satisfactory, one that raises far more questions that it answers, and one not really amenable to testing within the scientific framework.

Besides, evo-devo has opened up many more possibilities now including tinkering by natural selection in the timing of key events during embryogenesis through genetic mutation or even epigenetic influences.

Deep insights from apes

If we take a close look at the animals I’m most familiar with — humans and our ape cousins — we find good reasons to be sceptical that the kind of constraints on form that Conway-Morris supposes actually exist.

Research published recently by Sergio Almécija of George Washington University and co-authors shows that scientists have dramatically underestimated the ways in which the body form of apes can and do vary.

For primates, whose lives are spent mostly or entirely within trees, grasping hands are one of the main ways in which they interact with their environment.

Hands hang onto branches when primates moved about or rest, they grasp food for eating, and apes groom each other using their well developed handgrips.

Almécija and colleagues examined the proportions of the bones of the hand and hand digits in humans and other apes and found that they varied an awful lot, much more so than we had all been led to believe before now.

They found that among the apes, gibbons possess a highly unique form of hand, while chimpanzees and orangutans had similar hands, which had evolved independently of one other.

Gorillas and humans had very conservative or ‘primitive’ hands that had changed very little during our evolutionary histories.

Even today, human and gorilla hands look an awful lot like monkey hands rather than those of our chimpanzee cousins.

So our monkey-like human hands also turn out to be an awful lot like those of the earliest members of our evolutionary group, the bipeds who first strode the African savannah seven million years ago.

One big implication of this work is that it challenges a long-held assumption that living chimpanzees are a lot like our earliest human ancestors: a kind of evolutionary snap shot of our own earliest bipedal ancestors if you like.

For decades now we have been studying chimpanzees to glean insights into our immediate evolution, but this approach looks increasingly problematic.

Coming back to where we started, there seem also to be far fewer limits to form than many scientists have believed, and that important features like hands can change or stay the same in ways that are not immediately obvious or predictable.

Evolution Took Many Paths to Building ‘Pygmy’ Bodies

For more than two centuries physical anthropologists have been preoccupied with cataloguing and explaining the way humans vary physically across the planet.

We mostly differ in familiar ways: body mass and stature, limb proportions, head size and shape, nasal prominence, proportions of the face, tooth size and shape, hair form and colour, skin pigmentation, iris colour, among others.

From the late 18th through to second half of the 20th century physical variation was assumed to reflect the deep division of humanity into ancient ‘races’.

With major developments in human genetics from the 1960s onwards the notion of races began to be dismantled and eventually their use fell into disrepute.

Present day racialist stalwarts are wilfully ignorant of genetics or live in a vacuum divorced from the history of race theory and realities of human biological variation.

That’s not to deny that human geographic variation exists, a fact demonstrated powerfully by genetics time and time again.

But the way we vary along geographic lines simply doesn’t fit the old racial categories; but then, they never were about science, as readily acknowledged by early physical anthropologists like Johann Blumenbach.

One group that has received more than its fair share of scientific and racialist scrutiny is the so-called ‘pygmy’ peoples.

These are short statured populations – average height around or below 150 cm – found in many parts of the world including in Africa, South Asia, Southeast Asia, Australia, New Guinea and South America.

For example, the Efe hunter-gatherers of the Ituri rainforest (Democratic Republic of Congo) have mean adult female and male statures of 136 cm and 143 cm.

Such populations have played a major role in evolutionary models, underpinned by racialist theory, such as the Negrito settlement of Southeast Asia and Australia.

The term ‘pygmy’ is still widely used in science and the popular reporting of science, but we should be a little more circumspect about its use.

It’s a term tied to 19 th and early 20 th Century Social Darwinism, with pygmies seen as a lower stage in human evolution, and therefore, a lesser race than the imperialist Europeans who studied them.

Their small body size, and it was argued, diminutive brain size, was seen as a kind of infantile stage in the evolution of humankind.

Although, their brain sizes actually lie comfortably within the range of other, taller, populations.

Unsurprisingly, the ‘all grown up’ and strapping northern European lads doing the racial categorising saw themselves as occupying the apex of human evolution.

Pygmy is also a term these people don’t apply to themselves, and so has often come to be seen as derogatory.

Most of these small-bodied populations inhabit tropical rainforests, such as intensely studied groups like the Aka, Mbuti, Baka and Efe in tropical Africa, Andaman Islanders in South Asia, Aeta, Agta and Batak in tropical Southeast Asia, and the Mountain Ok and Mafulu peoples of New Guinea.

But there are some important exceptions, such as the San in southern Africa who live in the Kalahari Desert, and some of the Khoikhoi people who live around the southern African cape.

While it’s been argued they migrated to these regions from the tropics, genomic research indicates the Kalahari may be the evolutionary homeland for the human species, with the San being one of the oldest genetic lineages of humankind.

These peoples are, or were until recently, hunter-gatherers and so have also been the subject of intense research about all sorts of questions surrounding human evolution, including today by evolutionary psychologists.

We would be wise to be sceptical about this though: living hunter-gatherers don’t represent a snapshot of a lost world, or stage of humanity’s evolution, as such studies often imply.

Dwarfism is of broad interest in evolutionary biology because it’s known to affect many mammals especially on island settings.

Take the Ice Age pygmy elephants on Mediterranean islands or last of the woolly mammoths that became pygmies in the arctic region.

Another celebrated case is the so-called Hobbit from Flores – Homo floresiensis – also thought to be an example of island dwarfing.

The underlying cause of the small body size of ‘pygmy’ people has been one of the major themes of human biology for decades.

Short stature has been shown to be associated with perturbations in the GH1-IGF1 pathway, one of three endocrine systems that regulate growth.

A number of studies have shown that many, but not all, of these hunter-gatherer groups have low plasma levels of IGF1, hinting at differences in their underlying genetics.

Chronic malnutrition can also lead to low levels of IGF1 and other growth hormones, so environmental effects have also been implicated.

Genetic studies over the last half decade have suggested a complex role for DNA mutations and natural selection in at least some cases of human population dwarfism.

Selection could plausibly have acted to reduce body size in rainforest dwelling people, as they occupy a kind of ‘ecological island’ with scarce and difficult to acquire food resources.

Small bodies require less energy so people with them could survive on lower caloric intakes.

Other explanations have included thermoregulation in a hot tropical environment, the need to reduce energy expended during locomotion, and a life history explanation keyed into a lower reproductive age.

The scientific jury’s still out on which one provides the best explanation.

A new study published in Nature Communications has examined the growth patterns of the Baka people from Southeast Cameroon and offers some fascinating new insights into the mechanisms underpinning the human ‘pygmy’ phenotype.

Fernando Rozzi from the Centre National de la Recherche Scientifique in Paris and his team studied mission records from the 1970s onwards gathered by medically trained nuns and recording the growth of almost three hundred children.

In their paper they found that body size at birth was within the normal limits set by larger bodied populations, but that the growth rate of the Baka infants slowed significantly during the first two years of life.

After this, their growth more or less followed the standard pattern seen in people across the world, including the adolescent growth spurt accompanying puberty, which is a universal and unique characteristic of our species.

The Baka growth pattern also contrasts with that documented for other short-statured populations in Africa.

So evolution seems to have acted to produce the same outcome in different populations using different mechanisms.

We know that across life, evolutionary tinkering with growth and development is one of the major causes of differences among closely related species.

Body size is also one of the most important ecological variables among mammals and so understanding the mechanisms that alter it provides profound insights into evolution and fundamental ecological strategies.

Finally, the fact that such vast differences in growth have been found between short-statured populations on the same continent, evolving independently, shows once again that the old race categories like ‘pygmy’ or ‘Negrito’ are simply incapable of doing our evolutionary history justice.

 

Making Sense of Our Evolution

The science aboutour special senses – vision, smell, hearing and taste – offers fascinating and unique perspectives on our evolution.

Yet it remains patchy; we know surprisingly little for example about how our sense of hearing has evolved since we shared an ancestor with chimpanzees some 8 million years ago.

In contrast, understanding of the evolution of human vision and smell, including new developments in ancient DNA research, offers great promise in answering some long standing questions about our uniqueness as a species.

A very visual mammal

Humans live in a world dominated by images and colour. Our sense of vision largely dictates how we perceive the environment around us.

We’re also prone to summing up others by the way they look. Faces and expressions, skin, eye and hair colour; as if we can read someone’s heritage or personality like a book.

All of this hints at the crucial role vision plays in our social lives as well.

For our kind of mammal, the primates, vision is king.

Our ancient ancestors evolved for a life in the trees and today most of our primate cousins still lead an arboreal existence.

The need to safely judge distances when leaping or climbing about a canopy, tens of metres above a forest floor, certain death only a single wrongly placed hand grip away, must have led to intense natural selection.

We have, as a result, highly refined vision; monkeys and apes, including humans, possessing stereoscopy: we see in 3-dimensions.

Our skulls, eyes and brains have evolved to facilitate 3D vision: eye sockets that face forwards, the field of vision from each eye overlapping, and brains processing visual information from each eye equally on left and right hemispheres of the brain.

Trichromatic vision allows humans and many other primates to perceive perhaps 10 million colours; its evolution probably keyed into the eating of fruit by our distant primate ancestors, allowing it to be distinguished against a forested backdrop of leaves.

The primate eye is largest relative to body size of all the mammals; a legacy perhaps of the nocturnal lifestyle of the earliest primates.

Yet, the human eye is unusual among all primates in having an exposed sclera, the outer layer of hard tissue that encloses and protects it.

The sclera is also white, but in other primates it’s pigmented, being brown in colour, and probably acting as camouflage.

The depigmented white of the human sclera plays a role in enhancing communication especially when we make eye contact and may have a function in sexual attraction as well.

When we compare the human skull with our Neanderthal cousins we find their visual system was probably better developed than ours, as estimated from the volume of their eye sockets (orbits) and the space that would have been filled by the occipital lobe of their brain.

Just what Neanderthals were doing with their eyes that was so different to us remains unknown. Did it help them in low light or snow covered landscapes, or with hunting?

So far, though, we’ve learned remarkably little about the evolution of sight from ancient DNA; a couple of genes were identified in the initial sequence of the Neanderthal genome in 2010, but little seems to have emerged since.

The neglected sense

Can you imagine what our lives would be like if our dominant sense was smell, or olfaction?

Scents and odours filling our world like colours on the visual spectrum, only the shades and tones would be odours.

Across all life, from bacteria to mammals, the ability to detect chemicals in the environment is fundamental to survival.

For most mammals the sense of smell dominates their world much as our sense of sight does.

The mouse has around 1,000 different cell types for detecting odours, or so-called olfactory receptors. Humans only have about 350 of them.

It’s also been a major catalyst of biological evolution. Among vertebrates alone at least four different kinds of olfactory systems have evolved.

The evolution of olfaction has also left a very large imprint on the mammal genetic code; olfactory genes represent the single largest gene family in the mammal genome.

The human genome contains an estimated 900 genes and pseudogenes associated with the perception of smells while the mouse genome has roughly 1,400 of them.

In comparison, the catfish has only around 100 olfactory receptor genes.

It’s the pseudogenes though that have attracted much of the research attention: pseudogenes have either lost their ability to produce proteins or fail to produce them within a particular kind of cell.

Around 60 percent of human olfactory receptor genes are in fact peudogenes compared with only about 30 percent in other apes, and 20% in mice and dogs.

Charles Darwin thought that the human sense of smell was a vestigial (or ‘useless’) trait, and he may even have taken our large number of pseudogenes as confirmation of his ideas; had he known about them.

While Darwin’s was clearly an overstatement, the dramatic loss of functional genes strongly hints at major differences between our sense of smell and that of most other mammals including our ape cousins.

Still, we know that in living humans our sense of smell is anything but useless; it plays a role in our immune system, in social communication, reproduction including choosing mates and during courtship and sex, detecting emotional stress in others (‘emotional contagion’) and of course during eating.

Neanderthal smells

Its long been suggested from fossil comparisons that humans have a better developed sense of smell than our Neanderthal cousins did; the opposite situation to our sense of vision.

The olfactory bulb – an organ that sits inside our braincase, overlies the nose cavity and transmits smell perception to our brains – was probably larger in humans for a start.

Ancient DNA has also opened up the possibility of studying differences in the olfactory genes directly across humans, Neanderthals and the mysterious Denisovans.

Last year Graham Hughes of University College Dublin and co-workers reported differences in olfaction genes between humans and our extinct cousins.

They found that 10 functional olfaction genes in humans were inactive (pseudogenes) in Neanderthals, and 8 in the Denisovans.

This points to subtle but probably ecologically important differences in smell between our species.

Another recent study led by Kara Hoover of the University of Alaska Fairbanks compared the ability of humans from a range of populations across the globe to detect an odour called OR7D4.

This odd gene mutation allows us to detect a smell called androstenone that’s produced by pigs and wild boar, so may have played a role in diet among our ancestors.

The presence of the gene is known to be a good predictor of androstenone smelling ability.

Hoover and her team also studied the OR7D4 gene in the sequences of the Neanderthals and Denisovans and found the Neanderthal version was like our own, but the Denisovan one differed from it in terms of its DNA code, but functioned in a similar way.

Around 50 percent of living adults cannot smell androstenone while about 35 percent can detect 200 parts per trillion in air and are offended by it.

Androstenone is also found naturally in human sweat and urine, with boar androstenone even being marketed and sold as a human aphrodisiac.

So the ability of Neanderthals and Denisovans to detect it might eventually also have something to tell us about their sex lives as well.

Promise of DNA

We can only go so far with fossil studies when it comes to studying the special senses; so little information preserves for us to reconstruct their anatomy afterall, and little in the way of function.

But with vast numbers of olfactory genes available for study in the genome of living humans, our extinct cousins like the Neanderthals and Denisovans, and many living primate relatives, we’ve still a lot to learn about our remarkable sense of smell.

Aboriginal History Rewritten Again by Ignorant Political Class

Last week Liberal Democrats Senator David Leyonhjelm was widely reported as suggesting that people other than Aboriginal Australians may have occupied the Australian continent in the past.

At a doorstop at Parliament House he apparently couldn’t name his sources when pressed by journalists and seemed rather vague on the details.

His doubt was apparently based on disagreement among anthropologists over the identity of the painters of the so-called ‘Bradshaw’ or ‘Gwion Gwion’ rock paintings in the Kimberley region of Western Australia.

Now there is a very strong sense of deja vu here because this very issue was at the centre of a widely reported and politically fuelled stoush from the late 1990s to mid-2000s, but back then within the context of Native Title.

Actually, the debate over these paintings has existed ever since Joseph Bradshaw brought attention to them in 1892 because they were thought at the time to be ‘too advanced’ to have been made by Aborigines.

This fitted a 19th Century linear worldview in which societies progressed from primitive to advanced, the Bradshaw/Gwion Gwion paintings being touted as an anomaly made by an exotic people.

The Bradshaw/Gwion Gwion art style was however widely accepted by academic researchers from the late 1960s onwards as belonging within the broader rock art traditions of Northern Australia.

But following the publication of a book about the art in 1994 by amateur archaeologist Grahame Walsh the 19th Century view made a comeback.

Walsh argued that the Bradshaw/Gwion Gwion tradition was painted by a pre-Aboriginal group 20,000 years ago, Aboriginal Australians only arriving in the area 10,000 years ago.

In a second book published in 2000, he even went to great lengths to disconnect Aboriginal Australians culturally from the Bradshaw/Gwion Gwion paintings and instead connected them to a population possibly originating in Africa.

A great deal of space has been devoted in academic journals to deconstructing Walsh’s unfounded ideas and analyzing the political fallout from them.

Ian McNiven, an archaeologist at Monash University, wrote an article in 2011 in the journal Australian Archaeology about the 1990s/2000s public debate over them.

As he noted, there is very good evidence for cultural continuity between these paintings and recent art as documented for example by amateur archaeologist David Welch in 1996.

Paul Taçon who holds a chair in rock art research at Griffith University also pointed out in an article in Nature Australia (1998-1999) that Welch:

“has documented a recent use of every type of artifact depicted in Bradshaw art, strongly suggesting the paintings reflect Indigenous Australian way of life”.

More broadly, the science of human origins has moved a long way in the last two decades not the least because of big developments in genetic research.

DNA shows clearly that Aboriginal and Torres Strait Islander people are directly descended from the earliest humans to have settled Australia, New Guinea and surrounding islands.

Genetic clocks show they split from populations alive in East Asia today between 45,000 and 75,000 years ago.

Human skeletons from the Willandra Lakes region of southwest New South Wales also make abundantly clear that living Aboriginal Australians are the very same people as those who arrived here more than 40,000 years ago.

McNiven has also pointed out the very long history of the political use of archaeology to justify colonial ends by disassociating Indigenous people from their land and heritage.

He pithily concluded in 2011:

Thus, I suspect, we haven not heard the last of colonialist interpretations of Gwion Gwion paintings. As long as Australian society struggles to comprehend and acknowledge Aboriginal Native Title rights, archaeology will continue to be manipulated by those seeking to undermine Aboriginal authenticity and legitimacy of connections to land and heritage.

And so it is now with Constitutional recognition of Australia’s First people: once again Aboriginal and Torres Strait people find their history and culture being rewritten by ignorant politicians for ideological reasons.

Senator Leyonhjelm’s comments are clearly an attempt to reopen the Bradshaw/Gwion Gwion debate, and in so doing, cast doubt over the legitimacy of Aboriginal and Torres Strait Island people as the first inhabitants of Australia.

Sadly, he might just succeed within the context of a 24 hour news cycle and the seeming absence of a long term memory in the media and society more broadly.

The Conversation

Darren Curnoe is Human evolution specialist & ARC Future Fellow at UNSW Australia.

This article was originally published on The Conversation.
Read the original article.

Biology’s Holy Grail: the Species and it’s Controversial Recent History

How many kinds of plants and animals are there in the world? Where do humans fit within the vast fabric of life? Indeed, how did life, including humans, evolve?

At the centre of questions like these is the scientific practice of identifying and naming species, or taxonomy.

And, the basic unit of taxonomy – ‘the species’ – remains an elusive and controversial concept despite its fundamental importance to science.

Yet, few people outside of biology and philosophy realise that ‘the species’ has been at the centre of a major controversy in science for much of the last 50 years.

A cardinal science

Taxonomy is a fundamental or ‘enabling’ science that underpins all of biology and its many related fields including medical research.

How could we, for example, develop a vaccine or pharmaceuticals to fight deadly diseases like Ebola without knowing their status as a virus or bacterium?

It’s also central to major global projects like the ‘Open Tree of Life’ which ambitiously seeks to reconstruct no less than the evolutionary relationships of the Earth’s 1.8 million named living species.

Yet, so far, biologists have recognised and named less than 20 percent of the planet’s estimated 11 million living organisms, some of them going extinct quicker than they can be discovered.

Taxonomy is also at the core of fields like my own, palaeontology, concerned with the study of ancient worlds and extinct organisms.

By examining the diversity of life in the past we glean insights into alternative world’s, helping us realise that the Earth hasn’t always been as it is today, and informing us about where we could be headed with a planet beset by anthropogenic warming.

Moreover, with about 99 percent of all life that existed now extinct, some 20 billion species may have existed during the roughly 4.0 billion year history of life, leaving plenty of work for future generations of palaeontologists!

Taxonomy is also central to how we understand, enjoy and utilise nature in a sustainable way so that future generations might also share the Earth’s astonishing bounty and beauty.

A great debate about species

The ‘species’ is the most fundamental level in taxonomy and is also the unit of evolution.

The species is the only ‘real’ category in the taxonomic system – and by real I mean it has an objective existence in nature, at least according to most taxonomists and philosophers of biology.

Ironically though, it has proven to be the most troublesome of all the taxonomic categories to work with, and one of the most difficult concepts to define in science.

At present, there are at least 26 species concepts in use in biology which adds enormous confusion to an already confusing area of science.

In fairness, though, most of them don’t enjoy widespread support or use, with only a handful – a half dozen or so – being routinely applied by biologists.

For much of the second half of the 20th century, and spilling over into the present century, philosophers and biologists engaged in an intellectual war over the ontological status (or reality in nature and meaning), basic properties and practicalities of recognising species.

Just what is this thing that we call a species? What are its properties? Is it like the names we give to cities or to our children? Does it have its own unique qualities by virtue of being biological?

Interbreeding, surely?

But wasn’t all this resolved a long time ago, I hear you ask?

I learned at school, or in first year university biology, that species are groups of organisms that interbreed with each other, I hear you say.

You old sentimentalist! Harking back to the mid-20th Century before the big species debate erupted. If only it were so simple.

The interbreeding idea was widely discussed by biologists even before Charles Darwin’s time, but it was only formalised as a species ‘concept’ during the 20th Century, taking centre stage in the ideas of Theodosius Dobzhansky and Ernst Mayr.

By ‘concept’ here I mean the species as a rung on the Linnaean hierarchy, a description or definition of what the species, in a generic sense, actually is.

Interbreeding was formalised by Dhobzhansky and Mayr as the so-called ‘biological species concept’, although, this is a misnomer because all species concepts are biological by definition.

Dissatisfaction with this concept was there right from its inception though.

One of the other chief architects of the ‘modern synthesis’ of evolution biology, George Gaylard Simpson, proposed his own concept known as the ‘evolutionary species’.

But, from the early 1960s onwards dissatisfaction grew so strong that it became the catalyst for a big debate that would consume much of biology for the next few decades.

Hard questions

From this time, philosophers and biologists began to ask some rather difficult questions, like:

• How does the species category compare with other scientific groups or types of things like say the chemical elements?

• Does it play the same kind of role in science – conveying the same sorts of information and allowing us to make predictions about nature?

• What’s the best, most objective, way to recognise a species?

Also, as intuitively appealing as the ability of organisms to interbreed is as a test of species membership it’s been terribly difficult to apply in practice.

In fact, according to Lélia Lagache of the University of Bordeaux and co-workers, by 2013 it had only ever directly been applied once in a wild population!

So, it turns out we’ve all been cheated by the textbooks we read in high school or university.

Short-changed by our science teachers and biology lecturers.

A more honest reading of history shows that in fact most species – especially animals, the organisms I’m most familiar with – have been discovered and named on the basis of their physical appearance, or ‘phenotype’.

And the winner is..?

One of the biggest insights about the ‘species problem’ came during the late 1990s from the Smithsonian-based biologist Kevin de Queiroz who recognised that most of the concepts in use were simply a catalogue of the features that species might possess.

An example will help to explain its importance: cars.

They have a real existence, separate from us: an objective status if you will.

They possess an engine, four wheels, doors, a radio, need fuel, carry people and parcels and groceries, move, and come in a range of shapes and colours.

But does any one of these properties define a car adequately? Are cars described well by their engines or seats? Or by the fact that they have wheels or require fuel?

Some of these characteristics may be essential for them to be cars, but they don’t define what a car actually is.

Cars are, by definition, human controlled machines that move (propel in a controlled way) and carry people and other items from one place to another.

And so it is with species. Are species simply organisms that reproduce with each other? Or recognise each other’s mating call? Or share an ecological role or niche?

Like cars, there is something much more fundamental about species that defines them regardless of the particulars of any one or other species.

Species are groups of organisms that may do all, or some, of these things, but these qualities don’t define them.

It is much more productive to think of species as groups of organisms that share an evolutionary history.

They belong to their own branch on the tree of life; a branch with a beginning, a history, and eventually an end as well; an evolving lineage.

It’s all about the diagnosis

Focusing on the other fundamental issue here: if we don’t use interbreeding as our criterion, what should we use to distinguish species?

In a word, their ‘diagnosablity’.

That is, the evolutionary branches we call species, in sharing a common history, will share a set of physical features which aren’t shared with other organisms.

They possess a set of unique features that makes them diagnosable or distinguishable from all other branches or species.

A modern human (left) and Neanderthal (right) cranium. The unique physical features of each group are used to define us as different species.
Darren Curnoe.

Think of living humans or Homo sapiens. We can recognise our kind as having a bubble-shaped brain case, faces tucked beneath the front part of our brains producing a steep forehead and jaws that sport a chin.

We are the only primate to have this set of features in our skeleton, and they define us as an evolutionary branch or species.

Still, not every taxonomist agrees that diagnosability offers the best way forward in recognising species.

But, today, most do, and for me, and many others interested in classifying life, especially extinct organisms, it makes a great deal of sense, because we normally have little else to go on but features of fossilised teeth and bones.

Written in the genes?

What I have neglected here of course is the role that DNA evidence is increasingly playing in taxonomy.

Genetics has come to be seen a central to the process of identifying living species and increasingly also extinct species following remarkable developments in the investigation of ancient DNA.

But, an ongoing issue continues to be whether DNA markers can be used to describe species in nature.

Many insect species are recognised by the anatomy of their genitalia. Shown here is a pair of red mason bees.
Wikimedia Commons

While each species must, by definition, be genetically unique, among animals at least, species descriptions are still fundamentally based on observable physical features, be they soft tissues, fur pattern or coloration, or features of the teeth and skeleton.

DNA compliments information about the phenotype and of course informs us about how physical features develop and evolve.

Yet, there are species which can’t be distinguished easily with physical traits, but have been shown to be genetically highly distinct.

These are ‘cryptic species’: and I and others suspect they are much more common in nature than we realise.

Back to practicalities

While the species debate continues, much of the focus of current discussion is on how we should go about identifying them in nature.

Not everyone is satisfied with the criterion of diagnosability. In particular, one issue that causes unease is that different concepts can sometimes result in vastly different estimates of the number of species.

And, such issues will ensure that the species debate will continue for years to come.

But, we are closer than ever to resolving the question, reaching a consensus, over what has been one of the most hottly contested questions in the history of science, despite is remarkably low public prominence.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Our Ancient Obsession with Food: Humans as Evolutionary Master Chefs

Amateur cook-offs like the hugely popular MasterChef series now in its seventh season in Australia have been part of our TV diet for almost two decades.

These shows celebrate the remarkable lengths we humans will go to to whet the appetite, stimulate the senses, fire our neural reward systems and sustain the body.

Yet, few of us pause to reflect on the hugely important role diet plays in the ecology and evolutionary history of all species, including our own.

In the latest episode of my UNSWTV video series ‘How did we get here?’, I take a light hearted look at the role diet has played in our evolution.


Episode 6 of my UNSWTV Series ‘How did we get here?’ explores the importance of cooking in our evolution.

Constructing a human niche

So much of what we read about human evolution portrays the protagonists as unwitting players in a game of chance: natural selection acting through external environmental factors beyond their control and sealing their evolutionary fate.

Yet, all species influence their environment through the normal ecological interactions that occur in every ecosystem, such as between predators and their prey.

Such interactions shape ecosystems over long time scales and are profoundly important in terms of evolution.

When a species alters its environment and influences its own evolution, becomes a ‘co-director’ if you will, the process is dubbed ‘niche construction’.

The controlled use of fire and cooking of food must have been evolutionary game changers, being spectacular examples of niche construction.

Anthropologists like Richard Wrangham have argued that cooking probably even began more than 2 million years ago and may have played a key role in major changes such as a life permanently on the ground and our large bodies and brains.

The foraging strategy of human hunter-gatherers seems to have involved a focus on difficult to obtain but high reward foods, requiring sophisticated cognitive, cultural and social skills.

A gutsy move

Our guts also reflect the food preferences of our ancestors.

Our small intestine represents almost 60% of our total gut volume, whereas in other great apes it’s around 15-30%.

Similarly, the large intestine (colon) of other apes is about 45% of total gut volume but only about 20% in humans.

These directly reflect differences in our diet: humans rely heavily on nutritionally dense and easy to digest foods like grains and animal foods rather than a diet based entirely on raw foods dominated by plant matter.

Another interesting facet of the human diet is the relationship we have with parasites like tapeworms.

Each year millions of people around the world are infested by one or more of the 20 species of human tapeworms through eating undercooked or raw meat from cattle, sheep, pigs and other sources.

Yet, surprisingly, we humans are the only species of primate that is a definitive host for several of these tapeworms; meaning they can’t reproduce without being eaten and hosted in the human body.

The only other mammals to be definitive hosts for tapeworms are carnivores.

Molecular clocks even suggest that human specific tapeworms evolved at least 100,000 years ago and perhaps as early as 1.7 million years ago when meat eating and cooking may have begun.

Traces of fire in the dirt

The archaeological evidence for the controlled use of fire and cooking is surprisingly thin on the ground and highly contested.

The earliest claimed evidence goes back to least 1.4 million years ago in Kenya at an archaeological site called Chesowanja in the form of burnt clay.

But this is controversial and a range of natural causes such as lightning strikes on trees could have caused similar signs.

More recent evidence from Wonderwork Cave in South Africa provides a much stronger case for hominin fire use by about 1 million years ago.

In Israel, at the site of Gesher Benot Ya `aqov, burned seeds, wood and flint are all argued to be evidence for the control of fire nearly 790,000 years ago.

Excavations at Zhoukoudian during 2014 being conducted by members of the Chinese Academy of Science.
Darren Curnoe

At the famous site of Zhoukoudian near Beijing Davidson Black claimed way back in the 1930s that Homo erectus had cooked its food hundreds of thousands of years ago.

The debate has continued ever since and remains unresolved: many local archaeologists claim evidence for burning at the site while geologists from abroad have suggested the deposits are natural not anthropogenic.

Controlled fire use seems surprisingly late in places like Europe, where good evidence for burning dates back to only around 300,000 or 400,000 years ago.

Yet, even this has been questioned, the evidence for regular and controlled use of fire perhaps being much later and associated with modern humans, rather with Neanderthals who may even have lacked the capacity to control fire.

Can chimpanzees cook?

Could our chimpanzee cousins offer any insights into the evolution of cooking, given that they don’t control fire?

A new study published in the journal Proceedings of the Royal Society B by Felix Warneken of Harvard University and Alexandra Rosati of Yale University set out to test this question.

They conducted nine experiments on captive chimpanzees at a sanctuary in the Republic of Congo and concluded that they indeed share several essential psychological capacities needed to cook.

These include preferring cooked over raw food, delaying eating in order to get cooked food, showing practical understanding of cooking and even saving food in anticipation of future cooking opportunities.

We humans are undoubtedly ‘evolutionary Master Chefs,’ being the only species that cooks, and as of 10,000 years ago, grows our own food.

But this new research shows that the shared ancestor of humans and chimpanzees, an African ape that lived 8 million years ago, had the necessary smarts to adopt cooking when the opportunity would eventually present itself.

This article was originally published on The Conversation.

Read the original article.

All Mixed Up: Interspecies Love-ins and the Offbeat History of Our Species

Revolutionary developments in the study of the DNA of our fossil ancestors are forcing a major rewrite of the human evolutionary story.

They hold major implications for fundamental questions that cut across biology and shift the spotlight back onto humans as a central model in the study of evolution.

And, they again highlight the weird sex lives of our Palaeolithic ancestors.

Captivated by our past

Most of us find surprises in our family histories when we start digging. There are always distant relatives who didn’t quite fit in, did something heroic or shameful, or were perhaps from the wrong side of the railway tracks or even the “wrong” part of the planet.

In my own family one of our ancestral lines – my maternal grandmother’s father’s lineage – was never talked about much. “Oh, his ancestors were convicts,” my grandmother would say.

After my grandmother died six years ago, my mother began to explore this side of the family more thoroughly with the aid of one of those on-line genealogy sites.

To our surprise, we found that far from an embarrassing convict past, my great grandfather’s family seems to be traceable to the small group of English settlers who in 1788 established the fledgling colony that would later become known as Sydney.

From another perspective, he was part of the first wave of European invaders who would come to take over the Australian continent, dislocating its original owners.

It’s the same when we dig into the evolutionary genealogy of the human species as contained in our genome, it holds big surprises for us about our past.

Genomic genie out of the bottle

Studies of the human, Neanderthal and Denisovan genomes are turning the science of human evolution on its head.

It turns out we aren’t the species we once thought we were: we are in fact mongrels, ‘bitsas’: bits of this species and bits of that species.

Our genome is a mosaic of DNAs: largely Homo sapiens, but with bits of Neanderthal, fragments of Denisovan, and pieces of other, mystery, relatives that we haven’t yet identified from the fossil record.

Interbreeding seems to have had important consequences also for modern humans as we dispersed across the Eurasian land mass tens of thousands of years ago, bolstering our immune systems and perhaps even allowing of us to survive at high altitude.

It’s a sobering time for scientists like me, who in the post-genomic era are being forced to re-evaluate the theories we’ve promulgated and assumptions we’ve held dear over many decades.

How did this happen, you might ask? Well, perhaps sapiens was as sapiens is?

One of the great surprises of the Internet age is the incredible range of sexual proclivities that we humans indulge in; all there to be viewed in glorious Technicolor for those who might dare to click.

And it seems the broad menu of sexual tastes our species enjoys may have extended all the way back into the murky Palaeolithic, to include other hominin species.

Changing views of species

Many of us were taught in high school or university undergraduate biology that species are some how pure lines; groups of organisms that can’t interbreed with each other.

They are ‘reproductively isolated’ from other species, in the language of the mid-20th Century architects of the ‘modern synthesis,’ Theodosious Dhobzhansky and Ernst Mayr.

One of the great revelations from population genetics over recent decades is the surprisingly common occurrence of interspecies mating: hybridisation.

At least 10% of primate species interbreed naturally, in the wild, and hybridisation is now widely regarded to be a source of evolutionary novelty and to even play a role in the formation of new species.

History rewritten

In 2010, with the first draft sequence of a Neanderthal genome by Richard Green and co-workers, we began to learn that the ancestors of all living non-Africans had in fact mated with our Neanderthal cousins.

Modern human (left) and a Neanderthal (right). Despite being a different species to us, around 1-6% of the genome of living non-Africans comprises Neanderthal DNA.
Darren Curnoe

The result was that 1-4% of our genome is Neanderthal in origin, although, slightly earlier estimates from studies of the human genome by Jeffrey Wall and his team suggested the contribution of archaic human DNA could be least 6%, and perhaps up to 14%.

The amount varies also between human populations with some East Asians having 40% more Neanderthal DNA than Europeans, according to other research published by Wall and co-workers.

And yet further work by Jeffrey Wall has shown evidence in the genome of some living Africans for interbreeding with another species there, around 35,000 years ago. One we haven’t yet identified from the fossils.

Modelling by Armando Neanves and Maurizio Serva has also suggested that Palaeolithic humans need only to have interbred successfully once in every 77 generations, or roughly 1,500 years, to explain the levels of Neanderthal DNA seen in living people.

So it would seem to have been a very rare event, occurring perhaps only a handful of times in our evolutionary past, and subsequently captured by natural selection.

And yet further research by Sankararaman and colleagues has suggested that the patchy distribution of Neanderthal DNA in living humans indicates strong selection against hybrids, especially male hybrids who probably suffered reduced fertility or could even have been sterile.

This would strongly support the notion that Neanderthals were a different species to us, as most fossil specialists suspect, and that mating was in fact a cross-species affair.

A truck load more to come

One of the emerging surprises from the fossil record from the period roughly 50,000 to 10,000 years ago is the remarkably large number of ‘enigmatic’ remains that have and continue to be discovered.

By enigmatic I mean that ‘overall’ they resemble modern humans (H. sapiens) but also possess a surprisingly large number of features that we would normally associate with archaic groups like the Neanderthals.

The 11,000 year old skull from Longlin Cave in Southwest China might be the youngest example of a hybrid between modern and archaic humans.
Darren Curnoe

I’ve seen this in my own work with the ‘Red Deer Cave people’ in Southwest China.

And it’s a compelling explanation for the mixed anatomy of the Iwo Eleru remains from West Africa, Nazlet Khater 2 skull from Egypt, Lukenya Hill fossil from Kenya, and in Europe, the Mezzena jaw and Pestera cu Oase remains from Romania, among others.

A recent news report in Nature announced that DNA had been successfully sequenced from a 35,000 year old jaw from Pestera cu Oase, as described at the Biology of Genomes meeting in Cold Spring Harbor in New York.

Qiaomei Fu, a palaeogenomicist at Harvard Medical School, and her team apparently found that between 5% and 11% of the DNA of this individual (a man) was Neanderthal, including large chunks of several chromosomes.

They even estimated that the Oase man’s Neanderthal ancestor had lived only for to six generations or roughly 80 to 120 years earlier.

Unfortunately, the chance of successfully extracting and sequencing DNA from any fossil remains very low, so most of the unusual looking remains will probably never yield genetic clues to their ancestry.

But, those cases where DNA has been extracted from fossils, like the Pestera cu Oase jaw, give us confidence that their enigmatic looking bony features offer reliable insights into their genealogy.

Broad implications

Human evolutionary science was largely sidelined for most of the 20th Century as a quaint and “old fashioned” kind of discipline.

This was partly because anthropologists refused to accept Darwinian evolution as their central theory until the 1950s.

But then, with the “molecular revolution” from the 1960s onwards, fossil studies lost a lot of their shine.

The unfolding ancient DNA revolution over the past decade has put the spotlight firmly back on our species and its fossil record as one of the key models and sources of information for understanding evolution.

The findings are even beginning to revolutionise how we conceive of some of the most fundamental concepts like “species” and how they might arise in nature.

The Conversation

Darren Curnoe is Human evolution specialist & ARC Future Fellow at UNSW Australia.

This article was originally published on The Conversation.
Read the original article.

The ‘Other’ Red Meat on the ‘Real’ Palaeodiet

The so-called palaeodiet, and now even the palaeo-epigenetic diet, has come under a lot of scrutiny of late for making wild and unsubstantiated claims and for being downright dangerous to our health.

I think its fair to ask if we’re serious about the palaeolifestyle, then just how far are we prepared to take this obsession with our Stone Age heritage and its claimed benefits?

If it really offers a panacea for good health, shouldn’t we all become cave dwellers again and consume the full variety of foods our ancestors actually ate?

Are we really willing to eat the ‘real’ palaeodiet, even if it means munching on grandma when she passes away?

Being preyed upon by palaeodieters

Palaeodiet advocates prey on the deep-seated anxieties we all share about health and longevity, as well as our cultural fixation with body image, and the idea of naturalness and a sentimental connection to our past.

Its highly selective and unscientific interpretations of human biology should be a massive pause for thought.

But, of course, this is a naïve view of the real world: the fad diet industry makes its money from sales fuelled by ill-informed celebrity endorsements and publishing companies that fail to give health claims proper scrutiny.

And, importantly, making cash from diets that by their very nature cannot work is what the fad diet industry is ultimately all about.

If these diets did work then there’d be no industry because we’d all be following the successful one, instead of shifting diets every year or so to follow the latest fad, forking out large sums of cash as we do.

And let’s not be under any illusions here, the palaeodiet segment of the fad diet industry is worth a lot of money: according to the Sydney Morning Herald, the on-line book shop Amazon has holdings of around 5,000 titles on the palaeodiet alone.

Where’s the science?

I’ve written previously about some of the problems with the palaeodiet and palaeolifestyle fad from a human evolutionary and anthropological viewpoint.

It’s also the subject of one of the episodes of my UNSWTV YouTube series, How did we get here?


How did we get here? The Palaeodiet fad.

Last year the Dieticians Association of Australia criticised the palaeodiet for being unhealthy and going against best nutritional science, opening themselves up to an angry tirade on social media.

Earlier this year, the Association of UK Dieticians didn’t pull any punches in its assessment of the palaeodiet either, describing it as, “An unbalanced, time consuming, socially isolating diet, which this could easily be, is a sure-fire way to develop nutrient deficiencies, which can compromise health and your relationship with food.”

A common criticism I have received in the comments left after my article and film is that I’m missing the point about the palaeodiet, that its apparently about eating ‘real food’ and rejecting government guidelines like the Australian Guide to Healthy Eating, which apparently haven’t worked.

Putting aside the fact that most people don’t or have never properly followed government guidelines anyway, the palaeodiet is not unique in suggesting that people ‘just eat food’.

The journalist and author Michael Pollan, among others, has been saying this for almost a decade, nothing palaeo about his ideas though; and neither did his nor my grandmother live in a cave.

The ‘real’ palaeodiet?

The simple fact is that for most people alive today we don’t have a very good idea what our hunter-gatherer ancestors really ate.

That’s because most of them gave it up 5,000 or 10,000 years ago, and there’s little by way of a detailed archaeological record for us to reconstruct their diet from.

What we do know about hunter-gatherers around the world, ones studied by European ethnographers during historical times and those whose lifestyle we have managed to reconstruct in a very patchy way from the archaeological record, is that flexibility and diversity were the keys for our species.

No single diet fitted any single group, and everything eaten depended on where in the world people lived, keyed into local climate and environmental diversity, and seasonal availability.

One universal seems to be that people everywhere ate meat, from all kinds of animals; including even humans.

Of course, they ate a very wide variety of plant based foods as well, but meat seems to have been highly prized.

Cannibalism was practised by many different cultures across the world, and we’re today probably all carrying the evidence for it in our genomes.

It seems distasteful, or even macabre, by modern cultural standards, and is rightly outlawed, but we can’t escape the fact that our evolutionary ancestors practised it at one time or another.

Two kinds of cannibalism were practiced: so-called ‘nutritional’ cannibalism, where human flesh was a part of the diet, and ‘ritualistic’ cannibalism, where humans were eaten as war trophies, as part of religious sacrifice or perhaps after a revered elder died.

An article published recently by Silvia Bello from the Natural History Museum in London and her team in the Journal of Human Evolution described evidence for cannibalism in the UK dating from around 14,700 years ago.

These were some of the last hunter-gatherers in the UK before farming arrived, and their diet included human flesh.

Bello and her co-workers documented extensive evidence for defleshing, the disarticulation of skeletons, chewing, including human tooth marks, crushing of spongy bone, and the cracking of bones to extract marrow.

Reconstruction of a Mesolithic (late hunter-gatherer) tomb from France. It shows two women in their twenties or early thirties, both with traumatic injuries to the skull. One is believed to have been buried while still alive.
Wikipedia Commons

Evidence for Stone Age cannibalism exists from other parts of Europe, Asia and the Americas, as well as human flesh consumption in many places in much more recent times.

A well-documented example is the association between the prion disease Kuru and consumption of the human brain in cannibalistic feasts by women living in a region of the Papua New Guinean highlands.

Human flesh eaten far back in time

Cannibalism has an evolutionary history going back to at least 1.2 million years, as seen at the hominin fossil site of Gran Dolina, in the Atapeurca region of northern Spain.

The species involved was Homo antecessor and evidence shows that in one fossil deposit alone dozens of fossils sport cut marks, percussion pits and scars left by stone tools during butchering.

Excavations at the 0.8-1.2 million year old Grand Dolina site in Atapuerca, Spain. The earliest evidence for cannibalism has been found at this fossil site.
Wikimedia Commons

Even earlier, the type specimen for the species Homo gautengensis, skull Stw 53 from roughly 1.5 million year old deposits at Sterkfontein Cave in South Africa, and which I described in 2010, bears the signs of defleshing.

The signatures of cannibalism have even been suggested to be present in the genome of our species indicating a deep and global evolutionary ancestry.

Prion diseases like Kuru are under strong genetic control and in humans their susceptibility and resistance is associated with variation in the PRNP gene.

The pattern of genetic diversity of this gene has been studied by Simon Mean from University College London and his team, who proposed that its high levels of global variability are associated with widespread cannibalism in our evolution.

What about the bread for that human steak sandwich?

Palaeodieters claim that humans didn’t evolve to each grains and that their consumption only began with the agricultural revolution around 10,000 years ago.

In other words, grains are not part of the ‘natural’ human diet and should be off the palaeodiet menu.

Nutritionists have been especially concerned about this because most of us already don’t get enough fibre in our diets, and for most people, wholegrains provide a vital source of dietary fibre.

When we look at the human genome we find that it contains a gene called the salivary amylase gene, which produces an enzyme in our saliva to help break down starch before we ingest it.

We humans possess an average of seven salivary amylase genes across the globe compared to chimpanzees which possess only two.

Interestingly, among living people, the number of salivary amylase genes varies from around 4-14 copies, and is associated with varying levels of salivary amylase protein and starch consumption.

Moreover, with the sequencing of the genomes of our Neanderthal and ‘Denisovan’ cousins we now know that they also had only two copies, just like chimpanzees.

George Perry from Pennsylvania State University and his team reported salivary amylase gene numbers for our extinct cousins earlier this year in the Journal of Human Evolution.

They also suggested that the extra human copies arose in our evolution during the last 550,00-590,000 years – after we split from our common evolutionary ancestor with the Neanderthals – suggesting they probably pre-date our own species by a couple of hundred thousand years.

Good intentions, gone wrong

It’s laudable that people want to improve their health, or hope to stave off lifestyle diseases like cancer, diabetes and coronary heart disease, by adopting a diet based on healthy choices.

I also think its admirable to aspire to eat in ways that might be a better ‘fit’ for our biology and evolutionary history.

As noble as this idea might be, we simply won’t get there through some fad diet, like the palaeodiet.

It’s not based on good science, as many of its celebrity supporters will openly admit, and now has more to do with making money than providing good health.

Surely we owe it to ourselves to make the connection between what science tells us causes lifestyle diseases and a science-based approach to preventing them?

The Conversation

This article was originally published on The Conversation.

Read the original article.

Did Modern Humans Wipe Out the Neanderthals in Europe?

Our closest evolutionary cousins the Neanderthals continue to fascinate scientists and prehistorians.

Fossils and DNA strongly suggest we shared a common ancestor with them, genetic clocks placing the split between us in the range of 550,000 to 765,000 years ago.

Our fascination stems from the fact they are our closest evolutionary cousins; we have hundreds of fossils from them, so have a pretty good idea what they looked like; and they were the first extinct human species we knew about, with Neanderthal bones found discovered in the first half of the 19th century.

Neanderthals have historically also represented the archetypical brutish caveman in popular culture.

Each year dozens of research articles are published examining almost every aspect of their biology and behavior, as gleaned from the fossil and archaeological records they have left behind.

When and where did they live?

The Neanderthals occupied Europe for at least 200,000 years, but our knowledge of them further east is much sketchier.

We also know they lived in West Asia, with their skeletons found in several caves in Israel and Iraq dating between around 140,000 and 50,000 years ago.

They inhabited Southern Siberia as well, particularly the Altai Mountains, about 50,000 years ago, occupying the same cave as the mysterious “Denisovans,” a closely related but probably distinct species from them.

But, whether they were there at exactly the same time as the Denisovans is anyone’s guess.

Given the extreme cold associated with the glacial (cold) phases of the Ice Age, their occupation of Siberia probably wasn’t permanent either.

In Europe, for example, they are known to have retreated south during these extreme cold phases, so probably had limited tolerance of extreme conditions, despite their sophisticated culture.

Just when did they disappear?

A long and protracted debate has been held among archaeologists for decades about just when the Neanderthals went extinct.

This has mostly been because some archaeological sites have been disturbed since they formed, so the fossil and stone tools they contain are mixed up, representing materials of different ages.

The other major reason has been that we simply didn’t have the technology to determine the age of their bones reliably, especially when they were thought to be around 30,000 or 40,000 years old.

Laboratory methods developed over the last decade have pushed our ability to clean-up bones and other materials for radiocarbon dating, and to obtain much more accurate ages.

This has been truly revolutionary in places like Europe where archaeologists have been debating for decades when Neanderthals disappeared, when modern humans like us entered the subcontinent, and whether the two events coincided.

In short, a coincidence in the timing of these events would be the sort of ‘smoking gun’ most archaeologists are looking for to explain Neanderthal extinction.

The youngest Neanderthal fossil is from Mezmaiskaya in the northern Caucasus, and with an age of around 39,000 year old, it marks the most widely accepted disappearance date for the species.

But, the species probably went extinct over the course of several thousand years, and in different parts of Europe at different times.

Written in stone

Archaeologists who specialize in the study of stone tools attribute particular types of tools, grouped into ‘industries’ or ‘cultures’, with particular species of hominin.

The Neanderthals in Europe made an industry known as the ‘Mousterian’, named after a rock shelter called Le Moustier in the Dordogne region of France, which contained Neanderthal fossils and cultural remains.

In contrast, the earliest modern humans in Europe made a tool industry called the ‘Aurigancian’, named after another archeological site in France called Aurinac.

So, stone tools are used as markers of prehistoric migration, economic behavior and also mental capacities and intelligence between species.

Compared to the Mousterian, the Aurignacian is often seen as a cultural flowering associated with the hallmarks of the modern human mind in all its richness and imagination.

It includes elements like complex and carefully shaped bone including antler and ivory tools; skillfully shaped stone and ivory beads and other kinds of personal ornaments; highly varied and sophisticated forms of abstract and figurative art that portray fine details of human anatomy such as male and female sex organs; and elaborate cave paintings.

Now, this is not to say that the Neanderthals didn’t possess a complex culture, for there is controversial evidence they may have made art or even wore jewelry, possibly even buried their dead in graves in caves and rock shelters.

But, most archaeologists have no difficultly distinguishing the Aurignacian from the Mousterian.

Yet, having said this, there’s one catch: the proto-Aurignacian. A tool industry that appeared in Europe about 42,000 years ago and was subsequently replaced by the ‘true’ Aurignacian.

This industry has been controversial for decades owing to the lack of a clear association between these tools and the toolmakers, or their skeletal remains.

Who made the proto-Aurignacian?

A new study reported last week in the journal Science by Stefano Benazzi from the University of Bolognia and his team has shown that modern humans were in Western Europe by 41,000 years ago at two sites called Riparo Bombrini and Grotta di Fumane in Northern Italy.

This makes them examples of the earliest modern humans in Europe.

Yet, the mystery deepens, because Benazzi and his team have also managed to solve the long-standing issue of just who made the proto-Aurignacian.

This is the million-dollar question! Was it modern humans or Neanderthals, perhaps copying their new modern human neighbours?

Benazzi and his team worked out the identity of two 41,000-year-old teeth found alongside proto-Aurgnacian tools at these two sites studying anatomical features and DNA from one of them.

They turned out to be modern humans after all, confirming there were multiple routes into Europe – one in the south and one in the north – these different groups carrying with them the proto-Aurgnacian and Aurignacian, respectively.

Neanderthal disappearance: whodunit?

With new radiocarbon dates on bones from both late Neanderthals and the earliest European modern humans, and the identity of the makers of the proto-Aurignacian now known, it looks increasingly likely that our cousins went extinct with a very short timeframe of our arrival: within the narrow window of 39,000-41,000 years ago.

While we don’t, and probably never will, have direct evidence of exactly what happened, this strongly implies a role for modern humans in their disappearance.

From Sticks to Stones: Getting a Grip on the Human Genus

2015 has already been an amazing year for human evolution science.

We’ve witnessed an uncanny convergence of discoveries on the beginnings of the human genus, Homo, and all that it implies for understanding the evolution of a range of human characteristics, including culture and tool making.

Yet, the implications of this research run much deeper than most people would immediately think. Let me explain.

50 years of dogma

This year’s major discoveries touch on a number of major historical themes in human origins research, spanning decades of investigation. The implications are broad, involving one of the key theoretical pillars of human origins that explains, among other things:

  • The arrival of fully committed terrestrial locomotion (or obligate bipedalism),
  • Enlargement and reorganization of the brain,
  • Speech and language,
  • Beginnings of human like culture and complex tool manufacturing,
  • A hunting and gathering lifestyle,
  • Endurance running linked perhaps to the “persistence hunt”,
  • Regular consumption of meat, and
  • Use of fire and cooking.

Many elements of this ‘package’ of anatomical and behavioural traits probably began with the emergence of Homo around 2.3 million years ago. Actually, the discovery of a new Homo jaw from Ethiopia dated 2.8 million years old, which I wrote about in March, pushes this back a further half million years.

If we’re honest, I guess, we should have expected the deconstruction of this central pillar of human evolution theory any time now, as there have been hints it was coming for quite a while.

2015 just happens to be the year when momentum has reached a point, through serendipitous discoveries, where its time to reassess the old model and forge ahead with a new one.

Olduvai Hominin 7 is the type specimen for the species Homo habilis.

Homo the “handy-man”?

Ever since 1964, when Louis Leakey, Phillip Tobias and John Napier announced Homo habilis, and in so doing, redefined Homo, our genus has been linked to culture and tool making.

The nomen habilis was suggested by Raymond Dart, as it means “able, handy, mentally skillful or vigorous” in Latin.

Stone tool making, indicating the beginnings of culture, a fine ‘precision grip’ of the hand, and the start of a long-term trend in brain enlargement, were all linked to Homo and its then newest earliest member, Homo habilis.

Before this time, membership of the human genus was based mostly on the idea of a “cerebral Rubicon” in which a species had to have an estimated brain size of 700 cubic centimetres or more to belong.

Some features have been added to the package since 1964, such as the use of fire and cooking, and endurance running.

Deconstructing the Homo package

Ever since tool making was linked exclusively to Homo there have been dissenters.

At Olduvai Gorge in Tanzania, the Leakey’s themselves found tools, dubbed the “Oldowan” Industry, associated with the “Nutcracker Man” or Paranthropus bosiei as well as Homo habilis.

In the 1960s, one of Leakey’s own protégés, Jane Goodall, showed that chimpanzees also used tools in the wild.

Only last week, Jill Pruetz at Iowa State University and her team reported that chimpanzees in Senegal used wooden spears for hunting, being the only example of a living non-human species to do so.

Over the last couple of decades we’ve also learnt that very different kinds of animals use tools, from ants to octopuses, crows to otters, and dolphins to monkeys and apes.

In South Africa, stone tools have also been recovered from various ancient cave sites along side 2 million year old Homo and Paranthropus fossils.

Studies of the hand bones of Paranthropus by Randy Susman in the 1980s showed their anatomy was similar to humans: although, we have no idea if they had the motivation, need or smarts to make tools.

In 2010, cut-marks on large mammal bones made by early hominins using stone tools were reported from Ethiopia, dating to 3.39 million years ago, by Shannon McPherron and her team.

This pushed tool use well beyond Homo and Paranthropus to Kenyanthropus or even Australopithecus.

Last week, Sonia Harmand and her team at Stony Brook University announced at a conference in California that they’d found stone tools on the shores of Lake Turkana in Kenya dating to around 3.3 million years old.

This discovery pushes the oldest tools back a whopping 800,000 years!

This new tool industry differs from the Oldowan in being much chunkier, including tools weighing as much as 15 kilograms: according to a report in this week’s Nature by Ewen Calloway (the work has yet to be published in a scientific journal).

And, just this week, a new study published by Thomas Feix from Yale University and co-investigators proposed that the East African species Australopithecus afarensis living between roughly 3 and 4 million years ago may also have had a human-like hand grip.

This follows on from research reported in January by Matthew Skinner from the University of Kent and his team showing that the southern African species, Australopithecus africanus, may also have routinely made and used tools.

All of these discoveries call into question the whole Homo package: if tools were made by hominins like Australopithecus, then they can’t be associated with obligate bipedalism, brain enlargement, speech and language or the use of fire.

These are features we associate only with members of Homo.

Oldowan stone tools from Melka Kunture in Ethiopia dated to around 1.7 million years old. Until this latest discovery this tool type, associated with Homo habilis, was the earliest known.
Wikimedia Commons

A cultured ape?

Over the last few decades, the surprising finding of tool use by arthropods, molluscs and many species of chordate (including birds and mammals), has thrown a massive spanner in the works for scientists wrestling with the notion of “culture.”

In one camp – normally biologists – the definition of culture has been expanded to include most or all instances of non-human tool use.

In the other – comprising mostly archaeologists – it has been narrowed to include only behaviours seen in Homo sapiens and some of our early ancestors.

But, are there any unique features in the way humans, and our extinct biped relatives, made and used stone tools?

We humans stand out in making an extraordinary diversity of tools, including ones that are highly complex, comprising many components that interact in complicated ways, with some tools based in experimental research or deep insights about nature.

Just think about your car or smart phone as an example of high-tech tool.

Archaeologists have had a hard time wrestling with the problem of the uniqueness of human culture, especially in the face of the complexity of ape cultures, but as well as the use of tools across very diverse kinds of animals.

For example, passerine birds and most primates are known to use similar numbers of tools and in similarly complex situations. Yet, humans do things differently.

The archaeologists Iain Davison, from the University of New England, and William McCrew, Cambridge University, believe they have pinned down from a behavioural perspective what humans do that is so distinctintive.

They believe we are far more creative, generating new kinds of behaviours from those learned from other people, and that human stone tool making has provided a “niche” for the coopting of tools and tool-making procedures across different contexts.

Such ideas prove extremely difficult to test from archaeology alone though.

Uniqueness reassessed

We now seem to be at the point in scientific history where the package of features normally associated with the beginnings of Homo no longer seems valid, after 50 years of their use.

Many of the hallmarks of the human genus have been convincingly found with other, much more ancient hominin species, and some even with other kinds of animals.

At the same time, finding the evolutionary origins of the very real ‘gulf’ that exists between us and other life – one that is admittedly getting smaller each year – seems to be an increasingly difficult challenge for science.

The ConversationThis article was originally published on The Conversation.

Read the original article.

Human Environmental Footprint Reaches Far Back in Time

You’d literally have to be a cave dweller to be oblivious to the major global environmental changes happening in the world today.

It reads like a litany of crimes against the planet:

  • The many and far reaching impacts of global warming.
  • Disruption of the planet’s chemical cycles, such as carbon, nitrogen, phosphorous and others.
  • Air pollution from combustive sources.
  • Light pollution from our 24-hour cities.
  • Clearing and cultivation of the land and the ensuing loss of biodiversity.
  • Erosion and siltation of waterways.
  • Overfishing of the seas and oceans.
  • Introduction of exotic species and their disruption of ecosystems.
  • Plastic pollution and acidification of the oceans.
  • Synthetic chemical and pharmaceutical pollutions of land and water, such as antibiotics.

We’ve apparently even slightly altered the planet’s rotation because of the billions of tons of water we’ve locked up in dams over the last 40 years, with many more planned for the coming decades.

A new epoch

The changes made by us to the planet are so profound now that geologists are debating whether to recognise this period of human-induced planetary change as a whole new phase in the Earth’s timeline: dubbed the ‘Anthropocene’.

Next year the International Commission on Stratigraphy, a committee of the International Union of Geological Sciences, the group that compiles the Geological Timescale, will decide whether to adopt the Anthropocene as a new epoch in Earth’s history.

This proposed new phase in the more than four-and-a-half billion year history of the Earth would mark for the first time a catastrophic phase of disruption of the planet’s major systems as a the result of the activity of a single species, us; Homo sapiens.

In the past, the Earth and life were rarely disrupted on a global scale, but when they were it was from by extraterrestrial impacts such as asteroids and meteorites or due to the cooling of the planet’s layers early in its history.

These led to the loss of large numbers of species, shaping the course of the history of life and the planet itself.

Without these major extinction events we simply wouldn’t be here as a species.

The beginning of the Anthropocene is generally regarded to be the year 1800, roughly coinciding with the start of the Industrial Revolution in Europe.

But, this seems like a rather arbitrary break point when we cast our eyes into the prehistoric past.

Too many mouths to feed

When did we start altering our environment to suit ourselves?

Well, we know that humans were responsible for changing the environment well before the Industrial Revolution began, many thousands of years before in fact.

One of the keys to understanding our growing impact on the environment is population growth.

Warren Hern of the University of Colorado estimated in the late 1990s the number of times the human population has doubled in its history.

He estimated that at 3 million years ago our population was doubling in size about every 500,000 years.

And by a million years ago the human population was doubling in size roughly every 100,000 years.

A big shift occurred around 25,000 years ago when our population began to double every 5,000 years or so; and the doubling time from then onwards started to shorten even more dramatically.

Between 10,000 years ago, roughly when farming began, and today, the human population has seemingly doubled almost 11 times leading to the presently more than 7 billion human inhabitants of the planet.

So, the human population has doubled as many times in the last 10,000 years as it did for the previous 390,000 years of our evolutionary history.

Demise of the megafauna

Now, this kind of modeling exercise might seem a little abstract, perhaps far removed from the realities of science or troubles of the planet.

Yet, some scientists think these massive increases in human population were the key reason why the much of the world’s megafauna went extinct between around 50,000 and 10,000 years ago.

This major extinction ‘event’ in Earth’s history saw the loss of at least 200 mammal species weighing over 44 kilograms over a 40,000 year period.

This included our close evolutionary cousins the Neanderthals, Deniosvans, and probably other bipeds as well.

As humans began to spread across the planet from our African birthplace around 70,000 years ago – probably because our population grew rapidly – we began to spill into Asia, Australia, Europe and eventually the Americas.

As we did so, our population grew even more; and we burned the landscape and introduced exotic species and diseases to most of the places where we settled.

We may even have taken advantage of the apparently unfearing megafauna, who probably didn’t recognise us as predators, and hunted them to oblivion.

The slow reproduction rate of the megafauna mammals meant they couldn’t adapt to human impacts and their populations probably collapsed; maybe not overnight; but certainly rather quickly on an evolutionary timescale.

Some scientists see parallels with the megafauna collapse and the Anthropocene.

A human population that has doubled in size more than 11 times since the end of the Ice Age; taking with it another 90 megafauna species that had managed to survive the great Ice Age megafauna collapse, but unable to survive the havoc we have wreaked since the invention of farming 10,000 years ago.

Stone pavements in the desert

The evidence for human alteration of the environment goes back much further though, with the crafting of the earliest stone tools, and may be even fire, at least 2.5 million years ago.

Earlier this month, Rob Foley and Marta Lahr from Cambridge University described in the journal PLoS One an archaeological site in remote Libya called Messak Settafet which shows pre-human populations were changing the environment for hundreds of thousands of years in the one place.

These early humans, belonging to species different to us – perhaps Homo erectus or Homo heidelbergensis – left stone tools and the waste of tool-making on the ground, building up over hundreds of thousands of years, perhaps beginning 900,000 years ago.

Remarkably, the ground at Messak Settafet is strewn in places with as many as 75 tools per square metre; it is, they suggest, a “pavement” of tools and manufacturing waste.

I’ve seen other sites in Africa myself where everywhere you turn you’re literally walking over hundreds of the tools left by our ancestors hundreds of thousands of years ago.

You get a sense at these places, in Africa mostly, that humans have been around for a very long time, and sometimes in abundance in the landscape: we’ve left our mark everywhere.

Connections in deep time

The more we learn about our deep time past the more we realise that humans and our ancestors have been altering the environment for as long as we’ve existed; as our ancestors did before us for hundreds of thousands, perhaps millions, of years.

The difference today is that there are more than 7 billion of us, and thanks to science, we’re acutely aware of what we’re doing; ignorance is no longer a defence.

Optimistically, with knowledge comes solutions and hope; we can learn from our past and take heart in the fact that change, in behavior, culture and technology, has served us well for millions of years.

Age of Jawbones Mean the Origins of Humans Just Got Older

There’s plenty of excitement this month amid reports that scientists had identified the “dawn of humankind” in the Rift Valley of Ethiopia.

With reports this discovery of a jawbone could “rewrite the history of human evolution” you might well imagine, my hyperbole detector became instantly aroused.

Putting aside the hype that always accompanies discoveries in human origins research, this turned out to be, refreshingly, an important discovery that dramatically shifts debate on the beginnings of the human genus.

In fact, by uncanny coincidence I assume, two scientific studies bearing on this same issue were published in Nature and Science within a day of each other.

A handy find

In Nature we learnt from Fred Spoor and his team that the type fossil for the early species Homo habilis (the “handy man”), which in this case is a lower jaw dubbed Olduvai Hominid 7 (OH7), could be repaired and its anatomy made more accurate using a 3D-CT-scanning approach.

The fossils belonging to Homo habilis were mostly found at Olduvai Gorge in Tanzania by Louis and Mary Leakey during the 1950s and 1960s, with some similar looking bones found a couple of decades earlier in caves near Johannesburg by Robert Broom and John Robinson.

Spoor’s study found that OH7 was a lot more primitive than had been thought since it was described – and named the “handy man” by Louis Leakey, Phillip Tobias and John Napier more than 50 years ago.

Spoor and his colleagues compared this roughly 1.8 million year old jawbone to other examples of primitive Homo, living humans and other apes. They did this to gauge just how variable these early pre-humans may have been, and whether they might sample multiple evolutionary lines, or species.

And they clearly identified several early species of Homogoing back to before 2.3 million years ago, with some of them (not OH7 though) being very modern in the structure of their face and jaws, or remarkably human-like.

Pushing Homo back in time

The Science study published by Brian Villmoare and colleagues showed the origin of Homo could now been pushed back by almost half a million years to at least 2.8 million years ago, with the discovery of a new jaw at Ledi-Geraru in the Rift Valley of Ethiopia during fieldwork in January 2013.

This fossil is a real beauty! It’s largely complete, and overall its physical traits are surprisingly rather modern. The jawbone is not very rugged and fine details of the tooth crowns and pattern of wear on the teeth are rather precocious, hinting at features seen in much later Homo species.

If you’re a human evolution buff like I am, then this is the kind of discovery that truly gets the heart racing. It shifts the ground dramatically and excitingly overnight.

Why? Well, first of all, we simply don’t have that many fossils from the crucial temporal window of 2-3 million years old, especially in East Africa.

Second, the human genus is now getting dangerously close to 3 million years old, much, much, older than current scientific models or textbooks state.

Third, we can largely throw the popular models of climate change used to explain the beginnings of Homo out the window: the Ledi-Geraru jaw is too old to be explained by the onset of global cooling or a shift to more open vegetation in Africa. But it may coincide with a regionally arid ecology.

Fourth, despite the record breaking hyperbole that surrounded discoveries such as Australopithecus sediba, this South African species dated around 1.98 million years old is clearly in the wrong place at the wrong time to be the ancestor of Homoas I stated just two years ago.

The Ledi-Geraru fossil also sidelines the other chief contender for Homo ancestry, Australopithecus garhi. This is a 2.5 million year old species also found in the Ethiopian Rift, on the grounds of its younger age and different anatomy.

The Homo-like features of the bones of sediba and garhi do tell us something important though: there was a lot of diversity in these early pre-humans and that some of their features must have evolved in parallel with Homo probably owing to the species facing similar ecological challenges.

Finally, the new study of OH7 and the discovery of the Ledi-Geraru jaw together show there was, as many of us in the business have been arguing for a long time, numerous species of Homo and Australopithecus in Africa in the period 1 to 3 million years ago.

Just 18 months ago I again found myself criticising another glib study involving a new fossil from Dmanisi in Georgia claiming that more than a dozen species of Homo should be sunk into a single evolutionary line leading to living people.

It’s amazing to think that studies like this and the interpretations of the sediba discovery can be shown to be incorrect so quickly. But, then again, there is often a lot more at play than just the science.

Words, Genes, and the Science of Past Human Deeds

New research reveals a clear link between the ancient spread of European languages and people, while other findings challenge the reliability of commonly used DNA for human ancestry studies altogether.

Human languages have also long been known to be a marker of human cultural affiliation and biological ancestry, used by linguists and archaeologists alike as a marker of human population movements in the deep past.

A textbook example of this is the spread thousands of years ago of Indo-European languages, which are today spoken by a diverse range of people spread from South Asia to the Middle East, and throughout much of Europe.

Linguists and archaeologists have developed two major and competing models to try to explain the origins of this large language family:

  1. The Indo-European languages may have begun 5500-6500 years ago with the early pastoralists (animal herders) of the Pontic-Capsian steppe: a vast region running from the northern shores of the Black Sea, east to the Capsian Sea, and covering countries like Moldova, western Ukraine, and part of Russia and Kazakhstan.
  1. Alternatively, they may have their origins with the early farmers of Anatolia some 8000-10,000 years ago, spreading into Europe and other regions with the dispersal of early farming culture and domesticated plants and animals.

Two new studies examining the history of the Indo-European languages have independently confirmed the first of these ideas, the so-called ‘Great Steppe’ model.

The first new study, by Chang and co-workers published in the journal Language, used a new rigorous method to develop an ‘evolutionary’ language tree based on the changes in vocabulary through time, and incorporating ancient and medieval languages.

Its strength seems to be in its statistical rigor and more accurate estimates of the pace of change through time in these languages and the production of a more reliable ancestry tree.

In the second study, a large team led by Wolfgang Haak at the University of Adelaide’s Australian Centre for Ancient DNA, and published on the preprint server bioRciv, extracted and analysed the nuclear DNA of 69 skeletons from Europe dating between 3000 and 8000 years old.

While, again, their worked confirmed the Great Steppe model for the origins of Indo-European languages, it also showed that the genetic history of the people speaking these languages in Europe was far more complicated than anyone had thought before.

Around 8000 years ago, Western Europe seems to have been inhabited by farming people whose ancestry was distinct from the hunter-gatherers of the region. In the steppe region of Eastern Europe/Central Asia, the opposite situation applied, the area being occupied by nomadic foraging people.

By about 6000 years ago, people with hunter-gatherer ancestry had seemingly reoccupied Western Europe, while in the steppe region, the earlier hunter-gatherers had now been replaced by herders.

At about 4500 years ago, a massive migration of people from the great steppe region seems to have occurred into the centre of Europe, with the result being that today many European speakers of Indo-European languages are descended from these Eastern European/Central Asian animal herders.

For decades, archaeologist and linguists have debated whether the spread of Indo-European languages was by farmers replacing hunter-gatherers as they spread across Europe, or whether nomadic foragers simply threw away their spears and picked up a sickle instead.

These new studies shows it may actually have been a complex and drawn-out mixture of both.

History misread from mitochondrial genes?

With the development of genetic sequencing technology in the 1980s, allowing the DNA code to be read directly, scientists began to test and develop models of human evolution based previously on the fossil record.

The marker of choice at this time was mitochondrial DNA, championed by geneticists as providing a simple and reliable indicator of recent evolutionary history.

Mitochondrial DNA is abundant in the human body. It is contained within hundreds or thousands of mitochondria contained within each cell, their numbers varying according to tissue type (i.e. blood cells, liver cells or muscles etc.).

This kind of DNA sits within the mitochondria of our cells, organelles that act as the power stations for each cell, contrasting with nuclear DNA (or our ‘genome’) found only within the nucleus.

Human mitochondrial DNA is small, containing 37 genes, while our nuclear DNA houses close to 25,000 genes.

While nuclear DNA mixes up (recombines) during the production of new cells, some of which become eggs and sperm, mitochondrial DNA doesn’t shuffle about.

Furthermore, we all inherit the mitochondrial DNA contained within other mothers’ ovum, making it a marker of maternal evolution.

Pioneering early studies of the genetic origins of living people by Rebecca Cann and Alan Wilson during the mid-1980s used mitochondrial DNA to show that our species had evolved in Africa tens of thousands of years before we left the continent and settled the rest of the planet.

This was the so-called ‘mitochondrial Eve’ model, and it confirmed one of two major competing theories developed from painstaking studies of fossils –the ‘Out-of-Africa’ theory.

Since this time, mitochondrial DNA has also come to play a major role in forensics and the identification of criminals from human biological samples left at crime scenes or on clothing.

It is also now a major source of information for ‘mail order’ genotypic companies offering popular tests of genetic ancestry for a few hundred dollars, such as the National Geographic’s Genographic Project.

Genetic ancestry or a molecular horoscope?

The widespread use of mitochondrial DNA for these purposes now sits under a cloud following a major study by Leslie Emery and colleagues published in the Journal American Journal of Human Genetics.

They undertook a large-scale global comparison of human mitochondrial DNA with some nuclear genome sequences and got some very sobering results.

Emery and her team studied hundreds of DNA samples from people representing many of the world’s major geographic populations and discovered that in most cases their mitochondrial DNA actually provided very little information about ancestry compared with their nuclear DNA.

At least a third of the mitochondrial DNA makers from various populations across the globe failed to accurately predict people’s geographic continent of origin – say ‘African’ or ‘European’ – or ancestry within a particular continental region – like ‘Northeast Asian’ or ‘Southern African’.

Emery and colleagues concluded that mitochondrial DNA offered an “incomplete” picture of people’s ancestry and called for greater information about the complexities and inaccuracies of DNA results from commercial ancestry-testing to be conveyed to consumers.

This follows on from an article in March of 2013 by the BBC that described some DNA ancestry testing as akin to “genetic astrology”; based on an article warning of the pitfalls of genetic testing posted by the Sense About Science website.

While this new study doesn’t go quite that far, it’s take home message is a powerful one that should resonate with scientists working on human evolution and forensic science, and consumers parting with their cold hard cash for a ‘personal’ DNA history wherein the technology promises much more than it can really deliver.

Baby with the bathwater?

Genetics will continue to surprises us with its major and often unpredictable insights into our collective, deep time, human past.

But sometimes, the surprises even threaten to overturn major branches of genetic research itself. Such is the self-correcting nature of this intellectual juggernaut we call science.