The palaeolithic diet and the unprovable links to our past

We still hear and read a lot about how a diet based on what our Stone Age ancestors ate may be a cure-all for modern ills. But can we really run the clock backwards and find the optimal way to eat and live? It’s a largely impossible dream based on a set of fallacies about our ancestors.

There are a lot of guides and books on the palaeolithic diet, the origins of which have already been questioned.

It’s all based on an idea that’s been around for decades in anthropology and nutritional science; namely that we might ascribe many of the problems faced by modern society to the shift by our hunter-gatherer ancestors to farming roughly 10,000 years ago.

Many advocates of the palaeolithic diet even claim it’s the only diet compatible with human genetics and contains all the nutrients our bodies apparently evolved to thrive on.

While it has a real appeal, when we dig a little deeper into the science behind it we find the prescription for a palaeolithic diet is little more than a fad and might even be dangerous to our health.

Mismatched to the modern world

The basic argument goes something like this: over millions of years natural selection designed humans to live as hunter-gatherers, so we are genetically “mismatched” for the modern urbanised lifestyle, which is very different to how our pre-agricultural ancestors lived.

The idea that our genome isn’t suited to our modern way of life began with a highly influential article by Eaton and Konner published in the New England Journal of Medicine in 1985.

Advocates of the palaeolithic diet, traceable back to Eaton and Konner’s work, have uncritically assumed that  a gene-culture mismatch has led to an epidemic in “diseases of civilisation”.

Humans are, it’s argued, genetically hunter-gatherers and evolution has been unable to keep pace with the rapid cultural change experienced over the last 10,000 years.

These assumptions are difficult to test or even outright wrong.

What did our Stone Age ancestors eat?

Proponents of the palaeolithic diet mostly claim that science has a good understanding of what our hunter-gatherer ancestors ate.

Let me disavow you of this myth straight away – we don’t – and the further back in time we go the less we know.

What we think we know is based on a mixture of ethnographic studies of recent (historical) foraging groups, reconstructions based on the archaeological and fossil records and more recently, genetic investigations.

We need to be careful because in many cases these historical foragers lived in “marginal” environments that were not of interest to farmers. Some represent people who were farmers but returned to a hunter-gatherer economy while others had a “mixed” economy based on wild-caught foods supplemented by bought (even manufactured) foods.

The archaeological and fossil records are strongly biased towards things that will preserve or fossilise and in places where they will remain buried and undisturbed for thousands of years.

What this all means is that we know little about the plant foods and only a little bit more about some of the animals eaten by our Stone Age ancestors.

Many variations in Stone Age lifestyle

Life was tough in the Stone Age, with high infant and maternal mortality and short lifespans. Seasonal shortages in food would have meant that starvation was common and may have been an annual event.

People were very much at the mercy of the natural environment. During the Ice Age, massive climate changes would have resulted in regular dislocations of people and probably the extinction of whole tribes periodically.

Strict cultural rules would have made very clear the role played by individuals in society, and each group was different according to traditions and their natural environment.

This included gender-specific roles and even rules about what foods you could and couldn’t eat, regardless of their nutritional content or availability.

For advocates of the palaeolithic lifestyle, life at this time is portrayed as a kind of biological paradise, with people living as evolution had designed them to: as genetically predetermined hunter-gatherers fit for their environment.

But when ethnographic records and archaeological sites are studied we find a great deal of variation in the diet and behaviour, including activity levels, of recent foragers.

Our ancestors – and even more recent hunter-gatherers in Australia – exploited foods as they became available each week and every season. They ate a vast range of foods throughout the year.

They were seasonably mobile to take advantage this: recent foraging groups moved camps on average 16 times a year, but within a wide range of two to 60 times a year.

There seems to have been one universal, though: all people ate animal foods. How much depended on where on the planet you lived: rainforests provided few mammal resources, while the arctic region provided very little else.

Studies show on average about 40% of their diet comprised hunted foods, excluding foods gathered or fished. If we add fishing, it rises to 60%.

Even among arctic people such as the Inuit whose diet was entirely animal foods at certain times, geneticists have failed to find any mutations enhancing people’s capacity to survive on such an extreme diet.

Research from anthropology, nutritional science, genetics and even psychology now also shows that our food preferences are partly determined in utero and are mostly established during childhood from cultural preferences within our environment.

The picture is rapidly emerging that genetics play a pretty minor role in determining the specifics of our diet. Our physical and cultural environment mostly determines what we eat.

Humans show remarkable flexibility and adaptability to a wide range of environments and diets, today and in the past.

Evolution didn’t end at the Stone Age

One of the central themes in any palaeolithic diet is to draw on the arguments that our bodies have not evolved much over the past 10,000 years to adapt to agriculture-based foods sources. This is actually quite wrong.

There is now abundant evidence for widespread genetic change that occurred during the Neolithic or with the beginnings of agriculture.

Large-scale genomic studies have found that more than 70% of protein coding gene variants and around 90% of disease causing variants in living people whose ancestors were agriculturalists arose in the past 5,000 years or so.

Textbook examples include genes associated with lactose tolerance, starch digestion, alcohol metabolism, detoxification of plant food compounds and the metabolism of protein and carbohydrates: all mutations associated with a change in diet.

The regular handling of domesticated animals, and crowded living conditions that eventually exposed people to disease-bearing insects and rodents, led to an assault on our immune system.

It has even been suggested that the light hair, eye and skin colour seen in Europeans may have resulted from a diet poor in vitamin D among early farmers, and the need to produce more of it through increased UV light exposure and absorption.

So again, extensive evidence has emerged that humans have evolved significantly since the Stone Age and continue to do so, despite some uninformed commentators still questioning whether evolution in humans has stalled.

A difficult choice

In the end, the choices we make about what to eat should be based on good science, not some fantasy about a lost Stone Age paradise.

In other words, like other areas of preventative medicine, our diet and lifestyle choices should be based on scientific evidence not the latest, and perhaps even harmful, commercial fad.

If there is one clear message from ethnographic studies of recent hunter-gatherers it’s that variation – in lifestyle and diet – was the norm.

There is no single lifestyle or diet that fits all people today or in the past, let alone the genome of our whole species.

Saga of the Hobbit: a decade in the making

First published in October 2014 and by ABC Science.

It’s 10 years since the discovery of Homo floresiensis (aka the “Hobbit”) was announced. The bones – and debate they’ve generated – can tell us a lot about our evolution and the scientists who reconstruct it.

I remember the day well: the 28th of October 2004 is firmly embedded in my memory, when the discovery of Homo floresiensis was announced. And, I readily confess that my first reaction was disbelief.

It was a discovery that flew in the face of 150 years of understanding of human evolution.

I quickly rang my former PhD supervisor, and then mentor and close colleague, the late Alan Thorne at the Australian National University in Canberra, and he expounded similar disquiet.

We wondered and discussed whether some kind of disease or combination of diseases could have been responsible for it’s remarkable anatomy: a rather prophetic discussion with hindsight.

An incredible find

The first skeleton, published in 2004, was dubbed LB1 (Liang Bua cave find No. 1) and included a partial skull with most of the teeth in place, several lower limb bones, hand and wrist bones, some bones of the shoulders, ribs, and elements of the hips.

Through a combination of dating techniques on cave sediments it was estimated to be only about 17,000 years old.

LB1 was a two-footed (bipedal) ape, reconstructed to have stood just over a metre tall, and weighing close to 30 kilograms.

Even more striking, its estimated brain volume was among the smallest ever found in the hominin, or human evolutionary, group at just 380 cubic centimetres: although, this estimate has now been revised up slightly.

In many other respects its face, teeth and limb bones combined features seen in Australopithecus from several million years ago in Africa, features seen in Homo erectusfrom hundreds of thousands of years ago in Indonesia, some anatomical traits like living humans, and a host of bizarre features new to science.

The next year, more bones were described in another article in Nature, this time their age being extended from a fossil bearing unit dating from 95,000-74,000 years old right up to another one aged around 12,000 years old.

For many sceptics, this new evidence was important, for no longer was the Hobbit a single, aberrant, and incomplete skeleton, it had become a long lived population. Yet, there was, and still is, only one skull.

These new fossils, especially the second lower jaw and its teeth, as well as a large number of spinal (vertebrae) and limb bones were similarly enigmatic in their anatomy and resembled the LB1 remains when they could be compared.

A slow conversion

Now, I readily confess that I was slower than many to accept the Homo floresiensis hypothesis; and all descriptions of a new species are just that, scientific hypotheses that require testing.

I never published anything questioning the find, and although a couple journalists and many colleagues did ask my opinion, I declined at that stage to offer one, as advised by Thorne. It was very good advice.

The debate over the finds and their significance got very heated, and sadly, very personal. Long-held animosities began to raise their ugly head in the media, intergenerational squabbles and cultural differences about the role and weight to be given to senior scientists played out publically.

Accusations were made about carelessness in handling the bones. Fingers were pointed and individuals accused of unauthorised and unethical access to the remains.

In 2004, the first of a number of studies criticising the “new species” hypothesis was published by Thorne and Maciej Henneberg of the University of Adelaide, alongside of a response from Peter Brown of the University of New England, and others who had described them.

This article began the assault by a group of scientists suggesting the Liang Bua remains, chiefly LB1, were simply from a diseased modern human, perhaps even an indigenous person from the island of Flores, with short-statured people living there today.

My initial impression had been taken much further, but I was very uneasy. Evidence was quickly garnered by Thorne, Henneberg, Teuku Jacob, Etty Indriati, Charles Oxnard and other scientists from Australia, Indonesia, the United States and other places for pathology as the cause of the features others had come to regard as indicative of a new species.

In fact, over the last decade no less than eight diseases or syndromes have been suggested as possible explanations for the unusual features of LB1, to account for its resemblance to primitive hominins.

Yet, I began to have serious doubts. I had spent the first 10 years of my career working on early hominin fossils in Africa, and had in fact described with the late Phillip Tobias the most complete skull belonging to the earliest species of Homo from South Africa, as well as other early fossils from that country; and had worked on very ancient bones in Kenya as well.

The brain surface anatomy of LB1 was also studied on a virtual model made from CT-scans by Dean Falk and her team, comprehensive examination of the shoulder and wrist bones occurred, and then the foot bones were studied in detail.

All of the evidence was pointing to a very, very primitive brain and skeleton: one like that seen in our ancestors hundreds of thousands, if not millions, of years ago.

By 2007, I became unconvinced that a diseased Homo sapiens could end up resembling primitive hominins in so many ways. And, I told a journalist so, for the first time offering my opinion about the finds: I found the new species hypothesis convincing.

A small brain through arrested brain growth is one thing, short stature another, but the striking resemblance of so many features from across the skeleton would be too much of a coincidence and require far too much ad hoc explaining to ring true, for me anyway.

Clearly others disagreed, and continue to do so, the debate about which disease may have caused such gross disfigurement, nay evolutionary reversion, continues, the latest candidate being Down syndrome published in 2014 by Henneberg and co-workers.

The bigger picture

While it’s true that palaeoanthropology – the science of human evolution – has a history of sometimes rather extraordinary claims, and even downright frauds such as the Piltdown Man, I think the Hobbit debate has by historical standards been rather unusual.

For a start, it has lacked a larger than life colonial figure at the centre of the discovery: historically we have often seen eccentric and exuberant characters like Eugene Dubois, Raymond Dart, Louis Leakey or Robert Broom taking on the world to show how they have more or less single handedly solved a key riddle of human evolution; and in a place far from the centres of European colonial power.

While Homo floresiensis has had its enthusiastic public champions, including the late Michael Morwood, Australian co-leader of the project that discovered it, much of the ego and flamboyancy of earlier eras has been lacking, although the conviction certainly hasn’t been.

Also different this time has been the way a number of leading international scholars rallied around to support the find from its first announcement.

I guess one the key differences has been that the scientific manuscripts describing the Liang Bua finds were anonymously reviewed by senior colleagues prior to their acceptance for publication in Nature.

When Dart described Australopithecus and the Leakeys discovered Zinjanthropus this top-of-the-pops science journal didn’t require such rigorous review prior to articles hitting the printing presses the Hobbit find was subjected to. Instead, discussion and review of the science played out through correspondences sent to Nature.

Ten years on since the announcement of Homo floresiensis we are scarcely any closer to understanding the origins and evolutionary relationships of this very enigmatic species.

While its always a precarious thing to try to second guess where the “silent majority” stands on any issue, I think its fair to say that a majority of specialists accept that the remains represent a new species of a very primitive human relative.

I also think history will show that the Hobbit stands as one of the most surprising, challenging and important discoveries made in the 150 or so year history of palaeoanthropology: up there with Dubois’ Pithecanthropus, Black’s Sinanthropus, Dart’s Australopithecus and the Leakey’s Zinjanthropus.

The extraordinary beginnings of human consciousness

Image credit: NASA. First published in October 2014 by ABC Science and based on a TEDx Brisbane Talk.

Our consciousness sets us apart from all other life. Yet, its evolutionary appearance highlights the accidental nature of our origins.

The beginning of our species is one of the most significant events in the Earth’s — some say the universe’s — history. At its centre is answering big questions like the beginnings of consciousness.

The 20th century luminary of biology, Julian Huxley, believed the evolutionary arrival of humans was so profound an event in Earth’s history that he dubbed the geological period when it occurred the “Psychozoic Era”.

That is, the geological era of the soul or mind.

Contemporary cosmologists like Paul Davies have even argued that the evolution of humans gave the universe self-awareness.

We humans have always thought of ourselves as rather unique in the natural world — even special — a vast intellectual gulf seemingly separating us from all other life.

To reinforce this, we have constructed cosmologies placing humans at the centre of the cosmos: the Sun orbiting the Earth — as seen for example in Ptolemy’s geocentric model of the universe.

This view changed of course with Copernicus who showed some 1,300 years later that the Sun was at the centre of universe; well the solar system more accurately, the Earth being just one of several celestial or extraterrestrial bodies orbiting the Sun.

Four hundred years later came the space race. Humans, through the Apollo missions, ventured beyond our Earthly — our evolutionary — home, setting foot on our extraterrestrial neighbour.

We were struck by our seeming aloneness and insignificance in the universe: our pale blue dot of a home set against the vast black expanse of the universe.

This event also marked the serious search for life in outer space, and there’s something rather poignant about our desire to see just whether we ARE actually alone in the universe.

So far we seem to be one of a kind. Yet, it hasn’t always been this way, being alone I mean.

Living with the cousins

Our ancestors shared the planet with other intelligent life not so long ago — the blink of an eye in evolutionary time — with creatures a lot like us.

Our ancestors shared their world with them for most of our evolutionary history stretching back to around eight million years ago, to the beginning of two-footed apes.

Being alone, as we are today, is the unusual state of affairs.

You’ve undoubtedly heard of the Neanderthals, Homo neanderthalensis? They lived up until just 40,000 years ago.

The so-called ‘Hobbit’ — or Homo floresiensis — from the island of Flores. It lived up until around 17,000 years ago.

Or, the Red Deer Cave people, one of my own discoveries with my colleague Ji Xueping, from southwest China. Cousins that lived even more recently, up until about 10,000 years ago.

Arrival of the mind

Our species evolved only about 200,000 years ago: probably the newest arrival on the evolutionary scene.

Yet, if we look at the evidence for the behavior our ancestors — the archaeological record — we can scarcely distinguish the behaviour of sapiens-humans from our cousins.

That is, until somewhere in the geological window of time around 50, 60 or 70 thousand years ago. Roughly three quarters of the way through our species’ evolution.

At this time, we saw a major event which archaeologists have dubbed the ‘Human Revolution’.

At this time we saw the first examples of jewellery being made.

Also at this time, humans took their first steps out of Africa — the humans who went on to the found the world’s living populations across the globe.

People lived for the first time in previously unoccupied areas; like rainforests, intensely arid zones including deserts, high mountain ranges, and they quickly settled the Arctic region.

East Asia was also settled about 50,000 years ago for the first time by humans, as was the island continent of Australia.

All of this occurred about the time our kind left Africa. Not earlier, and sometimes a little later. And despite the fact we had existed as an unremarkable species for around 150,000 years.

We saw the first cave paintings at this time, in Europe, Asia and Australia. Symbolic representations of the internal and external world through vivid paintings of cave and rock shelter walls.

And we saw a much wider range of tools being made, with rapid innovation in tool form and use. Tools called ‘microliths': tiny tools that replaced in many places the bigger, chunkier tools made by our earlier ancestors and relatives.

In short, we saw humans in all of our glory: with our vivid internal world and imagination, and living in virtually every nook and cranny the planet has to offer.

Gift from a departing relative

So, why the ‘Human Revolution’ then and not some other time during the 200,000-year span of our species?

We can piece together the evidence to develop a rather surprising scenario: a truly remarkable narrative of our origins, based on the latest science.

At about 60,000 years ago, when our human ancestors were beginning to make their journey to settle new parts of Africa and the rest the Old World the planet was a very different place to today.

It was a world inhabited by our close relatives: cousins living in parts of Africa, and in Asia and Europe.

Now, something rather extraordinary seems to have occurred about this time, as has been shown by the work of some very clever geneticists.

When our ancestors moved into these new places they did something that seems to be a first in human evolution — they mated with the locals.

Now our genome, it turns out, is like a patchwork quilt. It’s estimated that up to five per cent of the DNA of people living in North Africa and outside of Africa today comprises Neanderthal genes.

And a similar value also for the Denisovans — a mysterious species from Siberia we know from a single tooth and finger bone, but also its genome.

It might strike you as odd that different species interbreed. But, in fact, between species mating is common in nature and is actually an important source of evolutionary innovation right across life.

The Denisovans, for example, probably gave us a raft of genes associated with immune function and genes that allowed people living today in the Himalayas to survive at high altitude.

Accidental origin of us

There’s another really fascinating and potentially profound genetic gift they gave us on their way out: a variant of the microcephalin gene.

This gene plays a key role in brain size in humans and there is ample evidence it has been under strong selection in recent evolution.

Now, genetic studies suggest this gene may actually have been added to our genome through interspecies interbreeding with a close cousin. Maybe even with the Neanderthals.

I don’t wish to suggest this is THE gene for consciousness, for without doubt something as complex as the human mind or consciousness must involve multiple genes or even networks of genes.

But, the microcephalin gene is likely to be a key gene, without which consciousness might not exist.

So, it could be that the psychozoic of Huxley, or the universal consciousness of Davies, resulted from the incorporation of a gene we received from a close evolutionary relative.

Isn’t this the ultimate irony? We get the gene, send them to extinction, and claim universal consciousness while we’re at it!

Science constantly updates and knowledge progresses. And, without doubt, this story will change as well. But, in the end, this doesn’t really matter because it highlights one really important aspect of our evolution.

It is clear that we humans, and our remarkable consciousness, were not planned, nor inevitable, and not built into some design for the universe or the fabric of the cosmos.

Instead we were accidental, our evolution contingent.

The very feature we hold so dearly may in fact result from a chance encounter in a dark alley, even an evolutionary one-night stand.

Human races: biological reality or cultural delusion?

First published in August 2014 and also by The Conversation.

The issue of race has been in the news a lot lately with the canning of proposed amendments to Australia’s Racial Discrimination Act, attempts by extremists to commit genocide on cultural minorities in Iraq and a new book by US author Nicholas Wade that has scientists claiming their work was hijacked to promote an ideological agenda.

The idea that races are part of our existence and daily experience, especially those of us living in multicultural societies, seems to be just taken for granted by many people.

But are races real or simply social/political constructs? Is there any scientific evidence they exist in humans? Or are some scientists just being politically correct in denying their existence?

Race in nature

Biologists have used the “race” category for hundreds of years to classify varieties of plants and animals and, of course, humans. It has normally been reserved for geographic populations belonging to a single species, and has often been used as a synonym of “subspecies”.

While the species concept, or definition, has also had its share of controversies, biologists agree that species are real, not arbitrary. They represent reproductively cohesive evolutionary units.

Yet the use of “race” in biology is far from straightforward. It has been controversial for many decades irrespective of which species it has been applied to; human or otherwise.

Ernst Mayr, one of the intellectual giants of biology during the 20th century and a pioneer of the classification of biological diversity, was critical of the use of races and subspecies by taxonomists.

Unlike species, races and subspecies are very fuzzy categories. They lack a clear definition as a biological rank, being arbitrarily and subjectively defined and applied.

Races have been identified on ecological, geographical, climatic, physiological and even seasonal criteria. There are subraces, local races, race populations and microgeographic and macrogeographic races; even “ethnic taxa”.

Races simply aren’t real like species are: species represent genuine “breaks” in nature while races are part of a continuum and can only ever have very arbitrary boundaries.

Their lack of favour in biology today has a great deal to do with a desire to remove subjectivity and fuzzy thinking from the enterprise of classifying nature.

A race to the bottom

The history of scientific racialism has a very chequered history. Many large-scale atrocities and instances of genocide were carried out in the name of race, usually involving notions of the superiority of one race over another, particularly during the 17th through to 20th centuries.

Anthropology was obsessed with race from the 18th to 20th centuries and has a lot to answer for in terms of the part it played in justifying political and ideological racism.

If you doubt for a moment the impact that race has had on many people, just ask an indigenous person anywhere in the world what they think of race.

History doesn’t lie

Putting aside the ethics for a moment, is it legitimate from the biological perspective to apply race to humans? We might consider this from two viewpoints:

  1. How would we go about recognising races?
  2. How many races might we then identify?

Both questions were the source of regular consternation during the 20th century, and earlier, as anthropology struggled to make sense of – and pigeonhole – the geographic variation seen in humankind around the world.

What evidence was used to identify human races? Well, as it happens, just about anything, and most of it unscientific.

The book Races of Africa, published in three editions from 1930 to 1957, recognised six races inhabiting the African continent. Its author, British anthropologist C G Seligman, readily admitted that the races it described were defined on non-biological grounds, a fact “readers should appreciate in order to make necessary allowances and corrections”.

How were these races identified? Mostly using the languages people spoke: as Seligman further informed his readers, “linguistic criteria will play a considerable part in the somewhat mixed classification adopted.”

Seligman should be praised for his honesty. Many other anthropologists continued the ruse of biological objectivity well into the 1970s; some stick to it today. The reality is that most races were identified on cultural or linguistic grounds, or simply on account of educated intuition, not biology.

Another fascinating example of the arbitrariness of this category is the so-called “Negrito” “pygmy” race, which sometimes still gets talked about by anthropologists and archaeologists with respect to the origins of indigenous people in East Asia and Australasia.

It has been defined to include people from the Congo of Africa, the Andaman Islands, several Southeast Asian countries, New Guinea and Australia. The Negrito race is not a biological reality reflecting history, but an artificial construct based on superficial similarities.

The skull measurements, brain size estimates, hair form, skin and eye colour, intelligence and blood group data used to justify races were simply retrofitted to each of them.

Moreover, these physical features were very far from flawless in reinforcing established notions of race. None of them has provided any evidence for discrete boundaries between human groups – or groups as genuine geographic entities – and many of them simply reflect the environment, not biological history.

Take skin colour, or pigmentation, as an example, a feature that has been used in almost every racial classification published. While anthropologists employed discrete categories such as “black,” “brown” and “white,” in actuality, pigmentation grades continuously along a geographic cline from the equator to northern and southern latitudes, regardless of race.

How many races have been recognised for living people? Well, there seems to have been no real limit in practice, reinforcing their arbitrary nature.

During the 20th century, estimates of the number of races varied from two to 200 across the globe. For Europe alone, one book published in 1950 estimated six, while another onethe same year identified at least 30 races.

Sure, you might recognise races if you compare the skin pigmentation of people from a village in the Scottish Highlands to one in coastal Kenya. But you’d be kidding yourself because you would be ignoring all of the people who live along the thousands of kilometres that stretch between them who don’t fit into your concocted moulds.

Genetics: the final arbitrator

Developments in the field of genetics from the 1960s onwards made new inroads into the question of race. In fact, genetics marked the death knell of the scientific race debate.

Geneticists have found a number of features about human diversity that just don’t fit the pattern expected for the ancient subdivisions we might anticipate if races actually existed.

Some important findings that show racial categories to be unfounded include:

  • humans are genetically much less diverse than most mammals, including our chimpanzee cousins
  • common estimates are that around 2%-8% of genetic variation occurs between large groups living on different continents; a pattern that again contrasts with most mammals, which show much greater differences on continental scales
  • living Africans possess substantially more genetic variation than other populations. This reflects the ancestry of our species in Africa – only a couple of hundred thousand years ago – and the establishment of all non-African populations by a small founder group from Africa – less than 60,000 years ago
  • most populations show high levels of mixed ancestry indicating that people have migrated regularly in the past, with most groups far from being isolated from each other for any great length of time.

Are we all the same then?

There is no denying that humans are variable. Some of that variation – a small amount – reflects our geographic origins. Genetic data show this unequivocally.

But this is simply not the same as claiming that this geographic variation has been partitioned by nature into discrete units we call races. Humans have simply refused to be classified along taxonomic grounds – beyond the fact that we all belong to the single species Homo sapiens.

The facts are that the races recognised by anthropologist during the 19th and 20th centuries simply don’t hold up to scrutiny from physical or genetic evidence; besides, races never were scientific to begin with.

Has human evolution come to an end?

First published in July 2014 and by ABC Science.

Surprising discoveries that reveal how we have evolved since the Stone Age offer real and troubling insights into where we may be headed as a species.

Futurologists love to think they have the answers to questions about where we might be headed as humans: technologically, socially and biologically.

A recent example bandied about on the internet comes from Nickolay Lamm, a digital artist who makes ‘normal-sized’ Barbie-like dolls, and who has also speculated on how the human head may look in a 100,000 years’ time.

He sees a world where human biological evolution will soon end and be replaced by evolution through genetic engineering.

That a hundred millennia into the future humans will be alien like with balloon-heads — from larger brains — Manga-style oversized eyes and large, flared, nostrils.

Why this particular look? It has nothing to do with fashion or sex.

Lamm believes humans will have colonised distant parts of the solar system by this time and will have engineered various physical features to cope with low light, high solar radiation and low oxygen in our new extraterrestrial home, as well as a need for greater intelligence.

Now, I’m as much a fan of science fiction as the next guy. So, let’s be frank here, these kinds of portrayals of the ‘future’ of human evolution are just that, science fiction.

And, in fairness, Lamm apparently didn’t claim his ideas to be anything more than speculation.

As fun as thought experiments like these might be, it can be interesting to reflect on whether science has anything useful to say about what the future course of human evolution might look like.

A never-ending story

Our evolution didn’t stop with the end of the Stone Age and I doubt very much that humans will ever be able to or want to take complete control over it.

We are still evolving today and will continue to do so into the future because it’s built into the basic fabric of our biology.

Evolution will always continue because of the way our DNA is encoded and replicates, and because of the fact that we reproduce sexually.

I don’t expect that the random generation of gene mutations, DNA reshuffling that occurs with recombination during sperm and egg cell production or random genetic shifts that occur from generation to generation with sexual reproduction will cease any time soon.

This is evolution on a very small scale, but it’s happening each generation and is unpredictable. And this is evolution we simply can’t and wouldn’t like to stop.

Under pressure

At the risk of being branded a futurist myself, I think we can also sketch out some larger possibilities about the sorts of evolutionary pressures that humans are now, and into the future will increasingly be, under.

What are some of the forces and changes we see that may act to drive our future evolution?

  1. Top among them has got to be industrially induced climate change with its higher temperatures and greater extremes in weather, food and water shortages and greater spread of infectious diseases, among other likely profound changes
  2. The overuse of antibiotics combined with many larger numbers of agricultural livestock and the high speed of global travel and disease transmission
  3. Resistance to pesticides by insects affecting agricultural crops and disease carrying vectors like mosquitoes
  4. Rising pollution levels and the greater exposure we are experiencing to chemicals in our environment and in our food.
  5. A lifestyle based around limited physical activity and focused more on technologically-aided activities
  6. Delayed conception, especially by fathers, leading to increased disease causing mutations in newborns.

Any one of these — and the list is far from exhaustive — could lead to the sort of pressures that could accelerate evolution — beyond the background rate — in human populations.

All that’s required is that some people, especially children, because of their genes, are incapable of surviving and/or reproducing in the face of these potential evolutionary drivers.

New mutations may arise, or existing but rare ones increase in number, in a population owing to such pressures. Some of them will be beneficial, others detrimental.

As a consequence, those people who can cope, or even thrive under such conditions, would leave more of their genes to future generations: plain and simple Darwinian evolution.

Changes might include in genes associated with our immune system, with heat shock protein genes, mutations in germ line cells especially spermatozoa, eccrine (sweat) gland function or even some others affecting our skeletons, muscles or nervous system.

Sound far-fetched? Well actually we have a pretty good case study from recent human evolution that suggests just these sorts of changes could happen in the not-too-distant future or could even be happening now.

The great agricultural assault

Large-scale studies of the human genome have shown that the most rapid and important evolutionary shifts that have occurred since our species appeared about 200,000 years ago followed the invention of agriculture around 10,000 years ago.

The changes were much greater than when the earliest members of Homo sapiens left Africa less than 100,000 years ago and settled new and challenging environments in Asia, Europe and Australia founding the modern populations of these regions.

The signatures of these events are very clear: more than 70 per cent of protein coding gene variants and almost 90 per cent of variants found to be disease causing in living people — whose ancestors were agriculturalists — arose in the last 5000 to 10,000 years.

At this time we saw a remarkable shift in lifestyle, with the end of hunting and gathering and the adoption of agriculture in many places.

This was the so-called Neolithic or Agricultural Revolution, and it eventually lead onto the industrial and post-industrial economies we have today in the rich world.

Farming led to a dramatic shift in our diet, changed behavioural patterns, with less mobility but long daily activities cultivating and processing food, led to a major assault on our immune system and saw the global population rise exponentially from perhaps a few million people worldwide at 10,000 years ago to more than 1 billion by AD 1800 and 7 billion today.

It led to large tracts of land being cleared, extensive natural species and habitat loss, changed local weather patterns, permanent human settlements, high-density living, exposure to many new infectious diseases from handling animals and from exposure to their faeces, contamination of water supplies and much poorer hygiene including exposure to diseases carried by pest species like rodents.

The sorts of changes we see going on in the world today as a result of human activity are in many ways similar to those during the agriculture revolution only this time on steroids!

Who will be affected?

Given the breadth of these potential drivers of current and future human evolution, all human populations will likely be affected, but not equally.

While scientific and medical progress over the last century or so has been remarkable, much of it aimed at reversing the effects of the adoption of farming, the spoils of our efforts have not been shared equally and many people in the developing world today still suffer diseases largely eradicated from wealthier societies.

These are also the people who are most vulnerable to the effects of carbon and other pollution and their various and wide-reaching effects, and also the people who will be most at risk of evolutionary pressures from them.

Unlike any other time in our history, we head into this future knowing full well where we are going and without the veil of ignorance that surrounded our Stone Age or Neolithic ancestors.

These are not the glamorous changes the futurists like to focus on, but they are, I think, far more plausible ones, and ones with very real implications for science, medicine, multinational policy makers and governments around the world.

The evolutionary path to us: straight line or forks in the road?

First published in June 2014 and by ABC Science.

The depiction of human evolution as a simple linear affair is not only laden with historical baggage, it incorrectly portrays the true complexity of our past.

Search “human evolution” in Google images and what you’ll get is an abundance of stereotypical images of an idea deeply embedded in our subconscious, the inevitable line or ladder of human evolution:

Step 1, crouching hairy ape resembling a chimpanzee with a bad back;
Step 2, ancient ape learns to squat;
Step 3, ape corrects bad posture;
Step 4, upright ape begins to loose skin colour;
Step 5, almost-human creature has picked up a spear, grown a beard and donned a roughly hewn leather skirt; and
Step 6, big-brained pale skinned man wearing a tailored leather mini (or Armani suit if you prefer) arrives in crowing glory, carrying a beautifully crafted spear (or brief case or even mobile phone).

Now, not only is this a woefully outdated, laconic and highly inaccurate portrayal of our evolutionary history, it’s one the plays right into the hands of wannabe scientists like creationists, showing our evolution to be a programmed series of steps leading inevitably to humankind.

This ridiculously simple image would also have appealed to the racialist anthropologists who dominated my field during the 19th and, sadly, a good part of the 20th Century — scientists like Samuel Morton, Carlton Coon and many other race supremacists.

We might well also ask the obvious question: what happened to the other 50 per cent of humanity, womankind? There’s more than a hint of Genesis (2:23) about it: “this is now bone of my bones, and flesh of my flesh, she shall be called Woman because she is taken out of Man.”

But, what I don’t really get is why this kind of drivel still pervades the internet well into the 21st Century and even on some pretty reputable sites that claim some kind of authority on evolution.

So, what’s the truth about how we evolved? How should we be portraying the broad sweep of our evolutionary history?

The ultimate twig

A giant of 20th Century biology, George Gaylard Simpson, observed in a 1964 article in the journal Science in which he poured cold water over the fledgling field of “exobiology” (what we today call “astrobiology”) that:

The fossil record shows very clearly that there is no central line leading steadily, in a goal-directed way, from a protozoan to man. Instead there has been continual and extremely intricate branching, and whatever course we follow through the branches there are repeated changes both in the rate and in the direction of evolution. Man is the end of one ultimate twig.

Sadly, 50 years after Simpson wrote these words, the public portrayal of human evolution hasn’t changed much, if the internet, many people’s font of all wisdom, is truly representative.

The portrayal of evolution as a ladder, just like the equally misleading term “missing link”, harks back to the Great Chain of Being of 17th and 18th Century philosophers who believed it was their divine duty to order and name nature in accordance with God’s plan: simple things at the bottom and humans, especially the white man, at the top, closest to God.

Carl Linnaeus, the 18th Century father of biological classification, whom we have to thank for the system of scientific names we use today to label all living things, was one such creationist.

He classified humans in the Order Primates and today this label still holds, humankind sitting in a biological group with the lemurs, lorises, tarsiers, monkeys and other apes.

Being dubbed a Primate is one of the highest honours a church can bestow on a clergyman, particularly a bishop, and Linnaeus’ classification reflected his bias that humans were also, like clerical primates, close to God.

But, while the label “primate” remains today, religious baggage no longer clouds our ideas about scientific classification.

Reading the fossil record

Beginning in the first half the 19th Century, anthropologists began to amass thousands of fossils, now spanning a period of seven million years, and this record of our evolution recovered from the Earth’s crust shows unequivocally that diversity was the rule.

Latest count is more than 30 species or twigs of two-footed ape (or ‘hominin’) relatives in our evolutionary bush: or many forks in the road to us, most of them dead ends.

It’s true that most of the fossils we have are broken skulls or teeth, sitting in or out of their respective jaws, but just occasionally nature throws up a more complete skull or even nearly complete skeleton for us to find: take Australopithecus sediba as a recent example.

Yet, what’s even more fascinating to me, as odd as it may seem, is what we don’t know by way of extinct species!

The fossil record is continually throwing up surprises for us when we look in places we’ve not looked before, or sediments spanning previously neglected periods of time.

Take the Hobbits from Flores (strictly Homo floresiensis), or my own discovery with Chinese colleague Ji Xueping, the ‘Red Deer Cave people’: anthropologists would never have predicted either of them to have existed based on what we previously knew.

That’s the joy of evolutionary science — not to be confused with another biological pastime — just when we think we know it all, along comes another big surprise to forces us out of old habits!

We know very little about human evolution for most of the planet, especially for vast areas like Asia, and even for most of the massive African continent.

Similarly, there are big gaps in time: until the year 2000, the human fossil record ran out at about four million years ago, but then within a few years of each other, new discoveries in Kenya, Ethiopia and Chad pushed it back another three million years, filling a vast chasm.

While the image of the “bush of human evolution” promoted by a bell-bottom wearing Stephen Jay Gould back in the 1970s might not be very glamorous it is the perfect analogy on many levels.

Not only does it accurately portray the evolutionary history of a very diverse, and rather short lived group of two-footed apes, it shrinks our collective ego back to a more realistic and moderated place, right where is should be.

Brain versus brawn: evolution of the bubble-headed weakling

First published in June 2014 with an edited version published by The Conversation.

One of the most important questions we can ask, and one that continues to take up much of the time of scientists, philosophers and the religious minded alike is, why are humans so different to the rest of the living world?

Philosophers and physicists have even celebrated the appearance of humans 200,000 years ago on the African savannah as marking the arrival of consciousness or self-awareness for the universe.

Despite the remarkable promise and advances of science and technology over the past 155 years since Charles Darwin published his paradigm shifting book, On the Origin of Species, I find myself increasingly pessimistic about whether this ‘riddle to end all riddles’ will ever be solved.

Mind the gap

In our quest to disentangle it, our scientific gaze usually turns to examining the differences between our close living relative, the chimpanzee, and ourselves.

The physical and behavioural distinctions between us are obvious to all: pointedly, it is we who are destroying their habitat and threatening their very existence.

We share a common evolutionary ancestor with them some 7 or 8 million years ago – a mere ripple in stream of time of Earth’s history – and more than reason enough to ensure their survival as a species.

Our genetic blueprint, our genome, is different by a mere 1-2 percent: barely enough to explain the “gap” between us.

Surprisingly, comparisons of our genomes have shown that chimpanzees have undergone more positive genetic change in their evolution than we have, undermining the widely held view that humans are dramatically different to other apes.

This also means that, despite its promise, our DNA seems unlikely to provide the answer to this most important of questions.

Bubble-headed apes

One of the most obvious physical differences is our massive brain: a typical chimp brain is close to 400 grams in weight while a human one, on average, weighs almost one and a half kilograms.

Geneticists have identified 15 genes differing between us that physiologically control our brain and nervous system.

Some of them probably underpin the profound changes in brain growth, size and function that set humans apart, but this remains  unclear.

Fifteen genes is a very small number, bearing in mind the 21,000 genes present in our genome – most of which have no known function – and that seemingly ordinary differences between people such as stature are influenced by hundreds of genes.

Brawn over brain

Another major difference between us is resides in our brawn: it is a well known fact among zoo keepers and conservationists of chimpanzees that these animals have been known from time to time to rip the arm out of a human’s shoulder socket when angry.

While the number, structure and function of their muscles is overall very similar to ours, there are a few important exceptions such as muscles involved in walking upright, chewing our food and the expression of emotions with facial gestures.

Our limb muscles are mostly are lot smaller and weaker than a chimp’s and their joints are adapted for much greater agility and more rapid and complex movements than ours.

In short, human limbs have evolved for terrestrial running, while chimpanzee’s are adapted for arboreal climbing.

The differences in our chewing muscles obviously reflect our diets, with humans preparing our food using techniques like cooking, a practice that may have begun a couple of million years ago among our Stone Age ancestors.

Another really interesting and long understood difference is in our muscles of facial expression, crucially used for non-verbal communication in both species.

Charles Darwin in his 1872 book The Expression of Emotions in Man and Animals wrote about how important facial gestures were in distinguishing humans from other animals.

Yet, recent research comparing these muscles has shown that earlier scientists exaggerated the differences between humans and chimpanzees, and that our facial muscles are very similar in number and function, although, not identical.

Chimpanzees do possess a very wide repertoire of facial expressions and gestures, but they are not as varied as ours: it seems that only humans expresses emotions like disgust with our faces.

Facing the question with new energy

The seeming failure of genetics to explain “the gap” seems to be inspiring some rather novel ways to address the problem.

Fascinating new researched by Katarzyna Bozek and co-workers published in PLOS Biology has compared the so-called “metabolome” of humans and chimpanzees with some surprising results.

The metabolome is the totality of small molecules produced in the body during metabolism (i.e. normal processes that occur within the cells).

They include amino acids, sugars, fats, vitamins, pigments, odorants, hormones and other signaling chemicals, enzymes and many others.

There are almost 42,000 such chemicals known for the human body, and Bozek and colleagues compared 10,000 of them from five body tissues from the brain, kidney and muscle.

Importantly, they found that changes in these chemicals between species track evolution and reflect the amount of time and change between organisms since they shared a common ancestor.

Reassuringly, they found that the human metabolome for the brain, especially our frontal cortex, had changed four times more rapidly than in chimpanzees, reflecting major differences in brain size and function between us.

But the big surprise in their research was that human muscle metabolome had changed more than eight times as much in humans than in chimpanzees, hinting at major differences in the way our muscles work on the molecular level.

They very reasonably speculate that metabolism in the human brain and muscles could have evolved in tandem in such a way that the energy demands of our muscles reduced to allow our metabolically expensive brains to grow larger in evolution.

Alternatively, the shift in our ancestors to endurance running at much the same time that their brains began to enlarge may have forced a change in the major source(s) of energy used by the body as cellular fuel: perhaps relying much more on energy stored in body fat.

Undoubtedly, we still face a major chasm in knowledge about how we evolved to be so different to other apes, in the way we think and behave.

But, new approaches like those comparing our metabolomes, made possible with recent development in fields like as biochemistry, offer power new insights that will add much to more traditional approaches such as anatomy, paleontology and genetics.

Darren Curnoe's Blog – An Insider's View of the Science of Human Evolution