Biological evolution is change in the characteristics of living organisms over generations.
Despite your vastly superior tastes in music and fashion, you probably look vaguely like your parents, just as they look vaguely like their parents. For all of recorded history, people have looked more or less the same: two eyes, two ears, a nose, a mouth, a head sitting atop an oblong body with two arms and legs. Hairdos differed, but the basic body plan stayed the same for as far back as the history books go.
But if you were to go back further in time, a couple million years before anybody figured out how to write, your ancestors would still have two eyes, two ears, a nose, a mouth, a head sitting atop an oblong body with two arms and legs. But they would have more hair. And maybe not quite as much smarts. They would be much better at climbing trees, though. If you went further back in time, the arms would be front legs. Even further back in time, the fur would be scales. Really far back in time, there wouldn't be any legs at all, just fins for swimming.
Evolutionary theory argues that all the organisms alive on Earth today share a common ancestor. As unlikely as it sounds, life forms from spiders to spider monkeys belong to the same family tree. Even fungus merits an invitation to the family reunion. A simple way to think of evolution is "descent with modification" — over many generations, organisms change into something different.
People took a long time to figure out that evolution happened, and for many years, Western civilization relied largely on the Bible to understand how we got here. But about the same time that Anglo-Irish Archbishop James Ussher calculated his Earth-formation date of October 26, 4004 BC, some of his keenest contemporaries puzzled over just how the Bible could be literally true. Early on, the story that caused perhaps the most trouble was Noah's Flood.
One man who managed to stir some doubt was Niels Stensen (or Nicolaus Steno). A religious man, Steno wasn't out to disprove the Bible; he used it as the basis for his work in piecing together old landscapes. He figured out that rocks are deposited in layers with older rocks at the bottom, and in the landscape of Tuscany, he thought he saw the events described in Genesis. In the century following his death, however, people who searched rock layers for remains of Earth's earliest inhabitants found something odd. According to Genesis, God made Adam, Eve, and all the animals first. Then Adam and Eve started a family and left plenty of descendents. Only later came Noah's Flood, which was held responsible for depositing all those weird remains, like shells, in rocks on top of mountains. If that actually happened, human remains should have appeared in the oldest rocks at the bottom of the heap, but they didn't. Human remains showed up only in the newest layers. The oldest layers of rocks held different creatures, and the further down in the heap one looked, the weirder the creatures got.
Steno wasn't alone in inadvertently causing Noah trouble. All the naturalists who traveled to the New World and Australia to draw, collect, measure and catalog what lived there threatened to sink Noah's Ark with too many passengers. After all, fitting two of everything living in Europe was enough of a challenge. Squeezing in all these newly discovered creatures from newly discovered continents looked impossible. Biblical scholars went back to calculating the length of a cubit.
One more problem for Noah was a shift in philosophy in the 17th and 18th centuries. Rather than simply relying on miracles, scholars and savants (there were few actual scientists just yet) adopted the view that God played by His own rules, so understanding nature could contribute to understanding God. And if God played by the rules, then a worldwide flood that covered the highest mountain peaks with water (and left seashells behind) would require the sudden creation and subsequent cleanup of a tremendous amount of liquid. That didn't look likely. Leonardo da Vinci himself poured scorn on such "just so" stories.
By the early 19th century, a different picture had emerged of the planet's history. Lead by scientists like Buffon, George Cuvier and James Hutton, the men (few women could participate then) who studied rocks and fossils accepted that the Earth had been around for thousands, if not millions, of years before people came on the scene. Furthermore, many animals — giant ground sloths in South America, monstrous reptiles in England — had gone extinct long ago. So even if people continued to look to the Bible for spiritual guidance, they began to doubt it was a literal account of the history of life on Earth.
Not everyone who accepted an ancient Earth necessarily accepted biological evolution. By the early 19th century, the professional scientists and leisurely gentlemen who dabbled in geology or comparative anatomy entertained a variety of explanations for humanity's predecessors. Evolution was one explanation, but many savants believed a series of catastrophes had been followed by fresh creations.
One early evolutionist, Lamarck, proposed a form of evolution in 1800. He suggested that organisms could acquire needed characteristics for changing environments, an idea that has been laughed off by history, but his views were actually more nuanced than modern accounts usually relate. Cuvier, regarded as the greatest comparative anatomist of his day, disdained Lamarckian "transmutation" but that didn't keep others from considering it, such as Erasmus Darwin, Charles Darwin's grandfather.
At the end of 1831, Charles Darwin set sail on a sea voyage. Over the next several years, he examined living and fossil organisms that gave him an insight into the ancient past. When he departed on his voyage, many of the people he knew believed in special creation, meaning that God specially created every organism for its environment. At the time, Darwin himself may have shared that belief. Early on, however, he noticed phenomena that special creation didn't explain very well. When introduced to islands, frogs often thrive, sometimes multiplying to nuisance proportions. Yet on the Galápagos, Darwin couldn't find any frogs. He instead affirmed what the French naturalist Jean Baptiste Geneviève Marcellin had said earlier: frogs aren't found on very many oceanic islands, even when they are well suited to those environments. It begs the question why they weren't created for those locations. The situation is more easily explained by the theory that life on Earth evolved, had to migrate to new locations, and would be prevented from doing so by substantial barriers. Where barriers to migration exist, different animals fill the same ecological niches. This happens again and again, with rabbits and viscachas, beavers and capybaras, and jaguars and marsupial lions filling similar niches in different locations.
Sometime after he returned home, Darwin developed his theory of natural selection. He planned to publish his idea as early as the 1840s, but then Robert Chambers anonymously published Vestiges of the Natural History of Creation a quasi-scientific account of organisms changing over time. The reaction was fierce. Darwin held back — and gathered more evidence to support his case. In the late 1850s, a young naturalist, Alfred Russel Wallace, began to see organisms in the same light as Darwin. He wrote Darwin about his idea, and Darwin realized that, if he waited any longer, he would be scooped. Darwin's notes and correspondence, and Wallace's paper were all read aloud at the same scientific meeting in 1858. They spurred little reaction, but a year later, Darwin published The Origin of Species. Biology was forever changed.
Darwin has gotten more modern attention than Wallace, but they both deserve credit for the theory. They're famous today not because they proposed evolution. That had already been done. They're famous for proposing a viable explanation of how evolution occurs: natural selection.
In the wild, most parents produce more babies than can possibly survive. When sea turtles hatch, hundreds of them poke out of their shells and head for the ocean, desperately flopping their tiny legs as fast as they can. Many of them don't even reach the water; they're picked off by seagulls instead. Once in the water, many more baby turtles become snacks for their neighbors in their new watery neighborhood. Even if they avoid becoming a meal themselves, they have to find meals of their own, and food can be pretty limited in the wild.
Some life forms on Earth — such as some bacteria, plants, even animals — reproduce by cloning — making exact replicas of themselves. In such populations, everybody is the same. For the rest of us, sex introduces variation into a population. Darwin pondered this variation, particularly the differences between the animals that survive and pass their characteristics on to their offspring, and those that instead get snatched up in the jaws of defeat. He (and Wallace) found that the key to surviving and producing offspring has to do with suitability to the environment. Birds that need to see their prey from afar, then swoop down and snatch up a meal, benefit from keen eyesight. Hunting birds with lousy eyesight often go hungry. Lizards scurrying around in forest floor litter benefit from good camouflage and swift legs. In a (literal) pinch, the ability to part with one's tail can keep a lizard alive. Ditto an octopus arm.
In other words, the life forms best suited to their environments are likely to live the longest and produce the most offspring.
An assortment of tiny islands in the Bahamas recently gave researchers the chance to compare the roles of predation and competition in natural selection. The scientists stocked each island with lizards that they planned to observe, careful to ensure no reptilian island-hopping. Even before that, they measured and marked each lizard, and put each (un)willing experiment participant through a fitness test on a treadmill. Some islands were sparsely populated, meaning the lizards' biggest challenge was predation. Other islands were crowded, so the biggest challenge in those places was beating the other lizards to the most calories. At the end of breeding season, the researchers sifted through their results and discovered that competition for resources was a bigger driver of natural selection than predation. Local predators turned out to be unfussy eaters. The lizards that won the chow contest on crowded islands, however, were bigger and more athletic than the survivors on un-crowded islands.
Even when an organism is well-suited to a particular environment, and can out-compete others for limited resources, that's no guarantee of everlasting success. Environments can change. Lakes can dry up. Weather patterns can shift. A new species of vegetation can set up camp, favoring a new color of camouflage in lizards rustling around in the leaves. Any organism that happens to possess a characteristic well suited to the new environment will do better than its peers. Over time, the new characteristic may become so preferable to the old one that the population eventually looks different from how it used to look.
Darwin couldn't know about the discoveries of genetics that would follow in the 20th century. Genes — the inherited instructions that tell our cells what to do and tell our bodies what to look like — are passed from one generation to the next with remarkable accuracy. But every once in a while, something goes wrong. The instructions get botched. These random mistakes are mutations. People often think of mutations as invariably harmful, but that's not always the case. Many (perhaps most) mutations have little or no effect, and some mutations are life savers. One example in human DNA is known as delta 32. When the plague, often referred to as the Black Death, struck Europe, it devastated the population, but a lucky few survived. Some research suggests they carried the delta 32 mutation that conferred resistance to the disease. The same mutation appears to confer a resistance to HIV, the virus that causes AIDS, today. A 2022 study identified mutations in a gene dubbed ERAP2. An allele of the gene that boosted immunity against Yersinia pestis boosted survival. A study of European skeletons of plague victims and survivors showed that survivors were more likely to carry the beneficial gene variant. In fact, researchers found the incidence of the variant surged in the wake of the plague — "the strongest surge of natural selection on the human genome documented so far," according to Science.
If your irritating little brother emptied a pail of sand over your school project, while you stood there thinking of ways to shorten his life, the little tyke might try to extricate himself with the argument that he had no idea the sand would actually fall down. After all, gravity's just a theory.
In casual conversation, we talk about our "theories" all the time, i.e., "Men who are chauvinists at work are really henpecked at home, and they're just trying to vicariously get even with their wives." A scientific theory is different. Rather than being simple speculation, a scientific theory tries to make sense of a broad range of observations. The germ theory of disease is a good example. Before scientists understood the role that microscopic pathogens play in spreading sickness and infection, people used to blame cold weather, warm weather, "vapors," foreigners, even evil spirits. Today, the germ theory of disease is so well accepted that a doctor who refused to wash up before performing surgery would face a lawsuit, and probably lose his or her license to practice medicine.
A scientific theory starts out as a hypothesis, a proposed explanation for some phenomenon, and where possible, the hypothesis is subjected to testing. Scientists document such work and publish their papers in peer-reviewed science journals. Other scientists in the same discipline review and comment on the paper before it sees the light of day, and other researchers should be able to replicate the experimental results.
As evolutionary biologists have pointed out, any of the experiments designed to test Darwin and Wallace's theory of natural selection since it was first proposed could have proven the hypothesis wrong. None of those tests did. So evolution is "just a theory" the same way that the theory of plate tectonics and the germ theory of disease are just theories. And gravity.
One body of evidence Darwin turned to for his theory was artificial selection: the changes humans have caused in crops, livestock and pets.
Humans have bred dogs, birds, cows and flowers to their desires for hundreds, if not thousands, of years. Dogs may be the first organism that people domesticated, and in the thousands of years that have passed since humans began breeding our canine companions, we have shaped dogs into a mind-boggling variety of colors, shapes and sizes. Depending on the breed, an adult dog might fit easily into your purse or backpack, or weigh more than you do. It might be specialized for speedy running, fetching, rat killing, thief watching or lap sitting. (Though they also show significant variety, domestic cats exhibit a smaller range of sizes and shapes than dogs, and no amount of human intervention in their breeding process has rendered cats and sweet and devoted as dogs.)
Human intervention in crops is easily to underestimate, but domestication has shaped the foods we eat as much as dogs and cats. A 17th-century still life by Giovanni Stanchi hints at how much humans have changed plants in just a matter of a few centuries. The exotic-looking fruits in the lower right of Stanchi's painting are watermelons — hard to recognize for modern viewers. In fact, humans have been working on watermelons for millennia; genetic analysis of a watermelon leaf from a 3,500-year-old tomb shows New Kingdom Egyptians cultivated them. Recognizing the power of Old Master art combined with domesticated plant genomes, a pair of Belgian researchers launched a crowdsourced project, #ArtGenetics, in 2020, appealing to the public for help in identifying more examples of domesticated changes.
If farmers and breeders could bring about such big changes in the short time span of recorded history, Darwin reasoned, what could nature do?
Artificial selection shows the power of an outside selective force acting on a species, but that's not the only evidence for evolution. Other factors point to common origin for life forms. Bats, dolphins and people are all mammals, but bats fly, dolphins swim, and humans type, dine and doodle. If each species were carefully designed from scratch, there wouldn't be much need for overlap in skeletal structure. Yet all three types of animals share the same general limb design. Humans, dolphins and bats all have upper arm bones, lower arm bones, wrists, hand bones, and fingers. In dolphins, these bones are shortened to make a stiff flipper. Bats, meanwhile, spread their wings out over their finger bones.
Why would evolution do this? Because it works with whatever it's got handy (pardon the pun). Evolution can't see the future and it can't change the past. It can only cope with the present.
Another example of evolution working with what it's got appears in the difference between how whales and fish swim. Fish and their ancestors have always lived in the water. Whales moved back into the water after their ancestors wandered around on land. When fish swim, they wriggle their tails sideways, rather like a snake slithers. In contrast, whales move their tails up and down, because they inherited spines that were once attached to running legs.
One of the arguments mustered against Darwin's theory was the complexity of the human eye, which couldn't function properly without all its components working just right. Surely something so complex couldn't evolve through gradual change, critics argued. But more primitive eyes can be found in nature. Freshwater flatworms get by with simple eyespots: aggregates of pigment cells that distinguish between light and darkness without the aid of nerves. Jellyfish use a range of strategies to sense light. Some use dispersed photoreceptor cells paired with dispersed neurons, and among the jellyfish that can see, eyes likely evolved several times. The tube feet of sea urchins have photoreceptor cells, meaning the entire animal operates as a crude compound eye.
Darwin pointed out the things he and his contemporaries could observe in the 19th century. In the 20th century, the neo darwinian synthesis combined Darwin's theory of natural selection with Gregor Mendel's discoveries related to genetics, and not long afterwards James Watson and Francis Crick uncovered the physical structure of DNA. This led to a whole new set of discoveries about common ancestry.
Reptiles have scales, birds have feathers, and we mammals have hair or fur. The cell structures that make these various skin appendages, known as anatomical placodes, employ the same developmental genes in all lineages, a 2016 study of embryos and gene expression found. The likeliest explanation is that animals with scales, fur and feathers all inherited the cell structures from a common ancestor.
People began domesticating rock pigeons, Columba livia, as long ago as the Neolithic, and by Darwin's day, pigeon breeds abounded. Darwin suspected that all the pigeon breeds, even the "fancy" pigeons with strange shapes and frilly feathers, descended from the rock pigeon. Genome sequencing of pigeon breeds in the early 21st century backed up Darwin's suspicions, showing that domestically bred pigeons are more genetically similar to each other than to another closely related pigeon species, Columba rupestris. The research also identified a mutation matching crests (feathers growing in reverse direction) on multiple species, likely a mutation that "occurred just once and spread to multiple breeds." The domestic pigeons had derived traits their ancestor breed lacked.
DNA studies have found that some organisms carry long-deactivated genes that were more useful to their ancestors. Dolphins and whales, for instance, carry hundreds of genes related to sensing smell in an air-based (not water-based) environment. But because these animals have returned to the water, those genes are no longer functional.
DNA studies help humans find our own spot in the tree of life. Comparing the genetic material of humans and other great apes shows that our DNA differs within our own species by about 0.1 percent, with that of the chimpanzees by about 1.2 percent, and with that of the gorillas by about 1.6 percent. Yet humans and African apes (including gorillas) all share more similar DNA than the African apes share with orangutans, which dwell in Asia. Darwin predicted that humans branched off from other apes in Africa and modern studies back up his hypothesis.
More striking than similarities among great apes are DNA similarities between a much wider range of organisms. Human DNA roughly matches 90 percent of the DNA of a domestic cat, 75 percent of the DNA of a mouse, 60 percent of the DNA of a banana, 38 percent of the DNA of a parasitic worm, 18 percent of the DNA of yeast, and 60 percent of the DNA of a fruit fly. Analysis of human versus fruit fly DNA indicates that we use basically the same genes, just in different ways with different combinations and different timing. Scientists have also pinpointed some specific genes that do different things in different organisms. Research led by Arhat Abzhanov of Harvard University suggests that a gene that geneticists have dubbed BMP4 both strengthens the jaws of fish that eat a robust shellfish diet, and bulks up beaks of Galápagos ground finches. A 2015 study by Sangeet Lamichhaney and coauthors found that a different gene, ALX1, plays a strong role in beak formation in Darwin's finches, as well as facial structure in humans. Another gene identified as FOXP2 helps young finches learn to sing, and young humans to speak.
Over the course of their evolutionary history, many organisms have lost genes, but sometimes the function of the discarded gene can be regained by repurposing genetic material the species has left. Many vertebrate species possess three genes related to taste, named T1R1, T2R2 and T3R3. Among this trio, T2R2 gives humans and other animals the ability to taste sweet foods. Genetic studies indicate that birds have lost the gene that would enable them to appreciate a candy store. But nectar-sipping hummingbirds show a clear preference for a sugary diet. A study led by Maude Baldwin in 2014 concluded that hummingbirds had repurposed T1R1 and T3R3, which pair up to sense umami (savory) flavors, to sense sweet ones as well. Baldwin postulated that hummingbird ancestors might have been introduced to a sweet diet by hanging out near flowers to catch insects. If so, it would bolster Darwin's claim that taste "must be acquired by certain foods being habitual — [and] hence become hereditary," though Baldwin cautioned that more evidence, such as early hummingbird fossils, was needed to flesh out her hypothesis.
Gene studies have turned up groups of genes, sometimes called modules or termed "deep homology," that have continued working with each other for (literally) millions of years. Like steadfast bowling buddies, these genes have stuck together even as the organisms they created evolved and the genes' jobs changed. A hunt for genes that code human blood vessels, for example, identified five genes — and turned up the same genes in an astonishing place: yeast. In these unicellular fungi, the genes repair damaged cell walls. A hunt for genes involved in the human nervous system found the same genes in completely nerveless protists known as choanoflagellates. Human eyes and jellyfish eyes use the same clusters of genes to detect light. So far, geneticists have detected almost 50 modules shared by people and plants.
And then there are genes that manage the big picture in a variety of organisms. First found in fruit flies, Hox genes regulate overall body plans for everything from bugs to birds. Although the genes differ between vertebrates and arthropods, they show remarkable similarities. They often occur together in a comprehensible order, in contrast with most other genes, and their order matters. Arguably grisly experiments with these genes show that moving them around creates fairly disgusting mutants, like flies with legs sprouting from their heads. Likewise, grafting mouse mouth tissue into a developing chick embryo demonstrates how Hox genes work across different animals; the resulting chicks hatch with teeth, though the teeth look dinosaurian.
Even more basic than Hox genes are the building blocks for all genes; DNA is the acronym for deoxyribonucleic acid. With the exception of some viruses that rely on just ribonucleic acid (RNA), all life on Earth uses DNA. And in organisms with DNA, RNA helps transcribe genes into proteins. These building blocks of inheritance are common all living things on this planet.
Evolution still happens today, and it's still driven by natural selection.
Finches in the Galápagos Islands have repeatedly demonstrated responsiveness to environmental change. A major drought hit the islands in 1977, leaving slim seed pickings for the finches. Easily chewed little seeds got eaten up fast, leaving only big, tough seeds that finches normally ignored. The finches with the big, tough beaks could munch those seeds, and fared better than the birds with little beaks. But not all droughts have the same impact. Another drought struck in 2004, and over the course of the next two years, medium ground finches with big beaks faced intense competition from another large-beaked species. In that drought, smaller-beaked finches fared better. Over the years, the researchers have identified a couple genes, ALX1 and HMGA2, as drivers of variation. In other species, including humans, ALX1 plays a role in the facial development, and HMGA2 plays a role in stature. Scientists have also discovered an important role for hybrids. In 1981, researchers noticed a hefty male. They assigned the bird the ID 5110, and nicknamed him Big Bird. Described as "unusually fit," Big Bird lived 13 years (much longer than a typical finch lifespan), formed a bond with six different females, and sired at least 18 baby birds who lived long enough to depart from the nest. They suspect that the unusual bird migrated to the island of Daphne Major from nearby Santa Cruz, and resulted from mating between a cactus finch and medium ground finch. Decades later, Big Bird offspring, with their intermediate-sized beaks, tend to stick to one area on Daphne Major and breed among themselves.
As the glaciers retreated at the end of the Pleistocene 10,000 years ago, migratory marine fish moved into lakes and streams in North America and Eurasia. Adapting to new foods, new threats and new water conditions has diversified these stickleback fish (Gasterosteus aculeatus) so much that they show greater diversity in their body plans than different genera of fish. Variations include dramatic differences in the size or number of bony plates, body build and tooth structure. Some sticklebacks sport fins that other sticklebacks completely lack. And because sticklebacks often stick to their own size when it comes to mating, that promotes isolation of different reproductive groups. In the shorter term, scientists put developmental pressure on stickleback fish in an experiment published in 2008. The researchers fed two different diets to fish in captivity. Some of the fish fed on bloodworms wriggling around the tank bottoms while others feasted on shrimp in the water column. The bloodworm eaters developed stumpy snouts with wide mouths, and the shrimp eaters developed long, skinny snouts.
The 2008 stickleback study did not result in the evolution of a new species but it documented something intriguing about how evolution can work. One of the gadgets in many organisms' evolutionary toolkit is plasticity: the capacity to develop in different ways when faced with different conditions. Fish known bichirs (Polypterus) inhabit lakes and rivers, but can "walk" short distances over land using their fins and breathing air with primitive lungs. An experiment published in 2014 put the fish on land over the course of their lives. The fish developed a more efficient gait, placing their fins under their bodies. Their shoulder joints loosened, allowing for greater movement. The changes in bichirs resembled the changes seen in fossil record of tetrapods.
Off Florida's Gulf Coast lies a geologically young island archipelago, inhabited by pale-hued mice, Peromyscus polionotus. The mice have light coats, making them mix better with pale beach sand than their darker mainland mice relatives, and the lighter coats result from small changes in the expression of a single gene during embryonic development. The beach mice's genetic mutation occurred after the beach islands formed some 6,000 years ago.
In the 2010s, a field experiment with deer mice in Nebraska demonstrated evolution in real time. Researchers built pens, some on light sand and some on dark soil, and loaded them up with uniformly colored medium mice. Owls picked off the most noticeable rodents pretty quickly, and over time, surviving mice and their offspring better matched their local environments. The team followed up the experiment with DNA analysis, and found different mutation levels in a fur-pigmentation gene, named Agouti, with lighter mice showing more frequent mutations.
In September 2017, Hurricanes Irma and María devastated the Caribbean. They also afforded a small group of biologists and serendipitous study. Just four days before Irma struck, the researchers finished a study of a common lizard, Anolis scriptus in the Turks and Caicos Archipelago. After the hurricanes swept through the area, the biologists returned, to see if the lizard population had changed. It had. Hanging onto a branch can be tough for a little lizard in high winds, and big toepads can help. In addition, anole lizards tend to hang on fiercely with their forelimbs and just let their hindlimbs flap in the wind. Relative to the pre-hurricane population, surviving anoles collected after the storms had bigger toepads, longer forelimbs and shorter hindlimbs. The researchers cautioned that they only measured observable characteristics (phenotypic changes), not genetic changes.
Bird lovers who stock bird feeders for European songbirds known as blackcaps, or Sylvia atricapilla, have managed to encourage two groups of birds, one that eats fruit and winters in the Mediterranean, and another that dines at bird feeders while wintering in the UK. The UK birds have rounder wings and skinnier beaks.
Between 1947 and 1977, General Electric released polychlorinated biphenyls (PCBs) into the Hudson River. PCBs can cause nasty deformities in fish larvae, such as missing jaws. But recent observations of one type of bottom-dwelling fish, tomcod, showed that they didn't suffer from PCB-induced deformities. It turns out that most tomcod in the New York region carry a beneficial mutation in a gene named AHR2. Normally PCBs latch onto the proteins encoded by AHR2, but the useful mutation enables the proteins to evade the PCBs' grasp. Farther away from the source of PCB pollution, fewer tomcod carry the mutated gene; hardly any tomcod in northern New England or Canada have it.
Studies of mosquitoes in the London Underground (subway system) have turned up new speciation — in Darwin's own neighborhood, and since the publication of Origin. Culex pipiens is the world's most widespread mosquito species. In London, it occurs in two varieties: C. pipiens pipiens, which lives above ground, and C. pipiens molestus, which lives underground. At high latitudes, the mosquito species are quite distinct from each other, partly because only the surface variety goes into hiding and stores fat during the winter. In the 1990s, a pair of geneticists began studying mosquitoes in London's Underground. Like subterranean mosquitoes elsewhere, the Underground bugs stayed active in the winter months. Yet the geneticists found that the Underground mosquitoes showed the greatest genetic similarity to local mosquitoes that lived at the surface. Furthermore, the mosquitoes inhabiting the subway system showed far less genetic variation than their relatives up top — evidence of a much younger population. When the researchers tried mating Underground bugs from different parts of the subway system, they had no trouble making babies, but when the researchers tried breeding them with surface varieties, no babies resulted. The evidence indicates that a group of surface mosquitoes took up residence in the Underground (which first came into use in 1863) and, in reproductive isolation, established a new species.
Mosquitoes aren't the only bugs to provide clues to speciation. Chalkhill blue butterflies (Polyommatus coridon) settle in chalk- or limestone-rich environments, feeding on horseshoe vetch and laying well-camouflaged eggs on the plant leaves. Chalkhill blue larvae secrete a sugary solution prized by multiple species of ants. Guarding their source of ant candy, the tiny attendants protect the butterfly larvae from wasps and other troublemakers. Entomologists have identified multiple varieties of Polyommatus coridon, and studies of the butterfly's genes in the mid-1990s indicated that a variety in Sardinia was evolving away from continental populations, likely driven by isolation. The butterfly population in the Sardinian mountains was small, and had far less genetic variation than the continental varieties.
Creationists have often argued that evolution has never been observed, and even many biologists assumed that evolution happened too slowly to be observed within a human lifespan, but that assumption turns out to be wrong.
Two bug species probably made their way to the Hawaiian Islands around the turn of the last century: Ormia ochracea (a parasitic fly from North America) and Teleogryllus oceanicus (a cricket from Oceania). Like boys everywhere, the male crickets tried to win girlfriends with noise, in their case by rubbing their wings together. But in the presence of the flies, who find the sound as attractive as female crickets do, and whose larvae like to burrow into the males and kill them from the inside, silence turned absolutely golden. Within as few as 20 generations, male crickets on two islands got much quieter, actually losing their chirping abilities. About 50 percent of the males on Oahu, and about 95 percent of the males on Kauai lost the wing structures that make the love songs humans hear as chirps. Not only did entomologists observe these changes over a short time, thanks to the bugs' short lifespans, but they could also see the differences in the wings between the island populations. A study published in 2014 reported that DNA analysis showed different markers in the genomes of the Oahu and Kauai crickets — an example of convergent evolution, where different populations independently evolve similar traits.
Working in the Guanapo River Valley in Trinidad and Tobago, biologists developed an ingenious method for marking guppies with microscopic beads under their skin, making each guppy unique, then tracking what happened to them. Since the 1970s, the studies have led to a series of discoveries. One is that guppies living in largely predator-free stretches of the river turn out to be more flamboyantly colored than their drab cousins who hope to escape the dinner plate downstream. After transplanting drab guppies to the safer stretches upstream, the researchers found that the dull little fish evolved pretty color patterns within just five fish generations. Another finding was that fish transplanted into tough neighborhoods evolved to mature sooner at smaller sizes, putting more resources into making their own fast-growing, compact babies.
In the 1950s, engineers were building the Intracoastal Waterway, and their dredging off the coast of Florida produced byproduct islands. Carolina anole (Anolis carolinensis) lizards soon took up residence. In 1995, scientists conducted an experiment by introducing a new species, the brown anole (Anolis sagrei) — a closely related species and a competitor — to some of the islands. The scientists had two predictions: (1) The Carolina anoles would move to higher perches to avoid the unwelcome guests, and (2) This change in behavior would have "an evolutionary consequence." The first prediction came true within just a few months. The second came true in just 15 years. The Carolina anoles quickly took up residence in higher branches, and their toepads enlarged to enable a better grip. The change took about 20 lizard generations.
Human conflict can change animal populations in unexpected ways, and the changes can happen quickly. Between 1977 and 1992, ivory poaching helped finance the Mozambican Civil War. The ivory came from African elephants, Loxodonta africana. After the conflict ended, a lasting impact could be seen in the elephant population of Mozambique's Gorongosa National Park. Both male and female African elephants normally have tusks. Prior to the civil war, roughly one-fifth of the park's female elephants were tuskless, perhaps the result of earlier, poaching-fueled conflicts. Now, half the females are tuskless. Selection pressure favored the survival of elephants without tusks. Evolution enabled females to forego the features that made them poaching targets, though at the price of being less equipped to dig water holes or defend themselves. For the males, the consequences appear even more dire. A 2021 study identified the gene likely driving the tuskless state on the X chromosome. The absence of the gene in a female apparently means the absence of tusks. The absence of the gene in a male, possibly because genes adjacent to it also disappear, likely means premature death, maybe even before birth. While researchers have seen an increase in the incidence of tuskless females, they've seen fewer males overall, and they haven't seen any males without tusks. So, Gorongosa National Park provides evidence of rapid evolution, but it's not an unalloyed success story.
Bacteria generally reproduce so quickly that humans can observe multiple generations in the space of a day, and this has enabled scientists to observe evolution. Experiments published in 2013 tested multiple generations of a bacteria species that occurs everywhere from soil to human skin to human lungs, Pseudomonas aeruginosa. Placing the microbes in petri dishes with tasty bacteria food, the researchers let the microbes evolve, took samples after 24 hours, and let the sampled microbes start their own colonies. Within days, the strains had evolved into "hyperswarmers" that were incredibly fast at fanning out to find new food. The bacteria evolved extra tails (microbes in the original strain were single-tailed), thanks to mutations in a gene known as FleN. But not all the new bacterial strains were the same; the scientists identified three different types of tail configuration. The evolution in these bacteria was swift and unmistakable, but apparently came at a cost. Another ability P. aeruginosa needs to thrive is the ability to build a film over the colony — something that has made fighting infections from this bug very difficult. And when it comes to film building, single-tailed bacteria out-compete the multi-tailed fast movers.
An ingenious experiment published in 2016 examined bacterial evolution on a scale visible to the naked eye. Led by Michael Baym at the Harvard Medical School, researchers built a giant petri dish, roughly 2 feet by 4 feet. They filled the petri dish with agar (a nutrient-rich, gelatinous material that provides bacteria with a stable environment for growth). They divided the petri dish into nine bands. The outermost bands contained no antibiotic. The next bands in contained just barely enough antibiotic to kill Escherichia coli. The bands inside from those contained 10 times as much antibiotic, and the bands inside from those contained 100 times as much antibiotic. The innermost band contained 1,000 times as much antibiotic as E. coli could survive at the outset of the experiment. Within just 12 days, bacteria in the experiment had evolved enough resistance to survive the innermost band.
Besides the DNA overlap with apes, some of the strongest evidence for evolution comes from our own species. Humans long believed ourselves to be the highest form of life on Earth, made in the image of God. If so, our perfection needs a little work. Some of us have flat feet, heel spurs, and ankles prone to recurring sprains. Some of us have cracked vertebrae and slipped disks — unfortunate results of upending a spine that had long been horizontal in four-legged animals. Women got an especially raw deal: hips so wide they place extra strain on knees, yet not wide enough to make childbirth to those big-brained babies the least bit easy.
The smidgen of the electromagnetic spectrum visible to human eyes roughly corresponds with the part of the spectrum in which our Sun releases most of its energy: red, green, blue and ultraviolet wavelengths. But whereas dinosaur descendants (birds) can see ultraviolet light, we can't. Perhaps as the result of Mesozoic mammals adopting a nocturnal lifestyle to avoid predation, our ancestors lost the receptors needed to detect UV light. Evolution is often a process of unused traits eventually being pruned from the genetic tree, and at some point, our mousy little ancestors relinquished a trait.
Human brains can pose problems at both the beginning and the ending of life. We are more intelligent than our ape relatives, but at a price. A study published in 2015 suggests that the same genes responsible for increased neuron connectivity may also lead to Alzheimer's — a condition not known to afflict any other animal, not even great apes.
Ever suffered from the hiccups? Thank the convoluted path that nerves must follow between your brain stem and your diaphragm; in fish, those nerves enjoy a less troublesome route. We humans have appendices, perhaps useful to our ancestors, but in us useless leftovers that occasionally inflame and rupture. Before the invention of surgery and the adoption of surgical hygiene, your own appendix could easily kill you.
Again, why did evolution do these annoying things? Because evolution doesn't have to be perfect, it merely has to be effective. It only has to be good enough to give you sufficient time to pass on your genes.
Think you picked up your fear of snakes from your skittish relatives? Maybe not. Two studies published in 2013 suggested that the way humans recoil from a coiled legless reptile may have been shaped by natural selection. One study examined the ability of children to rapidly spot snakes in photographs. Not only did the kids find the snakes with the same speed as adults, but the aptitude for identifying the reptiles was just as strong in city-dwelling kids who had likely not seen snakes outside of a zoo. Later in the year, a study of macaque monkeys used electrodes wired through the primates' brains to observe the firing of neurons in response to pictures of geometric shapes, monkey hands, monkey faces and snakes. Dozens of neurons that showed no response to the shapes or monkey parts fired strongly in response to the reptiles. Snakes probably counted among the deadliest threats to our primate ancestors, and that tough environment would have favored a brain able to identify such slithering dangers quickly.
Today, benefited by shelter, sterile surgeries, antibiotics, vaccines, car ownership and convenience stores, we humans in the developed world must cope with new dangers: diabetes and obesity. Our bodies — better adapted to a hunter-gatherer world where calories were scarce and exercise was abundant, where our bodies had to store fat to get us through long spells of little or no food — don't deal so well with the modern glut of cheeseburgers, bourbon and Twinkies. Today, overweight people outnumber underweight people the world over.
The mismatch between our hunter-gatherer bodies and modern diets is also apparent in dental health — or lack of it. Many of us harbor painful memories of wisdom tooth extraction, made necessary by jaws less robust than our ancestors'. (While many of us gave our third molars to our dentists, some populations have largely dispensed with those bothersome extra teeth. One study concluded that prehistoric people of Mexico had no wisdom teeth, and another study found 25 percent of Mexico's current population missing at least one wisdom tooth.) Besides more robust jaws, ancient humans had far fewer cavities. Studies of thousands of fossil humans found cavities in less than 2 percent of them. That began to change with the advent of farming and the consumption of cereal, but the problem got far worse with the introduction of affordable sugar within the last few centuries.
Studies of different human populations indicate that human evolution continued not only after we branched off from apes, but even long after Paleolithic times. Some studies suggest that people whose ancestors came from warm, lowland environments, who had to work hard for all their meals even after food became abundant elsewhere, tend to have slow metabolisms, and people whose ancestors came from cold, highland environments tend to have high metabolisms. While one group had to survive long periods of famine, the other group likely had to metabolize lots of animal fat. What both groups have in common are lifestyle diseases in response to a modern Western diet and sedentary mode of living. There is some debate about whether gene variants leading to type 2 diabetes really resulted from periods of food scarcity ("thrifty gene hypothesis") or whether they just resulted from genetic drift ("drifty gene hypothesis").
City life has apparently shaped human evolution, too. In centuries past — and in many parts of the world today — urban life has been tough, thanks to cramped and dirty living conditions. As a result, cities and tuberculosis go way back. Surviving such challenges over the long haul has entailed evolving resistance to the disease. A study published in 2010 correlated a genetic makeup that confers a resistance to tuberculosis and a few other maladies with a long duration of urban settlements. Iranians and Turks living around ancient cities had high resistance; Sami (or Saami) and Malawians living in places that weren't urbanized before the late 19th century had much lower resistance.
A study published in 2017 hinted, though it didn't prove, that natural selection might continue to weed harmful mutations out of the human genome today. The researchers combed through more than 8 million common mutations in a database of 215,000 people living in the United States and the United Kingdom. The scientists reasoned that, if a harmful genetic mutation shortens life, it will dwindle in older age groups. They found just two harmful mutations — an APOE gene variant linked to Alzheimer's, and a CHRNA3 gene mutation linked to heavy smoking in men — that were far less prevalent in the database's older members. To the researchers, their discovery of only two problematic mutations that fade away in older populations indicates that natural selection has already culled many other troublesome variants from the human gene pool.
Evolution of other organisms plays a role in human healthcare. Since humans learned about antibiotics, we've used them to fight infection, but we haven't always enjoyed the results we wanted. Staphylococcus (staph infection) evolved a resistance to Penicillin in 1946, to Methicillin in 1961, to Vancomycin in 1986, and to Zyvox in 1999. Growing resistance of hospital "superbugs" to antibiotics spread across six European Union nations between October 2010 and March 2011. Meanwhile, the HIV virus evolves so quickly that multiple species can thrive in a single patient. During the 2013-2016 Ebola outbreak in West Africa, the disease claimed more than 11,000 lives. Besides human factors such as widespread travel and poor public health management, the outbreak might have owed some of its lethality to a mutation that scientists named GP-A82V. GP-A82V made the virus much more effective at infecting human cells. At the same time, virus strains with that mutation were less effective at infecting bats, its typical host population. The strain that evolved early in the 2013 epidemic was not only more likely to spread in the human population, but also much more likely to kill its victims. Scientists studying the mutation cautioned that another outbreak was probable.
In May 2018, Science Magazine devoted a special issue to the evolution of resistance among the germs, bugs, weeds and molds that we humans have long tried to eliminate. The introductory article lamented:
Almost as soon as antibiotics were discovered to be valuable in medicine, resistance emerged among bacteria. Whenever mutating or recombining organisms are faced with extirpation, those individuals with variations that avert death will survive and reproduce to take over the population. . . . Today, we find ourselves at the nexus of an alarming acceleration of resistance to antibiotics, insecticides, and herbicides. . . . Evolution will always circumvent head-on attack by new biocides, and we may not be able to invent all the new products that we need. We must therefore harness evolutionary approaches to find smarter ways to minimize the erosion of chemical susceptibility.
In the special issue, one paper chronicled weeds' and insects' growing resistance to herbicides and pesticides. Another paper tracked the 30-year spread of methicillin-resistant Staphylococcus aureus (MRSA), and suggested using whole-genome sequencing to better anticipate the evolution antimicrobial resistance that we can't easily prevent.
On these fronts, at least, evolution appears to be working against us.
Despite his own extensive fossil collecting, when Darwin wrote The Origin of Species, he lamented the incompleteness of the fossil record. His friend T.H. Huxley disagreed, stating, "The primary and direct evidence in favor of evolution can be furnished only by paleontology. . . . if evolution has taken place, there will its mark be left; if it has not taken place, there will lie its refutation." In a way, both men were right.
To become a fossil, you have to be buried soon after you die — before weather or scavengers destroy your remains. The dirt and mud burying you then have to harden sufficiently to protect what's left of you over thousands or more likely millions of years. At some point, your remains have to find their way to the ground surface and erode out. And there's no guarantee that when your fossilized self sees the sun again that some human — let alone a human who knows or cares about fossils — will happen along at the right moment to find you. In other words, the odds of becoming a fossil are slim. The odds of being found as a fossil are slimmer still.
The fossil record is littered with exceptions, but in general, bones, teeth and exoskeletons are more likely to fossilize than soft tissue. (Your minteral-rich teeth are practically fossils-in-waiting the moment they erupt from your gums.) Trilobite shells, even the ones molted off, preserve reasonably well, as do the animals' calcite-crystal eyes. The vast majority of trilobites never left behind fossils, but they fossilize better than many other animals. Historically, paleontologists have been more likely to notice big fossils than little ones, though this is changing, and big plants or animals may have a better shot at being preserved in the first place.
Putting all of this together, you can reasonably guess that soft, fluttery butterflies have a pretty poor fossil record (they do) but the occasional butterfly left behind a beautiful fossil. And some Lagerstätten (sites with extraordinary preservation) can preserve not only big, hard parts, but also soft tissue. Certain kinds of mineralization — involving phosphate, pyrite or clay minerals — can do an excellent job of preserving soft tissues normally prone to decay. Once in a great while, the fossil record can even preserve colors. But these situations are rare.
The fossil record can be skewed by more than what fossilizes and what doesn't. What paleontologists notice and collect matters, too. Some paleontologists don't collect every rib they find, for instance, so fewer fossil ribs land in museum and university collections. Some fossil collecting involves pouring loose sediment from a fossil locality through a series of screens to trap the bones, and the fineness of the screens used can determine what size of small fossil gets collected versus left in a sediment heap in the field.
Where a fossil is found, whether in a rich or poor country, also affects its likelihood of getting noticed and collected. A 2022 study in Nature Ecology and Evolution discussed the lingering influence of European colonialism. The authors reported that researchers in relatively affluent countries have "a monopoly on palaeontological knowledge production," because those researchers contribute 97 percent of the fossil data. Besides noting instances of "parachute science," the paper pointed out that some countries don't even invite parachutes because they lack the infrastructure to support paleontology.
So the fossil record is spotty at best, but even with so many pieces missing, it tells us quite a bit about the history of life on Earth, and more discoveries happen all the time.
One of the most important ways in which the rock record supports evolutionary theory is the succession of fossils in older versus newer rock layers. As far back as the 18th century, scholars realized that fossils in older layers differed more from modern life forms than fossils in newer layers. While many fossils from the Pleistocene Ice Age resemble organisms living today, far fewer fossils from the Age of Reptiles do. If you venture back in the rock record to the Precambrian (prior to roughly 550 million years ago), you'll find few fossils of multicellular organisms at all, though you will find some. So striking has this fossil succession been that when asked what would disprove evolution, 20th-century British scientist J.B.S. Haldane quipped, "Fossil rabbits in the Precambrian." No such bunnies have ever been found.
Fossils that have been found in great abundance date back to the Cambrian Period, which began about 545 million years ago. Trilobites first show up in the fossil record over 500 million years ago, and fossil lovers have collected trilobites for centuries. Naturalist Edward Lhwyd published a description of a "flatfish" in 1679. In the Victorian era, a well-polished trilobite was mounted in a gold pin known as the Dudley Locust Brooch, now on display at the Natural History Museum, London. Trilobite fossils vary so much in size that some are best measured in millimeters while others are better measured in feet. Some sport spines, others have eyes on the ends of stalks. Yet these ancient water bugs remain recognizable as trilobites thanks to their shared body plan: horizontal segments spanning three lengthwise sections. No matter how big and fancy some of them got, they shared the same ancestral shape.
Graced with hardy exoskeletons, trilobites preserve some of the oldest evidence of evolutionary innovations common today: heads, mouths, gills, legs and above all eyes. Described in 2017, a three-dimensional trilobite eye fossil preserved enough internal detail to show a work-in-progress compound eye belonging to Schmidtiellus reetae. Though it had the same general structure as modern bee and dragonfly eyes, the ancient peeper lacked evidence of the tightly packed lenses needed for image formation. Instead, the half-a-billion-year-old arthropod probably used its eyes for little more than movement or obstacle detection. This species was collected from near the base of Cambrian Period. A slightly younger (geologically speaking) trilobite specimen showed eye structure closer to that of modern dragonflies.
Thousands of trilobite species evolved during the Paleozoic Era, the last of them dying out around 250 million years ago. Paleontologists have traced some of their more exuberant decorations to the evolution of more rapacious trilobite predators. This process probably started in the Ordovician Period, which began around 485 million years ago. Some trilobites evolved spikes to puncture would-be diners while others burrowed under the sea floor and left just the tips of their stalk eyes at the surface. Recent research has deduced the evolution of another defensive feature. Trilobites could roll up like a modern-day pill bug or armadillo. A 2013 study found that this useful posture evolved early on in trilobite history, though enrollment tricks got more sophisticated over time. Early trilobites had to maintain their rolled-up posture through muscular effort; later trilobites came outfitted with body parts that fit together like locks, frustrating bigger animals trying to pry the bugs open. Other researchers have found similarities between the features of ancient trilobite eyes and the peepers of modern horseshoe crabs — "a 'living fossil,' which probably retained this ancient basal system successfully until today."
Cephalopods — whose collective Greek name roughly means "your feet are on your head" — include the clever, nimble octopus, and the elegantly shelled nautilus. In between them are squid and cuttlefish. Unlike nautiloids (and their extinct relatives, the ammonoids), squid and cuttlefish lack an external shell. But unlike octopi, they do have some version of an internal shell. In modern squid, the internal shell is a long, skinny, streamlined feature known as a pen or gladius. Dating from the late Triassic through the early Jurassic, Phragmoteuthis conocauda is an early squid relative. A beautifully preserved specimen in Munich's Paleontological Museum has the overall squid shape, but it has a much more substantial shell. It also lacks any indication of specialized tentacles common in squid today.
The fossil record records some changes in squid body plans over time, but squid are a good example of the fossil record's limitations. Potential of hydrogen, better known as pH, is a measurement of how acidic or basic a substance is. Pure, pristine water is neutral. Lemon juice, vinegar and sulfuric acid are, not surprisingly, acids. Baking soda, bleach and drain cleaner are bases. For organisms to fossilize, the immediate surroundings need to be slightly acidic. To bolster their buoyancy, squid long ago evolved the use of ammonia. Basic ammonia wiped most potential squid fossils right out of the rock record.
Despite its limitations, the fossil record tells us a great deal. One argument raised about the fossil record is that it has no transitional forms. Yet two of the earliest, most famous fossils ever found were stunning examples of transitional fossils. In 1860, the limestone quarry in Solnhofen, Germany, yielded a fossil feather. A year later, the same quarry gave up a better prize: a partial skeleton of Archaeopteryx lithographica. Sir Richard Owen, in all fairness a brilliant comparative anatomist, described the fossil as "unequivocally a bird" and suggested that, if the whole creature were found, it would look like a modern bird. It certainly wouldn't have any teeth. In 1877, a complete specimen was found at Solnhofen, and this fossil sported a toothy grin to refute Owen's prediction. It was a lizard- and bird-like fossil.
Although itself a transitional fossil, Archaeopteryx followed millions of years of dinosaur miniaturization. A study published in 2014 examined 120 taxa and 1,549 skeletal characteristics of dinosaur fossils from the Jurassic and Cretaceous. The authors found that the ancient reptiles underwent about 50 million years of size reduction before Archaeopteryx. The authors hypothesized that shrinking size might have been driven by bird ancestors taking up residence in trees, perhaps to avoid predators on the ground. Life in trees favors small size and low weight, and the process may have prompted other traits: big eyes and big brains for negotiating in a three-dimensional environment (as opposed to two-dimensional environment on the ground), and feathers for insulation.
Straddling the boundary between birds and dinosaurs, Oculudentavis khaungraae took miniaturization to an entirely new level. Preserved in Burmese amber, and described in 2020, this 99-million-year-old fossil skull had big eyes (for its size), and a toothy beak characteristic of early birds. But it was by far the smallest early bird (or dinosaur) yet described. Based on the skull, paleontologists estimated that the bird weighed about 2 grams — as tiny as a bee hummingbird, the littlest living bird species today.
Archaeopteryx wasn't the only transitional avian fossil species found in the 19th century. In 1880, Othniel C. Marsh published Odontornithes describing transitional bird fossils collected in and around Kansas, including Ichthyornis and the flightless Hesperornis. Both bird species had teeth and long, bony tails. Analysis of Ichthyornis dispar skulls, published in 2018, found that the braincase looked relatively modern compared to extant birds, but the temporal region still retained dinosaurian characteristics.
Paleontologists long surmised that dinosaurs evolved into birds through millions of years of incremental changes and refinements, but in a 2017 Science perspective, paleontologist Stephen Brusatte recounted a surprisingly chaotic, messy evolution, with powered flight evolving multiple times with different adaptations. Still, the evolutionary link between dinosaurs and birds has revealed itself to paleontologists multiple times. In late 2021, Brusatte and colleagues described an oviraptorid theropod egg, complete with an exceptionally well-preserved embryo, displaying a prehatch posture remarkably like that of modern birds.
A 2022 study in Nature found the fossil record in modern organisms. Bhart-Anjan Bhullar, Christopher Griffin and colleagues combined CT scans and microscopy to create three-dimensional images of bird embryos in development.
Griffin, Bhullar and their coauthors examined embryos of Japanese quail, chickens, parakeets, and alligators, Chilean tinamou. They found that, before they hatch, birds experience a dinosaurian stage, or at least their hips do. More evidence that baby birds are teeny, tiny dinosaurs.
It's important, though, to remember that we weird assortment of feathered dinosaurs weren't simply transitional forms en route to modern birds. As Julia Clarke of the University of Texas points out, feathered dinosaurs lasted tens of millions of years — far longer than humans.
Overlapping in time with avian dinosaurs, ichthyosaurs negotiated the three-dimensional space of the world ocean. These animals evolved, changed and died millions of years ago, and fossil finds show their changes between the Triassic and Jurassic Periods, as Donald Prothero writes:
All these intermediate forms gradually acquired the standard body plan of the fully advanced Jurassic ichthyosaur, such as Ophthalmosaurus: long toothy snout, small skull with huge eyes protected by sclerotic rings, completely streamlined body with a dorsal fin, large front flippers with extra finger bones, small hind flippers with extra bones as well, and the sharp downward kind of the tail vertebrae that indicates the fully symmetrical upper and lower lobes of the tail. this was the kind of creature that Mary Anning first brought to light in 1811, and now it can be traced back to reptiles that barely look like ichthyosaurs at all.
Although ichthyosaurs changed body shape over a longer period of time, at least some of the animals evolved to gigantic sizes quickly. A 2021 study led by P. Martin Sander described Cymbospondylus youngorum from early Middle Triassic strata. The authors proposed that, in contrast to whales, which took roughly 90 percent of their evolutionary history to acquire huge size, ichthyosaurs evolved into giants in the first 1 percent of theirs.
Creationists have long argued that you can't have just "half" of a turtle, and yet the fossil record does show a turtle with just half a shell. Described in 2008, Odontochelys semitestacea lived about 220 million years ago, and had a shell over its abdomen but no shell over its back. Ten million years later, Proganochelys quenstedti had a full shell, but also a spiky neck that couldn't retract under it.
Ichthyostega, Acanthostega, Tiktaalik and Elpistostege provide more examples of transitional fossils left by fish slowly transmogrifying into landlubbing tetrapods. Transitional forms also document the mammalian return to water. Paleontologist Phil Gingerich was overjoyed to find a whale ancestor, Rodhocetus, in Pakistan. Years earlier, he had been disappointed to find a fairly pedestrian fossil whale ancestor, Basilosaurus, in the same region. Yet he perked up considerably when he found the creature's tiny legs. Why hadn't anybody found its legs before? Because nobody had looked. Why had nobody looked? "Because whales don't have legs!" Years later, Gingerich found fossils of the whale species Maiacetus inuus, including a pregnant mother. The whales' teeth were well-suited to a diet of fish, but the fetus was positioned head-down, characteristic of life on land. Together, the fossils indicated another transitional form — a whale that lived most of its life in the sea, but had to return to land to give birth. Several million years more recent than Maiacetus, Peregocetus pacificus was unearthed in Peru. Sort of like a supersized otter, this 10-foot-long animal had hooves on the ends of its fingers and toes, and could move around on land far better than, say, a modern seal. But it could also stay in the water long enough to cross the Atlantic Ocean, which would have been a much smaller distance to travel over 40 million years ago. Whale fossils have shown us other examples of change. Today's sperm whales lack upper teeth and rely on suction to partake of their prey, but in 2010, paleontologists described the skull of a gigantic, 12 to 13-million-year-old sperm whale whose lower and upper jaws bore teeth resembling steak knives.
The fossil record has revealed plenty about our own mammalian and protomammalian ancestors. Unlike the largely uniform vertebrae of reptiles and amphibians, mammalian vertebrae vary with their position in the spine. In other words, neck vertebrae aren't like rib-bearing or hip-bearing vertebrae. Synapsids lived between 200 million and 300 million years ago, and include precursors of mammals. A 2018 study examining synapsid fossils and modern DNA found that the mammalian spine evolved in stepwise fashion, with different types of vertebrae evolving in tandem with other body parts. The relatively recent lumbar portion of the spine is associated with most human back pain.
In the Middle Ages and Renaissance everybody just knew you could leave some wheat and old rags in a warm spot in your barn and make mice in a single day. Over time, naturalists restricted this assumption to only lowly creatures like insects, and later microbes. In the 19th century, Louis Pasteur established that even microscopic life couldn't spontaneously arise from decaying matter (a process known as heterogenesis).
Don't Pasteur's findings make abiogenesis — a term coined by Huxley to mean life arising from non-life (what must have happened on early Earth) — impossible? Well, no. Pasteur only established what has become common knowledge about life as it exists on Earth today. On a young Earth, the environment was very different. For starters, the atmosphere lacked oxygen — a gas that would have poisoned most of the earliest life forms.
Understanding life's earliest history on Earth requires understanding what life is, and the definition turns out to be tricky. In general, life reproduces itself and passes along traits, life maintains itself by changing energy and matter, and life has some physical boundary that separates itself from the outside world. But the border between life and non-life is surprisingly fuzzy. Viruses have genetic code and some other trappings of living cells, but must rely on their hosts to reproduce. Prions (the culprits spreading mad cow disease) are just proteins folded in a weird way. Even crystals can reproduce via cloning. None of these things fit the definition of life, but they skulk along the margins. They are not unusual. They have been around a long time, including billions of years ago, when our planet was young.
Some scientists suspect that the organic molecules that acted as life's building blocks may have synthesized in Earth's early atmosphere and rained down onto the planet, or hitched rides on extraterrestrial bodies. In 1953, Stanley Miller and Harold Urey combined gases they believed common in the early atmosphere (water, methane, ammonia and hydrogen) and added electrical charges. Their experiments produced amino acids — not life, but impressive. A valid criticism of these experiments was that they didn't reflect a more modern understanding of Earth's early atmosphere. We now understand that volcanic discharges (including carbon dioxide, nitrogen, hydrogen sulfide, sulfur dioxide and water) probably played a major role. But later experiments with more realistic gas configurations still produced results similar to Miller's, and recent studies suggest that Earth's early atmosphere was probably quite conducive to producing pre-biotic compounds such as amino acids.
Charles Darwin speculated that life might have arisen in a lukewarm pool. Roughly 150 years later, evolutionary biologists suggested a completely different environment: hydrothermal vents on the sea floor — a harsh nursery for early life.
When the Earth was forming, harsh conditions were probably the norm on our planet, although when those conditions eased is a matter of ongoing study and debate. About 4 billion years ago, the solar system was young and conditions were violent, with giant chunks of rock careening into planets regularly. Our own moon may have been formed by one such impact. In 2014, a team of Czech scientists conducted an experiment to replicate the pressure-cooker conditions created by an asteroid slamming into the Earth. The researchers used a high-powered laser to blast a solution of formamide — a chemical that forms from the reaction of hydrogen cyanide and water, and was likely abundant in the early days of our planet. The experiment produced adenine, guanine, cytosine and uracil: the nucleobases of RNA. The experiment suggested that, rather than impeding the development of life, the violent impacts of our early solar system might have contributed to its formation.
Even with organic molecules handy, making cells isn't easy. Although simple compared to us, bacteria are complex. They can generate energy, eliminate waste, wriggle around and reproduce. The evolution of single-celled organisms required intermediary steps, and three scenarios enjoyed serious consideration at the start of the 21st century. One scenario is that metabolism came first; simple molecules powered by chemical energy from minerals preceded genetic material. Another scenario is the RNA World; perhaps given a helping hand by clays or certain types of carbon-based compounds, genetic molecules began self-replicating. This hypothesis has been bolstered by findings that RNA can store genetic information, copy itself, and even carry out metabolic functions. The third scenario is some kind of collaboration between metabolism and genetics.
Building on the RNA World in 2009, researchers led by John Sutherland of the University of Cambridge reported that reactions of relatively simple compounds (acetylene and formaldehyde) could produce two of the four nucleotides needed to build RNA. Six years later, Sutherland's team found that RNA could be created by a sequence of reactions of even simpler precursors: hydrogen cyanide, hydrogen sulfide and ultraviolet light. Even better, the same sequence of reactions could make the precursors of amino acids and lipids. The research, Sutherland's team argued, helped resolve a conundrum that long vexed biologists: The genetic materials needed to make proteins also depended on those proteins, and everything needed lipids. This research indicated that all three components could have been created in a relatively short time. Sutherland's team argued that our planet's nascent environment would have provided the hydrogen cyanide, hydrogen sulfide and ultraviolet light necessary to start the process, but cautioned that the reactions would require different catalysts and probably wouldn't happen in the same place. They might, however, have been washed into a warm Darwinian pool by rainwater.
By late 2020, research by Sutherland and others pointed more toward a shallow Darwinian pool than the deep ocean, with key conditions including ultraviolet light and alternating wet and dry conditions. In a shallow, stream-fed water body on land, even in rainwater evaporating on rock, organic molecules might organize themselves into chains when conditions dried. When the water returned, those able to withstand water, which can be hard on unprotected organic molecules, survived. But the debates about how life arose were far from resolved.
However they arose, replicating molecules likely took eventual refuge inside membranes. Just as your skin protects your insides from the outside world, a membrane would provide advantages over braving the cold, hard world completely naked. Besides growing protective layer, single-celled organisms may have also gobbled and enslaved other single-celled organisms to do their grunt work. Both the gobblers and the gobbled benefited. These gobbled-up parts would become organelles — and would enjoy some protection from the outside environment inside the larger cells. Like organs in your body — stomach, liver, lungs — do certain jobs, organelles would carry out specialized tasks, too. Some types of organelles today have double membranes, apparent remnants of the swallowed cell's original membrane plus the membrane of the bigger cell that ate it. And all the while, everybody probably swapped genetic material like MP3 files.
Biologists have divided life on Earth into three domains: archaea, bacteria, and eukaryotes (this last category includes us, as well as all other animals and plants). Lynn Margulis found that mitochondria — eukaryotic cells' power generators — sport genes more like those of bacteria than eukaryotes. Meanwhile, Carl Woese and his colleagues found that archaea (single-celled organisms that sometimes occur in extreme environments) share more genetic material with eukaryotes than bacteria. A study published in 2015 identified Lokiarchaeota, a type of archaea excavated from beneath the Atlantic Ocean that shares more genes with eukaryotes than any other known archaea. Lokiarchaeota, the researchers found, has the same genes that build compartments inside eukaryote cells, and construct and destroy eukaryote "skeletons." The newly discovered type of archaea also has genes indicating that it could gobble up microbes and perhaps even turn some of those microbes into organelles. A study published in late 2022 offered another potential ancestor: tentacled microbes known as Asgards (named for Norse gods). Though categorized as archaea the microbes carried genes previously thought exclusive to eukaryotes, and electron microscopy found they had complex internal structures.
In simplistic terms, the archaean parts of our cells primarily process information while the bacterial parts mostly handle housekeeping, though the division of labor isn't strict. The common ancestor of today's life may have really been all the microbes on young Earth.
How could we wind up with genetic material from different domains? It could be that, when life was young, RNA and DNA didn't have much competence in self-replication, and mutations ran rampant. Genetic replication eventually improved. Likewise, if cell membranes became more sophisticated, they would get better at keeping out intruders (including genes). So the lateral gene transfer that was once commonplace, and still occurs in bacteria today, has probably become much rarer in multicellular organisms. In short, not only has life evolved, evolution has evolved.
Even after decades of research, however, most of the puzzle pieces of the earliest life on Earth still elude us. We may never find them. Billions of years of geologic processes and the planet-transforming power of life itself have probably wiped away many clues. But although the mysteries of Earth's early life forms are big, research continues.
Single-celled organisms still thrive all over our planet today, but well over half a billion years ago, some of them began organizing into multicellular colonies, the possible forerunners of multicellular life forms.
Photosynthesizing bacteria introduced oxygen into Earth's atmosphere over 3 billion years ago. Initially lapped up by iron in the world's ocean (shown in the rock record as banded iron formations), the oxygen eventually accumulated in the atmosphere sometime around 2.5 billion years ago.
This oxygen infusion into the Earth's atmosphere could have contributed to the onset of ice ages. In the 1960s, Brian Harland and Martin Rudwick hypothesized that the Earth experienced a tremendous ice age during the Neoproterozoic, a geological time period starting around a billion years ago. Renewed claims for a Neoproterozoic Snowball Earth hypothesis came from Paul Hoffman, Daniel Schrag and coauthors starting in the late 1990s. They argued that global snowball conditions would have alternated with hothouse conditions.
Although scientists differ on the extent of the ice, there is fairly widespread acceptance of the presence of glaciers at sea level at the equator, though some researchers prefer the term Slushball Earth. Many researchers have hypothesized the occurrence of two snowball events: the Sturtian (lasting from about 710 million to 670 million years ago) and the Marinoan (concluding about 635 million years ago.)
Such extreme conditions would pose a challenge for life forms on our planet, but survival wouldn't be impossible. Sponges, for instance, might have survived. And studies of rusty rocks in ancient glacial deposits in Australia, Africa and North America indicate that glacial runoff along coastlines could have provided ample oxygen for some life forms to persist.
Roughly bridging the gap between unicellular and multicellular life, sponges (porifera) are actually aggregations of cells that can reassemble after passage through a sieve. They pump water to feed, but they lack traits such as muscles, nerves or mobility. A 2017 study suggested that all of today's animal life likely arose from a sponge ancestor. Both fossils and molecular clock data suggest that sponges predated and survived the Marinoan glaciation. And proponents of the Neoproterozoic Snowball Earth hypothesis argue that the extreme conditions — or the recovery from them — may have kick-started the evolution of more complex multicellular life forms.
Besides potentially triggering ice ages, the buildup of oxygen in the Earth's atmosphere did something else. It provided fuel for metazoans (also known as animals).
Biologists have pondered for decades how and when the first true animals emerged, and one way to figure out Earth's earliest animals is to look for genetic commonalities in modern animal life. Building upon earlier studies that showed older genes from eukaryotic ancestors took on new functions, a 2018 study found that animal life emerged thanks to plenty of novel DNA. Jordi Paps and Peter Holland examined modern organisms across the animal kingdom, and identified 25 groups of genes common to all modern animals but not found in other organisms. They used these gene groups to infer the genome of the "Urmetazoon" that gave rise to animal life. When did this ancestor common to all animal life evolve? That's tough to say, but almost certainly before 650 million years ago.
Earth's fossil record gives us wonderful glimpses into the past, but it doesn't give us a perfect picture, and there may be no better illustration of this than the earliest metazoa. Molecular clocks indicate that some of the earliest metazoa to evolve might have been cnidarians such as jellyfish or sea anemones. But these animals are usually soft, and unlikely to fossilize well. An exception was named in 2022. Found in Charnwood Forest in the English Midlands, Auroralumina attenboroughii was described as a possible medusozoan captured in its rooted, polyp stage. The fossil was dated at roughly 560 million years old.
Among the first multicellular organisms to leave clear fossils widely accepted as evidence of ancient life, are those from the Ediacara Hills in South Australia. The fossils are known as Vendobionts or Ediacarans. Despite being more ancient than life forms that came later, Ediacaran fossils sometimes had a better shot at preservation thanks to a dearth of mouth-bearing organisms to scavenge their carcasses. Based on these fossils, the International Union of Geological Sciences added the Ediacaran Period, spanning roughly 635 million to 542 million years ago, to the geologic timescale. But with rare exceptions such as cnidarian remains, paleontologists have struggled to relate many Ediacaran fossils to life forms that came later.
Biologists have categorized all life (from most general to most specific) in: domain, kingdom, phylum, class, order, family, genus and species. Scientists are confident that the Ediacaran biota all fall into the domain of eukaryotes, but below that, classifying Ediacaran biota is challenging, even at the kingdom level. One fossil species, Kimberella, has been interpreted as an early mollusk. Dickinsonia, a plate-shaped fossil with a profusion of radial grooves, was probably only a few millimeters thick, but could spread out to 1.5 meters across. Fat molecules retrieved from Dickinsonia remnants have revealed cholesterol — found in cell membranes of modern animals. Still, many researchers maintain that much of the Ediacaran biota was vastly different from anything living today. Rangeomorophs, for example, look fernlike, but lived in water. These fractal, branching life forms maximized their surface area, perhaps to catch as many nutrients as possible from passing currents.
As a whole, Ediacarans have been variously identified as mollusks, cnidarians, sponges, protists, fungi and life forms unlike anything alive today. A study published in 2013 argued that Ediacaran fossil were really land-based lichens — an interpretation that gained little support, either from other researchers or from follow-up studies. A paper published the next year bore the title, "There is no such thing as the 'Ediacara Biota.'" Researchers kept right on using the term, but there is probably no one-size-fits-all classification for the organisms that left fossils for humans to find some 600 million years later.
Despite the murkiness of how best to classify them, it's fairly clear that many Ediacarans left no descendants. Ediacaran fossils have been collected from dozens of fossil sites around the world, but with few exceptions, those iconic Ediacaran fossils don't turn up in Cambrian sediments, even in localities with exceptional preservation.
The Cambrian Period turned out quite unlike the Ediacaran. To give an idea of the contrast between these two periods, consider the history of the period designations. The Ediacaran Period was established in 2004, the first new geologic period designated in 120 years. The fossils that lent the period its name were first discovered in the 1940s. In contrast, the Cambrian Period was proposed by Adam Sedgwick in the early 19th century. Sedgwick squabbled for years with Roderick Impey Murchison, who named the Silurian, over whose geologic period name ought to prevail. Meanwhile, fossil collectors kept digging up trilobites (three-lobed marine arthropods whose closest modern living relatives are probably horseshoe crabs), which first appeared in the fossil record in the Cambrian. Nineteenth-century Bohemian paleontologist Joachim Barrande assembled enough trilobite fossils to piece together the trilobite life cycle. In short, paleontologists and gentlemen geologists identified fossils associated with the Cambrian about a century before the identification of Ediacaran fossils. Even now, the entirety of the Earth's history before the Cambrian is often referred to simply as the Precambrian.
Fossils from the Cambrian Period are among the first, though not the very first, to show evidence of animal traits that persist today: heads, eyes, gills and legs. And mouths. The spread of mouth-bearing organisms forever changed the makeup of marine life. Before the Cambrian Period, cyanobacteria (water-dwelling bacteria that use photosynthesis to make their own food) were so abundant and unmolested that they formed layer upon layer of slimy, bacteria-sized high rises. Animals with mouths found them very appealing. Stromatolites were widespread before the Cambrian Period. They have been rare ever since.
Although the Cambrian ushered in an unmistakable era of animal diversification, hints of animal traits can be found in some fossils preceding the Cambrian. In the history of life on Earth, bilateral symmetry — left and right sides of the body that mirror each other — is kind of a big deal because it gives rise to the most complex features of animal life, such as heads. What could be the oldest bilaterian fossils so far found date from the late Ediacaran Period. Described in 2020, 555-million-year old Ikaria wariootia is described as a wormlike creature ranging in size from 2 to 7 millimeters. The fossils having big ends and little ends, which are all paleontologists have to distinguish between the ancient fossils' heads and butts. Estimated at 540 million years old, Saccorhytus coronarius is about a millimeter long, and bears (for the animal's tiny size) a big mouth.
Once the Cambrian arrived, mouths made some animals grazers that ate defenseless cyanobacteria. Mouths made other Cambrian animals hunters, and that made yet other Cambrian animals prey. Animals that found themselves on other animals' menus evolved varying strategies to survive, such as body armor and burrowing.
Cambrian fossils also count among the earliest to sport hard shells. Because hard parts fossilize more easily than soft flesh, there's room for debate about whether the Cambrian Period's most conspicuous innovation, as far as modern-day fossil hunters are concerned, is the evolution of body parts that fossilize well.
Despite the abundance of trilobite fossils, geologists couldn't guess at the true diversity of Cambrian life for decades after the Cambrian Period was first named. One of the richest sites for the period is the Burgess Shale in the Rocky Mountains of Canada's British Columbia. Other geologists had excavated there before him, but it was Smithsonian boss Charles Doolittle Walcott who made the site famous in the early 20th century. Episodes of extremely rapid fossil burial about a half a billion years ago preserved soft tissues, giving paleontologists unprecedented amounts of information about animal life from the Cambrian. Thousands of fossils have been excavated from the Walcott Quarry since 1909. Fossil sites with similar preservation and diversity have been found nearby (the Marble Canyon site about 40 kilometers away) and as far away as China. So many new body plans showed up in the fossil record that the period gained the nickname "Cambrian explosion." The term is somewhat controversial, and many paleontologists favor "diversification" or "radiation."
Lagerstätten (rich fossil sites) are impressive by themselves, but there's more to the Cambrian Period. Unlike the Ediacara biota, Cambrian life forms show much clearer affinities to modern life. Make no mistake; Burgess Shale fossils are weird. But they're also eerily similar to modern organisms.
The oldest trilobite fossils so far found date back 521 million years. Trilobite numbers were significantly reduced at the end of the Devonian Period some 360 million years ago, but some species survived throughout the Paleozoic (Ancient Life) Era, not going completely extinct until about 250 million years ago. Olenoides are the best-known trilobite species from the Burgess Shale.
Like trilobites, brachiopods got their start in the Cambrian Period. Brachiopods (which bear a superficial resemblance to clams but are very different animals) were decimated hundreds of millions of years ago. As a group, though, they tiptoed past complete annihilation, and some remnants still live today. They are some of Earth's longest survivors. Brachiopod fossils, including Acanthotretella, Acrothyra and Diraphora have been collected from the Burgess Shale.
Paleontologists have found many examples of Cambrian species that are related in some way to modern species, but perhaps a clearer way to assess the importance of the Cambrian radiation is to look at phyla.
Most animal life falls into just eight phyla, or body plans: sponges, cnidarians, flatworms, annelids, arthropods, mollusks, echinoderms and chordates. The last category includes all vertebrates, from fish to people. It also includes vertebrate relatives such as tunicates, with free-swimming juveniles and unmoving adults. Other animal phyla identified by biologists arguably apply to obscure animals, well understood only by the specialists who study them. Jean-Bernard Caron of the Royal Ontario Museum explains that, depending on which biologist you consult, the animal kingdom consists of roughly 35 phyla, or major body plans. At least two-thirds of those phyla existed during the Cambrian, though some of them were in primitive form. Others, such as a fossil tunicate dating to the late Cambrian, look similar to modern forms.
Biologists and paleontologists have made a convincing case that life diversified tremendously during the Cambrian, and that the body plans apparent half a billion years ago showed significant similarities to body plans still thriving today. Figuring out why, and why then, is more challenging. Possible triggers include environmental changes, such as the retreat of the glaciers of Snowball (or Slushball) Earth, or the increased availability of oxygen, though those big changes happened before the Ediacaran Period. The development of predation might have spurred an evolutionary arms race, though that becomes a chicken-and-egg question of timing. Another possibility involves changes in developmental genes. Although evolution is not random, it can be driven by random changes in genes, and changes in Hox genes responsible for regulating overall body plans could have contributed to a rapid diversification of animal phyla over a short span of geologic time. Genetic changes might also have limited the diversification of phyla after that time. Evolution certainly continues today, but it doesn't appear to operate at the phylum level in the animal kingdom.
A few billion years after life first evolved on Earth, a half a billion years after the first animals evolved, one species on our planet grew intelligent enough to wonder about the past. No part of the story of evolution has aroused stronger emotions than the human story.
Naturalists began uncovering fossil humans before Darwin published On the Origin of Species. The first fossil human that naturalists found was perhaps the Red Lady of Paviland, identified by William Buckland in the 1820s. "Red" reflected the fact that the body was covered with red ochre, but Buckland misfired a bit; the skeleton belonged to a man. That fossil, though ancient, wasn't really a different kind of human. But in 1856, quarry workers found truly puzzling human remains in the Neander Valley, and Johann Fuhlrott and Hermann Schaaffhausen realized these remains belonged to someone quite different from modern Europeans.
Writing about human evolution, the 19th-century naturalist Ernst Haeckel confidently asserted that our evolution consisted of precisely 22 phases, and the 21st was a yet-to-be-identified "missing link." If Haeckel predicted the missing link, Eugène Dubois arguably found it, and he found it far from his European homeland. In Indonesia he oversaw the discovery of Pithecanthropus erectus, now known as Homo erectus in the 1890s. Returning home to Europe, Dubois expected a more enthusiastic reception than he got. His personality was likely part of his problem, but perhaps European scientists just weren't ready for a non-European ancestor. Decades later, Raymond Dart suffered similar disillusionment when his description of the Taung Child (Australopithecus africanus) from South Africa aroused mostly scorn. Dart's find conflicted with what some early 20th-century anthropologists thought they knew about Piltdown Man, not debunked until 1953.
Despite Charles Darwin's 1871 prediction that Africa would prove to be humanity's original homeland, reluctance to accept ancestors from Africa persisted for decades. Resolving to prove Dart correct, Robert Broom found another South African australopithecine, nicknamed Mrs. Ples, in 1947. Mary and Louis Leakey made significant finds in the latter half of the 20th century, and Donald Johanson's 1974 discovery of Lucy captivated the public, at least the part of the public that accepted evolution. As the cache of human fossils has grown, so has the challenge of distinguishing between ancient species.
But discoveries of fossil hominids — now often referred to by the more specific term "hominin" — didn't occur in the same order as the hominins themselves. So let's start at the beginning of the human story, at least the beginning of what we understand today.
The Eocene Epoch began roughly 10 million years after the last dinosaurs went extinct, so the epoch lasted from about 55 million to 34 million years ago. The Eocene and brought primate-friendly greenhouse conditions, where rainforests covered huge expanses of our planet. Primates thrived not just in Africa and Asia, but also Europe and North America. Primates lived in Wyoming! The ensuing epoch, the Oligocene (34 million to 23.5 million years ago) experienced cooler, drier conditions. On a planet less welcoming to primates, many species went extinct, though plenty survived in Africa and Asia. And even though the planet was cooler and drier than it had been in the Eocene, it was still generally warmer than it is now. Even in the Miocene Epoch (23.5 million to 5.3 million years ago), rainforests remained abundant, and the number of primate species living then likely dwarfed the number of primate species living today.
The primate fossil record indicates that the anthropoids — a subgroup of primates including monkeys, apes and the distant ancestors of humans — actually originated in Asia. Bur during the Eocene-Oligocene transition that pared down primate species, anthropoids somehow coped better in Africa, where they later diversified into more species. Toward the end of the Miocene, an apelike species apparently evolved into an upright-walking form that would eventually give rise to people.
Primate fossils aren't the only evidence of divergence between the ancestors of humans and apes. Scientists can also look at the rates of change in nuclear DNA, mitochondrial DNA, and even the DNA of our gut bacteria, which have a symbiotic relationship with their primate hosts and can speciate as we do. A 2016 study examined two types of gut bacteria: Bacteroidaceae and Bifidobacteriaceae, which colonize the digestive tracts of chimps, bonobos, gorillas and humans. Comparing these strains of bacteria in different hosts indicates that we split from gorillas about 15.6 million years ago (this puts the human-gorilla split slightly further back in time than what mitochondrial DNA indicates, but it's in line with nuclear DNA). Gut bacterial differences also indicate that we diverged from chimps around 5.3 million years ago (this is a little more recent than what nuclear DNA differences suggest, but it's in agreement with mitochondrial DNA).
As of 2022, the three leading contenders for the earliest hominin title included Orrorin tugenensis, Sahelanthropus tchadensis and Ardipithecus kadabba. All three fossils have been dated at somewhere between 5 million and 7 million years old. Orrorin and Ardipithecus turned up in eastern Africa, the same general region as many other hominin fossils. Sahelanthropus was unusual in that it was found in central Africa. (So many hominin fossils have been found in eastern Africa not necessarily because most human evolution happened there, but because parts of eastern Africa preserve a rock record spanning the last several million years. In other words, most of human evolution is sampled in eastern Africa.)
The respective discovery teams of Orrorin tugenensis, Sahelanthropus tchadensis and Ardipithecus kadabba have advanced competing arguments for why their species designations deserve the title as the earliest known hominin, and naturally they don't totally agree with each other. (Members of team Ardipithecus kadabba suggest these fossils are all members of — you guessed it — Ardipithecus kadabba.) But paleoanthropologists agree that two characteristics of human ancestors include bipedalism (walking on two feet), and reduced canines. Bipedalism is fairly self-explanatory, although it's important to understand that early hominins may have had a very different gait than ours. As of 2022, the evidence for bipedalism in these three species was, like the totality of their fossil remains, fragmentary.
Canines might need a little more explanation.
If a dog has ever snarled at you, you might have noticed its fangs, which can look menacing even in a dainty, purse-sized dog. In most, though not all, species of non-human primates, the males also have big canines. These males often fight with each other for girlfriends by showing their teeth or, worse, using them. Outside of vampire stories, humans have small canines, and one of the early signs of human ancestry is a reduction in canine size. Darwin suggested that human ancestors lost their need for big canines because walking upright freed their hands to carry weapons, and they could clobber each other instead. A more commonly accepted hypothesis today is that those free hands enabled males to bring their sweethearts useful things like food, and the ladies wisely favored the sensible, provisioning males over the hotheads. Either way, smaller canines won the evolutionary battle in hominin teeth.
Led by Tim White, the research team that identified Ardipithecus kadabba as a candidate for earliest hominin also proposed it was the ancestor of a later member of that genus: Ardipithecus ramidus. White and colleagues first identifed that hominin in 1994, initially naming it Australopithecus ramidus based on teeth, pieces of skull, and arm bones. A year later, they established the new Ardipithecus genus for the find. In 2009, 15 years after the original species description, White's team produced an in-depth description based on many more fossils, publishing 11 papers in the journal Science.
Dated at about 4.4 million years old, Ardipithecus ramidus, nicknamed Ardi, showed signs of tree climbing — including an opposable toe — and bipedalism — evidenced by hip and leg bones. But Ardi held a surprise for the researchers describing her; the fossil was strangely unlike a chimpanzee. For them, Ardi served as a reminder that, like humans, chimps have continued evolving away from our last common ancestor, too. But this genus has remained controversial, with some paleoanthropologists considering it a plausible ancestor of australopithecines, and others suggesting it's not a hominin but instead a relative of extinct apes.
Ardipithecus was followed by the genus Australopithecus, including Australopithecus anamensis, Australopithecus afarensis and Australopithecus africanus, to name just a few. The small-brained, bipedal Lucy (A. afarensis) received widespread attention in the mid-1970s, and in the late 1970s, a research team led by Mary Leakey uncovered australopithecine footprints preserved in volcanic ash at Laetoli, Tanzania.
Generally considered ancestral to Lucy and other australopithecines, A. anamensis gained a long-awaited face in August 2019 with the description of a 3.8-million-year-old skull nicknamed MRD (based on its collection number). Led by Yohannes Haile-Selassie, the research team concluded that MRD's braincase was primitive, but its forward-projecting cheekbones were more derived. This, and a possible temporal overlap with A. afarensis, led the team to question whether A. anamensis really did give rise to Lucy's species. But not everyone was ready for a rearrangement of the hominin family tree; paleoanthropologist Tim White argued that individual variation could explain MRD's seemingly derived traits.
Whatever their relations to each other, australopithecines were capable climbers who also walked upright. The partial skeleton of a 3.32-million-year-old juvenile from Dikika, Ethiopia supports this interpretation, with features of both bipedalism and pedal grasping.
Many anthropologists point out that what matters as much as our intelligence is our ability to manipulate our surroundings, in other words, our hands and opposable thumbs. Studies published in 2015 suggested that australopithecines had the same forceful precision grip as that employed by humans, and tools found at Lomekwi, Kenya — a core and corresponding, snugly fitting rock flake — could be as much as 3.3 million years old. A study published the following year, though, found that capuchin monkeys of Serra da Capivara National Park in Brazil smashed stones together and produced flakes looking very much like tools attributed to early hominins. The monkeys lacked the precision grip of humans and didn't use any of their sharp-edged "tools." They sometimes licked the flakes, perhaps for minerals, but the researchers couldn't say for sure why the monkeys broke the rocks. The researchers cautioned against interpreting ancient rock flakes as tools without supporting evidence such as cut marks on bones.
Caveats in mind, though, more paleoanthropologists have concluded that australopithecines did make occasional use of stone tools. In a 2023 review article assessing the state of research, Zeresenay Alemseged observed:
For a long time, our knowledge of Australopithecus came from both A. africanus and Australopithecus afarensis, and the members of this genus were portrayed as bipedal creatures that did not use stone tools, with a largely chimpanzee-like cranium, a prognathic face and a brain slightly larger than that of chimpanzees. Subsequent field and laboratory discoveries, however, have altered this portrayal, showing that Australopithecus species were habitual bipeds but also practised arboreality; that they occasionally used stone tools to supplement their diet with animal resources; and that their infants probably depended on adults to a greater extent than what is seen in apes.
Fossils indicate that australopithecines came in two general varieties: gracile and robust. Gracile australopithecines probably led to our own genus, Homo, while the robust australopithecines constituted a side branch of the hominin evolutionary tree. Adapted to chewing extremely coarse, tough plants (or simply to chewing a lot), some robust australopithecine species sported molars as big around as a nickel or even a quarter. They also had crests on the tops of their skulls to anchor big chewing muscles. Figuring out exactly what these hominins actually ate can be tricky because their skull and tooth morphology indicates a diet of tough-to-crack nuts or tubers, but the microwear on their teeth suggests a softer diet of grasses and leaves. They may have subsisted on soft parts of plants most of the time, but been able to eat hard parts of plants when the going got (literally) tough.
Our own ancestors took a different approach to food, which included eating meat, and even processing food through cooking and/or grinding it before sitting down to dine. In a landmark paper published in 1997, Leslie Aiello laid out the expensive-tissue hypothesis, arguing that digesting meat doesn't take as much intestinal yardage as digesting leaves, so an animal that survives on a meaty diet can get by with a smaller gut. Both the brain and the digestive tract require a lot of energy, so there has been a tradeoff between brain size and gut size, with meat eating facilitating bigger brains.
The expensive-tissue hypothesis presents a chicken-and-egg conundrum since we can't be sure which — meaty diets or bigger brains — came first. Delicious, a 2021 book by Rob Dunn and Monica Sanchez speculates that cooking, particularly cooking meat, was originally driven by a pursuit of tastier food, a quest pursued by people with the most discerning tastebuds and noses.
Some paleoanthropologists suspect that processing food likely came before cooking it. Despite evidence of controlled fire dating back 1.5 million years in South Africa's Cradle of Humankind, evidence of human-mastered fire is scarce before 500,000 years ago. Long before that, though, early members of our genus could have pounded rocks on their food (including meat, roots and tubers) to tenderize it, as well as used sharp stone flakes to slice meat, and remove skin and cartilage. A 2016 study concluded that, assuming our ancestors ate a diet of one-third meat, this pre-prandial processing would save the average hominin something like 2.5 million chews per year. Such savings would be significant, according to a 2022 study. By testing study participants chewing soft gum and tough gum (both versions unflavored, alas), a research team found that chewing elevates metabolic rates. Chewing soft gum elevated the energy expenditure by 10 percent above the basal rate. Chewing tough gum elevated the expenditure by 15 percent. Human relatives, Pan and Pongo, spend hours chewing their food. We humans spend roughly a half hour a day at the same activity.
When fire-cooked food became commonplace, fire likely presented a mixed blessing. Two studies released around the beginning of August 2016 described fire's possible pros and cons. A study led by Troy Hubbard found that modern humans possess a genetic mutation that facilitates the metabolism of toxins common in fire smoke. In short, our species might have thrived because we could cope with the smoke while reaping the benefits of cooking, nighttime light and warmth. But a study led by Rebecca Chisholm speculated that all that campfire togetherness might have produced ideal conditions for soil microbes to evolve into tuberculosis, which we started coughing at each other tens of thousands of years ago.
Whichever came first, meat eating or bigger brains, that extra gray matter is a characteristic of the genus Homo, although traits associated with this genus (long legs, reduced size differences between the sexes, and energy-hogging brains) probably didn't all arise at the same time. And though paleoanthropologists have long held that pelvises similar to our own evolved to accommodate bigger-brained babies, "modern" pelvises have been found in smaller-brained hominin species. A possible explanation for so much variation is that pelvic shape might actually change over a lifetime, even after an individual reaches maturity.
Paleoanthropologists estimate that australopithecines evolved into early forms of the genus Homo sometime between 3 million and 2.5 million years ago, but verifying this hypothesis has been complicated by a dearth of early Homo fossils. A 2.3-million-year-old maxilla (upper jaw) long served as the oldest example of the genus, and the type specimen for Homo habilis, or "handy man," is younger still. A computerized reconstruction of the mandible (lower jaw) from the Homo habilis type specimen showed australopithecine characteristics along with traits of our own genus. And comparisons of the Homo fossils from this period suggests "an evolutionary explosion at the dawn of our genus," according to Nature.
Over the lifetime of our own genus, paleoanthropologists have identified multiple Homo species, and one of them, Homo erectus, thrived not only in Africa, but also in Eurasia. Remember that Dubois found the first known example of this species in Indonesia, and the species was likely a long-time resident of Java. Another famous example of this species, Peking Man, went missing during World War II.
One of the most complete fossils in the hominin record is a juvenile Homo erectus skeleton found at Nariokotome, Kenya. Estimated to be seven or eight years old at death (but equivalent to an 11- to 13-year-old modern human), Nariokotome Boy (or Turkana Boy) was already over 5 feet tall. If his lived-out future included an adolescent growth spurt, then he would have become a tall man. But significant height might not have been a characteristic of Homo erectus so much as a trait Homo erectus living in the tropics. An especially rich fossil site at Dmanisi, Georgia, was for years the oldest known Homo erectus site outside Africa, and the site preserved the remains of multiple individuals. Dated at around 1.8 million years old, the site shows remarkable variation among hominins living within a short (geologically speaking) time span. Skull size, brain size, and skeleton size all show wide disparities, suggesting that this was simply an extremely diverse species. Some H. erectus fossils have been assigned to different species, such as Homo ergaster in Africa, and Homo georgicus at Dmanisi, but whether these additional species definitions are necessary is debatable. In fact, a 2013 study on Dmanisi led by David Lordkipanidze argued that hominin species identified from other sites as well — Homo erectus, Homo habilis and Homo rudolfensis — were all one species with a lot of variability.
Regarding the oldest evidence of hominins outside Africa, Georgia's 1.8-million-year-old fossils from Dmanisi reigned supreme for years, bolstered by a similarly old tooth from nearby Orozmani, found in 2022. But in terms of oldest finds outside Africa, Dmanisi fossils were narrowly defeated in 2018 with the description of stone tools at Shangchen, part of China's Loess Plateau. Dated at about 2.1 million years old, the Shangchen tools are older and much farther away from Africa than Dmanisi, but tools alone can say little about hominin morphology. Even though the 2018 paper pushed back the alleged age of hominins outside of Africa, the likeliest candidate to leave Africa and eventually craft tools in China would likely still belong to the genus Homo.
Besides variability, Homo erectus fossils show something else: evidence of kindness. One of the fossils from Dmanisi is toothless. If you're toothless long enough (and don't starve), your jaw changes, eventually converting those lumpy tooth sockets to smooth bone. But if you lost all your teeth at Dmanisi, you couldn't get dentures. The owner of the toothless jaw may have had his or her food chewed by a family member or (very good) friend. Likewise, a Homo erectus female fossil from Africa is covered by bone that looks like it was woven. The deposition of such weird bone is characteristic of Vitamin A poisoning. This toxic condition, which has sometimes struck Antarctic explorers, results from eating too much carnivore liver, and it is exquisitely painful. If you were fending for yourself on the African savanna, you'd die long before the disease progressed as far as it did in this woman. For her to survive as long as she did meant that someone looked after her near the end.
Homo erectus survived as a species for well over 1.5 million years, and before they completely died out, a new species evolved that has figured large in anthropology. Homo neanderthalensis arose in Eurasia about 200,000 years ago, although hominins with proto-Neanderthal characteristics may have arisen hundreds of thousands of years earlier. The earliest specimens that are arguably Neanderthal come from Spain's Sima de los Huesos Cave, a site preserving thousands of fossils from about 430,000 years ago. The fossils have alternately been classified as Neanderthal ancestors and early Neanderthals. Neanderthals and the direct ancestors of modern humans may have diverged somewhere between 550,000 and 765,000 years ago.
Neanderthals have a tortured history in anthropology, identified as diseased modern humans then identified as stooped and awkward because the scientist making that determination examined a Neanderthal riddled with arthritis. One of the early claims made about Neanderthals, that they buried their dead, has been alternately challenged and vindicated. Like Homo erectus, Neanderthal fossils preserve evidence of people who must have been looked after by others. Finds associated with Neanderthal remains suggest symbolic behavior. Near Krapina, Croatia, 130,000-year-old eagle talons from a Neanderthal site bear cut marks and abrasion marks that might have resulted from their use in necklaces. Bruniquel Cave in France bears structures built from broken-off stalagmites, dated at abut 176,000 years old, when Neanderthals were the only hominins known to inhabit Europe, though the purpose of these structures is a mystery.
And Neanderthals apparently admired fossils, or at least the animals that would become fossils. A hand axe from Norfolk, England, perhaps dating to 200,000 years ago, has been fashioned to give special prominence to a bivalve shell. Even more remarkable is an even older hand axe, perhaps the handiwork of an earlier species, Homo heidelbergensis, that shows off a fossil sea urchin.
Early 20th-century scientists such as Henry Fairfield Osborn considered Neanderthals direct human ancestors. That view no longer predominates. Instead, Neanderthals coexisted with direct ancestors of modern humans, who evolved in Africa. The most likely path out of Africa is through the Middle East, and Neanderthal and Homo sapiens were living in close proximity to each other about 55,000 years ago. But Neanderthals haven't been completely kicked out of the family tree; a slew of genetic studies, most of them supervised by Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology, indicate that Neanderthals interbred with ancestors of modern humans.
Pääbo and his team, as well as other researchers in the field, have made multiple breakthroughs in understanding Neanderthal contributions to modern human DNA.
Multiple studies show that descendants of Europeans, East Asians and other populations outside of Africa owe about 2 percent of their DNA to Neanderthals. Starting in the 2010s, genetic research indicated that humans with non-African ancestry carried some Neanderthal DNA, and Africans carried none, but a 2019 study overturned that understanding, finding that Africans owe about 0.3 percent of their DNA to Neanderthals — a lower percentage but not nothing. The 2019 study authors concluded that "remnants of Neanderthal genomes survive in every modern human population studied to date." The model best describing the Neanderthal DNA percentages held that some of that DNA was mixed into African populations by Neanderthal-gene-carrying people migrating back into Africa over the last 20,000 years, but also that some DNA previously ascribed to Neanderthals might consist of genes from early modern humans that both Neanderthal and non-Neanderthal populations retained.
Although the Neanderthal DNA retained in modern human populations tops out at about 2 percent or slightly higher, a far greater percentage of the Neanderthal genome has been preserved in modern populations because different people carry different genes. Some of those genes relate to hair and skin color, and to diseases such as lupus and diabetes. A few adaptations that likely served Neanderthals well are less beneficial to people living in modern industrialized societies. A gene variant that helps blood clot quickly likely helped Neanderthals avoid bleeding to death after hunting injuries or childbirth, but it also increases the risk of blood clots and strokes in longer-living people today. Genes that bolstered Neanderthal immune systems against troublesome microbes thousands of years ago might contribute to allergies and inflammation in our cleaner environments of today.
Neanderthals and the ancestors of modern humans may have been "at the edge of biological compatibility," and their hybrid progeny may have eventually become infertile, according to one group of researchers. And yet DNA analysis of a roughly 40,000-year-old modern human jawbone from Pestera cu Oase Cave in Romania showed strong evidence of interbreeding. If you make a baby, you contribute large stretches of your DNA to your child. But over generations, those chunks of DNA you have contributed will get broken into smaller and smaller pieces. The jawbone showed long, uninterrupted stretches of Neanderthal DNA, prompting anthropologist Erik Trinkaus to argue in 2015 that the mandible apparently belonged to the great-great-great-grandson of a Neanderthal.
In 2010, the Natural History Museum at the Smithsonian opened its state-of-the-art, awesome hall of human origins. Within weeks, a new study led by Pääbo described a previously unknown hominin species from Denisova Cave in Siberia. Dubbed Denisovans, these enigmatic humans were identified from DNA analysis of fossils too fragmentary to give any idea of what the hominins looked like, but they apparently lived at the same time as Neanderthals and the direct ancestors of modern humans. Considering Denisovan fossils turned up in Russia, you might expect Denisovan DNA to occur in large parts of Russia and China, but it has proven relatively scarce in modern human populations outside of Southeast Asia.
Ongoing DNA studies present an increasingly complicated picture of human history. In rapid succession, two 2013 studies found that (1) Denisovans bred with Neanderthals, ancestors of people now living in East Asia and Oceania, and another group of extinct archaic humans who were genetically dissimilar to both Neanderthals and modern humans, and (2) Mitochondrial DNA (nearly always inherited through mothers) from Spain's Sima de los Huesos Cave showed a link to Denisovans, a link reaffirmed in a 2016 study. And in a 2019 paper describing the "last appearance" of Homo erectus in Java a little over 100,000 years ago, researchers speculated that the ancient hominin species might have been the source of the roughly 1 percent archaic DNA now found in some Southeast Asian populations — DNA that falls outside the modern-human-Neanderthal-Denisovan ancestral group.
A 2018 study pinpointed the first known direct descendant of a Neanderthal-Denisovan pairing. A long-bone fragment from Denisova Cave, likely belonging to a female at least 13 years old, owed about 40 percent of its DNA to to Neanderthals, and a roughly equal amount of its DNA to Denisovans. Officially named Denisova 11, and nicknamed Denny, the 90,000-year-old hominin had a Neanderthal mother and Denisovan father. By the time Denny was described, such interspecies breeding was well known, but the authors speculated that the offspring of such unions might have been infertile.
Ancient Eurasian hominin DNA has also shown evidence of occasional inbreeding, perhaps a common event when local populations got isolated by tough environments. In short, early humans were a randy bunch.
Given the varying levels of Denisovan DNA in modern populations, Pääbo suggests that early migrants out of Africa might have met Denisovans while traveling along the coast of southern Asia.
Just as Neanderthal DNA has contributed to some modern human adaptations, "Denisovan-like" DNA has also given some modern humans an adaptive edge. Tibetans able to breathe easily at high altitudes have a special version of a gene known as EPAS1. The "superathlete" version of this gene lets Tibetans make effective use of small amounts of oxygen typical at high altitudes. Comparisons between populations around the world and between modern and Denisovan DNA indicate that modern Tibetans likely inherited the useful version of the gene from Denisovans at some point in the last 40,000 years or so. The researchers remark, "With our increased understanding that human evolution has involved a substantial amount of gene flow from various archaic species, we are now also starting to understand that adaptation to local environments may have been facilitated by gene flow from other hominins that may already have been adapted to those environments." The argument that Denisovans were superathletes got additional support from the 2019 description of a 160,000-year-old partial jawbone. Found in 1980 by a monk in Baishiya Karst Cave on the Tibetan Plateau, the partial mandible was linked to Denisovans via protein analysis nearly 40 years later. Paleoanthropologist Chris Stringer expressed astonishment that ancient humans managed to survive at that altitude.
More support for Denisovans' adaptability came from a paper published in 2022, describing a molar discovered in Cobra Cave in Laos — a warmer climate than Siberia and Tibet. Analysis of surrounding sediments dated the Cobra Cave tooth at 131,000 to 164,000 years old. The tooth's partially-developed roots suggested it belonged to a juvenile. Its lack of peptides associated with the Y chromosome suggested it belonged to a female. Its overall morphology showed a resemblance to a molar from the Tibetan jaw. Cobra Cave was stuffed with bones, perhaps an ancient porcupine stash for sharpening their own teeth. Tropical conditions unraveled the DNA of other bones sampled from the cave, so the research team opted not to sacrifice part of the tooth for a DNA sample that would likely show nothing. Without such a sample, the molar's affiliation with Denisovans wasn't certain, but its location and age fit with migration patterns that could explain Denisovan DNA in modern Southeast Asian and Pacific Islander populations.
In September 2019, when the only fossils definitively tied to Denisovans consisted of the pinkie bone fragment from Denisova Cave, the partial mandible from Baishiya Karst Cave, and a few teeth, an international team of researchers announced their study of the Denisovan epigenome. Looking at molecular modifications to DNA, they made predictions about Denisovan appearance. Specifically, they identified 56 Denisovan features, including 34 skull features, that might have diverged from those of modern humans and Neanderthals. In general, the researchers concluded that Denisovans had Neanderthal-like foreheads, craniums and jaws, but had wider faces, something the Baishiya Karst Cave find appeared to confirm. But other scientists cautioned against inferring too much about a species from the remains of a single individual. As always, more fossils would be welcome.
Yet another study overseen by Pääbo, published in 2016, found five instances of gene exchange between the direct ancestors of modern humans, Denisovans and Neanderthals. Science staff writer Ann Gibbons explains, "If you're an East Asian, you have three Neanderthals in your family tree; Europeans and South Asians have two, and Melanesians only one."
Another surprise came from a 2017 study of mitochondrial DNA from a 100,000-year-old Neanderthal femur collected in a southwestern German cave. Led by Cosimo Posth, the analysis of Neanderthal genetic material found a strong similarity to the mitochondrial DNA of modern humans. The research team contended that a female from the same stock that led to modern Homo sapiens in Africa must have bred with a Neanderthal male, and that the female's descendants bred so successfully, they eventually replaced the mitochondrial DNA of Neanderthals. Though other paleoanthropologists found the study intriguing, they asked for more Neanderthal genome studies before completely agreeing.
Denisovans are an amazing discovery. Neanderthals have long fascinated anthropologists and the general public alike. Studies uncovering ancient trysts between those populations and our direct ancestors will no doubt continue. But it's important to view Denisovans and Neanderthals in their proper perspective. In the autumn of 2013, Adam Van Arsdale, anthropology professor at Wellesley College and longtime researcher at Dmanisi Cave, explained to students of Wellesley's massive open online course (MOOC) on human evolution:
In the history of paleoanthropology, Neanderthals are central. A lot of our earliest fossil evidence for human evolution comes in the form of Neanderthals . . . And yet in an evolutionary standpoint, Neanderthals probably aren't as important as the attention we give them. If we think about the evolutionary reality of humans for the last million and half years, as they've occupied large stretches of Africa, the Middle East, East Asia, Southeast Asia, Europe was probably always fairly marginal, on the peripheries of that environment. Europe was one of the last areas of the Old World to be occupied. And throughout the late Pleistocene, as we had major ice age events, it became an even smaller space, as populations where restricted to Southern Europe. So Neanderthals were probably always a fairly small population, probably only a tenth the size of the population in Africa, maybe even less. . . . But they're important in the history our discipline because they've been the center of discussion for so long.
To give an example of how inhospitable Europe was during the Pleistocene Epoch, consider that, during the epoch's coldest periods, sea ice, which is mostly confined to the Arctic Ocean nowadays, spread as far south as the northern shores of the Iberian Peninsula.
Compared to other parts of the world, Europe has been more intensively studied, and this has skewed understanding of our evolution. In Our Human Story, Louise Humphrey and Chris Stringer point out that Africa, Asia and Southeast Asia are all poorly studied in terms of ancient human diversity.
One factor that has contributed to the public's fascination with Neanderthals is the mystery of their disappearance amidst a growing population of modern humans. Hypotheses for Neanderthal extinction have included cold-adapted bodies that coped poorly with warming interglacial conditions, disease, and violence directed at them by modern humans. One of the more broadly accepted hypothesis is that modern humans simply outcompeted Neanderthals, having better technologies for acquiring food and keeping warm. Paleoanthropologist Chris Stringer of the Natural History Museum, London, argues that the superior technology of modern humans probably resulted directly from their larger population size. A bigger population has more experts on gathering and storing food, building shelters, and fending off predators — and more methods of passing that expertise along to the next generation. Always constrained by Europe's relatively small habitat, Neanderthal populations would always be more likely to lose expertise in periodic die-offs, leaving the survivors with fewer survival tools.
Perhaps no place better illustrates Europe's shrinking and growing human habitat than the United Kingdom. Except for Earth's earliest days, the amount of water at the planet surface has been pretty finite. That means that — as more water has resided inside glaciers during ice ages, including the recurring ice ages of the Pleistocene Epoch (roughly 1.6 million to 10,000 years ago) — global ocean level has fallen. What is today the island of Britain was once a peninsula of northwestern Eurasia. Geologists surmise that during the Pleistocene, a lake at the tip of a glacier eventually broke through the chalky spine connecting Britain with mainland Europe and, as ocean level rose, water isolated the landmass. The same conditions that would have lowered sea level enough to reconnect Britain to the rest of Eurasia also covered the area in ice, making it an extremely uninviting habitat. Yet some hardy hominins did make it. In May 2013, violent storms at Happisburgh exposed an 800,000-year-old hominin trackway. The same storms that exposed the trackway soon eroded it away, but researchers used multi-angle photographs to produce a three-dimensional reproduction.
The Happisburgh tracks date from a period of time in which, according to a 2023 study, ancestors of modern humans dwindled to such a small population that extinction was a definite possibility. Wangjie Hu, Haipeng Li and coauthors used a statistical analysis method they called FitCoal to project past population sizes based on modern genetic analysis. Their conclusion was that, between 930,000 and 810,000 years ago, ancestors of modern humans — not necessarily Neanderthals or Denisovans — experienced a population bottleneck, dwindling the number of breeding humans to roughly 1,300. The authors posited sudden, dramatic climate change as the cause. Reactions to the study were mixed. Writing about the results, Nick Ashton and Chris Stringer cited archaeological evidence from the time during the proposed bottleneck, including the Happisburgh prints, to argue that Neanderthals and Denisovans had likely escaped serious losses, or the effects of whatever drove the bottleneck had been short-lived.
Throughout these climate shifts, which often made Europe and Asia unattractive, Africa provided a much greater habitable area. The researchers involved in the Neanderthal DNA studies published in early 2014 agreed that the ancestors of modern Africans likely also interbred with archaic human populations that are now extinct, and that future studies should look for evidence of those groups. Pääbo has warned that such a task could prove daunting; Africa's greater human genetic variation could make genetic contributions from archaic populations harder, not easier, to detect than in Europe. But a 2016 study led by PingHsun Hsieh managed to provide some confirmation. Hsieh's team used statistical inference based on genetic data from two Western African Pygmy populations (Biaka and Baka). The researchers concluded that an archaic population had bred with anatomically modern humans at least once in the last 150,000 years, and likely within the last 30,000 years. Published in early 2020, a study by Arun Durvasula and Sriram Sankararaman concluded that members of some contemporary West African populations derive anywhere from 2 to 19 percent of their DNA from a hominin species known only from DNA. According to their model, an extinct hominin species diverged from our own lineage about a million years ago, and descendants of that species later interbred with modern humans about 50,000 years ago. The evidence, the researchers argued, is in gene variants unseen in other modern humans, Neanderthals, or Denisovans.
Overall, genetic studies of African populations have lagged behind other investigations. In November 2018, Nature pointed out that geneticists "have devoted their attention almost exclusively to the small subset of Africans that migrated north to Europe," and expressed a little optimism that new studies were "beginning to address this imbalance." A 2020 review of genetic studies of human evolution in The Scientist reiterated the problem of underrepresented African DNA, and pointed out that non-African ancestry may be traceable to a single small population, perhaps just a few thousand individuals, who left Africa 60,000 to 70,000 years ago.
Because DNA perseveres better in temperate and high-latitude regions, genetic studies of African populations have been rarer. A 2015 study headed by M. Gallego Llorente focused on much newer material, but managed to detect that some Eurasians streamed back into Africa several thousand years ago. The DNA of a roughly 4,500-year-old Ethiopian male — named Mota after the cave where his remains were found — indicated that the Eurasian backflow into Africa originated with a population of Neolithic farmers who had settled in Europe some four millennia earlier. More extensive than previously noted, the Eurasian backflow reached into Central, West and Southern Africa.
As for the evolution of our species, Homo sapiens, conventional wisdom had long held that eastern Africa gave rise to fully modern humans, evolving from either Homo heidelbergensis or Homo rhodesiensis. A 2017 paper on hominin fossils and stone tools discovered in Jebel Irhoud, Morocco, described "a mosaic of features" similar to those of anatomically modern humans. Dated at 280,000 to 350,000 years old, these "pre-modern" human fossils from northwestern Africa pushed back the earliest date of Homo sapiens by some 100,000 years, the authors argued. They didn't contend that this fossil came from a direct ancestor of modern humans, but instead that it suggested a pan-African evolution of anatomically modern Homo sapiens.
A 2018 review paper, led by Eleanor Scerri, greatly expanded on the Jebel Irhoud study and other findings from fossil, archaeological, paleoenvironmental and genetic data to contest the longstanding hypothesis view that Homo sapiens evolved from a single population or a single region of Africa. Instead, the authors argued that populations across the continent spent long periods isolated from each other, coming into contact with each other when favorable environmental changes opened up migration corridors or otherwise formidable habitats, then becoming isolated from each other again. Rather than a tree or branching bush, the study authors presented the analogy of a braided stream. Scerri also stated that humans today look far more alike than our ancestors did. Not all the evidence examined in the review matched nicely; genetic studies of modern African populations indicate a divergence from each other 100,000 to 150,000 years ago, which is a later divergence than that suggested by fossils and artifacts.
Several months after the Jebel Irhoud fossil study, another paper described "the earliest modern humans outside Africa," based on a maxilla and associated teeth found at Misliya Cave, Israel. The fossil was dated at 177,000 to 194,000 years old, and the study team explained that this age was consistent with genetic studies suggesting an emergence from Africa around 220,000 years ago. The Middle East's role as a geographical bridge between Africa and Eurasia was further underscored by the 2021 announcement of Nesher Ramla Homo fossils found in Israel. While avoiding a new species declaration for the fossils, the research team argued that modern humans traveling out of Africa and through the Levant might have met hominins with a "weird mix" of archaic looks and modern toolkits between 120,000 and 140,000 years ago.
The same climate shifts that rendered some areas temporarily uninhabitable may have done more than just encourage migration. A long-held view of human evolution was the savanna hypothesis: Humans diverged from apes when our ancestors adapted to grasslands. But hominin fossils turn up in more diverse environments. In the 1990s. Rick Potts of the Smithsonian Institution began to advance an alternative hypothesis: variability selection. He formulated the hypothesis after years of fieldwork at Olorgesailie, Kenya, where he found evidence of: a lake, an absent lake, a volcanic eruption, another lake, a severe drought and another lake. In brief, Potts concluded, our ancestors evolved key traits, such as cooperation and big brains, by adapting to environmental instability.
Multiple lines of evidence point to environmental instability over the last several million years. Oxygen isotopes in foraminifera provide proxy records of past sea levels and seawater temperature. Gas bubbles trapped in ancient ice give indicates of past levels of carbon dioxide. Sapropels (dark, organic-rich stripes of sediment that alternate with lighter-colored layers) in the Mediterranean hint at times when wet conditions in Africa increased flow along the Nile River. Driving much of this climate instability were Milankovitch cycles.
In a 2015 paper, Potts and coauthor Tyler Faith argued that Milankovitch elements of eccentricity and precession worked together to affect seasonal monsoons. The authors identified eight periods of intense environmental instability, each lasting 192,000 years or longer, in the last 5 million years. These periods of intense instability may have bred human ancestors well equipped to cope with a variety of conditions, and able to settle foreign environments.
A review paper published in Nature early in 2021 discussed three "key phases" in human evolution: the global expansion of modern humans roughly 40,000 to 60,000 years ago, an African origin of modern human diversity between 60,000 and 300,000 years ago, and the separation of modern humans from archaic populations between 300,000 and 1 million years ago. The authors pointed out how much is still unknown about each of these phases. They listed multiple limitations on our current knowledge: Although evidence uncovered to date clearly identifies Africa as the birthplace of modern humans, nothing narrows the focus to a particular region in Africa. The emergence of modern humans can't be traced to a particular point in time, either. Some human ancestors belonged to groups that left evidence in the fossil record, but many didn't. And as counterintuitive as it sounds, the evolution of modern human traits, such as behavior and physiology, doesn't necessarily correspond to identifiable changes in genetic ancestry.
In the capacity to amaze, maybe nothing tops the 2004 announcement from Peter Brown, Mike Morwood and their colleagues about a 1-meter-tall "hobbit," Homo floresiensis. The researchers found Homo floresiensis in Liang Bua Cave on the Indonesian island of Flores. Controversy raged around the hobbits, with some alleging the hominins were malformed, little-brained midgets, but braincase scans suggested otherwise. The species identification was bolstered by another find, made in 2014 and published in 2016, of a partial mandible and some isolated teeth from Mata Menge, about 50 miles east of Liang Bua. The later research team, led by Gerrit van den Bergh, dated the finds at 700,000 years old, and because the wisdom tooth had erupted, they identified the mandible as belonging to an adult. The teeth were even smaller than those of the hobbits. Understanding that future discoveries might change the picture, they provisionally classified the new finds in the same species, Homo floresiensis, though the newly discovered mandible was actually smaller than those described in 2004.
Homo floresiensis puzzles paleoanthropologists for a few reasons. In some respects, it shows the greatest similarity to Homo erectus, which could have undergone rapid island dwarfing. But some hobbit features, such as flat feet and unusual wrist bones, suggest a more primitive ancestor, Homo habilis or even Australopithecus. Could australopithecines have made it all the way to Indonesia? Even when Pleistocene glaciers locked up maximum amounts of water and ocean levels were at their lowest, you couldn't get to Flores without an ocean voyage. It's possible the hobbits' ancestors (un)willingly rode rafts of vegetation, perhaps in the wake of a tsunami. But Homo erectus is known to have colonized vast regions, so ancestors from that species are easier to envision making such a long journey.
A 2018 study describing evidence (butchered rhino bones and stone tools) of hominin habitation (whether by Homo erectus or another species) from more than 700,000 years ago in the Philippines adds weight to the hypothesis that early hominins crossed big water bodies by accident or design. A 2019 study described teeth, hand bones, foot bones and a femur to a new species from the Philippines: Homo luzonensis found in Callao Cave on Luzon Island. Researchers dated the fossils at 50,000 to 67,000 years old. Although distinct from H. floresiensis, H. luzonensis shares a puzzling combination of characteristics: small stature, teeth like those of Homo but hands and feet like those of australopithecines. And, like the Flores hobbits, the Philippine hominins had to cross open ocean.
On the one hand, the tremendous variability at Dmanisi, all attributed to Homo erectus, provides a possible explanation for the unusual features of the small hominins, and they could show a strong founder effect — the oddness that occurs when a population is founded by a small group of atypical individuals. On the other hand, as anthropologist Matthew Tocheri argues:
However, explaining the many similarities that H. floresiensis and H. luzonensis share with early Homo and australopiths as independently acquired reversals to a more ancestral-like hominin anatomy, owing to evolution in isolated island settings, seems like a stretch of coincidence too far. . . . [O]ne thing can be said for certain — our picture of hominin evolution in Asia during the Pleistocene just got even messier, more complicated and a whole lot more interesting.
Regarding the hobbits, we can only wonder whether they ever encountered their much larger evolutionary cousins (us). Initial dating of the species provided an age as young as 11,000 years, but follow-up studies indicated the hominins were gone by 50,000 years ago, and were maybe even driven to extinction by modern humans. Paleoartist John Gurche, who produced hominin reconstructions for the Smithsonian's new human origins hall, speculates about the most complete hobbit specimen, a female he's nicknamed Flo:
What expression would be fitting for Flo if she were able to see us? What would she have thought of our kind? I didn't have to think too hard about this one. We Homo sapiens do not have a good record when it comes to how we treat people different from ourselves. If she had any knowledge of our kind (and she may have, as modern humans were in the region during her time) we might have seemed to her like a race of violent giants.
Flores hobbits, like the Neanderthals and Denisovans, are extinct today, though Neanderthals and Denisovans have left some traces in modern populations. Those traces have forced a rethink of the strict "out of Africa" model of modern human origins, one that argued for a single dispersal from Africa into Eurasia around 60,000 years ago. A 2017 review paper in Science called for a revision to that model, but didn't refute African origins of all modern humans. Instead, the review argued for multiple migrations out of Africa, and posited that a migration around 60,000 years ago by "larger and more demographically successful human populations" probably obscured earlier migrations. In short, traces of Neanderthal and Denisovan DNA notwithstanding, the vast majority of our ancestry lies in Africa, where multiple waves of hominin migrations began, and where some migrations eventually returned.
Genetic evidence is one piece of the puzzle of when and how humans migrated into Eurasia. Reconstructions of paleoclimate provide another piece. In 2021, an international team of researchers reconstructed climate over the past 300,000 years, examining conditions between northern Africa and the Levant. The research team assumed a minimum rainfall threshold of 90 millimeters per year since no hunter-gather population has been recorded in drier conditions. The research team also looked at two routes out of Africa: a northern route across the Sinai Peninsula, and a southern route across the Strait of Bab-el-Mandeb (across the Red Sea). Acknowledging that questions remained about capacity for maritime travel across the Red Sea, and that some ecological opportunities for migration preceded modern human remains, the team identified multiple migration windows over the past 300,000 years.
Weeks after the Levant study, another international team published on findings from the Arabian Peninsula. Sediments from long-ago lakes in the Nefud Desert include stone tools and remains of water-loving animals such as hippos. The stone-tool assemblages span the last 400,000 years. The authors argued that roughly 400,000; 300,000; 200,000; 130,000 to 75,000, and 55,000 years ago, high rainfall brought fleeting periods of "green Arabia" that would have facilitated human migration and habitation in the region.
Paleoanthropologists generally agree that modern humans derive most of our ancestry from a population that was living in Africa long after Homo erectus settled parts of Eurasia. In fact the majority of genetic variation in modern humans can be found not between African and other populations but within Africa, and this has always been the case with humans.
Skin color, long used to distinguish sub-Saharan Africans from Europeans, turns out to be a poor method of classification. A 2017 genomic study not only found a remarkable range of skin tones among modern African populations, but also found that gene variants associated with comparatively light European skin arose in Africa perhaps as much as 1 million years ago, and remain common among the San people of Southern Africa even now. Summarizing the findings, study coauthor Sarah Tishkoff remarked, "There is so much diversity in Africans that there is no such thing as an African race."
A 2020 review of human genetic research in The Scientist pointed out that the roughly 12 to 15 genes so far known to influence skin color in Eurasian populations account for less than a quarter of the variation found in modern Africans. The "deep racial divides" relied upon for centuries to decide who deserved slavery and who deserved freedom are indeed only skin deep.
Meanwhile, DNA analysis of European ancestry has found that hunter-gatherers living in Europe within the last 10,000 years retained dark skin, even if they had blue or green eyes. In early 2018, the Natural History Museum and University College in London included those features in their reconstruction of Britain's famed Cheddar Man, who lived roughly 10,000 years ago. Although the findings specific to Cheddar Man weren't yet published in peer-reviewed research, they fit with earlier findings about the group known as Western hunter-gatherers, who migrated to Europe about 14,000 years ago. This argument gained more support with the publication of a novel study published in late 2019. By analyzing a 5,700-year-old clump of chewed birch pitch (perhaps chewed to soften it before using it as an adhesive, or for its antiseptic properties) researchers identified the DNA of the woman who chewed it — as well as the DNA of her microbiome. She probably had, in the words of one researcher, a "really striking combination of dark hair and dark skin and blue eyes."
Blue eyes and possibly dark skin may have predominated for a time in Europe — in the western reaches of the continent, while light skin and dark eyes predominated to the east. This was one of multiple findings emerging from a pair of studies, one led by Cosimo Posth in Nature, the other led by Vanessa Villalba-Mouco in Nature Ecology and Evolution, released in March 2023. Collectively, the studies examined the genomes of 357 ancient Europeans, collected from 34 countries, individuals who lived between 45,000 and 5,000 years ago. The researchers also found that some groups living in Europe at the same time differed genetically from each other more than modern Europeans and Asians do today. Living before and after the Last Glacial Maximum, the ancient populations wove a complex tapestry of sharing cultural practices, merging, not merging, and all-out replacing each other. Challenging the long-held notion that modern humans simply out-competed Neanderthals, the research founds that the earliest modern humans to migrate into Europe may have disappeared along with the Neanderthals. While praising the work, anthropobiologist Ludovic Orlando implored researchers to "redouble efforts outside Europe, to avoid developing a Eurocentric vision of human prehistory."
White skin, it appears, became common among Europeans only within the last several thousand years, some 40,000 later than previously thought. More importantly, the findings of paleogenetics have been unkind to myths of European racial purity reaching back to the Pleistocene. Paleogenetics have been equally unkind to modern European anti-immigration fervor. Europe has always been a melting pot.
In 2019, the American Association of Biological Anthropologists (AABA) issued a statement on race and racism which read:
Humans are not divided biologically into distinct continental types or racial genetic clusters. Instead, the Western concept of race must be understood as a classification system that emerged from, and in support of, Euyro9pean colonialism, opre3swsion, and discrimination. It thus does not have its roots in biological reality, but in policies of discrimination.
Reflecting on the AABA statement, Princeton University anthropology professor Agustín Fuentes observed that, although race is not a biological reality, as a social reality, including how societies are structured and how some people experience the world, race "is very real."
A better understanding of human genetic variability has largely overturned long-held beliefs about separate human races. Before giving up the notion of human races, the scientifically literate gave up the notion of separate human species (a belief still prevalent in the 19th century). Although multiple species of humans lived on this planet at various times in the hominin past, there remains only one human species today. Just when our species arose continues to be researched. A genetic study published in 2017 argued that modern Homo sapiens emerged somewhere between 350,000 and 260,000 years ago — earlier than previous estimates — though the research team acknowledged that its estimates of mutation rates could be debated. Consistent with earlier studies, though, the research team found by far the greatest genetic diversity and the most ancient population splits within Africa, with non-African populations arising much more recently.
Study after study confirms: We all have ancestors from Africa. Yet many questions remain, and new fossils, such as the hobbit and the Denisovan finger bone, often raise more questions than answers.
We can infer, but don't precisely know, what migration paths ancient humans followed — or exactly when they followed them. Improved dating techniques keep refining those estimates. Radiocarbon dating of archaeological remains had long suggested the maximum age of human colonization of Australia was roughly 45,000 years, but that figure is near the upper limit of what radiocarbon dating can measure. A 2017 study estimating the date an artifact-bearing sediment layer last saw sunlight changed the earliest-Australian-settlement figure to probably more than 65,000 years ago. Humans reaching Australia that long ago has implications for meeting and mingling with Neanderthals and Denisovans, and implies a longer overlap between modern humans and the hobbits of Flores. As if to reinforce those findings, another study published soon afterward concluded that anatomically modern humans reached Sumatra, Indonesia, between 63,000 and 73,000 years ago.
A study, led by Anthony Wilder Wohns, published in Science in early 2022, inferred human ancestral lineages by combining modern and ancient human genomes. The result was "immensely complex" and spanned parts of Africa, Eurasia and Oceana. Among the findings, the reseachers noted, were "signals of very deep ancestral lineages in Africa, the out-of-Africa event, and archaic introgression in Oceana."
We still don't know when our ancestors developed language, or what proto-languages might have proliferated in the past. One of the hallmarks of humanity is the ability to communicate with symbols, but although clear evidence of symbolic behavior has emerged in the form of abstract art and self-adornment, the artifacts that have survived are almost certainly not the oldest examples. A trio of papers published in Science in March 2017 catalogued new evidence of increasingly complex behavior at Olorgesailie Basin, Kenya, starting at least 320,000 years ago. The discoveries included long-distance transport of obsidian, and the use of red and black pigments. These more complex behaviors likely played out against a backdrop of rapidly changing climate, and a growing scale of social relationships, for instance to carry out trade, might have helped our ancestors cope with climate swings. Unfortunately, Olorgesailie's geologic record for the time period between 500,000 and 320,000 years ago has been lost to erosion, when some of the technological innovations were probably first made.
A 2021 study concluded that 105,000-year-old artifacts found at Ga-Mohana Hill, along the edge of the Kalahari Desert, indicate symbolic behavior among the local population. Dated to a time when the Kalahari Desert was lush and green, the artifacts include ostrich eggshell fragments and calcite crystals. The researchers examining them could find no other explanation for the assemblage of eggshells and crystals except that people collected them on purpose, and their purpose probably wasn't utilitarian. Another 2021 study meticulously documented evidence of the deliberate burial of a child in a cave at Panga ya Saidi, Kenya, roughly 78,000 years ago. The authors described the find as the earliest known burial in Africa.
Among the most significant developments in human evolution, Adam Van Arsdale identifies bipedality, the evolution of bigger brains compared to earlier hominins, and the adoption of agriculture.
Agriculture proved to be a mixed blessing.
Historians and anthropologists long believed that agriculture was a prerequisite for the establishment of settled communities, but people living in ecologically rich areas with plentiful, varied food sources could largely stay put without farming. And when farming took hold, it didn't improve the lives of everyone who farmed.
Two studies released online at the end of 2014 linked the adoption of agriculture to the evolution of lighter and unfortunately weaker joints in the human skeleton. In particular, the research found lower density in trabecular bone — the spongy tissue at the ends of bones such as the femur. The lower bone density likely evolved about 12,000 years ago.
Farming may have meant a more reliable food supply, and it eventually led to surplus stores of grain that supported exponential population growth and non-farming occupations we associate with civilization. But an agricultural diet isn't necessarily a nutritious one. Malnutrition and long work days must have prevailed for many early farmers. Make no mistake; hunter-gatherers have always gotten their exercise, but early farmers had to work far harder and longer for fewer nutrients. Skeletons of Neolithic women show telltale signs of grain-grinding drudgery: arthritic backs and knees, and deformed femurs and toes.
Agriculture brought women bigger problems than repetitive, tiring tasks. Breastfeeding, combined with the lean diet of a hunter-gatherer lifestyle, hampers fertility. Hunter-gatherer mothers typically space their offspring by several years, having three or four pregnancies total. Access to livestock milk meant farming mothers could wean their babies while they were still babies and then have more babies. Another year, another baby for 10 or 15 years isn't something the human female body evolved to handle well.
Human health suffered in other ways, too, in both sexes. Pre-farming human skeletons show considerable wear on the front teeth, understandable since hunter-gatherers likely used their teeth as tools when working with hides. But their back teeth were generally pristine white and cavity free. In farming humans, more carbohydrates feed the bacteria that cause teeth to rot. The concentration of people, livestock and surplus grain in settlements brought unwanted visitors to spread disease: mice, rats, ticks, fleas and mosquitoes. Anemia was far worse in farmers, especially women, than hunter-gatherers. Some have argued that the species benefiting the most from the agricultural revolution isn't ours, it's wheat.
The adoption of agriculture altered human bodies and ultimately human language, not just because farmers needed words for "barn" and "rake." Human children typically have overbites. In hunter-gatherer societies, years of chewing unprocessed foods makes the mandible grow robust enough that upper and lower teeth align. Agriculture produces more processed foods that keeps the mandible relatively dainty well into adulthood. That has the downside of impacted wisdom teeth, but a 2019 study concluded that the smaller lower jaw makes pronouncing consonants known as labiodentals — such as F and V — easier. The study authors supported their conclusion with computer simulations of human mouths, the historical spread of labiodentals with the spread of processed foods, and the relative paucity of labiodentals in modern hunter-gatherer societies. As Science reporter Ann Gibbons explains, "Don't like the F-word? Blame farmers and soft food." So without the advent of farming, you would not, thousands of years later, be able to fully embrace the experience of driving at rush hour, especially if you're on your way to the dentist to have your wisdom teeth extracted.
While many paleoanthropologists agree with Van Arsdale's assessment of the most important events — bipedality, bigger brains and agriculture — others might choose different things. At a 2014 symposium at the Salk Institute for Biological Studies, anthropologists discussed what they termed "human self-domestication," highlighting shortened faces and diminished brow ridges over the past 80,000 years, and a drop in cranial capacity after the invention of agriculture. They likened traits in modern humans to those of some domesticated animals, including the retention of some juvenile physical traits in adults.
Though modern humans show less genetic diversity than our closest modern relatives (chimpanzees), we show much greater cranial variation than that of chimps or other primate species. Big contributors to that variation are the things we use most to identify each other: our faces. A review article published in 2019 in Nature Ecology and Evolution surveyed research about the evolution of the human face over the past 4 million years. Although some clear trends could be found, such as less facial projection in Homo than in australopithecines, and more gracile facial features in Homo erectus than in earlier Homo species, the overall picture was complex. Dietary factors undoubtedly mattered in the changing morphology of hominin faces, but they weren't all that mattered. Genetic drift and founder effects could partially drive facial evolution. So could climate, especially at high latitudes where, for instance, our Neanderthal cousins likely needed big noses to warm frigid air. And however different factors have mingled to produce facial variation, it stands to reason that eyebrows are much more expressive without brow ridges.
Migrations out of our shared ancestral homeland of Africa were hardly the end of human mass movement. Studies of DNA and artifacts indicate an unending process of migration, to the extent that beliefs in pure ethnicities are hardly more than myth. As recently as 4,500 years ago, migrants associated with bell-shaped pottery, or "Bell Beaker culture," elbowed out the likely architects of Stonehenge. With the exception of Australian aboriginal ancestors, hardly any population has stayed put for tens of thousands of years.
In short, while anthropologists may vigorously debate which traits matter most in defining modern humans, and precisely how human evolution occurred, they don't debate whether it occurred, nor do they debate our common ancestry.
In 1925, Tennessee schoolteacher John Thomas Scopes was tried for teaching evolution, in violation of the recently enacted Butler Act. The act outlawed the teaching of "any theory that denies the story of the Divine Creation of man as taught in the Bible." The American Civil Liberties Union offered to defend anyone who defied the act, in hopes of overturning the law. Instead, Scopes's conviction was thrown out on a technicality, and the law remained on the books until the late 1960s.
In the decades following the Scopes Monkey Trial, quite a lot changed, yet some things remained very much the same. Teaching evolution is no longer illegal in the United States, but millions of Americans still oppose the teaching of evolution in public school science classrooms. Of the world's industrialized nations, only Turkey's population shows greater opposition to evolutionary theory than the U.S. (In the Muslim world, support for creationism comes not from fundamentalists but, ironically, from relatively moderate clerics looking for a "middle way" between science and faith. One Muslim nation that appears to enjoy widespread support for evolution is Iran.)
The U.S. enjoys the rather unique distinction of politicizing evolution. During the 1990s, Republican Party platforms called for the teaching of "creation science" in Alaska, Iowa, Kansas, Oklahoma, Oregon, Missouri and Texas. In a New York Times editorial in 2005, moderate Republican, Episcopal minister and former senator John C. Danforth described the U.S. as the only country where a political party adopted a position on evolution. In June 2008, Governor Bobby Jindal signed into law the Louisiana Science Education Act singling out evolution (and global warming and human cloning) for what the act described as critical analysis. The move established something of a precedent. Between January 2004 and June 2013, state legislatures in the United States introduced 39 "academic freedom" bills that would open the door for creationism. Between January 2008 and June 2013, state legislatures introduced 15 bills targeting both evolution and climate change in public school science classrooms. Lawmakers in Tennessee, the same state that tried John Scopes, entertained legislation that would encourage "critical thinking" on the topics of "biological evolution, the chemical origins of life, global warming and human cloning."
By targeting other topics besides evolution (and thereby evading some charges of religious motivation), Louisiana's Science Education Act survived legal challenges better than other antievolution efforts. In January 2016, Nicholas Matzke published a phylogenetic tree of antievolution legislation, including efforts that also targeted climate change and human cloning, and described "strong evidence of bill-to-bill copying and 'descent with modification.'"
In extreme cases, some Americans endorse the teaching of biblical literalism in public schools. Others push for the instruction of Intelligent Design. Still others want teachers to "teach the controversy."
What controversy?
A widespread public perception in the United States (and to a lesser degree in some other nations) is that scientists disagree about whether evolution is real. In October and November of 2001, Seattle-based think tank the Discovery Institute placed advertisements in The New York Review of Books, The New Republic, and The Weekly Standard. These ads featured a list of 100 PhDs (!) who "dissented from Darwinism." Members of the American public were understandably persuaded that a genuine controversy existed.
In February 2003, theoretical physicist Lawrence Krauss, a keynote speaker at the annual meeting of the American Association for the Advancement of Science, unveiled a response to the list of 100 PhDs. The National Center for Science Education (NCSE) had collected a list of 200 PhDs who agreed with the following statement:
Evolution is a vital, well-supported, unifying principle of the biological sciences, and the scientific evidence is overwhelmingly in favor of the idea that all living things share a common ancestry. Although there are legitimate debates about the patterns and processes of evolution, there is no serious scientific doubt that evolution occurred or that natural selection is a major mechanism in its occurrence. It is scientifically inappropriate and pedagogically irresponsible for creationist pseudoscience, including but not limited to "intelligent design," to be introduced into the science curricula of our nation's public schools.
Besides finding twice as many scientists as the Discovery Institute, NCSE added a further twist: In order to support the pro-evolution statement, evolutionists not only had to hold a PhD in science. They also had to be named Steve (or Steven, Stephen, Esteban, Stephanie, or some derivative). "Steve" was chosen for two reasons: in honor of the late Stephen Jay Gould, and because people named Steve or some variation comprise roughly 1 percent of the American population. So the Steve list could be multiplied by 100 to get a ballpark figure of how many scientists think evolution is real. Steves armed with PhDs have continued to sign up since February 2003. Steve number 300 was Stephen Hawking. On September 5, 2008, the Steve-o-Meter reached 900. On February 13, 2009, Steven P. Darwin — professor of ecology and evolutionary biology at Tulane University in New Orleans, although no relation to the most famous evolutionist — became Steve number 1,000. As of early November 2009, the Steve-o-Meter sat above 1,100. As of January 28, 2021, it sat at 1,465.
So in comparing scientists with doubts about Darwin to scientists who accept evolution, the score isn't 100 to 200, or even 100 to 1,000. It's more like 100 to about 130,000. Project those percentages onto an election, and evolution wins by a landslide. As Joel Achenbach remarks in a National Geographic article on topics such as evolution, climate change and vaccinations, "There aren't really two sides to all these issues."
That's not to say there is no debate about evolution. Clearly there is. But it's a cultural debate, a religious debate and — given recent religious influence in American government — a political debate. But it's not a scientific debate.
In 2004, William Dembski wrote in The Design Revolution:
Intelligent design is not an evangelic Christian thing, or a generally Christian thing or even a generically theistic thing. . . . Intelligent design is an emerging scientific research program. Design theorists attempt to demonstrate its merits fair and square in the scientific world — without appealing to religious authority.
But in 1999, the very same guy had written in Intelligent Design: The Bridge Between Science and Theology:
[A]ny view of the sciences that leaves Christ out of the picture must be seen as fundamentally deficient. . . . [T]he conceptual soundness of a scientific theory cannot be maintained apart from Christ.
Intelligent design (ID) argues that many features found in living organisms are so complex, they must have been planned by an intelligent designer. A cornerstone of the argument is "irreducible complexity": some multi-faceted features simply could not have evolved by chance, and the removal of one part would collapse the whole system. Yet in a paper critiquing intelligent design, Elliot Sober observed that a horse with one, two or even three legs couldn't run very well, but horses didn't likely evolve one extra leg at a time. Appendages like four legs aren't controlled by four sets of genes, but by a single gene set that governs the development of appendages.
The leading think tank behind the intelligent design movement is the Discovery Institute in Seattle, Washington. In 2001, the institute's "Wedge Strategy" was leaked on the Internet. In part, it stated:
Design theory promises to reverse the stifling dominance of the materialist worldview, and to replace it with a science consonant with Christian and theistic convictions.
In 2005, intelligent design was put on trial in Dover, Pennsylvania, in 400 F.Supp.2d 707 (M.D. Pa. 2005). In the trial, as before, ID supporters insisted that they were promoting science, not religion. But one of the more interesting pieces of evidence offered by the plaintiffs was the straightforward substitution of "design proponents" for "creationists" in a textbook recommended to introduce intelligent design. Philosophy professor and long-time ID tracker Barbara Forrest examined successive versions of the book, Of Pandas and People, and found one substitution that went horribly awry: "cdesign proponentsists." Evolutionists described this sloppy substitution as the missing link between creationism and ID. On December 20, 2005, the presiding judge, John E. Jones III (a lifelong church-going Republican appointed by George W. Bush, it is worth noting) reached this conclusion about intelligent design:
Teaching intelligent design in public school biology classes violates the Establishment Clause of the First Amendment to the Constitution of the United States (and Article I, Section 3 of the Pennsylvania State Constitution) because intelligent design is not science and "cannot uncouple itself from its creationist, and thus religious, antecedents."
Much of the anxiety about evolution stems from the fear that, in accepting the theory, one denies the existence of God. In fact, evolutionary theory makes no demands about religious faith (or lack of it). People who accept evolution are free to draw their own conclusions about a higher power in the universe.
Although some evolutionists such as Richard Dawkins are atheists, other evolutionists, such as Kenneth Miller (biology professor at Brown University, and author of Finding Darwin's God) and Francis Collins (head of the National Human Genome Research Institute) see no conflict between evolutionary theory and religious belief. Mainstream Protestantism, Roman Catholicism, branches of Judaism and Islam all accept evolution.
Michael Zimmerman, evolutionary biologist and dean of the College of Letters and Science at the University of Wisconsin at Oshkosh, recently initiated a letter stating:
We the undersigned, Christian clergy from many different traditions, believe that the timeless truths of the Bible and the discoveries of modern science may comfortably coexist. We believe that the theory of evolution is a foundational scientific truth, one that has stood up to rigorous scrutiny and upon which much of human knowledge and achievement rests. To reject this truth or to treat it as "one theory among others" is to deliberately embrace scientific ignorance and transmit such ignorance to our children. We believe that among God's good gifts are human minds capable of critical thought and that the failure to fully employ this gift is a rejection of the will of our Creator.
Scientists acknowledge that there are some questions science simply cannot answer. Whether Genesis as recorded in the King James Bible describes a literal truth about the physical universe is a testable hypothesis, and it turns out to be false. We know this from dating of rocks and fossils, for one thing. But whether or not there is a God is not a testable hypothesis. That's a matter of personal belief. As Stephen Jay Gould put it, science and religion are "nonoverlapping magisteria." While evolution doesn't require a supreme being, it doesn't preclude the existence or involvement of such a being, either.
In a word: no. Although there's virtually no debate among scientists about whether evolution happens, they continue to debate how it happens. The story gets more intriguing every day.
Niles Eldredge co-founded the theory of punctuated equilibria (rapid bursts of evolution separated by long periods of little or no change) with Stephen Jay Gould in the early 1970s. He has since articulated a more comprehensive "sloshing bucket" theory of evolution, arguing that most evolution occurs in regional turnovers of species. A migrating sandbar on the ocean floor will decimate the bottom-feeders in its way, but the same kinds of animals will populate the sea floor in the sandbar's wake. But an asteroid hitting the Earth is a complete game-changer. How hard the bucket of life gets kicked determines slosh size.
On a smaller — but perhaps more deeply felt — scale, some researchers have begun to rethink aspects of human childbirth. Anthropologists long subscribed to the obstetrical dilemma, a theory that evolution struck a compromise between women retaining the ability to walk and delivering big-brained babies. Such a compromise, with (literally) no wriggle room, would probably mean female pelvises are similar the world over. But a study published in 2018, measuring 348 skeletons from 24 regions of the globe, found considerable variation, with Native American birth canals diverging the most from those of sub-Saharan Africa. As with other aspects of human variation, populations farther away from Africa showed less within-population variability than those from Africa.
The more scientists examine life on Earth, the more they unravel complex webs of interaction. For example, leafcutter ants feed on a certain type of fungus, and scientists long thought that leafcutter ants kept their fungal gardens pest-free with careful weeding. Then a graduate student discovered that the gardens were under continual threat from a different kind of fungus, a parasitic fungus. He also discovered that the worker ants' abdomens were covered with a substance commonly used in antibiotics. What had long looked like a simple partnership between two species (ants and the fungus they ate) was a more complicated relationship between four species (ants, their fungal food, an annoying fungal pest, and the built-in antibiotic).
Closer examination of genetics reveals more flexibility than once thought, too. This extra layer of complexity or flexibility in genetic material, nudged by factors such as lifestyle and stress, is known as epigenetics.
The term was coined by developmental biologist Conrad Hal Waddington. He conducted a series of experiments on fruit flies, subjecting them to a burst of heat soon after pupa formation. A few of the adult flies to come out of these experiments lacked wing support struts. Waddington bred these flies with each other, continuing to subject new pupa to heat bursts. But after about 15 generations, the experiment began producing defective flies even without the heat shock.
Waddington's former student, geneticist Steve Jones, points to cave fish as a good example of how epigenetics may progress:
Cave fish show how the system might work in nature. Surface fish put into water taken from underground (which has its own unique chemistry) find the experience stressful. In some individuals, developmental pathways are overwhelmed and they grow up with small eyes. By breeding from such animals, a small-eyed stock appears in a few generations. Perhaps that is how the eyeless state first arose. Rather than the earliest animals to venture into the depths having to wait for new mutations to adapt them to the dark, the stressful conditions of their new home released hidden variation that could be exploited at once.
Waddington's ideas didn't initially receive a warm welcome, but in the decades since he carried out his fruit fly experiments, new generations of researchers have found more evidence.
In August 2003, researchers at Duke University reported that the diet of a pregnant mouse mother could affect the functioning of her offspring's genes. By 2013, multiple studies had indicated that both exercise and diet could affect a process known as methylation. Methylation is the process in which assemblages of carbon and hydrogen atoms, or methyls, attach to either DNA or histones (the proteins around which DNA coils itself). It appears that DNA methylation switches off the gene while histone methylation enhances or inhibits gene expression. Methylation may alternately help or hinder a gene's ability to respond to other things happening in the body.
In 2014, another paper described the results of an ingenious study in which 23 healthy men and women were instructed to ride exercise bikes for 45 minutes four times a week for three months. The trick was that the study participants pedaled with only one leg, leaving the other leg at rest. The researchers then performed muscle biopsies on both legs. The biopsies showed more than 5,000 muscle genome sites with new methylation patterns — but only in the exercised legs, not the slacking legs. At least some of the genes with changed methyl groups had already been linked to muscle inflammation, metabolism or insulin response.
Although methylation patterns don't change DNA, they can apparently be inherited. Intriguing evidence comes from Holland where, in the winter of 1944, German blockades left the local population nearly starving. Babies conceived during the famine, known as "Hongerwinter," grew up to have a greater susceptibility to obesity and heart disease. Remarkably, so did their children. Studies of Hongerwinter baby genomes indicated that methylation had occurred in the genes connected to cholesterol and aging. Similar susceptibility to lifestyle disease has arisen in babies conceived during other famines, such as China's disastrous Great Leap Forward.
It may even be possible to uncover the effects of epigenetics in human ancestors. DNA degrades over time, but methylated DNA degrades differently from unmethylated DNA, and this finding allowed a team of international researchers to infer chemical tweaks to ancient genes in a 50,000-year-old Denisovan female and a somewhat older Neanderthal female. In April 2014, they announced that gene silencing might account for skeletal differences between modern humans and Neanderthals. Although the results were promising, scientists advised against drawing conclusions from such a small sample.
In recent decades, multiple studies have suggested something that might be even more amazing than epigenetics: de novo genes. In 2019, a feature in Nature explained:
[G]enomes contain much more than just genes: in fact, only a few percent of the human genome, for example, actually encode genes. Alongside are substantial stretches of DNA — often labelled "junk DNA" — that seem to lack any function. . . . It wasn't until the 21st century that scientists began to see hints that non-coding sections of DNA could lead to new functional codes for proteins.
Despite impressive breakthroughs in DNA analysis, scientists still struggle to overcome DNA's fragility, especially in tropical regions where it degrades especially fast. Proteins, such as collagen in bones, is more abundant than DNA and persists better over time. By relying on protein sequencing, scientists might be able to peer further back in the history of life than is possible with DNA.
The relatively new field nicknamed evo-devo, that relates evolution to biological development, promises to change how we view the process of evolution. During embryonic development, researchers have found, the timing of genes being activated and deactivated can produce dramatic changes in body plans.
In January 2007, Nigel Goldenfeld and Carl Woese predicted that coming discoveries in how microbes swap genes — a process known as horizontal gene transfer, or HGT — would force a redefinition of organisms and "evolution itself." They wrote:
[M]icroorganisms have a remarkable ability to reconstruct their genomes in the face of dire environmental stresses, that in some cases, their collective interactions with viruses may be crucial to this. In such a situation, how valid is the very concept of an organism in isolation?
Recent studies have raised questions about how we define species. As part of the neo darwinian synthesis in the mid-20th century, Ernst Mayr identified reproductive isolation as a requirement of new species formation, but in late 2016, Science devoted a news feature on the role of hybrids in evolution, reporting:
[New] data belie the common idea that animal species can't hybridize or, if they do, will produce inferior or infertile offspring — think mules. Such reproductive isolation is part of the classic definition of a species. But many animals, it is now clear, violate that rule: Not only do they mate with related species, but hybrid descendants are fertile enough to contribute DNA back to a parental species — a process called introgression. . . . Biologists long ago accepted that microbes can swap DNA, and they are now coming to terms with rampant gene flow among more complex creatures. "A large percent of the genome is free to move around," notes Chris Jiggins, an evolutionary biologist at the University of Cambridge in the United Kingdom. This "really challenges our concept of what a species is."
Will textbooks be rewritten soon? Stay tuned.
Home | Goof Gallery | Timeline | Biographies | Evolution | References | Search | Email
Narrative text and graphic design © 2007-2023 by Michon Scott - Updated September 1, 2023