Оценить:
 Рейтинг: 0

The Evolution of Everything: How Small Changes Transform Our World

Автор
Год написания книги
2019
<< 1 2 3 4 5 6 7 >>
На страницу:
5 из 7
Настройки чтения
Размер шрифта
Высота строк
Поля

Doubting Darwin still

Yet, despite this overwhelming evidence of emergence, the yearning for design still lures millions of people back into doubting Darwin. The American ‘intelligent design’ movement evolved directly from a fundamentalist drive to promote religion within schools, coupled with a devious ‘end run’ to circumvent the USA’s constitutional separation between Church and state. It has largely focused upon the argument from design in order to try to establish that the complex functional arrangements of biology cannot be explained except by God. As Judge John Jones of Pennsylvania wrote in his judgement in the pivotal case of Kitzmiller vs Dover Area School District in 2005, although proponents of intelligent design ‘occasionally suggest that the designer could be a space alien or a time-traveling cell biologist, no serious alternative to God as the designer has been proposed’. Tammy Kitzmiller was one of several Dover parents who objected to her child being taught ‘intelligent design’ on a par with Darwinism. The parents went to court, and got the school district’s law overturned.

In the United States, fundamentalist Christians have challenged Darwinism in schools for more than 150 years. They pushed state legislatures into adopting laws that prohibited state schools from teaching evolution, a trend that culminated in the Scopes ‘monkey trial’ of 1925. The defendant, John Scopes, deliberately taught evolution illegally to bring attention to the state of Tennessee’s anti-evolution law. Prosecuted by William Jennings Bryan and defended by Clarence Darrow, Scopes was found guilty and fined a paltry $100, and even that was overturned on a technicality at appeal. There is a widespread legend that Bryan’s victory was pyrrhic, because it made him look ridiculous and Scopes’s punishment was light. But this is a comforting myth told by saltwater liberals living on the coasts. In the American heartland, Scopes’s conviction emboldened the critics of Darwin greatly. Far from being ridiculed into silence, the fundamentalists gained ground in the aftermath of the Scopes trial, and held that ground for decades within the educational system. Textbooks became very cautious about Darwinism.

It was not until 1968 that the United States Supreme Court struck down all laws preventing the teaching of evolution in schools. Fundamentalists then fell back on teaching ‘creation science’, a concoction of arguments that purported to find scientific evidence for biblical events such as Noah’s flood. In 1987 the Supreme Court effectively proscribed the teaching of creationism on the grounds that it was religion, not science.

It was then that the movement reinvented itself as ‘intelligent design’, focusing on the old Aquinas–Paley argument from design in its simplest form. Creationists promptly rewrote their textbook Of Pandas and People, using an identical definition for intelligent design as had been used for creation science; and systematically replaced the words ‘creationism’ and ‘creationist’ with ‘intelligent design’ in 150 places. This went wrong in one case, resulting in a strange spelling mistake in the book, ‘cdesign proponentsists’, which came to be called the ‘missing link’ between the two movements. This ‘astonishing’ similarity between the two schools of thought was crucial in causing Judge John Jones to deem intelligent design religious rather than scientific as he struck down the Dover School District’s law demanding equal time for intelligent design and evolution in 2005. Intelligent design, according to the textbook Of Pandas and People, argued that species came into existence abruptly, and through an intelligent agency, with their distinctive features already present: fish with fins and scales, birds with feathers.

Jones’s long Opinion in 2005 was a definitive and conclusive demolition of a skyhook, all the more persuasive since it came from a Christian, Bush-appointed, politically conservative judge with no scientific training. Jones pointed out that the scientific revolution had rejected unnatural causes to explain natural phenomena, rejected appeal to authority, and rejected revelation, in favour of empirical evidence. He systematically took apart the evidence presented by Professor Michael Behe, the main scientific champion of intelligent design testifying for the defendants. Behe, in his book Darwin’s Black Box and subsequent papers, had used two main arguments for the existence of an intelligent designer: irreducible complexity and the purposeful arrangement of parts. The flagellum of a bacterium, he argued, is driven by a molecular rotary motor of great complexity. Remove any part of that system and it will not work. The blood-clotting system of mammals likewise consists of a cascade of evolutionary events, none of which makes sense without the others. And the immune system was not only inexplicably complex, but a natural explanation was impossible.

It was trivial work for evolution’s champions, such as Kenneth Miller, to dismantle these cases in the Dover trial to the satisfaction of the judge. A fully functional precursor of the bacterial flagellum with a different job, known as the Type III secretory system, exists in some organisms and could easily have been adapted to make a rotary motor while still partly retaining its original advantageous role. (In the same way, the middle-ear bones of mammals, now used for hearing, are direct descendants of bones that were once part of the jaw of early fish.) The blood-clotting cascade is missing one step in whales and dolphins, or three steps in puffer fish, and still works fine. And the immune system’s mysterious complexity is yielding bit by bit to naturalistic explanations; what’s left no more implicates an intelligent designer, or a time-travelling genetic engineer, than it does natural selection. At the trial Professor Behe was presented with fifty-eight peer-reviewed papers and nine books about the evolution of the immune system.

As for the purposeful arrangement of parts, Judge Jones did not mince his words: ‘This inference to design based upon the appearance of a “purposeful arrangement of parts” is a completely subjective proposition, determined in the eye of each beholder and his/her viewpoint concerning the complexity of a system.’ Which is really the last word on Newton, Paley, Behe, and for that matter Aquinas.

More than 2,000 years ago Epicureans like Lucretius seem to have cottoned on to the power of natural selection, an idea that they probably got from the flamboyant Sicilian philosopher Empedocles (whose verse style was also a model for Lucretius), born in around 490 BC. Empedocles talked of animals that survived ‘being organised spontaneously in a fitting way; whereas those which grew otherwise perished and continue to perish’. It was, had Empedocles only known it, probably the best idea he ever had, though he never seems to have followed it through. Darwin was rediscovering an idea.

Gould’s swerve

Why was it even necessary, nearly 150 years after Darwin set out his theory, for Judge Jones to make the case again? This remarkable persistence of resistance to the idea of evolution, packaged and repackaged as natural theology, then creation science, then intelligent design, has never been satisfactorily explained. Biblical literalism cannot fully justify why people so dislike the idea of spontaneous biological complexity. After all, Muslims have no truck with the idea that the earth is 6,000 years old, yet they too find the argument from design persuasive. Probably fewer than 20 per cent of people in most Muslim-dominated countries accept Darwinian evolution to be true. Adnan Oktar, for example, a polemical Turkish creationist who also uses the name Harun Yahya, employs the argument from design to ‘prove’ that Allah created living things. Defining design as ‘a harmonious assembling of various parts into an orderly form towards a common goal’, he then argues that birds show evidence of design, their hollowed bones, strong muscles and feathers making it ‘obvious that the bird is product of a certain design’. Such a fit between form and function, however, is very much part of the Darwinian argument too.

Secular people, too, often jib at accepting the idea that complex organs and bodies can emerge without a plan. In the late 1970s a debate within Darwinism, between a predominantly American school led by the fossil expert Stephen Jay Gould and a predominantly British one led by the behaviour expert Richard Dawkins, about the pervasiveness of adaptation, led to some bitter and high-profile exchanges. Dawkins thought that almost every feature of a modern organism had probably been subject to selection for a function, whereas Gould thought that lots of change happened for accidental reasons. By the end, Gould seemed to have persuaded many lay people that Darwinism had gone too far; that it was claiming a fit between form and function too often and too glibly; that the idea of the organism adapting to its environment through natural selection had been refuted or at least diminished. In the media, this fed what John Maynard Smith called ‘a strong wish to believe that the Darwinian theory is false’, and culminated in an editorial in the Guardian announcing the death of Darwinism.

Within evolutionary biology, however, Gould lost the argument. Asking what an organ had evolved to do continued to be the main means by which biologists interpreted anatomy, biochemistry and behaviour. Dinosaurs may have been large ‘to’ achieve stable temperatures and escape predation, while nightingales may sing ‘to’ attract females.

This is not the place to retell the story of that debate, which had many twists and turns, from the spandrels of the Cathedral of San Marco in Venice to the partial resemblance of a caterpillar to a bird’s turd. My purpose here is different – to discern the motivation of Gould’s attack on adaptationism and its extraordinary popularity outside science. It was Gould’s Lucretian swerve. Daniel Dennett, Darwin’s foremost philosopher, thought Gould was ‘following in a long tradition of eminent thinkers who have been seeking skyhooks – and coming up with cranes’, and saw his antipathy to ‘Darwin’s dangerous idea as fundamentally a desire to protect or restore the Mind-first, top–down vision of John Locke’.

Whether this interpretation is fair or not, the problem Darwin and his followers have is that the world is replete with examples of deliberate design, from watches to governments. Some of them even involve design: the many different breeds of pigeons that Darwin so admired, from tumblers to fantails, were all produced by ‘mind-first’ selective breeding, just like natural selection but at least semi-deliberate and intentional. Darwin’s reliance on pigeon breeding to tell the tale of natural selection was fraught with danger – for his analogy was indeed a form of intelligent design.

Wallace’s swerve

Again and again, Darwin’s followers would go only so far, before swerving. Alfred Russel Wallace, for instance, co-discovered natural selection and was in many ways an even more radical enthusiast for Darwinism (a word he coined) than Darwin himself. Wallace was not afraid to include human beings within natural selection very early on; and he was almost alone in defending natural selection as the main mechanism of evolution in the 1880s, when it was sharply out of fashion. But then he executed a Lucretian swerve. Saying that ‘the Brain of the Savage [had been] shown to be Larger than he Needs it to be’ for survival, he concluded that ‘a superior intelligence has guided the development of man in a definite direction, and for a special purpose’. To which Darwin replied, chidingly, in a letter: ‘I hope you have not murdered too completely your own and my child.’

Later, in a book published in 1889 that resolutely champions Darwinism (the title of the book), Wallace ends by executing a sudden U-turn, just like Hume and so many others. Having demolished skyhook after skyhook, he suddenly erects three at the close. The origin of life, he declares, is impossible to explain without a mysterious force. It is ‘altogether preposterous’ to argue that consciousness in animals could be an emergent consequence of complexity. And mankind’s ‘most characteristic and noble faculties could not possibly have been developed by means of the same laws which have determined the progressive development of the organic world in general’. Wallace, who was by now a fervent spiritualist, demanded three skyhooks to explain life, consciousness and human mental achievements. These three stages of progress pointed, he said, to an unseen universe, ‘a world of spirit, to which the world of matter is altogether subordinate’.

The lure of Lamarck

The repeated revival of Lamarckian ideas to this day likewise speaks of a yearning to reintroduce mind-first intentionality into Darwinism. Jean-Baptiste de Lamarck suggested long before Darwin that creatures might inherit acquired characteristics – so a blacksmith’s son would inherit his father’s powerful forearms even though these were acquired by exercise, not inheritance. Yet people obviously do not inherit mutilations from their parents, such as amputated limbs, so for Lamarck to be right there would have to be some kind of intelligence inside the body deciding what was worth passing on and what was not. But you can see the appeal of such a scheme to those left disoriented by the departure of God the designer from the Darwinised scene. Towards the end of his life, even Darwin flirted with some tenets of Lamarckism as he struggled to understand heredity.

At the end of the nineteenth century, the German biologist August Weismann pointed out a huge problem with Lamarckism: the separation of germ-line cells (the ones that end up being eggs or sperm) from other body cells early in the life of an animal makes it virtually impossible for information to feed back from what happens to a body during its life into its very recipe. Since the germ cells were not an organism in microcosm, the message telling them to adopt an acquired character must, Weismann argued, be of an entirely different nature from the change itself. Changing a cake after it has been baked cannot alter the recipe that was used.

The Lamarckians did not give up, though. In the 1920s a herpetologist named Paul Kammerer in Vienna claimed to have changed the biology of midwife toads by changing their environment. The evidence was flaky at best, and wishfully interpreted. When accused of fraud, Kammerer killed himself. A posthumous attempt by the writer Arthur Koestler to make Kammerer into a martyr to the truth only reinforced the desperation so many non-scientists felt to rescue a top–down explanation of evolution.

It is still going on. Epigenetics is a respectable branch of genetic science that examines how modifications to DNA sequences acquired early in life in response to experience can affect the adult body. There is a much more speculative version of the story, though. Most of these modifications are swept clean when the sperm and egg cells are made, but perhaps a few just might survive the jump into a new generation. Certain genetic disorders, for example, seem to manifest themselves differently according to whether the mutant chromosome was inherited from the mother or the father – implying a sex-specific ‘imprint’ on the gene. And one study seemed to find a sex-specific effect on the mortality of Swedes according to how hungry their grandparents were when young. From a small number of such cases, none with very powerful results, some modern Lamarckians began to make extravagant claims for the vindication of the eighteenth-century French aristocrat. ‘Darwinian evolution can include Lamarckian processes,’ wrote Eva Jablonka and Marion Lamb in 2005, ‘because the heritable variation on which selection acts is not entirely blind to function; some of it is induced or “acquired” in response to the conditions of life.’

But the evidence for these claims remains weak. All the data suggest that the epigenetic state of DNA is reset in each generation, and that even if this fails to happen, the amount of information imparted by epigenetic modifications is a minuscule fraction of the information imparted by genetic information. Besides, ingenious experiments with mice show that all the information required to reset the epigenetic modifications themselves actually lies in the genetic sequence. So the epigenetic mechanisms must themselves have evolved by good old Darwinian random mutation and selection. In effect, there is no escape to intentionality to be found here. Yet the motive behind the longing to believe in epigenetic Lamarckism is clear. As David Haig of Harvard puts it, ‘Jablonka and Lamb’s frustration with neo-Darwinism is with the pre-eminence that is ascribed to undirected, random sources of heritable variation.’ He says he is ‘yet to hear a coherent explanation of how the inheritance of acquired characters can, by itself, be a source of intentionality’. In other words, even if you could prove some Lamarckism in epigenetics, it would not remove the randomness.

Culture-driven genetic evolution

In fact, there is a way for acquired characteristics to come to be incorporated into genetic inheritance, but it takes many generations and it is blindly Darwinian. It goes by the name of the Baldwin effect. A species that over many generations repeatedly exposes itself to some experience will eventually find its offspring selected for a genetic predisposition to cope with that experience. Why? Because the offspring that by chance happen to start with a predisposition to cope with that circumstance will survive better than others. The genes can thereby come to embody the experience of the past. Something that was once learned can become an instinct.

A similar though not identical phenomenon is illustrated by the ability to digest lactose sugar in milk, which many people with ancestors from western Europe and eastern Africa possess. Few adult mammals can digest lactose, since milk is not generally drunk after infancy. In two parts of the world, however, human beings evolved the capacity to retain lactose digestion into adulthood by not switching off genes for lactase enzymes. These happened to be the two regions where the domestication of cattle for milk production was first invented. What a happy coincidence! Because people could digest lactose, they were able to invent dairy farming? Well no, the genetic switch plainly happened as a consequence, not a cause, of the invention of dairy farming. But it still had to happen through random mutation followed by non-random survival. Those born by chance with the mutation that caused persistence of lactose digestion tended to be stronger and healthier than their siblings and rivals who could digest less of the goodness in milk. So they thrived, and the gene for lactose digestion spread rapidly. On closer inspection, this incorporation of ancestral experience into the genes is all crane and no skyhook.

So incredible is the complexity of the living world, so counterintuitive is the idea of boot-strapped, spontaneous intricacy, that even the most convinced Darwinian must, in the lonely hours of the night, have moments of doubt. Like Screwtape the devil whispering in the ear of a believer, the ‘argument from personal incredulity’ (as Richard Dawkins calls it) can be very tempting, even if you remind yourself that it’s a massive non sequitur to find divinity in ignorance.

4 (#ulink_58321b8b-6c1d-5e1e-81a6-4110341c970a)

The Evolution of Genes (#ulink_58321b8b-6c1d-5e1e-81a6-4110341c970a)

For certainly the elements of things do not collect

And order their formations by their cunning intellect,

Nor are their motions something they agree upon or propose;

But being myriad and many-mingled, plagued by blows

And buffeted through the universe for all time past,

By trying every motion and combination, they at last,

Fell into the present form in which the universe appears.

Lucretius, De Rerum Natura, Book 1, lines 1021–7

An especially seductive chunk of current ignorance is that concerning the origin of life. For all the confidence with which biologists trace the emergence of complex organs and organisms from simple proto-cells, the first emergence of those proto-cells is still shrouded in darkness. And where people are baffled, they are often tempted to resort to mystical explanations. When the molecular biologist Francis Crick, that most materialist of scientists, started speculating about ‘panspermia’ in the 1970s – the idea that life perhaps originated elsewhere in the universe and got here by microbial seeding – many feared that he was turning a little mystical. In fact he was just making an argument about probability: that it was highly likely, given the youth of the earth relative to the age of the universe, that some other planet got life before us and infected other solar systems. Still, he was emphasising the impenetrability of the problem.

Life consists of the capacity to reverse the drift towards entropy and disorder, at least locally – to use information to make local order from chaos while expending energy. Essential to these three skills are three kinds of molecule in particular – DNA for storing information, protein for making order, and ATP as the medium of energy exchange. How these came together is a chicken-and-egg problem. DNA cannot be made without proteins, nor proteins without DNA. As for energy, a bacterium uses up fifty times its own body weight in ATP molecules in each generation. Early life must have been even more profligate, yet would have had none of the modern molecular machinery for harnessing and storing energy. Wherever did it find enough ATP?

The crane that seems to have put these three in place was probably RNA, a molecule that still plays many key roles in the cell, and that can both store information like DNA, and catalyse reactions like proteins do. Moreover, RNA is made of units of base, phosphate and ribose sugar, just as ATP is. So the prevailing theory holds that there was once an ‘RNA World’, in which living things had RNA bodies with RNA genes, using RNA ingredients as an energy currency. The problem is that even this system is so fiendishly complex and interdependent that it’s hard to imagine it coming into being from scratch. How, for example, would it have avoided dissipation: kept together its ingredients and concentrated its energy without the boundaries provided by a cell membrane? In the ‘warm little pond’ that Charles Darwin envisaged for the beginning of life, life would have dissolved away all too easily.

Don’t give up. Until recently the origin of the RNA World seemed so difficult a problem that it gave hope to mystics; John Horgan wrote an article in Scientific American in 2011 entitled ‘Psst! Don’t Tell the Creationists, But Scientists Don’t Have a Clue How Life Began’.

Yet today, just a few years later, there’s the glimmer of a solution. DNA sequences show that at the very root of life’s family tree are simple cells that do not burn carbohydrates like the rest of us, but effectively charge their electrochemical batteries by converting carbon dioxide into methane or the organic compound acetate. If you want to find a chemical environment that echoes the one these chemi-osmotic microbes have within their cells, look no further than the bottom of the Atlantic Ocean. In the year 2000, explorers found hydrothermal vents on the mid-Atlantic ridge that were quite unlike those they knew from other geothermal spots on the ocean floor. Instead of very hot, acidic fluids, as are found at ‘black-smoker’ vents, the new vents – known as the Lost City Hydrothermal Field – are only warm, are highly alkaline, and appear to last for tens of thousands of years. Two scientists, Nick Lane and William Martin, have begun to list the similarities between these vents and the insides of chemi-osmotic cells, finding uncanny echoes of life’s method of storing energy. Basically, cells store energy by pumping electrically charged particles, usually sodium or hydrogen ions, across membranes, effectively creating an electrical voltage. This is a ubiquitous and peculiar feature of living creatures, but it appears it might have been borrowed from vents like those at Lost City.

Four billion years ago the ocean was acidic, saturated with carbon dioxide. Where the alkaline fluid from the vents met the acidic water, there was a steep proton gradient across the thin iron-nickel-sulphur walls of the pores that formed at the vents. That gradient had a voltage very similar in magnitude to the one in a modern cell. Inside those mineral pores, chemicals would have been trapped in a space with abundant energy, which could have been used to build more complex molecules. These in turn – as they began to accidentally replicate themselves using the energy from the proton gradients – became gradually more susceptible to a pattern of survival of the fittest. And the rest, as Daniel Dennett would say, is algorithm. In short, an emergent account of the origin of life is almost within reach.

All crane and no skyhook

As I mentioned earlier, the diagnostic feature of life is that it captures energy to create order. This is also a hallmark of civilisation. Just as each person uses energy to make buildings and devices and ideas, so each gene uses energy to make a structure of protein. A bacterium is limited in how large it can grow by the quantity of energy available to each gene. That’s because the energy is captured at the cell membrane by pumping protons across the membrane, and the bigger the cell, the smaller its surface area relative to its volume. The only bacteria that grow big enough to be seen by the naked eye are ones that have huge empty vacuoles inside them.

However, at some point around two billion years after life started, huge cells began to appear with complicated internal structures; we call them eukaryotes, and we (animals as well as plants, fungi and protozoa) are them.

Nick Lane argues that the eukaryotic (r)evolution was made possible by a merger: a bunch of bacteria began to live inside an archeal cell (a different kind of microbe). Today the descendants of these bacteria are known as mitochondria, and they generate the energy we need to live. During every second of your life your thousand trillion mitochondria pump a billion trillion protons across their membranes, capturing the electrical energy needed to forge your proteins, DNA and other macromolecules.

Mitochondria still have their own genes, but only a small number – thirteen in us. This simplification of their genome was vital. It enabled them to generate far more surplus energy to support the work of ‘our genome’, which is what enables us to have complex cells, complex tissues and complex bodies. As a result we eukaryotes have tens of thousands of times more energy available per gene, making each of our genes capable of far greater productivity. That allows us to have larger cells as well as more complex structures. In effect, we overcame the size limit of the bacterial cell by hosting multiple internal membranes in mitochondria, and then simplifying the genomes needed to support those membranes.

There is an uncanny echo of this in the Industrial (R)evolution. In agrarian societies, a family could grow just enough food to feed itself, but there was little left over to support anybody else. So only very few people could have castles, or velvet coats, or suits of armour, or whatever else needed making with surplus energy. The harnessing of oxen, horses, wind and water helped generate a bit more surplus energy, but not much. Wood was no use – it provided heat, not work. So there was a permanent limit on how much a society could make in the way of capital – structures and things.
<< 1 2 3 4 5 6 7 >>
На страницу:
5 из 7