I was shocked and utterly delighted to have won, especially as I am the first woman to have done so in the prize’s 28-year history. It’s received some great media attention, and a lovely new review, so hopefully people will buy my book and learn more about the remarkable people I found living with dramatic change around the planet at this extraordinary time. I will try to post articles about the book and the prize here: today I was talking about it with Professor Brian Cox on the One Show on BBC 1 television, I spoke to the BBC’s Jon Amos, and I was on the Guardian podcast:
It has been an often gruelling journey, lengthy research and writing project to get here, so I want to thank all the people who have supported the book and this blog over the past few years. You are much appreciated!
And the winner of the Royal Society’s Science Book of the Year Prize is announced later today. You can read the first chapter of all the great shortlisted books here, including mine.
“She said, we’ve got to hold on to what we’ve got …” As Bon Jovi’s “Livin’ on a Prayer” came on in the bar, I joined in enthusiastically: “It doesn’t make a difference if we’re naked or not!” My friend burst out laughing. “It doesn’t make a difference if we make it or not,” she corrected me.
My head spun. For the past 20 years, my understanding of this song had been wrong. I hadn’t even questioned it, simply assuming that my interpretation was correct. And now, learning the true lyrics, my perspective shifted and it all made more sense.
I thought of this incident when I read a headline describing the findings of a recent survey: “Antibiotic resistance widely misunderstood by the British public.” How could this be misunderstood, I wondered, when it’s such a simple concept?
Survey researchers found that most people, if they had heard of antibiotic resistance at all, thought that it was their body that became resistant to antibiotics, rather than the bacteria. One person interviewed during the research said: “The more you take, the more your body becomes resistant to it. They stop working.”
This was another “a-ha!” moment for me. After all, our bodies do build up tolerance to certain drugs, thereby lessening their effectiveness.
The simple misunderstanding helps explain why many people who are prescribed antibiotics often fail to complete the course, believing that lessening their exposure will help prevent their bodies from developing resistance. In fact, failure to complete a course of antibiotics is a major factor in the development of drug-resistant infections, as it exposes germs to enough of the drug to promote resistance, but not enough to kill them.
It’s an easy mistake to make, but like me, clinicians and other health professionals too long remained unaware of the misconception, with potentially deadly consequences. Antibiotic resistance is perhaps the biggest health threat we face in the developed world. Imagine tuberculosis and pneumonia once again spreading through our cities. Imagine dying of appendicitis or a simple scratch on the leg that leads to untreatable cellulitis.
People in rich countries already die of drug resistant infections, which they usually acquire in the hospital. It would be nothing short of catastrophic if the constantly evolving bacteria that regularly infect us became resistant to our precious small arsenal of antibiotics. A major factor in drug resistance is the irresponsible use of antibiotics in cattle, but incorrect and excessive use by humans is a likewise a large part of the problem.
If people don’t understand how they are part of the problem, they can’t become part of the solution. The survey researchers suggest doctors talk about “drug-resistant infections” or “antibiotic resistant germs,” rather than “antibiotic resistance.” The simple phrase could open a window of perception.
This column first appeared at The American Scholar.
The outrage that erupted this summer after a Minnesota dentist shot dead a lion in a Zimbabwe wildlife park shows how passionately people feel about the conservation of endangered species. Cecil the Lion briefly became more famous than Aslan, with his own Twitter hashtag and thousands of angry supporters on social media, some of whom even threatened violence against the dentist, forcing him to temporarily close his practice.
But why do we care? After all, as animal rights activists have pointed out, the average American eats about 30 land animals a year, most of which led far worse lives than Cecil before being slaughtered. Cows, pigs, and chickens are far from endangered species, however.
Lion populations have declined by half in the past 20 years. Although trophy hunting has contributed to this (scores of lions are killed this way every year in Zimbabwe alone), the main threats they face en route to extinction are other human impacts, such as habitat loss, conflict with farmers, and disease.
By contrast, the main threat that elephants face is hunting for their lucrative ivory tusks. Elephants are being poached at an alarming rate—more than 100,000 were killed between 2010 and 2012, and indeed, a family of five elephants was killed in the same park and in the same month as Cecil, although their deaths received little press coverage.
It’s clear that the rate at which we are exterminating species is unprecedented in human history and rare in the history of life on Earth. Scientists warn that we are heading for the planet’s sixth mass extinction event, whereby 75 percent of all current mammal species could be wiped out. Unlike the previous five, which were the result of natural events, such as a supervolcanic eruption, this will be the first mass extinction caused by a living species—us.
Does this matter? Well, we don’t exactly need lions or elephants. If they were to go extinct, it would have negligible impact on human lives. Much more worrying from a human survival perspective is, say, the decline in bee populations. If pollinators went extinct, our food supply would be catastrophically limited. Conservation projects for bees are therefore understandable.
But what’s the point of trying to save endangered species that are not useful to us and may even harm us, such as lions or polar bears? After all, doing so costs billions of dollars at a time when human beings around the world are suffering for want of financial aid.
The usual argument is that all animals provide humans with an ecosystem service. Animals and plants exist as part of a web of biological activity, which supports our lifestyles. So, for example, as top predators, lions keep in check animals further down the food chain. By removing lions, this chain is broken and, say, herbivore populations explode. The landscape would then change to reflect this, leading to perhaps more fires, or a raised incidence of lethal tick-borne diseases that spread to humans or cattle.
Ecosystems are vital for humans—they are what clean our air and water, provide our food, clothes, and medicines. And this conservation argument holds true for many species. In truth, however, conservation is important to us for an entirely different reason—a reason that better explains the passionate response to Cecil’s untimely end. Quite simply, we like certain species. Lions are beautiful, we don’t want them to go extinct in the wild, and we want the assurance that they will be protected from hunters in a wildlife park.
Irrational or not, most of us don’t want to see one shot for the purpose of decorating a dentist’s living room.
This column first appeared at The American Scholar.
Nature is free for everyone to enjoy. So you’d think, but in truth there is a growing divide over who has access to the natural world. A British study found that families need to earn an annual income of around £45,000 ($51,000)—considerably more than the national average—before they felt they had access to areas of natural beauty.
The world is becoming increasingly urban. In 1800, just three percent of the world’s population lived in cities. Today more than half of us do, and by 2050, 75 percent of us will. Wealthier people tend to live in greener areas with gardens, nicely tended parks, and wide, tree-lined boulevards. They have the money and time to visit the countryside and to travel to areas with wildlife and natural beauty overseas. By contrast, poorer parts of cities tend to be uglier, more cramped, and blighted by litter, tagging, pollution, and decay. Poor people are less likely to have the funds and leisure time to visit more attractive places and may feel that national parks and the like are “not for them”—not aimed at their social class or group.
On my travels around the world, I have found some of the poorest people living in some of the most outstandingly beautiful places. Although a tourist finds it easier to recognize the extraordinary in a resident’s ordinary, these people were not blind to their landscapes and many natural features are worshipped by locals.
But in humanity’s great urban migration, we are increasingly privatizing nature or making it less accessible in other ways. We are in danger of creating a world in which nature is just another plaything to be enjoyed by the privileged few.
When I was a child, school trips were organised to the seaside, the countryside, and to museums—the only opportunities many poor children had to visit these places. Children played outside on school grounds or on vacant lots and parks. Now, school trips are less frequent, after-school play is limited by parents who fear for their children’s safety, and vacant lots have all been built up as property values have risen. Children now gather in shopping malls, where even the vegetation is plastic.
Democratizing nature needn’t be difficult or expensive. Maintaining clean and safe parks for people to enjoy has multiple social benefits. A positive local experience in parks and gardens, for example, may encourage people to appreciate and learn more about the greater natural world.
The alternative is bleak. If people do not think nature is for them, they are unlikely to value it. And if they do not value their small, local, sanitized version, what hope remains for the rest of this wild world?
This column first appeared at The American Scholar.
A journalist called me the other day to ask if I would “speak about the unspeakable”: isn’t it true, she said, that if our human population weren’t so large, we wouldn’t have so many environmental problems?
More than seven billion people already inhabit the planet, and by 2050, there may be as many as 10 billion. It took us 50,000 years to reach the first billion, but barely more than a decade to add the most recent billion. The negative effects of so many people competing for Earth’s limited resources are everywhere to be seen.
So what do you propose, I teased, slaughtering half the world’s population? Of course not. She described a charity trying to create sustainable villages in Madagascar that is promoting family planning to reduce the villagers’ environmental footprint—shouldn’t that be practiced everywhere?
It already is. Population growth is slowing, with many countries now in negative growth. The rate peaked around 1968 (the year when Paul Ehrlich’s The Population Bomb was published, and since then has declined by about 50 percent. The average woman in a developing country (outside of China) now has three kids rather than six. Globally, that number stands at 2.36, roughly equal to the “replacement rate” (2.33), which, accounting for child and maternal deaths, is the number of children a woman needs to have, on average, to maintain the current population.
Some countries have actively promoted contraceptive use, later motherhood, and female education to reduce family size. China, the most populous nation, introduced a controversial one-child policy in 1978, which has prevented hundreds of millions of potential births. However, social engineering has turned out to be less effective than economic growth in reducing family size: over the same period, Taiwan, in moving from “developing” to “developed” status, has seen a slightly larger reduction in fertility than China.
As people get richer, better educated, and urban, and as resources such as family land become scarcer, women will continue to have fewer children. It may be that as fertility declines, the global population will fall. Such a shift is already happening in parts of the rich world, such as Japan. The social consequences of this are enormous. Wealthy societies will increasingly have to rely on immigration to support the generational population disparity.
Women still have large families in some places, and there, as elsewhere, they should have access to family planning as a fundamental human right. Smaller families may well bring environmental benefits, but promoting family planning programs on that basis alone makes me very uncomfortable.
Rather than focusing on population growth as the preeminent environmental problem, we need to accept our growing numbers and look to what we can acceptably change. And it’s no secret that it comes down to our use of resources. If product engineers were made to consider the 10-billion global population during the design phase, for example, they could create products that are more durable, longer-lasting, and more easily dismantled for efficient recycling of their materials. Energy could be generated from nonpolluting sources. Instead of wasting 40 percent of our food, as we do now, we could farm, store, transport, and eat it more efficiently.
Until the next population-decimating pandemic sweeps the globe, we need to make our large number part of the environmental solution rather than the problem.
This column first appeared at The American Scholar
Of all the changes humans have made to the planet, nothing is so startlingly obvious as our transformation of the landscape. We have razed forest and savannah to create monoculture farmland, flattened mountains, greened deserts, and built cities atop swathes of marshland.
Scientists looking for landscape changes in the distant past, such as those wrought during ice ages, when large temperate areas were covered with glaciers, can probe the rocks for fossils or seek answers in the shape of a valley or cliff. Core samples drilled into ice or sedimentary layers might likewise reveal a warmer past, populated by long-extinct species.
Previously, changes like these have always resulted from uncontrollable natural events, such as an asteroid strike or the eruption of super volcanoes. But now humans are leaving equally profound marks on the world. Even in the geologically brief decades since the Industrial Revolution, and even since World War II, we have triggered transformations that will reveal themselves in chemical, physical, and biological signatures to geologists in the far future.
Human-made changes are exactly that—due to things like population size and cultural practices. But perhaps the most important factor of all is politics, which determines whether a society is at war or peace, rural or urban, agrarian or industrializing, concentrated or spread out, and so on. Political decisions leave their traces in the rocks just as surely as the asteroid that wiped out the dinosaurs. Lake sediments in Romania, for example, reveal the intensive farming of Nicolae Ceaușescu’s communist regime, the Chernobyl accident in nearby Ukraine, and the agricultural changes of the early 1990s after Ceaușescu’s fall. And war continues to reshape our planet’s geological history, from the radioactive signature of atomic tests to craters left by explosives, as well as unexploded ordnance that will eventually fossilize within the rock strata.
Some of the most fundamental changes to the global landscape—melting glaciers, disappearing rainforests, drying continents—are the result of political decisions made in democratic countries. If we want to address some of these problematic changes, politics would be a good place to start. President Obama’s pledge to cut greenhouse gas emissions by almost one-third of 2005 levels may help protect glaciers. But it will take more, much more, than just that. Brazil’s decision to water down its successful environmental law and Forest Code, for example, has led to a recent explosion in deforestation in the Amazon—and stands as an example of the work that remains to be done.
This column first appeared at The American Scholar.