多少與"自然科學"和"人工科學"的區分方式相關
The Anthropocene
Science is recognising humans as a geological force to be reckoned with
May 26th 2011 | from the print edition
THE here and now are defined by astronomy and geology. Astronomy takes care of the here: a planet orbiting a yellow star embedded in one of the spiral arms of the Milky Way, a galaxy that is itself part of the Virgo supercluster, one of millions of similarly vast entities dotted through the sky. Geology deals with the now: the 10,000-year-old Holocene epoch, a peculiarly stable and clement part of the Quaternary period, a time distinguished by regular shifts into and out of ice ages. The Quaternary forms part of the 65m-year Cenozoic era, distinguished by the opening of the North Atlantic, the rise of the Himalayas, and the widespread presence of mammals and flowering plants. This era in turn marks the most recent part of the Phanerozoic aeon, the 540m-year chunk of the Earth’s history wherein rocks with fossils of complex organisms can be found. The regularity of celestial clockwork and the solid probity of rock give these co-ordinates a reassuring constancy.
Now there is a movement afoot to change humanity’s co-ordinates. In 2000 Paul Crutzen, an eminent atmospheric chemist, realised he no longer believed he was living in the Holocene. He was living in some other age, one shaped primarily by people. From their trawlers scraping the floors of the seas to their dams impounding sediment by the gigatonne, from their stripping of forests to their irrigation of farms, from their mile-deep mines to their melting of glaciers, humans were bringing about an age of planetary change. With a colleague, Eugene Stoermer, Dr Crutzen suggested this age be called the Anthropocene—“the recent age of man”.
The term has slowly picked up steam, both within the sciences (the International Commission on Stratigraphy, ultimate adjudicator of the geological time scale, is taking a formal interest) and beyond. This May statements on the environment by concerned Nobel laureates and the Pontifical Academy of Sciences both made prominent use of the term, capitalising on the way in which it dramatises the sheer scale of human activity.
The advent of the Anthropocene promises more, though, than a scientific nicety or a new way of grabbing the eco-jaded public’s attention. The term “paradigm shift” is bandied around with promiscuous ease. But for the natural sciences to make human activity central to its conception of the world, rather than a distraction, would mark such a shift for real. For centuries, science has progressed by making people peripheral. In the 16th century Nicolaus Copernicus moved the Earth from its privileged position at the centre of the universe. In the 18th James Hutton opened up depths of geological time that dwarf the narrow now. In the 19th Charles Darwin fitted humans onto a single twig of the evolving tree of life. As Simon Lewis, an ecologist at the University of Leeds, points out, embracing the Anthropocene as an idea means reversing this trend. It means treating humans not as insignificant observers of the natural world but as central to its workings, elemental in their force.
The most common way of distinguishing periods of geological time is by means of the fossils they contain. On this basis picking out the Anthropocene in the rocks of days to come will be pretty easy. Cities will make particularly distinctive fossils. A city on a fast-sinking river delta (and fast-sinking deltas, undermined by the pumping of groundwater and starved of sediment by dams upstream, are common Anthropocene environments) could spend millions of years buried and still, when eventually uncovered, reveal through its crushed structures and weird mixtures of materials that it is unlike anything else in the geological record.
The fossils of living creatures will be distinctive, too. Geologists define periods through assemblages of fossil life reliably found together. One of the characteristic markers of the Anthropocene will be the widespread remains of organisms that humans use, or that have adapted to life in a human-dominated world. According to studies by Erle Ellis, an ecologist at the University of Maryland, Baltimore County, the vast majority of ecosystems on the planet now reflect the presence of people. There are, for instance, more trees on farms than in wild forests. And these anthropogenic biomes are spread about the planet in a way that the ecological arrangements of the prehuman world were not. The fossil record of the Anthropocene will thus show a planetary ecosystem homogenised through domestication.
More sinisterly, there are the fossils that will not be found. Although it is not yet inevitable, scientists warn that if current trends of habitat loss continue, exacerbated by the effects of climate change, there could be an imminent and dramatic number of extinctions before long.
All these things would show future geologists that humans had been present. But though they might be diagnostic of the time in which humans lived, they would not necessarily show that those humans shaped their time in the way that people pushing the idea of the Anthropocene want to argue. The strong claim of those announcing the recent dawning of the age of man is that humans are not just spreading over the planet, but are changing the way it works.
Such workings are the province of Earth-system science, which sees the planet not just as a set of places, or as the subject of a history, but also as a system of forces, flows and feedbacks that act upon each other. This system can behave in distinctive and counterintuitive ways, including sometimes flipping suddenly from one state to another. To an Earth-system scientist the difference between the Quaternary period (which includes the Holocene) and the Neogene, which came before it, is not just what was living where, or what the sea level was; it is that in the Neogene the climate stayed stable whereas in the Quaternary it swung in and out of a series of ice ages. The Earth worked differently in the two periods.
The clearest evidence for the system working differently in the Anthropocene comes from the recycling systems on which life depends for various crucial elements. In the past couple of centuries people have released quantities of fossil carbon that the planet took hundreds of millions of years to store away. This has given them a commanding role in the planet’s carbon cycle.
Although the natural fluxes of carbon dioxide into and out of the atmosphere are still more than ten times larger than the amount that humans put in every year by burning fossil fuels, the human addition matters disproportionately because it unbalances those natural flows. As Mr Micawber wisely pointed out, a small change in income can, in the absence of a compensating change in outlays, have a disastrous effect. The result of putting more carbon into the atmosphere than can be taken out of it is a warmer climate, a melting Arctic, higher sea levels, improvements in the photosynthetic efficiency of many plants, an intensification of the hydrologic cycle of evaporation and precipitation, and new ocean chemistry.
All of these have knock-on effects both on people and on the processes of the planet. More rain means more weathering of mountains. More efficient photosynthesis means less evaporation from croplands. And the changes in ocean chemistry are the sort of thing that can be expected to have a direct effect on the geological record if carbon levels rise far enough.
At a recent meeting of the Geological Society of London that was devoted to thinking about the Anthropocene and its geological record, Toby Tyrrell of the University of Southampton pointed out that pale carbonate sediments—limestones, chalks and the like—cannot be laid down below what is called a “carbonate compensation depth”. And changes in chemistry brought about by the fossil-fuel carbon now accumulating in the ocean will raise the carbonate compensation depth, rather as a warmer atmosphere raises the snowline on mountains. Some ocean floors which are shallow enough for carbonates to precipitate out as sediment in current conditions will be out of the game when the compensation depth has risen, like ski resorts too low on a warming alp. New carbonates will no longer be laid down. Old ones will dissolve. This change in patterns of deep-ocean sedimentation will result in a curious, dark band of carbonate-free rock—rather like that which is seen in sediments from the Palaeocene-Eocene thermal maximum, an episode of severe greenhouse warming brought on by the release of pent-up carbon 56m years ago.
No Dickensian insights are necessary to appreciate the scale of human intervention in the nitrogen cycle. One crucial part of this cycle—the fixing of pure nitrogen from the atmosphere into useful nitrogen-containing chemicals—depends more or less entirely on living things (lightning helps a bit). And the living things doing most of that work are now people (see chart). By adding industrial clout to the efforts of the microbes that used to do the job single-handed, humans have increased the annual amount of nitrogen fixed on land by more than 150%. Some of this is accidental. Burning fossil fuels tends to oxidise nitrogen at the same time. The majority is done on purpose, mostly to make fertilisers. This has a variety of unwholesome consequences, most importantly the increasing number of coastal “dead zones” caused by algal blooms feeding on fertiliser-rich run-off waters.
Industrial nitrogen’s greatest environmental impact, though, is to increase the number of people. Although nitrogen fixation is not just a gift of life—it has been estimated that 100m people were killed by explosives made with industrially fixed nitrogen in the 20th century’s wars—its net effect has been to allow a huge growth in population. About 40% of the nitrogen in the protein that humans eat today got into that food by way of artificial fertiliser. There would be nowhere near as many people doing all sorts of other things to the planet if humans had not sped the nitrogen cycle up.
It is also worth noting that unlike many of humanity’s other effects on the planet, the remaking of the nitrogen cycle was deliberate. In the late 19th century scientists diagnosed a shortage of nitrogen as a planet-wide problem. Knowing that natural processes would not improve the supply, they invented an artificial one, the Haber process, that could make up the difference. It was, says Mark Sutton of the Centre for Ecology and Hydrology in Edinburgh, the first serious human attempt at geoengineering the planet to bring about a desired goal. The scale of its success outstripped the imaginings of its instigators. So did the scale of its unintended consequences.
For many of those promoting the idea of the Anthropocene, further geoengineering may now be in order, this time on the carbon front. Left to themselves, carbon-dioxide levels in the atmosphere are expected to remain high for 1,000 years—more, if emissions continue to go up through this century. It is increasingly common to hear climate scientists arguing that this means things should not be left to themselves—that the goal of the 21st century should be not just to stop the amount of carbon in the atmosphere increasing, but to start actively decreasing it. This might be done in part by growing forests (see article) and enriching soils, but it might also need more high-tech interventions, such as burning newly grown plant matter in power stations and pumping the resulting carbon dioxide into aquifers below the surface, or scrubbing the air with newly contrived chemical-engineering plants, or intervening in ocean chemistry in ways that would increase the sea’s appetite for the air’s carbon.
To think of deliberately interfering in the Earth system will undoubtedly be alarming to some. But so will an Anthropocene deprived of such deliberation. A way to try and split the difference has been propounded by a group of Earth-system scientists inspired by (and including) Dr Crutzen under the banner of “planetary boundaries”. The planetary-boundaries group, which published a sort of manifesto in 2009, argues for increased restraint and, where necessary, direct intervention aimed at bringing all sorts of things in the Earth system, from the alkalinity of the oceans to the rate of phosphate run-off from the land, close to the conditions pertaining in the Holocene. Carbon-dioxide levels, the researchers recommend, should be brought back from whatever they peak at to a level a little higher than the Holocene’s and a little lower than today’s.
The idea behind this precautionary approach is not simply that things were good the way they were. It is that the further the Earth system gets from the stable conditions of the Holocene, the more likely it is to slip into a whole new state and change itself yet further.
The Earth’s history shows that the planet can indeed tip from one state to another, amplifying the sometimes modest changes which trigger the transition. The nightmare would be a flip to some permanently altered state much further from the Holocene than things are today: a hotter world with much less productive oceans, for example. Such things cannot be ruled out. On the other hand, the invocation of poorly defined tipping points is a well worn rhetorical trick for stirring the fears of people unperturbed by current, relatively modest, changes.
In general, the goal of staying at or returning close to Holocene conditions seems judicious. It remains to be seen if it is practical. The Holocene never supported a civilisation of 10 billion reasonably rich people, as the Anthropocene must seek to do, and there is no proof that such a population can fit into a planetary pot so circumscribed. So it may be that a “good Anthropocene”, stable and productive for humans and other species they rely on, is one in which some aspects of the Earth system’s behaviour are lastingly changed. For example, the Holocene would, without human intervention, have eventually come to an end in a new ice age. Keeping the Anthropocene free of ice ages will probably strike most people as a good idea.
That is an extreme example, though. No new ice age is due for some millennia to come. Nevertheless, to see the Anthropocene as a blip that can be minimised, and from which the planet, and its people, can simply revert to the status quo, may be to underestimate the sheer scale of what is going on.
Take energy. At the moment the amount of energy people use is part of what makes the Anthropocene problematic, because of the carbon dioxide given off. That problem will not be solved soon enough to avert significant climate change unless the Earth system is a lot less prone to climate change than most scientists think. But that does not mean it will not be solved at all. And some of the zero-carbon energy systems that solve it—continent- scale electric grids distributing solar energy collected in deserts, perhaps, or advanced nuclear power of some sort—could, in time, be scaled up to provide much more energy than today’s power systems do. As much as 100 clean terawatts, compared to today’s dirty 15TW, is not inconceivable for the 22nd century. That would mean humanity was producing roughly as much useful energy as all the world’s photosynthesis combined.
In a fascinating recent book, “Revolutions that Made the Earth”, Timothy Lenton and Andrew Watson, Earth-system scientists at the universities of Exeter and East Anglia respectively, argue that large changes in the amount of energy available to the biosphere have, in the past, always marked large transitions in the way the world works. They have a particular interest in the jumps in the level of atmospheric oxygen seen about 2.4 billion years ago and 600m years ago. Because oxygen is a particularly good way of getting energy out of organic matter (if it weren’t, there would be no point in breathing) these shifts increased sharply the amount of energy available to the Earth’s living things. That may well be why both of those jumps seem to be associated with subsequent evolutionary leaps—the advent of complex cells, in the first place, and of large animals, in the second. Though the details of those links are hazy, there is no doubt that in their aftermath the rules by which the Earth system operated had changed.
The growing availability of solar or nuclear energy over the coming centuries could mark the greatest new energy resource since the second of those planetary oxidations, 600m years ago—a change in the same class as the greatest the Earth system has ever seen. Dr Lenton (who is also one of the creators of the planetary-boundaries concept) and Dr Watson suggest that energy might be used to change the hydrologic cycle with massive desalination equipment, or to speed up the carbon cycle by drawing down atmospheric carbon dioxide, or to drive new recycling systems devoted to tin and copper and the many other metals as vital to industrial life as carbon and nitrogen are to living tissue. Better to embrace the Anthropocene’s potential as a revolution in the way the Earth system works, they argue, than to try to retreat onto a low-impact path that runs the risk of global immiseration.
Such a choice is possible because of the most fundamental change in Earth history that the Anthropocene marks: the emergence of a form of intelligence that allows new ways of being to be imagined and, through co-operation and innovation, to be achieved. The lessons of science, from Copernicus to Darwin, encourage people to dismiss such special pleading. So do all manner of cultural warnings, from the hubris around which Greek tragedies are built to the lamentation of King David’s preacher: “Vanity of vanities, all is vanity…the Earth abideth for ever…and there is no new thing under the sun.” But the lamentation of vanity can be false modesty. On a planetary scale, intelligence is something genuinely new and powerful. Through the domestication of plants and animals intelligence has remade the living environment. Through industry it has disrupted the key biogeochemical cycles. For good or ill, it will do yet more.
It may seem nonsense to think of the (probably sceptical) intelligence with which you interpret these words as something on a par with plate tectonics or photosynthesis. But dam by dam, mine by mine, farm by farm and city by city it is remaking the Earth before your eyes.
--
Anthropocene was originally coined by ecologist Eugene Stoermer but subsequently popularized by the Nobel Prize-winning scientist Paul Crutzen by analogy with the word "Holocene." The Greek roots are anthropo- meaning "human" and -cene meaning "new." Crutzen has explained, "I was at a conference where someone said something about the Holocene. I suddenly thought this was wrong. The world has changed too much. So I said: 'No, we are in the Anthropocene.' I just made up the word on the spur of the moment. Everyone was shocked. But it seems to have stuck."[6] Crutzen first used it in print in a 2000 newsletter of the International Geosphere-Biosphere Programme (IGBP), No.41. In 2008, Zalasiewicz suggested in GSA Today that an anthropocene epoch is now appropriate.[7]
沒有留言:
張貼留言