12 July 2008

Ditchley Foundation Annual Lecture XLIV

The Next Half-Century:  a Scientist's Hopes and Fears

Delivered by 

Lord Rees of Ludlow

Professor Martin Rees (Lord Rees of Ludlow), President of the Royal Society, is also Master of Trinity College, Cambridge and Astronomer Royal.  He is a Trustee of the Institute for Advanced Study at Princeton, and belongs to the US National Academy of Sciences and the American Philosophical Society.  He has authored or co-authored over five hundred research papers, and seven books for general readership.  He has lectured, broadcast and written widely on science and policy.


Last year, Brent Scowcroft stood at this podium as Ditchley Lecturer. It's daunting to follow him.  I'll take as my text his concluding words:

“If we behave wisely, prudently and in close strategic cooperation with each other, the 21st century could be the best yet in the rather dismal history of mankind.”

This is the 50th anniversary of the Ditchley Foundation, and I've been asked to offer a scientist's perspective on the next fifty years.  As an astronomer, I often get mistakenly described as an astrologer -- but I cast no horoscopes and have no crystal ball.  My message will be that the Promethean power of science offers greater opportunities than ever before -- for the developing and the developed world.  We can indeed be optimistic: we can surely expect huge economic and social advances, especially in Asia.  But there will be new challenges and vulnerabilities to contend with.

THE LAST 50 YEARS

Fifty years ago no-one here could confidently have predicted the geopolitical landscape of today.  And scientific forecasting is just as hazardous.  Three of today's most remarkable technologies had their gestation in the 1950s.  But nobody could then have guessed how pervasively they would shape our lives today.

It was in 1958 that Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductors built the first integrated circuit -- the precursor of today's ubiquitous  silicon chips, each containing literally billions of microscopic circuit elements.  This was perhaps the most transformative single invention of the past century.

A second technology with huge potential began in Cambridge in the 1950s, when Watson and Crick discovered the bedrock mechanism of heredity -- the famous double helix.  This discovery launched the science of molecular biology, opening exciting prospects in genomics and synthetic biology.

And it's just over 50 years since the launch of Sputnik.  This event started the ‘space race’, and led President Kennedy to inaugurate the programme to land men on the Moon.  Kennedy's prime motive was of course superpower rivalry -- cynics could deride it as a stunt.  But it was an extraordinary technical triumph -- especially as NASA's total computing power was far less than in a single mobile phone today.  And it had an inspirational aspect too: it offered a new perspective on our planet. Distant images of Earth -- its delicate biosphere of clouds, land and oceans contrasting with the sterile moonscape where the astronauts left their footprints -- have, ever since the 1960s, been iconic for environmentalists.

Most of us here are old enough to recall the Apollo programme.  But it's nearly 40 years since Neil Armstrong's ‘first small step’.  To young people today, however, this is ancient history: they know that the Americans went to the Moon, just as they know the Egyptians built pyramids, but the motives for these two enterprises may seem equally baffling.

There was no real follow-on after Apollo:  there is no practical or scientific motive adequate to justify the huge expense of NASA-style manned spaceflight, and it has lost its glamour.  But unmanned space technology has flourished, giving us GPS, global communications, environmental monitoring and other everyday benefits, as well as an immense scientific yield.  But of course there is a dark side.  Its initial motivation was to provide missiles to carry nuclear weapons. And those weapons were themselves the outcome of a huge enterprise, the Manhattan project, that was even more intense and focused than the Apollo programme.

Soon after World War II, some physicists who had been involved in the Manhattan project founded a journal called the Bulletin of Atomic Scientists, aimed at promoting arms control.  The 'logo' on the Bulletin's cover is a clock, the closeness of whose hands to midnight indicates the Editorial Board's judgement on how precarious the world situation is.  Every year or two, the minute hand is shifted, either forwards or backwards.

It was closest to midnight at the time of the Cuban Missile Crisis.  Robert MacNamara spoke frankly about that episode in his confessional movie 'Fog of War'.  He said that “We came within a hairbreadth of nuclear war without realising it. It’s no credit to us that we escaped - Khrushchev and Kennedy were lucky as well as wise”. Indeed on several occasions during the Cold War the superpowers could have stumbled towards armageddon.

When the Cold War ended, the Bulletin's clock was put back to 17 minutes to midnight.  There is now far less risk of tens of thousands of H-bombs devastating our civilisation.  Indeed one clear reason for sharing Brent Scowcroft's optimism is that the greatest peril to confront the world from the 1950s to the 1980s -- massive nuclear annihilation -- has diminished.

But the clock has been creeping forward again.  There is increasing concern about nuclear proliferation, and about nuclear weapons being deployed in a localised conflict.  And Al Qaida-style terrorists might some day acquire a nuclear weapon. If they did, they would willingly detonate it in a city, killing tens of thousands along with themselves, and millions would acclaim them as heros.

And the threat of a global nuclear catastrophe could be merely in temporary abeyance.  I'm diffident about even mentioning such matters to an audience where there's so much experience and expertise.  But during this century, geopolitical realignments could be as drastic as those during the last century, and could lead to a nuclear standoff between new superpowers that might be handled less well -- or less luckily -- than the Cuba crisis was. 

The nuclear age inaugurated an era when humans could threaten the entire Earth's future -- what some have called the ‘anthropocene’ era.  We'll never be completely rid of the nuclear threat.  But the 21st century confronts us with new perils as grave as the bomb.  They may not threaten a sudden world-wide catastrophe -- the doomsday clock is not such a good metaphor -- but they are,  in aggregate, worrying and challenging.

I want briefly to address some of these themes, and then, near the end of my lecture, to comment on the role of science and scientists in the policy arena.

ENERGY AND CLIMATE

High on the global agenda are energy supply and energy security. These are crucial for economic and political stability, and linked of course to the grave issue of long-term climate change.

Human actions -- mainly the burning of fossil fuels  -- have already raised the carbon dioxide concentration higher than it's ever been in the last half million years. Moreover, according to ‘business as usual’ scenarios, it will reach twice the pre-industrial level by 2050, and three times that level later in the century.  This much is entirely uncontroversial.  Nor is there significant doubt  that CO2 is a greenhouse gas, and that the higher its concentration rises, the greater the warming -- and, more important still, the greater the chance of triggering  something grave and irreversible:  rising sea levels due to the melting of Greenland's icecap; runaway greenhouse warming due to release of methane  in the tundra, and so forth.

There is a substantial uncertainty in just how sensitive the temperature is to the CO2 level.  The climate models can, however, assess the likelihood of a range of temperature rises.  It is the ‘high-end tail’ of the probability distribution that should worry us most -- the small probability of a really drastic climatic shift.  Climate scientists now aim to refine their calculations, and to address questions like:  Where will the flood risks be concentrated?  What parts of Africa will suffer severest drought?  Where will the worst hurricanes strike?

The ‘headline figures’ that the climate modellers quote -- 2, 3 or 5 degrees rise in the mean global temperature -- might seem too small to fuss about.  But two comments should put them into perspective.

First, even in the depth of the last ice age the mean temperature was lower by just 5 degrees.  Second, the prediction isn't a uniform warming:  the land warms more than the sea, and high latitudes more than low.  Quoting a  single figure glosses over shifts in global weather patterns that will be more drastic in some regions than in others, and could involve relatively sudden 'flips' rather than steady changes.

Nations can adapt to some of the adverse effects of warming.  But the most vulnerable people -- in, for instance, Africa or in Bangladesh -- are the least able to adapt. 

The science of climate change is intricate.  But it's a doddle compared to the economics and politics.  Global warming poses a unique political challenge for two reasons.  First, the effect is non-localised: the CO2 emissions from this country have no more effect here than they do in Australia, and vice versa.  That means that any credible regime whereby the 'polluter pays' has to be broadly international.

Second, there are long time-lags -- it takes decades for the oceans to adjust to a new equilibrium, and centuries for ice-sheets to melt completely.  So the main   downsides of global warming lie a century or more in the future.  Concepts of intergenerational justice then come into play:  How should we rate the rights and interests of future generations compared to our own?  What discount rate should we apply?

In his influential 2006 report for the UK government, Nicholas Stern argued that equity to future generations renders a 'commercial' discount rate quite inappropriate.  Largely on that basis he argues that we should commit substantial resources now, to pre-empt much greater costs in future decades.

There are of course precedents for long-term altruism.  Indeed, in discussing the safe disposal of nuclear waste, experts talk with a straight face about what might happen more than 10,000 years from now, thereby implicitly applying a zero discount rate.  To concern ourselves with such a remote ‘post-human’ era might seem bizarre. But all of us can surely empathise at least a century ahead.  Especially in Europe, we're mindful of the heritage we owe to centuries past; history will judge us harshly if we discount too heavily what might happen when our grandchildren grow old.

To ensure a better-than-evens chance of avoiding a potentially dangerous ‘tipping point’;  global CO2 emissions must, by 2050, be brought down to half the 1990 level.  This is the target espoused by the G8. It corresponds to two tons of CO2 per year from each person on the planet. For comparison, the current European figure is about 10, and the Chinese level is already 4.  To achieve this target without stifling economic growth -- to turn around the curve of CO2 emissions well before 2050 -- is a huge challenge.  The debates last week in Japan indicated the problems -- especially how to bring India and China into the frame.  The great emerging economies have not caused the present problem, but if they develop in as carbon-intensive a way as ours did, they could swamp and negate any measures taken by the G8 alone.

Realistically, however, there is no chance of reaching this target, nor of achieving real energy security, without drastically new technologies.  Though I'm confident that these will have emerged by the second half of the century, the worry is that this may not be soon enough.

Efforts to develop a whole raft of techniques for economising on energy, storing it and generating it by 'clean' or low-carbon methods, deserve a priority and commitment from governments akin to that accorded to the Manhattan project or the Apollo moon landing.  Current R and D is far less than the scale and urgency demands.  To speed things up, we need a ‘shotgun approach’ -- trying all the options.  And we can afford it: the stakes are colossal.  The world spends around 7 trillion dollars per year on energy and its infrastructure.  The US imports 500 billion dollars worth of oil each year.

I can't think of anything that could do more to attract the brightest and best into science than a strongly proclaimed commitment -- led by the US and Europe --  to provide clean and sustainable energy for the developing and the developed world.

Even optimists about  prospects in solar energy, advanced biofuels, fusion and other renewables have to acknowledge that it will be at least 40 years before they can fully ‘take over’.  Coal, oil and gas seem set to dominate the world's every-growing energy needs for at least that long.  Last year the Chinese built 100 coal-fired power stations.  Coal deposits representing a million years’ accumulation of primeval forest are now being burnt in a single year.

Coal is the most ‘inefficient’ fossil fuel in terms of energy generated per unit of carbon released.  Annual CO2 emissions are rising year by year.  Unless this rising curve can be turned around sooner, the atmospheric concentration will irrevocably reach a threatening level.

So an immediate priority has to be a coordinated international effort to develop carbon capture and storage -- CCS.  Carbon from power stations must be captured before it escapes in the atmosphere; and then piped to some geological formation where it can be stored without leaking out.  It's crucial to agree a timetable, and a coordinated plan for the construction of CCS demonstration plants to explore all variants of the technology.  To jump-start such a programme would need up to 10 billion dollars a year of public funding worldwide (preferably as part of public-private partnerships).  But this is a small price to pay for bringing forward, by five years or more, the time when CCS can be widely adopted and the graph of CO2 emissions turned around.

What is the role of nuclear power in all this?  The concerns are well known – it is an issue where expert and lay opinions are equally divided.  I'm myself in favour of the UK and the US having at least a replacement generation of power stations -- and of R and D into new kinds of reactors.  But the non-proliferation regime is fragile, and before being relaxed about a world-wide programme of nuclear power, one would surely require the kind of fuel bank and leasing arrangement that has been proposed by Mohamed el Baradei at the IAEA .

NATURAL RESOURCES AND POPULATION

Energy security and climate change are the prime ‘threats without enemies’ that confront us. But there are others.  High among these is the threat to biological diversity caused by rapid changes in land use and deforestation.  There have been 5 great extinctions in the geological past; human actions are causing a 6th.  The extinction rate is 1000 times higher than normal, and increasing.  We are destroying the book of life before we have read it.

Biodiversity -- manifested in forests, coral reefs, marine blue waters and all Earth's other ecosystems -- is often proclaimed as a crucial component of human wellbeing and economic growth.  It manifestly is: we're clearly harmed if fish stocks dwindle to extinction; there are plants whose gene pool might be useful to us.  And massive destruction of the rain forests would accelerate global warming.  But for environmentalists these ‘instrumental’ – and anthropocentric – arguments aren't the only compelling ones. For them, preserving the richness of our biosphere has value in its own right, over and above what it means to us humans.

Population growth, of course, aggravates all pressures on energy and environment.  Fifty years ago the world population was below 3 billion.  It has more than doubled since then, to 6.6 billion.  The percentage growth-rate has slowed, but the global figure is projected to reach 8 or even 9 billion by 2050.  The excess will almost all be in the developing world.

There is, incidentally, a global trend from rural towards urban living.  More than half the world's population is now urban -- and megacities are growing explosively.

There is an extensive literature on the ‘carrying capacity’ of our planet -- on how many people it can sustain without irreversible degradation.  The answer of course depends on lifestyle.  The world could not sustain its present population if everyone lived like present-day Americans or Europeans.  On the other hand, the pressures would plainly be eased if people travelled little and interacted via super-internet and virtual reality.  And, incidentally, if they were all vegetarians: it takes 13 pounds of corn to make one pound of beef. 

If population growth continues even beyond 2050, one can't be other than exceedingly gloomy about the prospects. However, there could be a turnaround.  There are now more than 60 countries in which fertility is below replacement level -- it's far below in, for instance, Italy and Singapore.  In Iran the fertility rate has fallen from 6.5 in 1980 to 2.1 today.  We all know the social trends that lead to this demographic transition -- declining infant mortality, availability of contraceptive advice, women's education, and so forth.

If the transition quickly extended to all countries, then the global population could start a gradual decline after 2050 -- a development that would surely be benign.

There is, incidentally, one ‘wild card’ in all these long-term forecasts. This is the possibility that the average lifespan in advanced countries may be extended drastically by some biomedical breakthrough. 

The prognosis is especially bleak in Africa, where there could be a billion more people in 2050 than there are today.  It's worth quoting some numbers here.  A hundred years ago, the population of Ethiopia was 5 million.  It is now 75 million (of whom 8 million need permanent food aid) and will almost double by 2050.  Quite apart from the problem of providing services, there is consequent pressure on the water resources of the Nile basin. 

Over 200 years ago, Thomas Malthus famously argued that populations would rise until limited by food shortages.  His gloomy prognosis has been forestalled by advancing technology, the green revolution and so forth, but he could be tragically vindicated in Africa.  Continuing population growth makes it harder to break out of the poverty trap -- Africa not only needs more food, but a million more teachers annually, just to keep standards level.  And just as today's population couldn't be fed by yesterday's agriculture, a second green revolution may be needed to feed tomorrow's population.

But the rich world has the resources, if the will is there, to enhance the life-chances of the world's billion poorest people -- relieving the most extreme poverty, providing clean water, primary education and other basics.  This is a precondition of achieving in Africa the demographic tradition that has occurred elsewhere.  The overseas aid from most countries, including the US, is far below the UN's target of 0.7 percent of GNP.  It would surely be shameful, as well as against even our narrow self-interests, if the Millennium Goals set for 2015 were not met.

(To inject a pessimistic note in parenthesis, the meagre underfunding of overseas aid, even in a context where the humanitarian imperative seems so clear, augurs badly for the actual implementation of the measures needed to meet the 2050 carbon emission targets -- generally quoted as around 1 percent of GNP -- where the payoff is less immediately apparent.) 

SOME NEW VULNERABILITIES

Infectious diseases are mainly associated with developing countries -- but in our interconnected world we are now all more vulnerable.  The spread of epidemics is aggravated by rapid air travel, plus the huge concentrations in megacities with fragile infrastructures.   

Whether or not a pandemic gets global grip may hinge on the efficiency of worldwide monitoring -- how quickly a Vietnamese or Sudanese poultry farmer   can diagnose or report any strange sickness.

In our everyday lives, we have a confused attitude to risk.  We fret about tiny risks: carcinogens in food, a one-in-a-million chance of being killed in train crashes, and so forth.  But we're in denial about others that should loom much larger.  If we apply to pandemics the same prudent analysis that leads us to buy insurance -- multiplying probability by consequences -- we'd surely conclude that measures to alleviate this kind of extreme event need higher priority.  A global pandemic could kill tens of millions and cost many trillions of dollars.

This thought leads me to new vulnerabilities of a different kind: vulnerabilities stemming from the misuse of powerful technologies -- either through error or by design.  Biotechnology, for instance, holds huge promise for health care, for enhanced food production, even for energy. But there is a downside.

Here's a quote from the American National Academy of Sciences:  “Just a few individuals with specialised skills ... could inexpensively and easily produce a panoply of lethal biological weapons ...... The deciphering of the human genome sequence and the complete elucidation of numerous pathogen genomes .... allow science to be misused to create new agents of mass destruction.”

Not even an organised network would be required: just a fanatic, or a weirdo with the mindset of those who now design computer viruses -- the mindset of an arsonist.  The techniques and expertise for bio or cyber attacks will be accessible to millions.

We're kidding ourselves if we think that technical expertise is always allied with balanced rationality:  it can be combined with fanaticism -- not just the traditional fundamentalism that we're so mindful of today, but new age irrationalities.  I'm thinking of cults such as the Raelians:  and of extreme eco-freaks, animal rights campaigners and the like.  The global village will have its village idiots. 

In a future era of vast individual empowerment, where even one malign act would be too many, how can our open society be safeguarded?  Will there be pressures to constrain diversity and individualism?  Or to shift the balance between privacy and intrusion?  These are stark questions, but I think they are deeply serious ones.   (Though -- to inject a slightly frivolous comment -- the  careless abandon with which younger people put their intimate details on Facebook, and the broad acquiescence in ubiquitous CCTV, suggests that in our society there will be surprisingly little resistance to loss of privacy.)

Developments in cyber, bio or nano-technology will open up new risks of error or terror.  Our global society is precariously dependent on elaborate networks -- electricity grids, air traffic control, the internet, just-in-time delivery and so forth - whose collapse could stress it to breaking point.  It's crucial to ensure maximal resilience of all such systems.

At the start of this lecture, I cited three technologies that now pervade our lives in ways quite unenvisioned 50 years ago.  Likewise, by extrapolating from the present, I have surely missed the qualitatively greatest changes that may occur in the next 50. 

The great science-fiction writer Arthur C Clark opined that any ultra-advanced technology was indistinguishable from magic.  Everyday consumer items like Sony game stations, sat-nav and Google would have seemed magic 50 years ago.

In the coming decades, there could be qualitatively new kinds of change.  One thing that's been unaltered for millennia is human nature and human character.  But in this century, novel mind-enhancing drugs, genetics, and ‘cyberg’ techniques may start to alter human beings themselves.  That's something qualitatively new in recorded history.

And we should keep our minds open, or at least ajar, to concepts on the fringe of science fiction --- robots with many human attributes, computers that  make discoveries worthy of Nobel prizes, bioengineered organisms,  and so forth.  Flaky Californian futurologists aren't always wrong.

Opinion polls in England show that people are generally positive about science's role, but are concerned that it may ‘run away’ faster than we can properly cope with it.  Some commentators on biotech, robotics and nanotech worry that when the genie is out of the bottle, the outcome may be impossible to control.  They urge caution in ‘pushing the envelope’ in some areas of science.

The uses of academic research generally can't be foreseen: Rutherford famously said, in the mid-thirties, that nuclear energy was ‘moonshine’; the inventors of lasers didn't foresee that an early application of their work would be to eye surgery;  the discoverer of x-rays was not searching for ways to see through flesh.  A major scientific discovery is likely to have many applications -- some benign, others less so -- none of which was foreseen by the original investigator.

We can't reap the benefits of science without accepting some risks -- the best we can do is minimise them.  Most surgical procedures, even if now routine, were risky and often fatal when they were being pioneered.  In the early days of steam, people died when poorly designed boilers exploded.

But something has changed.  Most of the 'old' risks were localised.  If a boiler explodes, it's horrible but there's an 'upper bound' to just how horrible.   In our ever more interconnected world, there are new risks whose consequences could be so widespread that even a tiny probability is unacceptable.

There will surely be a widening gulf between what science enables us to do, and what applications it's prudent or ethical actually to pursue -- more doors that science could open but which are best kept closed.

There are already scientific procedures -- human reproductive cloning, synthetic biology and the rest -- where regulation is called for, on ethical as well as prudential grounds.  And there will be more.  Regulations will need to be international, and to contend with commercial pressures -- and they may prove as hard to enforce as the drug laws.  If one country alone imposed regulations, the most dynamic researchers and enterprising companies would migrate to another that was more permissive.  This is happening already, in a small way, in primate and stem cell research.

THE INTERNATIONAL SCIENTIFIC COMMUNITY

Some comments, now, on the role of the scientific community.  Science is the only truly global culture:  protons, proteins, and Pythagoras’s theorem are the same from China to Peru.  Research is international, highly networked, and collaborative.  And most science-linked policy issues are international, even global -- that's certainly true of those I've addressed in this lecture. 

This is primarily an Anglo-American gathering, so I hope it's not out of place to emphasise that our two countries have been the most successful in creating and sustaining world-class research universities.  These institutions are magnets for talent -- both faculty and students -- from all over the world, and are in most cases embedded in a 'cluster' of high-tech companies, to symbiotic benefit.

By 2050, China and India should at least gain parity with Europe and the US -- they will surely become the 'centre of gravity' of the world's intellectual power.  We will need to aim high if we are to sustain our competitive advantage in offering cutting-edge ’value added’.

It's a duty of scientific academies and similar bodies to ensure that policy decisions are based on the best science, even when that science is still uncertain and provisional; this is the Royal Society's role in the UK and that of the National Academy of Sciences in the US.  The academies of the G8 + 5 countries are playing an increasing role in highlighting global issues.  And one thinks of consortia like the IPCC, and bodies like the WHO.

In this country, an ongoing dialogue with parliamentarians on embryos and stem cells has led to a generally-admired legal framework.  On the other hand, the GM crops debate went wrong here because we came in too late, when opinion was already polarised between eco-campaigners on the one side and commercial interests on the other.  I think we have recently done better on nanotechnology, by raising the key issues early.  It’s necessary to engage with the public 'upstream' of any legislation or commercial developments.

We need to point out that the resources and expertise devoted to applications of science are not deployed optimally.  Some subjects have had the 'inside track' and gained disproportionate resources; huge sums, for instance, are still devoted to new weaponry.  On the other hand, environmental projects, renewable energy, and so forth, deserve more effort.  In medicine, the focus is disproportionately on cancer and cardiovascular studies, the ailments that loom largest in prosperous countries, rather than on the infections endemic in the tropics. 

Policy decisions -- whether about energy, GM technology, mind-enhancing drugs or whatever -- are never solely ‘scientific’: strategic, economic, social, and ethical ramifications enter as well.  And here scientists have no special credentials.  Choices on how science is applied shouldn't be made just by scientists.  That's why everyone needs a ‘feel’ for science and a realistic attitude to risk -- otherwise public debate won't rise above the level of tabloid slogans.

Scientists nonetheless have a special responsibility.  We feel there is something lacking in parents who don't care what happens to their children in adulthood, even though this is largely beyond their control.  Likewise, scientists shouldn't be indifferent to the fruits of their ideas -- their intellectual creations.  They should try to foster benign spin-offs -- and of course help to bring their work to market when appropriate.  But they should campaign to resist, so far as they can, ethically dubious or threatening applications.  And they should be prepared to engage in public debate and discussion.

I mentioned earlier the atomic scientists in World War II.  Many of them -- and I've been privileged to know some, such as Hans Bethe and Joseph Rotblat -- set a fine example.  Fate had assigned them a pivotal role in history.  They returned with relief to peacetime academic pursuits.  But they didn't say that they were ‘just scientists’ and that the use made of their work was up to politicians.  They continued as engaged citizens -- promoting efforts to control the power they had helped unleash.  We now need such individuals -- not just in physics, but across the whole range of applicable science.

A COSMIC PERSPECTIVE

My special subject is astronomy -- the study of our environment in the widest conceivable sense.  And I'd like to end with a cosmic perspective.

It is surely a cultural deprivation to be unaware of the marvellous vision of nature offered by Darwinism and by modern cosmology -- the  chain of emergent complexity leading from a still-mysterious beginning to atoms, stars, planets, biospheres and human brains able to ponder the wonder and the mystery.  And there's no reason to regard humans as the culmination of this emergent process.  Our Sun is less than half way through its life.  Any creatures witnessing the Sun's demise, here on earth or far beyond, won't be human -- they'll be as different from us as we are from bacteria.

But, even in this cosmic time-perspective -- extending billions of years into the future, as well as into the past -- this century may be a defining moment.  It's the first in our planet's history where one species - ours - has Earth's future in its hands.

I recalled earlier the image of our Earth viewed from space.  Suppose some aliens had been watching our planet -- a ‘pale blue dot’ in a vast cosmos, for its entire history, what would they have seen?

Over nearly all that immense time, 4.5 billion years, Earth's appearance would have altered very gradually.  The continents drifted; the ice cover waxed and waned; successive species emerged, evolved and became extinct.

But in just a tiny sliver of the Earth's history -- the last one millionth part, a few thousand years -- the patterns of vegetation altered much faster than before.  This signalled the start of agriculture.  The changes accelerated as human populations rose.

But then there were other changes, even more abrupt.  Within fifty years -- little more than one hundredth of a millionth of the Earth's age, the carbon dioxide in the atmosphere began to rise anomalously fast.  The planet became an intense emitter of radio waves (the total output from all TV, cellphone and radar transmissions).

And something else unprecedented happened:  small projectiles lifted from the planet's surface and escaped the biosphere completely.  Some were propelled into orbits around the Earth; some journeyed to the Moon and planets.

If they understood astrophysics, the aliens could confidently predict that the biosphere would face doom in a few billion years when the Sun flares up and dies.  But could they have predicted this unprecedented spike less than half way through the Earth's life -- these human-induced alterations occupying, overall, less than a millionth of the elapsed lifetime and seemingly occurring with runaway speed?

If they continued to keep watch, what might these hypothetical aliens witness in the next hundred years?  Will a final spasm be followed by silence?  Or will the planet itself stabilise?  And will some of the objects launched from the Earth spawn new oases of life elsewhere?
The answers will depend on us, collectively -- on whether we can, to quote Brent Scowcroft again, “behave wisely, prudently and in close strategic cooperation with each other”.

© The Ditchley Foundation, 2008.  All rights reserved.  Queries concerning permission to translate or reprint should be addressed to The Editor, The Ditchley Foundation, Ditchley Park, Enstone, CHIPPING NORTON, Oxfordshire OX7 4ER, England.