Tag Archives: Prediction

World needs early warning of climate-linked disasters

A leading French government minister says the number of natural disasters connected to climate change has doubled in two decades, and is urging a global early warning system. LONDON, 15 March, 2015 − A senior French political leader, foreign minister Laurent Fabius, has told an international conference on how to reduce the risk from natural disasters that 70% of them are now linked to climate change, twice as many as twenty years ago. Mr. Fabius is the incoming president of this year’s round of negotiations by member states of the UN climate change convention, to take place in Paris in December. He said disaster risk reduction and the struggle against climate change went hand in hand: “It is necessary to tackle these problems together and not separately.” He was speaking against the background of two events which occurred thousands of miles apart on 14 March, linked by nothing except tragic coincidence. In the Japanese city of Sendai the third UN world conference on disaster risk reduction began a five-day meeting. In the South Pacific Cyclone Pam brought death and devastation to the 83-island nation of Vanuatu on a scale seldom recorded in the region. Vivien Maidaborn, executive director of Unicef New Zealand, said the disaster could prove one of the worst in Pacific history. “The sheer force of the storm, combined with communities just not set up to withstand it, could have devastating results for thousands across the region,” she said.

Hope shattered

A Unicef worker in Vanuatu described the cyclone as “15 to 30 minutes of absolute terror” for “everybody in this country” as it passed over. The president of Vanuatu, Baldwin Lonsdale, told the UN meeting: “I am speaking with you today with a heart that is so heavy… All I can say is that our hope for prospering into the future has been shattered.” The UN Secretary-General, Ban Ki-moon, opened the Sendai meeting, attended by 4,000 people from 186 countries, with a reminder that annual economic losses from natural disasters are now estimated to exceed US$ 300 billion annually. He said: “We can watch that number grow as more people suffer. Or we can dramatically lower that figure and invest the savings in development. Six billion dollars allocated each year can result in savings of up to US$360 billion by 2030.” A report released at the meeting, United for Disaster Resilience, prepared by insurance companies working with the UN Environment Programme’s Finance Initiative, said: “In the past decade, average economic losses from disasters were about US$190 billion per year, while average insured losses were about US$60 billion per year. This century, more than one million people have already lost their lives to disasters.”

Alert system

The UN Office for Disaster Risk Reduction, UNISDR, says global climate-related disasters between 1980 and 2011 included:

  • 3,455 floods
  • 2,689 storms
  • 470 droughts
  • 395 episodes of extreme temperature.

Mr Fabius said the creation of a worldwide early warning system for climate disasters could provide the most vulnerable countries, including small island developing states, with access to real-time weather and climate updates, information and communications technology, and with support for an SMS-based alert system. UNISDR’s PreventionWeb already links those working to protect communities against disaster risk. Since the last such disaster risk conference in 2005, the UN says, at least 700,000 people have died, 1.7 billion more have been affected, and economic losses from major reported disasters total US$1.4 trillion. The conference is working to prepare a new plan for reducing the risks of disasters. Margareta Wahlström, head of UNISDR, said: “After three years of consultation on a post-2015 framework which updates the current Hyogo Framework for Action, there is general agreement that we must move from managing disasters to managing disaster risk.” She said the framework would help to reduce existing levels of risk and avoid the creation of new ones. − Climate News Network

A leading French government minister says the number of natural disasters connected to climate change has doubled in two decades, and is urging a global early warning system. LONDON, 15 March, 2015 − A senior French political leader, foreign minister Laurent Fabius, has told an international conference on how to reduce the risk from natural disasters that 70% of them are now linked to climate change, twice as many as twenty years ago. Mr. Fabius is the incoming president of this year’s round of negotiations by member states of the UN climate change convention, to take place in Paris in December. He said disaster risk reduction and the struggle against climate change went hand in hand: “It is necessary to tackle these problems together and not separately.” He was speaking against the background of two events which occurred thousands of miles apart on 14 March, linked by nothing except tragic coincidence. In the Japanese city of Sendai the third UN world conference on disaster risk reduction began a five-day meeting. In the South Pacific Cyclone Pam brought death and devastation to the 83-island nation of Vanuatu on a scale seldom recorded in the region. Vivien Maidaborn, executive director of Unicef New Zealand, said the disaster could prove one of the worst in Pacific history. “The sheer force of the storm, combined with communities just not set up to withstand it, could have devastating results for thousands across the region,” she said.

Hope shattered

A Unicef worker in Vanuatu described the cyclone as “15 to 30 minutes of absolute terror” for “everybody in this country” as it passed over. The president of Vanuatu, Baldwin Lonsdale, told the UN meeting: “I am speaking with you today with a heart that is so heavy… All I can say is that our hope for prospering into the future has been shattered.” The UN Secretary-General, Ban Ki-moon, opened the Sendai meeting, attended by 4,000 people from 186 countries, with a reminder that annual economic losses from natural disasters are now estimated to exceed US$ 300 billion annually. He said: “We can watch that number grow as more people suffer. Or we can dramatically lower that figure and invest the savings in development. Six billion dollars allocated each year can result in savings of up to US$360 billion by 2030.” A report released at the meeting, United for Disaster Resilience, prepared by insurance companies working with the UN Environment Programme’s Finance Initiative, said: “In the past decade, average economic losses from disasters were about US$190 billion per year, while average insured losses were about US$60 billion per year. This century, more than one million people have already lost their lives to disasters.”

Alert system

The UN Office for Disaster Risk Reduction, UNISDR, says global climate-related disasters between 1980 and 2011 included:

  • 3,455 floods
  • 2,689 storms
  • 470 droughts
  • 395 episodes of extreme temperature.

Mr Fabius said the creation of a worldwide early warning system for climate disasters could provide the most vulnerable countries, including small island developing states, with access to real-time weather and climate updates, information and communications technology, and with support for an SMS-based alert system. UNISDR’s PreventionWeb already links those working to protect communities against disaster risk. Since the last such disaster risk conference in 2005, the UN says, at least 700,000 people have died, 1.7 billion more have been affected, and economic losses from major reported disasters total US$1.4 trillion. The conference is working to prepare a new plan for reducing the risks of disasters. Margareta Wahlström, head of UNISDR, said: “After three years of consultation on a post-2015 framework which updates the current Hyogo Framework for Action, there is general agreement that we must move from managing disasters to managing disaster risk.” She said the framework would help to reduce existing levels of risk and avoid the creation of new ones. − Climate News Network

Heat is on to slow down faster rise in temperatures

New research warns that emissions will make drought conditions even more extreme as our climate moves into a period of rapid change. LONDON, 12 March, 2015 – Analysis of temperature records and reconstructions of past climates indicates that the pace of global warming is about to accelerate. Although the much-debated “pause” in warming during the 21st century is still under debate, climate scientist now warn that the Earth is about to enter a period of change that will be faster than anything in the last thousand years. Steven Smith, an integrated modelling and energy scientist, at the US government’s Pacific Northwest National Laboratory, and colleagues decided to take a look at the short history of temperature records and the somewhat longer “proxy” reconstructions of past climates to look for patterns of the past that might be a guide to the future.

Baseline rates

They then matched the past and examined the future using computer model simulations. Climate periods were considered in 40 year blocks, and were compared to establish a baseline for natural rates of change. The scientists report in Nature Climate Change that rises now in North America and many parts of the world are greater than the natural range for any rate of change. And when they tested future emissions scenarios, they confirmed that global warming will pick up speed in the next 40 years in all cases − even in those projections in which the world reduced its greenhouse gas emissions. And if the world doesn’t reduce these emissions, the rate of change in warming will remain high for the rest of the century. “In these climate model simulations, the world is just now starting to enter a new place, where rates of temperature change are consistently larger than historical values over 40-year time spans,” Dr Smith says. “We need to better understand what the effects of this will be, and how to prepare for them.” The research is based on simulation, and seems inconsistent with the story of the 21st century, which is that, after a relatively rapid decadal rise in global average temperatures between 1970 and 2000, the rate of rise seemed to slow. Although almost all the years of the new century so far have been warmer than any in the 20th century, and although 2014 was the warmest year on record so far, the notches on the thermometer each year have been smaller. But as researchers have repeatedly warned, the real rise may be masked by some kind of natural variation. At least one group in 2014 found that the patterns of extremes of heat seem to be accelerating, even if the averages are not.

“The finding is critical to understanding what the world will be like as the climate continues to change”

And now Rong Fu, professor of geological sciences at the University of Texas at Austin, US, has looked at a study by research scientists William Lau, of the University of Maryland, and Kyu-Myong Kim, of the NASA Goddard Space Flight Centre, and seen signs of an intensified pattern of extreme droughts in Australia, the southwest and central US, and southern Amazonia. The Proceedings of the National Academy of Sciences has published both the original research and the commentary  by Professor Fu. At the heart of the issue is the impact of increased emissions of carbon dioxide on the pattern of wind circulation that overall dictates the climate of each hemisphere. This pattern is sometimes called the Hadley Circulation, named after the 18th-century English lawyer and amateur meteorologist, George Hadley, who first identified the mechanism behind the all-important Trade Winds that carried sailing ships across the Atlantic. It can change with global temperatures. And as the winds change – and the prevailing Trade Winds move away from the tropics – they take the rainfall with them.

Ominous consequences

The guess has been that Hadley Circulation varies naturally. And the PNAS study suggests that it is likely to intensify in a warmer world, with ominous consequences for some already naturally dry regions. That both Australia and the American southwest are already feeling the heat is not news. But the significance of the research lies in more detailed understanding of why even more is on the cards in future. “This is the first study that suggests a possible intensification of droughts in the tropic-subtropical margins in warmer climate,” Professor Fu says. “The finding is critical to understanding what the world will be like as the climate continues to change. “Will the Hadley Circulation continue to expand? Could the intensification of droughts over the tropics be a new norm? These are questions that need to be answered.” – Climate News Network

New research warns that emissions will make drought conditions even more extreme as our climate moves into a period of rapid change. LONDON, 12 March, 2015 – Analysis of temperature records and reconstructions of past climates indicates that the pace of global warming is about to accelerate. Although the much-debated “pause” in warming during the 21st century is still under debate, climate scientist now warn that the Earth is about to enter a period of change that will be faster than anything in the last thousand years. Steven Smith, an integrated modelling and energy scientist, at the US government’s Pacific Northwest National Laboratory, and colleagues decided to take a look at the short history of temperature records and the somewhat longer “proxy” reconstructions of past climates to look for patterns of the past that might be a guide to the future.

Baseline rates

They then matched the past and examined the future using computer model simulations. Climate periods were considered in 40 year blocks, and were compared to establish a baseline for natural rates of change. The scientists report in Nature Climate Change that rises now in North America and many parts of the world are greater than the natural range for any rate of change. And when they tested future emissions scenarios, they confirmed that global warming will pick up speed in the next 40 years in all cases − even in those projections in which the world reduced its greenhouse gas emissions. And if the world doesn’t reduce these emissions, the rate of change in warming will remain high for the rest of the century. “In these climate model simulations, the world is just now starting to enter a new place, where rates of temperature change are consistently larger than historical values over 40-year time spans,” Dr Smith says. “We need to better understand what the effects of this will be, and how to prepare for them.” The research is based on simulation, and seems inconsistent with the story of the 21st century, which is that, after a relatively rapid decadal rise in global average temperatures between 1970 and 2000, the rate of rise seemed to slow. Although almost all the years of the new century so far have been warmer than any in the 20th century, and although 2014 was the warmest year on record so far, the notches on the thermometer each year have been smaller. But as researchers have repeatedly warned, the real rise may be masked by some kind of natural variation. At least one group in 2014 found that the patterns of extremes of heat seem to be accelerating, even if the averages are not.

“The finding is critical to understanding what the world will be like as the climate continues to change”

And now Rong Fu, professor of geological sciences at the University of Texas at Austin, US, has looked at a study by research scientists William Lau, of the University of Maryland, and Kyu-Myong Kim, of the NASA Goddard Space Flight Centre, and seen signs of an intensified pattern of extreme droughts in Australia, the southwest and central US, and southern Amazonia. The Proceedings of the National Academy of Sciences has published both the original research and the commentary  by Professor Fu. At the heart of the issue is the impact of increased emissions of carbon dioxide on the pattern of wind circulation that overall dictates the climate of each hemisphere. This pattern is sometimes called the Hadley Circulation, named after the 18th-century English lawyer and amateur meteorologist, George Hadley, who first identified the mechanism behind the all-important Trade Winds that carried sailing ships across the Atlantic. It can change with global temperatures. And as the winds change – and the prevailing Trade Winds move away from the tropics – they take the rainfall with them.

Ominous consequences

The guess has been that Hadley Circulation varies naturally. And the PNAS study suggests that it is likely to intensify in a warmer world, with ominous consequences for some already naturally dry regions. That both Australia and the American southwest are already feeling the heat is not news. But the significance of the research lies in more detailed understanding of why even more is on the cards in future. “This is the first study that suggests a possible intensification of droughts in the tropic-subtropical margins in warmer climate,” Professor Fu says. “The finding is critical to understanding what the world will be like as the climate continues to change. “Will the Hadley Circulation continue to expand? Could the intensification of droughts over the tropics be a new norm? These are questions that need to be answered.” – Climate News Network

Science offers new view of human survival hopes

Astrophysicists say questions about the sustainability of civilisation on our high-tech planet may soon be answered scientifically as a result of new data about the Earth and other planets in its galaxy. LONDON, 15 November, 2014 − Two American scientists have just sought to find a way of answering the ultimate global warming question: how long can any species last once it has discovered how to exploit fossil fuels and change the conditions under which it first evolved? In doing so, they have sidestepped the great challenge of astrobiology. This is that all thinking about life in the universe is handicapped by a simple problem: because there is only one so-far identified instance of life in the universe, it is impossible to arrive at a generalisation. But Adam Frank, assistant professor of astrophysics at the University of Rochester in New York, and Woodruff Sullivan, professor of astronomy at the University of Washington in Seattle, propose a way round the problem.

Energy intensive

They report in The Anthropocene journal that since they were interested in the potential lifetimes of human, humanoid or other intelligent species with energy intensive technology (SWEIT), they could start by using a famous equation to estimate the number of such species that exist now or have already gone extinct. The Drake Equation is the intellectual basis of the search for extraterrestrial civilisation. It calculates the number of possible planetary systems in all the known galaxies, the proportion of these that might be hospitable to life, and the proportion of habitable planets that might be fit for the emergence of a technically-advanced or SWEIT civilisation. They reason that, even if the chances of a high-technology species are just one in a thousand trillion, that means that a thousand such SWEIT civilisations exist or have existed in our local region of the universe. Prof Frank says: “That’s enough to start thinking about statistics − like what is the average lifetime of a species that starts harvesting energy efficiently and uses it to develop high technology?”

“We have no idea how long a technological civilisation like our own can last”

But another part of the puzzle is also uncertain. “We have no idea how long a technological civilisation like our own can last,” Frank says. “Is it 200 years, 500 years or 50,000 years? Answering this question is at the root of all our concerns about the sustainability of human society. “Are we the first and the only technologically-intensive civilisation in the entire history of the universe? If not, shouldn’t we stand to learn something from the past successes and failures of these other species?”

Human threats

The two authors considered the ways in which human action could threaten human civilisation, including: the partial or complete collapse of 95% of all fish stocks in the last 50 years; the diminishing supplies of fresh water; the loss of rainforest habitat; the acidification of the oceans; and, of course, the change to the climate system. All are a consequence of the use of energy-intensive technology. They also contemplated the relatively new science of sustainability: how long can such action continue? They note that 20,000 scientific papers that address sustainability have appeared in the last 40 years, and the numbers of these articles has doubled every eight years. Then they looked at what little could be known from astrobiology − the study of life beyond the solar system. None has been found, but in the last two decades a huge number of extrasolar planets have been identified. The local solar system has been explored in detail, and the Earth’s own history is now well studied. So astronomers could now be in a position to make judgments about the potential conditions for life on the “exoplanets” identified so far. For the purpose of estimating an average lifetime for an extraterrestrial species, it wouldn’t much matter what form the life took, it would affect entropy, the thermodynamic balance of order and disorder. “If they use energy to produce work, they’re generating entropy,” says Prof Frank. “There’s no way round that, whether they’re human-looking Star Trek creatures with antennae on their foreheads or they’re nothing more than single-cell organisms with collective mega-intelligence.

Feedback effects

“And that entropy will almost certainly have strong feedback effects on their planet’s habitability, as we’re beginning to see here on Earth.” With this in mind, the report’s authors started to consider the sustainability lessons of Earth’s own history − marked by five mass extinction events in the past 500 million years – and a set of recent human-driven changes so marked that some geologists have labelled the present era the Anthropocene. Their conclusions are less than optimistic. “Although such rapid changes are not a new phenomenon, the present instance is the first (we know of) where the primary agent of causation is knowingly watching it all happen and pondering options for its own future,” they conclude. “One point is clear: both astrobiology and sustainability science tell us that the Earth will be fine in the long run. The prospects are, however, less clear for Homo sapiens.” − Climate News Network

Astrophysicists say questions about the sustainability of civilisation on our high-tech planet may soon be answered scientifically as a result of new data about the Earth and other planets in its galaxy. LONDON, 15 November, 2014 − Two American scientists have just sought to find a way of answering the ultimate global warming question: how long can any species last once it has discovered how to exploit fossil fuels and change the conditions under which it first evolved? In doing so, they have sidestepped the great challenge of astrobiology. This is that all thinking about life in the universe is handicapped by a simple problem: because there is only one so-far identified instance of life in the universe, it is impossible to arrive at a generalisation. But Adam Frank, assistant professor of astrophysics at the University of Rochester in New York, and Woodruff Sullivan, professor of astronomy at the University of Washington in Seattle, propose a way round the problem.

Energy intensive

They report in The Anthropocene journal that since they were interested in the potential lifetimes of human, humanoid or other intelligent species with energy intensive technology (SWEIT), they could start by using a famous equation to estimate the number of such species that exist now or have already gone extinct. The Drake Equation is the intellectual basis of the search for extraterrestrial civilisation. It calculates the number of possible planetary systems in all the known galaxies, the proportion of these that might be hospitable to life, and the proportion of habitable planets that might be fit for the emergence of a technically-advanced or SWEIT civilisation. They reason that, even if the chances of a high-technology species are just one in a thousand trillion, that means that a thousand such SWEIT civilisations exist or have existed in our local region of the universe. Prof Frank says: “That’s enough to start thinking about statistics − like what is the average lifetime of a species that starts harvesting energy efficiently and uses it to develop high technology?”

“We have no idea how long a technological civilisation like our own can last”

But another part of the puzzle is also uncertain. “We have no idea how long a technological civilisation like our own can last,” Frank says. “Is it 200 years, 500 years or 50,000 years? Answering this question is at the root of all our concerns about the sustainability of human society. “Are we the first and the only technologically-intensive civilisation in the entire history of the universe? If not, shouldn’t we stand to learn something from the past successes and failures of these other species?”

Human threats

The two authors considered the ways in which human action could threaten human civilisation, including: the partial or complete collapse of 95% of all fish stocks in the last 50 years; the diminishing supplies of fresh water; the loss of rainforest habitat; the acidification of the oceans; and, of course, the change to the climate system. All are a consequence of the use of energy-intensive technology. They also contemplated the relatively new science of sustainability: how long can such action continue? They note that 20,000 scientific papers that address sustainability have appeared in the last 40 years, and the numbers of these articles has doubled every eight years. Then they looked at what little could be known from astrobiology − the study of life beyond the solar system. None has been found, but in the last two decades a huge number of extrasolar planets have been identified. The local solar system has been explored in detail, and the Earth’s own history is now well studied. So astronomers could now be in a position to make judgments about the potential conditions for life on the “exoplanets” identified so far. For the purpose of estimating an average lifetime for an extraterrestrial species, it wouldn’t much matter what form the life took, it would affect entropy, the thermodynamic balance of order and disorder. “If they use energy to produce work, they’re generating entropy,” says Prof Frank. “There’s no way round that, whether they’re human-looking Star Trek creatures with antennae on their foreheads or they’re nothing more than single-cell organisms with collective mega-intelligence.

Feedback effects

“And that entropy will almost certainly have strong feedback effects on their planet’s habitability, as we’re beginning to see here on Earth.” With this in mind, the report’s authors started to consider the sustainability lessons of Earth’s own history − marked by five mass extinction events in the past 500 million years – and a set of recent human-driven changes so marked that some geologists have labelled the present era the Anthropocene. Their conclusions are less than optimistic. “Although such rapid changes are not a new phenomenon, the present instance is the first (we know of) where the primary agent of causation is knowingly watching it all happen and pondering options for its own future,” they conclude. “One point is clear: both astrobiology and sustainability science tell us that the Earth will be fine in the long run. The prospects are, however, less clear for Homo sapiens.” − Climate News Network

Flow chart unclear for glacial rivers

Glaciers in the high Himalayas and on the Tibetan Plateau are a vital source of water for millions of people in Asia, but scientists question what will happen to supplies if the rate of melting continues to rise due to climate-related factors LONDON, 19 June – A new study examining river basins in the Asia region suggests that amounts of water supplied to the area by glaciers and rainfall in the Himalayas will increase in the coming decades. At first reading, that looks like good news, as an estimated 1.3 billion people in Pakistan, India, Bangladesh, Nepal, China and elsewhere are dependent for their water supplies on rivers fed by glaciers and snowmelt. But the less welcome news is that scientists are unsure what will happen after 2050 if the rate at which glaciers melt continues to increase as a result of climate change. Scientists say rising temperatures and more intense rainfall patterns in the higher Himalayas are causing the retreat of the majority of glaciers in the region.

Heat build-up

They say glacier melt is also being caused by black carbon – particulate matter that, in South Asia, comes mainly from cooking fires, the burning of waste, plus coal burning and diesel exhausts. The black carbon, or soot, falls on the glaciers, reducing reflectivity and increasing heat build-up. This latest study of glacier melt and water flows, appearing in the journal Nature Climate Change, was carried out by scientists at Future Water, a Netherlands-based research group, Utrecht University, and the Nepal-based International Centre for Integrated Mountain Development. It assesses the contribution of glacier and snowmelt to the region’s river basins, incorporating some of the world’s mightiest rivers – the Indus, the Ganges, the Brahmaputra, the Mekong and the Salween. The scientists say that highly-sophisticated modelling techniques were used to study the river basins in unprecedented detail. They report: “Despite large differences in runoff composition and regimes between basins and between tributaries within basins, we project an increase in runoff at least until 2050, caused primarily by an increase in precipitation in the upper Ganges, Brahmaputra, Salween and Mekong basins and from accelerated melt in the upper Indus Basin. “These findings have immediate consequences for climate change policies where a transition towards coping with intra-annual shifts in water availability is desirable.”

Uncertain supplies

But while the study says that, up to mid-century, little change is likely in the amount of glacier melt water flowing into river basins, it is unclear what will happen thereafter to the water supplies for  what is a significant portion of the world’s population. “Our study does not include projections after 2050,” Arthur Lutz, lead author of the study, told Climate News Network. “However, at some point in time, the contribution of glacier melt to the total flow will decrease, because of the decreasing glacier extent. When this happens, it will differ for different river basins and sub-basins.” The study says the long-term outlook is particularly uncertain for the upper Indus basin. While glacier melt contributes only 11.5% of the total runoff in the upper basin of the Ganges river, it contributes more than 40% of total water runoff in the upper Indus basin. The Indus river, which flows for nearly 2,000 miles from high up in the Hindu Kush-Karakoram Himalaya mountain range down to the Arabian Sea, is vital to life in Pakistan, providing water for 90% of the country’s agricultural crops. Hydro plants along the Indus also supply about half the country’s electricity. – Climate News Network

Glaciers in the high Himalayas and on the Tibetan Plateau are a vital source of water for millions of people in Asia, but scientists question what will happen to supplies if the rate of melting continues to rise due to climate-related factors LONDON, 19 June – A new study examining river basins in the Asia region suggests that amounts of water supplied to the area by glaciers and rainfall in the Himalayas will increase in the coming decades. At first reading, that looks like good news, as an estimated 1.3 billion people in Pakistan, India, Bangladesh, Nepal, China and elsewhere are dependent for their water supplies on rivers fed by glaciers and snowmelt. But the less welcome news is that scientists are unsure what will happen after 2050 if the rate at which glaciers melt continues to increase as a result of climate change. Scientists say rising temperatures and more intense rainfall patterns in the higher Himalayas are causing the retreat of the majority of glaciers in the region.

Heat build-up

They say glacier melt is also being caused by black carbon – particulate matter that, in South Asia, comes mainly from cooking fires, the burning of waste, plus coal burning and diesel exhausts. The black carbon, or soot, falls on the glaciers, reducing reflectivity and increasing heat build-up. This latest study of glacier melt and water flows, appearing in the journal Nature Climate Change, was carried out by scientists at Future Water, a Netherlands-based research group, Utrecht University, and the Nepal-based International Centre for Integrated Mountain Development. It assesses the contribution of glacier and snowmelt to the region’s river basins, incorporating some of the world’s mightiest rivers – the Indus, the Ganges, the Brahmaputra, the Mekong and the Salween. The scientists say that highly-sophisticated modelling techniques were used to study the river basins in unprecedented detail. They report: “Despite large differences in runoff composition and regimes between basins and between tributaries within basins, we project an increase in runoff at least until 2050, caused primarily by an increase in precipitation in the upper Ganges, Brahmaputra, Salween and Mekong basins and from accelerated melt in the upper Indus Basin. “These findings have immediate consequences for climate change policies where a transition towards coping with intra-annual shifts in water availability is desirable.”

Uncertain supplies

But while the study says that, up to mid-century, little change is likely in the amount of glacier melt water flowing into river basins, it is unclear what will happen thereafter to the water supplies for  what is a significant portion of the world’s population. “Our study does not include projections after 2050,” Arthur Lutz, lead author of the study, told Climate News Network. “However, at some point in time, the contribution of glacier melt to the total flow will decrease, because of the decreasing glacier extent. When this happens, it will differ for different river basins and sub-basins.” The study says the long-term outlook is particularly uncertain for the upper Indus basin. While glacier melt contributes only 11.5% of the total runoff in the upper basin of the Ganges river, it contributes more than 40% of total water runoff in the upper Indus basin. The Indus river, which flows for nearly 2,000 miles from high up in the Hindu Kush-Karakoram Himalaya mountain range down to the Arabian Sea, is vital to life in Pakistan, providing water for 90% of the country’s agricultural crops. Hydro plants along the Indus also supply about half the country’s electricity. – Climate News Network

Flat denial rejects 'very likely' science

FOR IMMEDIATE RELEASE
Warnings, predictions and statements of probability all have their uses, but only if people heed them. Research tells one story, human behaviour seems to offer another version.

LONDON, 28 April – The odds that global warming of almost 1°C since 1880 is just a natural fluctuation are very low: less than one in a hundred and probably less than one in a thousand, according to a study in the journal Climate Dynamics.

Shaun Lovejoy of McGill University in Canada didn’t play with computer simulations: he simply looked at the climate data since 1500 and subjected it to statistical analysis. The message from the historical data – records, tree rings, ice cores, lake sediments and so on – is that global warming is linked to fossil fuel-burning and to rising levels of greenhouse gases in the atmosphere.

“This study will be a blow to any remaining climate change deniers,” said the physicist. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.”

Lovejoy’s finding is unlikely to be the end of the story, perhaps because there are problems with words like “probability”. David Budescu of Fordham University in the US reports in Nature Climate Change that when people hear the words “very likely” used to describe a 95% chance that something is the case, they are more likely to interpret that as around 50% probability.

Budescu and his colleagues worked their way through the problems the Intergovernmental Panel on Climate Change had in presenting its findings. The researchers settled on the terms very unlikely, unlikely, likely and very likely in 24 nations and 17 languages, and found that, on balance, while the IPCC intended “very likely” to mean a more than 90% chance, people tended to understand the phrase as “closer to 50%”. The researchers suggest that the IPCC puts the numbers in, or at least changes the way it presents uncertainty.

A challenge too far

But there has been consistent evidence that people tend to think in unpredictable ways when contemplating an uncertain future predicted decades ahead. In 2010, psychologists at the University of California Berkeley conducted an experiment on undergraduates and found that people tended to discount the most apocalyptic warning if it challenged their view of a stable and orderly world.

“Fear-based appeals, especially when not coupled with a clear solution, can backfire and undermine the intended effects of the messages,” the researchers conclude in the journal Psychological Science.

And even when people were prepared to accept that climate change was a substantial threat, there could be resistance to meeting the costs of mitigation.

The problem is almost as old as the spectre of global warming itself. In his 2012 book The City and the Coming Climate (Cambridge University Press) Brian Stone recalls the spectacular US heat waves and drought of 1988, then the hottest year ever recorded.

Off the charts

At the time the Nasa scientist James Hansen put up a $100 wager that at least one of the first three years of the 1990s would surpass the 1988 record. Hansen was the man who in 1988 told a senate committee “it was time to stop waffling … the evidence is pretty strong that the greenhouse effect is here,” and thus put global warming on the political agenda for the first time.

Nobody took his money. “1988 not only was hot: it was off the charts in terms of historical extremes, including the Dust Bowl years of the 1930s,” writes Stone. “Yet the 1990s would render the ’88 record almost trivial.”

The temperature anomalies continued to mount: new temperature records were set every 30 months. Nine of the 10 hottest years ever recorded happened between 2001 and 2010, and the temperature anomaly in 2010 was twice that of 1988.

The statistical probability that such a string of increasingly hot years had nothing to do with climate change was effectively zero, Stone writes. “The implications of these trends should be apparent to every sentient person alive today: the Earth’s climate is changing.” – Climate News Network

FOR IMMEDIATE RELEASE
Warnings, predictions and statements of probability all have their uses, but only if people heed them. Research tells one story, human behaviour seems to offer another version.

LONDON, 28 April – The odds that global warming of almost 1°C since 1880 is just a natural fluctuation are very low: less than one in a hundred and probably less than one in a thousand, according to a study in the journal Climate Dynamics.

Shaun Lovejoy of McGill University in Canada didn’t play with computer simulations: he simply looked at the climate data since 1500 and subjected it to statistical analysis. The message from the historical data – records, tree rings, ice cores, lake sediments and so on – is that global warming is linked to fossil fuel-burning and to rising levels of greenhouse gases in the atmosphere.

“This study will be a blow to any remaining climate change deniers,” said the physicist. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.”

Lovejoy’s finding is unlikely to be the end of the story, perhaps because there are problems with words like “probability”. David Budescu of Fordham University in the US reports in Nature Climate Change that when people hear the words “very likely” used to describe a 95% chance that something is the case, they are more likely to interpret that as around 50% probability.

Budescu and his colleagues worked their way through the problems the Intergovernmental Panel on Climate Change had in presenting its findings. The researchers settled on the terms very unlikely, unlikely, likely and very likely in 24 nations and 17 languages, and found that, on balance, while the IPCC intended “very likely” to mean a more than 90% chance, people tended to understand the phrase as “closer to 50%”. The researchers suggest that the IPCC puts the numbers in, or at least changes the way it presents uncertainty.

A challenge too far

But there has been consistent evidence that people tend to think in unpredictable ways when contemplating an uncertain future predicted decades ahead. In 2010, psychologists at the University of California Berkeley conducted an experiment on undergraduates and found that people tended to discount the most apocalyptic warning if it challenged their view of a stable and orderly world.

“Fear-based appeals, especially when not coupled with a clear solution, can backfire and undermine the intended effects of the messages,” the researchers conclude in the journal Psychological Science.

And even when people were prepared to accept that climate change was a substantial threat, there could be resistance to meeting the costs of mitigation.

The problem is almost as old as the spectre of global warming itself. In his 2012 book The City and the Coming Climate (Cambridge University Press) Brian Stone recalls the spectacular US heat waves and drought of 1988, then the hottest year ever recorded.

Off the charts

At the time the Nasa scientist James Hansen put up a $100 wager that at least one of the first three years of the 1990s would surpass the 1988 record. Hansen was the man who in 1988 told a senate committee “it was time to stop waffling … the evidence is pretty strong that the greenhouse effect is here,” and thus put global warming on the political agenda for the first time.

Nobody took his money. “1988 not only was hot: it was off the charts in terms of historical extremes, including the Dust Bowl years of the 1930s,” writes Stone. “Yet the 1990s would render the ’88 record almost trivial.”

The temperature anomalies continued to mount: new temperature records were set every 30 months. Nine of the 10 hottest years ever recorded happened between 2001 and 2010, and the temperature anomaly in 2010 was twice that of 1988.

The statistical probability that such a string of increasingly hot years had nothing to do with climate change was effectively zero, Stone writes. “The implications of these trends should be apparent to every sentient person alive today: the Earth’s climate is changing.” – Climate News Network

Flat denial rejects ‘very likely’ science

FOR IMMEDIATE RELEASE Warnings, predictions and statements of probability all have their uses, but only if people heed them. Research tells one story, human behaviour seems to offer another version. LONDON, 28 April – The odds that global warming of almost 1°C since 1880 is just a natural fluctuation are very low: less than one in a hundred and probably less than one in a thousand, according to a study in the journal Climate Dynamics. Shaun Lovejoy of McGill University in Canada didn’t play with computer simulations: he simply looked at the climate data since 1500 and subjected it to statistical analysis. The message from the historical data – records, tree rings, ice cores, lake sediments and so on – is that global warming is linked to fossil fuel-burning and to rising levels of greenhouse gases in the atmosphere. “This study will be a blow to any remaining climate change deniers,” said the physicist. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.” Lovejoy’s finding is unlikely to be the end of the story, perhaps because there are problems with words like “probability”. David Budescu of Fordham University in the US reports in Nature Climate Change that when people hear the words “very likely” used to describe a 95% chance that something is the case, they are more likely to interpret that as around 50% probability. Budescu and his colleagues worked their way through the problems the Intergovernmental Panel on Climate Change had in presenting its findings. The researchers settled on the terms very unlikely, unlikely, likely and very likely in 24 nations and 17 languages, and found that, on balance, while the IPCC intended “very likely” to mean a more than 90% chance, people tended to understand the phrase as “closer to 50%”. The researchers suggest that the IPCC puts the numbers in, or at least changes the way it presents uncertainty.

A challenge too far

But there has been consistent evidence that people tend to think in unpredictable ways when contemplating an uncertain future predicted decades ahead. In 2010, psychologists at the University of California Berkeley conducted an experiment on undergraduates and found that people tended to discount the most apocalyptic warning if it challenged their view of a stable and orderly world. “Fear-based appeals, especially when not coupled with a clear solution, can backfire and undermine the intended effects of the messages,” the researchers conclude in the journal Psychological Science. And even when people were prepared to accept that climate change was a substantial threat, there could be resistance to meeting the costs of mitigation. The problem is almost as old as the spectre of global warming itself. In his 2012 book The City and the Coming Climate (Cambridge University Press) Brian Stone recalls the spectacular US heat waves and drought of 1988, then the hottest year ever recorded.

Off the charts

At the time the Nasa scientist James Hansen put up a $100 wager that at least one of the first three years of the 1990s would surpass the 1988 record. Hansen was the man who in 1988 told a senate committee “it was time to stop waffling … the evidence is pretty strong that the greenhouse effect is here,” and thus put global warming on the political agenda for the first time. Nobody took his money. “1988 not only was hot: it was off the charts in terms of historical extremes, including the Dust Bowl years of the 1930s,” writes Stone. “Yet the 1990s would render the ’88 record almost trivial.” The temperature anomalies continued to mount: new temperature records were set every 30 months. Nine of the 10 hottest years ever recorded happened between 2001 and 2010, and the temperature anomaly in 2010 was twice that of 1988. The statistical probability that such a string of increasingly hot years had nothing to do with climate change was effectively zero, Stone writes. “The implications of these trends should be apparent to every sentient person alive today: the Earth’s climate is changing.” – Climate News Network

FOR IMMEDIATE RELEASE Warnings, predictions and statements of probability all have their uses, but only if people heed them. Research tells one story, human behaviour seems to offer another version. LONDON, 28 April – The odds that global warming of almost 1°C since 1880 is just a natural fluctuation are very low: less than one in a hundred and probably less than one in a thousand, according to a study in the journal Climate Dynamics. Shaun Lovejoy of McGill University in Canada didn’t play with computer simulations: he simply looked at the climate data since 1500 and subjected it to statistical analysis. The message from the historical data – records, tree rings, ice cores, lake sediments and so on – is that global warming is linked to fossil fuel-burning and to rising levels of greenhouse gases in the atmosphere. “This study will be a blow to any remaining climate change deniers,” said the physicist. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.” Lovejoy’s finding is unlikely to be the end of the story, perhaps because there are problems with words like “probability”. David Budescu of Fordham University in the US reports in Nature Climate Change that when people hear the words “very likely” used to describe a 95% chance that something is the case, they are more likely to interpret that as around 50% probability. Budescu and his colleagues worked their way through the problems the Intergovernmental Panel on Climate Change had in presenting its findings. The researchers settled on the terms very unlikely, unlikely, likely and very likely in 24 nations and 17 languages, and found that, on balance, while the IPCC intended “very likely” to mean a more than 90% chance, people tended to understand the phrase as “closer to 50%”. The researchers suggest that the IPCC puts the numbers in, or at least changes the way it presents uncertainty.

A challenge too far

But there has been consistent evidence that people tend to think in unpredictable ways when contemplating an uncertain future predicted decades ahead. In 2010, psychologists at the University of California Berkeley conducted an experiment on undergraduates and found that people tended to discount the most apocalyptic warning if it challenged their view of a stable and orderly world. “Fear-based appeals, especially when not coupled with a clear solution, can backfire and undermine the intended effects of the messages,” the researchers conclude in the journal Psychological Science. And even when people were prepared to accept that climate change was a substantial threat, there could be resistance to meeting the costs of mitigation. The problem is almost as old as the spectre of global warming itself. In his 2012 book The City and the Coming Climate (Cambridge University Press) Brian Stone recalls the spectacular US heat waves and drought of 1988, then the hottest year ever recorded.

Off the charts

At the time the Nasa scientist James Hansen put up a $100 wager that at least one of the first three years of the 1990s would surpass the 1988 record. Hansen was the man who in 1988 told a senate committee “it was time to stop waffling … the evidence is pretty strong that the greenhouse effect is here,” and thus put global warming on the political agenda for the first time. Nobody took his money. “1988 not only was hot: it was off the charts in terms of historical extremes, including the Dust Bowl years of the 1930s,” writes Stone. “Yet the 1990s would render the ’88 record almost trivial.” The temperature anomalies continued to mount: new temperature records were set every 30 months. Nine of the 10 hottest years ever recorded happened between 2001 and 2010, and the temperature anomaly in 2010 was twice that of 1988. The statistical probability that such a string of increasingly hot years had nothing to do with climate change was effectively zero, Stone writes. “The implications of these trends should be apparent to every sentient person alive today: the Earth’s climate is changing.” – Climate News Network

'Forget the cost – tackle climate anyway'

FOR IMMEDIATE RELEASE
Forget the cost of mitigating climate change, say two researchers. It’s impossible to work out how much it will be – and whatever it is, we should do it anyway.

LONDON, 3 April – Two researchers who tried to work out the economics of  reducing global climate change to a tolerable level have come up with a perhaps surprising answer: essentially, we do not and cannot know what it would cost.

Even more surprising, probably, is their conclusion: not knowing is no excuse for not acting. “Mitigating climate change must proceed regardless of long-run economic analyses”, they conclude, “or risk making the world uninhabitable.”

Their report, entitled The economics of mitigating climate change: What can we know?,is published online in Technological Forecasting and Social Change.

The pair are Dr Rich Rosen, who specialises in energy system planning and is a senior fellow of the Tellus Institute, based in Boston, Massachusetts, and Edeltraud Guenther, professor of environmental management and accounting at Dresden University of Technology in Germany.

In a densely-argued analysis of the long-term economics of mitigating climate change they say various kinds of uncertainties raise serious questions about whether or not the net costs and benefits of mitigation over periods as long as 50 years or a century can be known accurately enough to be useful to policymakers and citizens.

Crisis ‘trumps uncertainty’

Technological change, especially for energy efficiency technologies, is a key factor in making the net economic results of mitigation unknowable over the long term, they argue. So policymakers should not base mitigation policy on the estimated net economic impacts computed by integrated assessment models (IAM – models which combine scientific and economic insights).

Instead, “mitigation policies must be forcefully implemented anyway given the actual physical climate change crisis, in spite of the many uncertainties involved in trying to predict the net economics of doing so”.

This argument directly challenges the many politicians and others who insist that governments should adopt policies designed to limit climate change only if they can make a strong economic case for doing so. Essentially, it shifts the ground of the debate from “what is affordable?” to “what is survivable?”

The authors say economic analyses of mitigating climate change rely on flawed sets of IAM results, which are invalidated by uncertainty over future technologies and their costs. They also believe changes in production and consumption patterns will affect mitigation costs.

‘Meaningless’ results

They write: “Since the Western lifestyle can probably not serve as a role model for the life styles of the nine billion people likely to inhabit our planet by 2050, significant but unpredictable changes to consumption and production patterns not incorporated in existing IAMs are likely to occur, adding another layer of uncertainty to the economic calculations made by these IAMs for the net costs and benefits of mitigating climate change.”

“The IPCC and other scientific bodies should no longer report attempts at calculating the net economic impacts of mitigating climate change…”

The authors do not hide their scorn for the results provided by existing IAM scenarios. These, they write, are “not useful because even the simplest comparison of model results yields meaningless results — the uncertainties are too profound.”

They end by posing a question: “Should these findings and conclusions about the inadequacies of current IAMs really matter to policymakers who are trying to figure out when, and to what extent, to implement effective climate change mitigation policies?

Their response is terse: “Our answer is ‘no’, because humanity would be wise to mitigate climate change as quickly as possible without being constrained by existing economic systems and institutions, or risk making the world uninhabitable.” – Climate News Network

FOR IMMEDIATE RELEASE
Forget the cost of mitigating climate change, say two researchers. It’s impossible to work out how much it will be – and whatever it is, we should do it anyway.

LONDON, 3 April – Two researchers who tried to work out the economics of  reducing global climate change to a tolerable level have come up with a perhaps surprising answer: essentially, we do not and cannot know what it would cost.

Even more surprising, probably, is their conclusion: not knowing is no excuse for not acting. “Mitigating climate change must proceed regardless of long-run economic analyses”, they conclude, “or risk making the world uninhabitable.”

Their report, entitled The economics of mitigating climate change: What can we know?,is published online in Technological Forecasting and Social Change.

The pair are Dr Rich Rosen, who specialises in energy system planning and is a senior fellow of the Tellus Institute, based in Boston, Massachusetts, and Edeltraud Guenther, professor of environmental management and accounting at Dresden University of Technology in Germany.

In a densely-argued analysis of the long-term economics of mitigating climate change they say various kinds of uncertainties raise serious questions about whether or not the net costs and benefits of mitigation over periods as long as 50 years or a century can be known accurately enough to be useful to policymakers and citizens.

Crisis ‘trumps uncertainty’

Technological change, especially for energy efficiency technologies, is a key factor in making the net economic results of mitigation unknowable over the long term, they argue. So policymakers should not base mitigation policy on the estimated net economic impacts computed by integrated assessment models (IAM – models which combine scientific and economic insights).

Instead, “mitigation policies must be forcefully implemented anyway given the actual physical climate change crisis, in spite of the many uncertainties involved in trying to predict the net economics of doing so”.

This argument directly challenges the many politicians and others who insist that governments should adopt policies designed to limit climate change only if they can make a strong economic case for doing so. Essentially, it shifts the ground of the debate from “what is affordable?” to “what is survivable?”

The authors say economic analyses of mitigating climate change rely on flawed sets of IAM results, which are invalidated by uncertainty over future technologies and their costs. They also believe changes in production and consumption patterns will affect mitigation costs.

‘Meaningless’ results

They write: “Since the Western lifestyle can probably not serve as a role model for the life styles of the nine billion people likely to inhabit our planet by 2050, significant but unpredictable changes to consumption and production patterns not incorporated in existing IAMs are likely to occur, adding another layer of uncertainty to the economic calculations made by these IAMs for the net costs and benefits of mitigating climate change.”

“The IPCC and other scientific bodies should no longer report attempts at calculating the net economic impacts of mitigating climate change…”

The authors do not hide their scorn for the results provided by existing IAM scenarios. These, they write, are “not useful because even the simplest comparison of model results yields meaningless results — the uncertainties are too profound.”

They end by posing a question: “Should these findings and conclusions about the inadequacies of current IAMs really matter to policymakers who are trying to figure out when, and to what extent, to implement effective climate change mitigation policies?

Their response is terse: “Our answer is ‘no’, because humanity would be wise to mitigate climate change as quickly as possible without being constrained by existing economic systems and institutions, or risk making the world uninhabitable.” – Climate News Network

‘Forget the cost – tackle climate anyway’

FOR IMMEDIATE RELEASE Forget the cost of mitigating climate change, say two researchers. It’s impossible to work out how much it will be – and whatever it is, we should do it anyway. LONDON, 3 April – Two researchers who tried to work out the economics of  reducing global climate change to a tolerable level have come up with a perhaps surprising answer: essentially, we do not and cannot know what it would cost. Even more surprising, probably, is their conclusion: not knowing is no excuse for not acting. “Mitigating climate change must proceed regardless of long-run economic analyses”, they conclude, “or risk making the world uninhabitable.” Their report, entitled The economics of mitigating climate change: What can we know?,is published online in Technological Forecasting and Social Change. The pair are Dr Rich Rosen, who specialises in energy system planning and is a senior fellow of the Tellus Institute, based in Boston, Massachusetts, and Edeltraud Guenther, professor of environmental management and accounting at Dresden University of Technology in Germany. In a densely-argued analysis of the long-term economics of mitigating climate change they say various kinds of uncertainties raise serious questions about whether or not the net costs and benefits of mitigation over periods as long as 50 years or a century can be known accurately enough to be useful to policymakers and citizens.

Crisis ‘trumps uncertainty’

Technological change, especially for energy efficiency technologies, is a key factor in making the net economic results of mitigation unknowable over the long term, they argue. So policymakers should not base mitigation policy on the estimated net economic impacts computed by integrated assessment models (IAM – models which combine scientific and economic insights). Instead, “mitigation policies must be forcefully implemented anyway given the actual physical climate change crisis, in spite of the many uncertainties involved in trying to predict the net economics of doing so”. This argument directly challenges the many politicians and others who insist that governments should adopt policies designed to limit climate change only if they can make a strong economic case for doing so. Essentially, it shifts the ground of the debate from “what is affordable?” to “what is survivable?” The authors say economic analyses of mitigating climate change rely on flawed sets of IAM results, which are invalidated by uncertainty over future technologies and their costs. They also believe changes in production and consumption patterns will affect mitigation costs.

‘Meaningless’ results

They write: “Since the Western lifestyle can probably not serve as a role model for the life styles of the nine billion people likely to inhabit our planet by 2050, significant but unpredictable changes to consumption and production patterns not incorporated in existing IAMs are likely to occur, adding another layer of uncertainty to the economic calculations made by these IAMs for the net costs and benefits of mitigating climate change.” “The IPCC and other scientific bodies should no longer report attempts at calculating the net economic impacts of mitigating climate change…” The authors do not hide their scorn for the results provided by existing IAM scenarios. These, they write, are “not useful because even the simplest comparison of model results yields meaningless results — the uncertainties are too profound.” They end by posing a question: “Should these findings and conclusions about the inadequacies of current IAMs really matter to policymakers who are trying to figure out when, and to what extent, to implement effective climate change mitigation policies? Their response is terse: “Our answer is ‘no’, because humanity would be wise to mitigate climate change as quickly as possible without being constrained by existing economic systems and institutions, or risk making the world uninhabitable.” – Climate News Network

FOR IMMEDIATE RELEASE Forget the cost of mitigating climate change, say two researchers. It’s impossible to work out how much it will be – and whatever it is, we should do it anyway. LONDON, 3 April – Two researchers who tried to work out the economics of  reducing global climate change to a tolerable level have come up with a perhaps surprising answer: essentially, we do not and cannot know what it would cost. Even more surprising, probably, is their conclusion: not knowing is no excuse for not acting. “Mitigating climate change must proceed regardless of long-run economic analyses”, they conclude, “or risk making the world uninhabitable.” Their report, entitled The economics of mitigating climate change: What can we know?,is published online in Technological Forecasting and Social Change. The pair are Dr Rich Rosen, who specialises in energy system planning and is a senior fellow of the Tellus Institute, based in Boston, Massachusetts, and Edeltraud Guenther, professor of environmental management and accounting at Dresden University of Technology in Germany. In a densely-argued analysis of the long-term economics of mitigating climate change they say various kinds of uncertainties raise serious questions about whether or not the net costs and benefits of mitigation over periods as long as 50 years or a century can be known accurately enough to be useful to policymakers and citizens.

Crisis ‘trumps uncertainty’

Technological change, especially for energy efficiency technologies, is a key factor in making the net economic results of mitigation unknowable over the long term, they argue. So policymakers should not base mitigation policy on the estimated net economic impacts computed by integrated assessment models (IAM – models which combine scientific and economic insights). Instead, “mitigation policies must be forcefully implemented anyway given the actual physical climate change crisis, in spite of the many uncertainties involved in trying to predict the net economics of doing so”. This argument directly challenges the many politicians and others who insist that governments should adopt policies designed to limit climate change only if they can make a strong economic case for doing so. Essentially, it shifts the ground of the debate from “what is affordable?” to “what is survivable?” The authors say economic analyses of mitigating climate change rely on flawed sets of IAM results, which are invalidated by uncertainty over future technologies and their costs. They also believe changes in production and consumption patterns will affect mitigation costs.

‘Meaningless’ results

They write: “Since the Western lifestyle can probably not serve as a role model for the life styles of the nine billion people likely to inhabit our planet by 2050, significant but unpredictable changes to consumption and production patterns not incorporated in existing IAMs are likely to occur, adding another layer of uncertainty to the economic calculations made by these IAMs for the net costs and benefits of mitigating climate change.” “The IPCC and other scientific bodies should no longer report attempts at calculating the net economic impacts of mitigating climate change…” The authors do not hide their scorn for the results provided by existing IAM scenarios. These, they write, are “not useful because even the simplest comparison of model results yields meaningless results — the uncertainties are too profound.” They end by posing a question: “Should these findings and conclusions about the inadequacies of current IAMs really matter to policymakers who are trying to figure out when, and to what extent, to implement effective climate change mitigation policies? Their response is terse: “Our answer is ‘no’, because humanity would be wise to mitigate climate change as quickly as possible without being constrained by existing economic systems and institutions, or risk making the world uninhabitable.” – Climate News Network

Human activities 'caused record Oz heat'

FOR IMMEDIATE RELEASE
Australia’s 2013 summer was the hottest on record only because of human influences on the climate,  meteorologists say. They report that people’s activities raised the likelihood of a record by about five times.

LONDON, 24 March – Australian researchers are in no doubt about what happened there last year. The country’s Bureau of Meteorology is a model of clarity: “2013 was Australia’s warmest year on record. Persistent and widespread warmth throughout the year led to record-breaking temperatures and several severe bushfires. Nationally-averaged rainfall was slightly below average.”

Now two Australian scientists say it is virtually certain that no records would have been broken had it not been for the influence on the climate of humans. They even put a figure on it: people, they say, raised the stakes about five times.

The World Meteorological Organization devotes a section in its report, WMO statement on the status of the global climate in 2013, to the scientists’ peer-reviewed case study, undertaken by a team at the ARC Centre of Excellence for Climate System Science at the University of Melbourne. It was adapted from an article originally published in the journal Geophysical Research Letters,

The study used nine global climate models to investigate whether changes in the probability of extreme Australian summer temperatures were due to human influences.

More frequent extremes ahead

It concluded: “Comparing climate model simulations with and without human factors shows that the record hot Australian summer of 2012/13 was about five times as likely as a result of human-induced influence on climate, and that the record hot calendar year of 2013 would have been virtually impossible without human contributions of heat-trapping gases, illustrating that some extreme events are becoming much more likely due to climate change.”

The report also strikes a warning note: “These types of extreme Australian summers become even more frequent in simulations of the future under further global warming.”.

It says last year was notable as well because it was marked by what scientists call “neutral to weak La Niña ENSO conditions”, which would normally be expected to produce cooler temperatures across Australia, not hotter. El Niño is characterized by unusually warm temperatures and La Niña by unusually cool ones in the equatorial Pacific.

Before 2013 six of the eight hottest Australian summers occurred during El Niño years. The WMO says natural ENSO variations are very unlikely to explain the record 2013 Australian heat.

“There is no standstill in global warming…The laws of physics are non-negotiable”

Introducng the report the WMO secretary-general, Michel Jarraud, said many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change. And he repeated his insistence that claims of a pause in climate change were mistaken.

There is no standstill in global warming. The warming of our oceans has accelerated, and at lower depths. More than 90% of the excess energy trapped by greenhouse gases is stored in the oceans.

“Levels of these greenhouse gases are at record levels, meaning that our atmosphere and oceans will continue to warm for centuries to come. The laws of physics are non-negotiable.”

The report says 13 of the 14 warmest years on record have all occurred during this century, and each of the last three decades has been warmer than the previous one, culminating with 2001-2010 as the warmest decade on record. It confirms that 2013 tied with 2007 as the sixth warmest year on record, continuing the long-term global warming trend.

Temperatures in many parts of the southern hemisphere were especially warm, and Australia was not the only country to feel the impact: Argentina had its second hottest year on record.- Climate News Network

FOR IMMEDIATE RELEASE
Australia’s 2013 summer was the hottest on record only because of human influences on the climate,  meteorologists say. They report that people’s activities raised the likelihood of a record by about five times.

LONDON, 24 March – Australian researchers are in no doubt about what happened there last year. The country’s Bureau of Meteorology is a model of clarity: “2013 was Australia’s warmest year on record. Persistent and widespread warmth throughout the year led to record-breaking temperatures and several severe bushfires. Nationally-averaged rainfall was slightly below average.”

Now two Australian scientists say it is virtually certain that no records would have been broken had it not been for the influence on the climate of humans. They even put a figure on it: people, they say, raised the stakes about five times.

The World Meteorological Organization devotes a section in its report, WMO statement on the status of the global climate in 2013, to the scientists’ peer-reviewed case study, undertaken by a team at the ARC Centre of Excellence for Climate System Science at the University of Melbourne. It was adapted from an article originally published in the journal Geophysical Research Letters,

The study used nine global climate models to investigate whether changes in the probability of extreme Australian summer temperatures were due to human influences.

More frequent extremes ahead

It concluded: “Comparing climate model simulations with and without human factors shows that the record hot Australian summer of 2012/13 was about five times as likely as a result of human-induced influence on climate, and that the record hot calendar year of 2013 would have been virtually impossible without human contributions of heat-trapping gases, illustrating that some extreme events are becoming much more likely due to climate change.”

The report also strikes a warning note: “These types of extreme Australian summers become even more frequent in simulations of the future under further global warming.”.

It says last year was notable as well because it was marked by what scientists call “neutral to weak La Niña ENSO conditions”, which would normally be expected to produce cooler temperatures across Australia, not hotter. El Niño is characterized by unusually warm temperatures and La Niña by unusually cool ones in the equatorial Pacific.

Before 2013 six of the eight hottest Australian summers occurred during El Niño years. The WMO says natural ENSO variations are very unlikely to explain the record 2013 Australian heat.

“There is no standstill in global warming…The laws of physics are non-negotiable”

Introducng the report the WMO secretary-general, Michel Jarraud, said many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change. And he repeated his insistence that claims of a pause in climate change were mistaken.

There is no standstill in global warming. The warming of our oceans has accelerated, and at lower depths. More than 90% of the excess energy trapped by greenhouse gases is stored in the oceans.

“Levels of these greenhouse gases are at record levels, meaning that our atmosphere and oceans will continue to warm for centuries to come. The laws of physics are non-negotiable.”

The report says 13 of the 14 warmest years on record have all occurred during this century, and each of the last three decades has been warmer than the previous one, culminating with 2001-2010 as the warmest decade on record. It confirms that 2013 tied with 2007 as the sixth warmest year on record, continuing the long-term global warming trend.

Temperatures in many parts of the southern hemisphere were especially warm, and Australia was not the only country to feel the impact: Argentina had its second hottest year on record.- Climate News Network

Human activities ’caused record Oz heat’

FOR IMMEDIATE RELEASE Australia’s 2013 summer was the hottest on record only because of human influences on the climate,  meteorologists say. They report that people’s activities raised the likelihood of a record by about five times. LONDON, 24 March – Australian researchers are in no doubt about what happened there last year. The country’s Bureau of Meteorology is a model of clarity: “2013 was Australia’s warmest year on record. Persistent and widespread warmth throughout the year led to record-breaking temperatures and several severe bushfires. Nationally-averaged rainfall was slightly below average.” Now two Australian scientists say it is virtually certain that no records would have been broken had it not been for the influence on the climate of humans. They even put a figure on it: people, they say, raised the stakes about five times. The World Meteorological Organization devotes a section in its report, WMO statement on the status of the global climate in 2013, to the scientists’ peer-reviewed case study, undertaken by a team at the ARC Centre of Excellence for Climate System Science at the University of Melbourne. It was adapted from an article originally published in the journal Geophysical Research Letters, The study used nine global climate models to investigate whether changes in the probability of extreme Australian summer temperatures were due to human influences.

More frequent extremes ahead

It concluded: “Comparing climate model simulations with and without human factors shows that the record hot Australian summer of 2012/13 was about five times as likely as a result of human-induced influence on climate, and that the record hot calendar year of 2013 would have been virtually impossible without human contributions of heat-trapping gases, illustrating that some extreme events are becoming much more likely due to climate change.” The report also strikes a warning note: “These types of extreme Australian summers become even more frequent in simulations of the future under further global warming.”. It says last year was notable as well because it was marked by what scientists call “neutral to weak La Niña ENSO conditions”, which would normally be expected to produce cooler temperatures across Australia, not hotter. El Niño is characterized by unusually warm temperatures and La Niña by unusually cool ones in the equatorial Pacific. Before 2013 six of the eight hottest Australian summers occurred during El Niño years. The WMO says natural ENSO variations are very unlikely to explain the record 2013 Australian heat.

“There is no standstill in global warming…The laws of physics are non-negotiable”

Introducng the report the WMO secretary-general, Michel Jarraud, said many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change. And he repeated his insistence that claims of a pause in climate change were mistaken. “There is no standstill in global warming. The warming of our oceans has accelerated, and at lower depths. More than 90% of the excess energy trapped by greenhouse gases is stored in the oceans. “Levels of these greenhouse gases are at record levels, meaning that our atmosphere and oceans will continue to warm for centuries to come. The laws of physics are non-negotiable.” The report says 13 of the 14 warmest years on record have all occurred during this century, and each of the last three decades has been warmer than the previous one, culminating with 2001-2010 as the warmest decade on record. It confirms that 2013 tied with 2007 as the sixth warmest year on record, continuing the long-term global warming trend. Temperatures in many parts of the southern hemisphere were especially warm, and Australia was not the only country to feel the impact: Argentina had its second hottest year on record.- Climate News Network

FOR IMMEDIATE RELEASE Australia’s 2013 summer was the hottest on record only because of human influences on the climate,  meteorologists say. They report that people’s activities raised the likelihood of a record by about five times. LONDON, 24 March – Australian researchers are in no doubt about what happened there last year. The country’s Bureau of Meteorology is a model of clarity: “2013 was Australia’s warmest year on record. Persistent and widespread warmth throughout the year led to record-breaking temperatures and several severe bushfires. Nationally-averaged rainfall was slightly below average.” Now two Australian scientists say it is virtually certain that no records would have been broken had it not been for the influence on the climate of humans. They even put a figure on it: people, they say, raised the stakes about five times. The World Meteorological Organization devotes a section in its report, WMO statement on the status of the global climate in 2013, to the scientists’ peer-reviewed case study, undertaken by a team at the ARC Centre of Excellence for Climate System Science at the University of Melbourne. It was adapted from an article originally published in the journal Geophysical Research Letters, The study used nine global climate models to investigate whether changes in the probability of extreme Australian summer temperatures were due to human influences.

More frequent extremes ahead

It concluded: “Comparing climate model simulations with and without human factors shows that the record hot Australian summer of 2012/13 was about five times as likely as a result of human-induced influence on climate, and that the record hot calendar year of 2013 would have been virtually impossible without human contributions of heat-trapping gases, illustrating that some extreme events are becoming much more likely due to climate change.” The report also strikes a warning note: “These types of extreme Australian summers become even more frequent in simulations of the future under further global warming.”. It says last year was notable as well because it was marked by what scientists call “neutral to weak La Niña ENSO conditions”, which would normally be expected to produce cooler temperatures across Australia, not hotter. El Niño is characterized by unusually warm temperatures and La Niña by unusually cool ones in the equatorial Pacific. Before 2013 six of the eight hottest Australian summers occurred during El Niño years. The WMO says natural ENSO variations are very unlikely to explain the record 2013 Australian heat.

“There is no standstill in global warming…The laws of physics are non-negotiable”

Introducng the report the WMO secretary-general, Michel Jarraud, said many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change. And he repeated his insistence that claims of a pause in climate change were mistaken. “There is no standstill in global warming. The warming of our oceans has accelerated, and at lower depths. More than 90% of the excess energy trapped by greenhouse gases is stored in the oceans. “Levels of these greenhouse gases are at record levels, meaning that our atmosphere and oceans will continue to warm for centuries to come. The laws of physics are non-negotiable.” The report says 13 of the 14 warmest years on record have all occurred during this century, and each of the last three decades has been warmer than the previous one, culminating with 2001-2010 as the warmest decade on record. It confirms that 2013 tied with 2007 as the sixth warmest year on record, continuing the long-term global warming trend. Temperatures in many parts of the southern hemisphere were especially warm, and Australia was not the only country to feel the impact: Argentina had its second hottest year on record.- Climate News Network