UK Met Office maps of 2070-2100 weather

Joined
Dec 4, 2003
Messages
3,411
The UK Met Office Hadley Centre has published a page showing expected temperatures about 75 years from now based on the best models of ocean-atmosphere interactions and given emissions trends. Speculative, of course, but interesting.

http://www.metoffice.com/research/hadleycentre/models/modeldata.html

If you want heat and agony, though, there's no need to go out for fine dining when burgers are already on the grill:

http://drought.unl.edu/dm/monitor.html

What is this, like year 3 of the great Texas drought?

Tim
 
A simple question: If these models are so good, why can't they produce skillful forecasts at 70 days? And, if they cannot consistently produce skillful 70 day forecasts, why do we give them any credibility at 70 years?
 
Last edited by a moderator:
Not such a simple question, I think. The implicit assumption is that short-term variability is greater than long-term variability. That's pretty clearly true of weather vs. short-term climate per your (serious) comment.

But when you get into the realm of these long-range atmo-ocean coupled models, that's a whole other ball game. And I think they recognize it with rather wide significance windows. They try to refine them with backcasting. I'd say, though, that it's pretty hard to get a good handle on some phenomena that have just started to become significant.

But what is this saying? I think it's saying that the actualized magnitude of things they don't have a good handle on is likely greater than the things they do. So we may either slow-cook, parboil, or fry -- TBD. IMO in no way does this uncertainty diminish the importance of the work.
 
Global Warming....think of the alternative!

Do you think these models take into account the potential global warming spoilers? Like a shutdown of the Gulf Stream, or a large volcanic eruption like Tambora/Krakatoa?
 
Actually, I think it is simple.

Unless something has changed recently, none of the climate models accurately reproduced the mid-20th Century cooling without being "tuned" to do so. If they don't accurately reproduce past weather without fudging, why do we think they will accurately forecast the future?

If a model can forecast the weather 70 years into the future, it should be able to forecast the weather seven months into the future. Since most everyone agrees they cannot, why do we want to bet $1,000,000,000,000 that they are correct? See: http://science.infoshop.org/index.php?name=News&file=article&sid=255 Consider all the ways that money can be spent in ways that are CERTAIN to have a positive return on investment.

Fact: The sun has burned "hotter" (see Max Planc Institute studies) the last 20 years than at any time in the last millenia. We have no theory for predicting (let alone a methodology for doing so) how solar energy will change in the next 5 years. We don't understand (let alone model) the radiative properties of clouds. We can't forecast volanoes. The rapid ocean cooling of the last year was not forecast in advance.

Remember: NOAA and Gray thought 2006 was going to be a bad hurricane season. That forecast was for a period 30 days to 200 days into the future (the length of the hurricane season). Those forecasts were busts. I state this not to criticize those forecasters. I point this out to illustrate what I believe is the folly of forecasting the weather 36,000 days into the future.

Mike
 
Fudging is a good word to use. One of the greatest physicists of the 20th century decided to fudge (include) a number to make sure things looked like he wanted them to look, and regretted that mistake later. Not the same as this climate "issue" of course, but it does apply to one, or a group of people, making things look the way they want them to look.

Like Mike Smith wrote, no person can predict future "solar heat ouput" with any degree of accuracy. This same thinking also applies to accurate weather prediction beyond a certain number of days.. Seasonal outlook accuracy is little better than a five year old flipping a coin and being told to forecast warm if heads, cold if tails. Now UK Met is going to tell all of us how things will be 36000 days in the future... get real! North Korea could nuke the planet before then - forgive me for drifting, but then again, I'm no fan of Nostradamus either:)

Expect a positive charge, get a positive charge. Expect to see global heat, get global heat. I realize that is a terrible perversion of a wonderfully accurate theory. Who is to say that that isn't precisely what UK MET has done?
 
Last edited by a moderator:
Mike,

Forecasting the number and intensity of hurricanes is VERY different than forecasting average temperatures decades down the road. These global climate models do not model (by design and intention) 'weather' at specific locations in hopes of 'predicting the weather on August 5th, 2085'. There is a lot of parameterization involved, but the goal is to model average conditions. High-frequency variability doesn't matter much like it does in
day-to-day forecasting that we're most familiar with. In short-term forecasts (a few days), there can be a lot of variability (both in space and time). If we average everything out over a relatively large area over a relatively large time, however, this small-scale variability disappears (it "averages out"). In the long-term, global energy balance drives the climate and global weather patterns, and it'd be quite an event for a single, day-long local weather event to significant affect the climate of an entire region.

Yes, there is a lot of politics behind the global warming campaign (and the fact that some in Hollywood are pushing it as if it's a fad) does not mean that it isn't happening. Actually, I should rephrase that -- few argue that global warming ISN'T occurring. The vast majority of data show that the globe is warming. Now, the biggest point of contention is whether human activity is having a noticeable and appreciable impact on global warming. I recently saw a study in from NCAR meteorologists that noted that, per their work, the world could not have warmed as quickly and much as it has by natural influences alone. Say what you will, but the majority of global collaborative studies have concluded that global warming is being aided by anthropogenic factors. There seem to be a lot more studies that support the idea of humans significantly influencing global warming than the number of studies that show otherwise. Of course, this doesn't mean that humans are influencing the global weather, but I certainly think it's a "better than not" situation.


The response to global warming is highly non-linear, so an increase in average global temperature does not mean that everyone will see warmer weather. Likewise, that does not mean that the world will continue to warm. The ocean / thermohaline circulation is quite complex, and there are a lot of secondary effects that may produce results directly opposite of those that originally seem obvious. Heck, at some point, the warm may get so warm that it triggers some other highly non-linear process that causes a rapid cool-down. Or, we could just hope for one or two massive eruptions from a supervolcano that could cast us into an ice age (as has happened in the past).
 
I would like to point out that my own seasonal remarks from above refer particularly to climatic temperature “predictions.” I was not trying to suggest in any way tropical cyclone predictions are similar in accuracy.
 
Jeff,

Thank you for your response, but you are not telling me anything regarding the models I did not know. Please re-read my response. At no point did I express an opinion regarding whether global warming is real or whether humans are driving climate change. My comments were directed to what appears to be an unwarranted belief in the skill of these ultra long range forecasts -- even though they have been less than terrific in "pastcast" mode.

A challenge: Let's have the most sophisticated model start making daily forecasts for the mean conditions from day 150 to day 180 for several specific cities and start verifying them for a year or two. Not the daily weather, but the 30 day mean temperature and precipitation departure from normal for each location. If your premise is correct that it "averages out" then the mean forecasts will be consistently skillful. Want to make any bets as to the outcome?

You say of NCAR's work: " the world could not have warmed as quickly and much as it has by natural influences alone." Given the uncertainties I just listed (solar output, radative forces, the oceans' cooling not predicted, among many others), how could anyone say "could not"?!

Various global warming scientists have forecast warning, cooling, drought and flood (I can document this comment), so essentially, at least one their forecasts are correct no matter what happens. Vindication!

For the record, I am a human-induced global warming agnostic. I am perfectly willing to believe it when there is good science that indicates it. But 36,000 day forecasts that have not been proven skillful in a FORECAST, let alone pastcast, mode are not at all convincing to me.

Again, thank you for the response.

Mike
 
Last edited by a moderator:
Over time, any forecasting model worth its jock should work out to climatology (for temperature, precip. etc.) for any given location. It may miss events (resulting in bad forecats in the short term), but over the long term it should be correct (for climate). The climate models aren't simply taking WRF/GFS to 100000000000 hours and saying yes, the climate will be warming.

As far as global warming goes, I'm not going to get into an indepth discussion but look at CO2 graphs over time. CO2 flux lines up nicely with warmer and colder periods. I think it is indisputable a) we're increasing CO2 and b) humans have an impact on global processes (look at the ozone hole for example). Draw your own conclusions from that, but we're at least impacting the Earth system. I thought the plots from UK looked pretty good for North Dakota... I think I'll stick around up here for awhile ;)

Aaron
 
Last edited by a moderator:
Thanks for your response, Aaron.

Actually, that is exactly what the climate models are doing because we don't know what the future state of the climate will be, so we don't know whether they are accurately averaging to climate especially since climate naturally changes over time. And, because we don't verify these models in forecast mode, we have no idea whether they have any skill at short term (let alone long term) climate prediction.

I agree the graphs of CO2 and world temperature show a high correlation with each other, but the graphs I have seen show the temperature rising prior to the CO2 levels which would imply that CO2 is not the cause.

You mention the ozone hole. A great deal of money and inconvenience have been expended trying to fix it. Five years ago, there were various proclamations of "victory" -- chlorofluorocarbons had been banned. Well, it turned out not to be that simple. The ozone hole is worse than ever. See: www.theozonehole.com/ozonehole2006.htm . In the long run, it may turn out that the CFC ban was a good thing and the ozone hole will "heal." But, in view of the worsening ozone hole (in spite of the ban) it is questionable whether it will turn out that banning CFC's will have a favorable benefit to cost ratio or will have had any significant effect on the ozone hole at all.

The worsening of of the ozone hole after spending all of the money and effort to ban CFC's should be a cautionary note for atmospheric scientists in the global warming arena. The sun-earth-atmosphere-ocean system is extraordinary complex. I simply don't buy the idea that unverified models can make accurate 36,000 day forecasts or that we should spend one quadrillion dollars based on them.

Mike
 
Considering CFCs can last for ~100 years in the atmosphere, I wouldn't expect the ozone hole to simply go away after the banning.

oz_hole_area.jpg

I'd say the rate of change has decreased.
 
Mike,

I was a little vague in my last post regarding the natural forcings research -- the research only looked at variations in the sun's brightness. Here's the article on the UCAR website about the study:

“Our results imply that, over the past century, climate change due to human influences must far outweigh the effects of changes in the Sun's brightness,” says Wigley.

...

Brightness variations are the result of changes in the amount of the Sun’s surface covered by dark sunspots and by bright points called faculae. The sunspots act as thermal plugs, diverting heat from the solar surface, while the faculae act as thermal leaks, allowing heat from subsurface layers to escape more readily. During times of high solar activity, both the sunspots and faculae increase, but the effect of the faculae dominates, leading to an overall increase in brightness.
The new study looked at observations of solar brightness since 1978 and at indirect measures before then, in order to assess how sunspots and faculae affect the Sun’s brightness. Data collected from radiometers on U.S. and European spacecraft show that the Sun is about 0.07 percent brighter in years of peak sunspot activity, such as around 2000, than when spots are rare (as they are now, at the low end of the 11-year solar cycle). Variations of this magnitude are too small to have contributed appreciably to the accelerated global warming observed since the mid-1970s, according to the study, and there is no sign of a net increase in brightness over the period.



To assess the period before 1978, the authors used historical records of sunspot activity and examined radioisotopes produced in Earth's atmosphere and recorded in the Greenland and Antarctic ice sheets. During periods of high solar activity, the enhanced solar wind shields Earth from cosmic rays that produce the isotopes, thus giving scientists a record of the activity.


Changes in Solar Brightness Too Weak to Explain Global Warming




Then, there's this one:
Previous efforts to understand the causes of changes in SSTs have focused on temperature changes averaged over very large ocean areas, such as the entire Atlantic or Pacific basins. The new research specifically targets SST changes in much smaller hurricane formation regions.
For the period 1906-2005, the researchers found an 84 percent probability that human-induced factors—primarily an increase in greenhouse gas emissions—account for most of the observed rise in SSTs in the Atlantic and Pacific hurricane formation regions.
"The important conclusion is that the observed SST increases in these hurricane breeding grounds cannot be explained by natural processes alone," says Wigley. "The best explanation for these changes has to include a large human influence."


Human Activities Are Boosting Ocean Temperatures in Areas Where Hurricanes Form, New Study Finds
 
Last edited by a moderator:
Back
Top