domingo, 29 de junio de 2008

Phytoplankton surprisingly destroys a lot of ozone

Dr Katie Read and fifteen mostly U.K. and U.S. co-authors have studied the mechanisms destroying ozone (O₃) in the lower atmosphere above the ocean:
Extensive halogen-mediated ozone destruction over the tropical Atlantic Ocean (scientific paper in Nature, abstract)
Recall that ozone in the lower atmosphere is a highly potent greenhouse gas. Despite its small amount, it is responsible for almost 2/3 of the effect we attribute to CO₂. (The absolute size of the effect remains uncertain, mostly due to unknown feedbacks, but most of these feedbacks are universal multiplicative factors for all greenhouse gases.) When you divide the "shared absorption" in between the overlapping different gases, the percentages of the total greenhouse effect are as follows (source):

67%+ H₂O (water)
15% CO₂ (carbon dioxide)
10% O₃ (ozone)
3% CH₄ (methane)
3% N₂O (nitrous oxide)
Water itself would be able to cause a much higher percentage of the effect than 67% but some of the spectral lines are absorbed - and attributed to - the competitors. At any rate, you see that O₃ and CH₄, when added together, are almost as important as CO₂, so we should care about them.

Is ozone good? The popular cliché is that the lower-atmosphere ozone is a health hazard while higher-atmosphere ozone is helpful to protect us against ultraviolet radiation
There are all kinds of processes that create and destroy ozone. Dr Katie Read et al., the authors of the present peer-reviewed paper, spectroscopically observed the gases at the Cape Verde Observatory (Atlantic Ocean islands, 500 miles West from Senegal in West Africa).
See also: Other lab problems facing the mainstream theory of the ozone cycle
They found out that phytoplankton apparently creates a lot bromine monoxide and iodine monoxide. These compounds subsequently destroy a lot of ozone and lead to additional products that destroy some methane, too.

The existing conventional mainstream models of Earth's chemistry completely neglect halogens (Fl, Cl, Br, I, At). That's why they end up with a wrong figure expressing how much ozone is destroyed above the ocean - i.e. a wrong figure for the concentration of an important greenhouse gas. The authors decide that the actual amount of ozone destroyed in this way is 50% higher than the models would lead you to believe.

I want to say that the rates of the (halogen-dominated) reactions they are proposing to explain the spectroscopic observations are generally increased by man-made (or other) CO₂ production and by "global warming". Remember that e.g. coccolithophores, a brand of phytoplankton, thrive when the CO₂ levels increase. Also, higher temperatures lead (or would lead) to increased water vapor above the ocean that helps the halogen oxides to escape from the ocean. All these relationships are examples of Le Chatelier's principle i.e. Nature's natural ability to self-regulate. Negative feedbacks always win at the end.

A pseudoscientific reaction from RealClimate.ORG

RealClimate.ORG's Gavin Schmidt clearly doesn't like the fact that the findings show another mildly serious problem with the contemporary climate models. So he spreads some fog. The most breathtaking demonstration of his incompetence (or zeal) is the following quote:
... Yet this is completely misleading since neither climate sensitivity nor CO₂ driven future warming will be at all affected by any revisions in ozone chemistry - mainly for the reason that most climate models don't consider ozone chemistry at all. Precisely zero of the IPCC AR4 model simulations (discussed here for instance) used an interactive ozone module in doing the projections into the future.
Wow! He seems to be so proud that their models neglect virtually everything. So if a mechanism happens to destroy another greenhouse gas that is as important as CO₂, partially as a result of the presence of CO₂ itself, the sensitivity will not be affected "at all"! Who could have thought? Has Mr Schmidt ever heard of feedbacks? Or does he think that there is no interaction (or causal relationship) between the concentration of different chemical compounds (and between temperature, too)? Has he ever heard of the so-called chemical reactions?

What he says is so flagrant denial of basics of science that I believe that most people who have heard some science at the elementary school will know what's wrong with his opinions.

Political activists like him love to talk about positive feedbacks all the time - especially the production of water vapor indirectly caused by CO₂-induced warming - but when it comes to negative feedbacks such as the destruction of other greenhouse gases such as O₃ and CH₄, they shouldn't be looked at "at all"! Is it what you call science and how you want to obtain correct (...) answers, Mr Schmidt? I am stunned.

The climate models that try to emulate the greenhouse effect but that don't reproduce the correct chemistry are simply wrong models because the chemistry that involves the greenhouse gases on either side of the formulae is completely crucial for the greenhouse effect. Because O₃ and CH₄ are also greenhouse gases, it is damn important to know whether they exist in the atmosphere and whether they are being destroyed and whether they will be destroyed in the future (and how much). So Schmidt's statement
Precisely zero of the IPCC AR4 model simulations (discussed here for instance) used an interactive ozone module in doing the projections into the future.
implies that you should now throw precisely all IPCC AR4 models to the trash bin or, to say the same thing more moderately, to work very hard to correct the bug and to introduce the previously neglected important effect that was pointed out by Dr Katie Read et al.

Many similar problems with the models have been found in the past and many more will be found in the future. Science listens - and has to listen - to all these new insights, otherwise it would be no science and it could make no progress. Looking at new data and insights and the elimination (or adjustment) of existing theories is what scientists are really paid for.

Mr Schmidt's own attitude to the error-correcting procedures is clarified by the last sentence of his text:
But it seems that the "climate models will have to be adjusted" meme is just too good not to use - regardless of the context.
Very good. More precisely, what an astonishingly misguided person.

So Mr Schmidt finds error correction in science too good and prefers not to use it and not to correct errors in the climate models. In fact, he even prefers not to talk about adjustments at all because it could indicate that science and models are not the infallible and eternally valid verses from the Holy Scripture that he knows, believes, and uses to evangelize. Instead, they could become temporary insights that could be influenced (or even refuted!?) by every new observation or a new scientific paper - and that would be nothing short of hell! :-)

I happen to use the word "science" for this "hell".

Well, this approach of Mr Schmidt might be one of the reasons why his personal opinions about the climate and the opinions of his comrades at RealClimate.ORG are scientifically worthless piles of crap. The more science will know about Nature, the more crappy the opinions of similar zealots who are not ready to adjust their opinions will be. If you try to quantify how much this particular paper changes the numbers relevant for the climate sensitivity, it is fair to say that 5-10 papers like that are able to change the numbers by something of order 100%. In a year or two, our understanding may be very different if we're doing things right. It's therefore damn important for climate science to (critically) read and (rationally) process such papers!

People like Schmidt are neither willing nor able to correct mistakes in their models and theories. Fortunately for them and unfortunately for the society, they are being paid for something completely different - for a blind promotion of wrong theories and politically convenient conclusions that are irrationally extracted from these wrong, never-updated, obsolete theories.

See also Science Daily, Nude Socialist, and Discover Magazine.

Destruction of greenhouse gases over tropical Atlantic

Large amounts of ozone – around 50% more than predicted by the world's state-of-the-art climate models – are being destroyed in the lower atmosphere over the tropical Atlantic Ocean. Published today (26th June '08) in the scientific journal, Nature, this startling discovery was made by a team of scientists from the UK's National Centre for Atmospheric Science and Universities of York and Leeds. It has particular significance because ozone in the lower atmosphere acts as a greenhouse gas and its destruction also leads to the removal of the third most abundant greenhouse gas; methane.

The findings come after analysing the first year of measurements from the new Cape Verde Atmospheric Observatory, recently set up by British, German and Cape Verdean scientists on the island of São Vicente in the tropical Atlantic. Alerted by these Observatory data, the scientists flew a research aircraft up into the atmosphere to make ozone measurements at different heights and more widely across the tropical Atlantic. The results mirrored those made at the Observatory, indicating major ozone loss in this remote area.

This image is of the Cape Verde Atmospheric Observatory, where instruments for monitoring the atmosphere are stationed. It has been operating for a year and the first year's data...

Click here for more information.

So, what's causing this loss? Instruments developed at the University of Leeds, and stationed at the Observatory, detected the presence of the chemicals bromine and iodine oxide over the ocean for this region. These chemicals, produced by sea spray and emissions from phytoplankton (microscopic plants in the ocean), attack the ozone, breaking it down. As the ozone is destroyed, a chemical is produced that attacks and destroys the greenhouse gas methane. Up until now it has been impossible to monitor the atmosphere of this remote region over time because of its physical inaccessibility. Including this new chemistry in climate models will provide far more accurate estimates of ozone and methane in the atmosphere and improve future climate predictions.

Professor Alastair Lewis, Director of Atmospheric Composition at the National Centre for Atmospheric Science and a lead scientist in this study, said: "At the moment this is a good news story – more ozone and methane being destroyed than we previously thought - but the tropical Atlantic cannot be taken for granted as a permanent 'sink' for ozone. The composition of the atmosphere is in fine balance here- it will only take a small increase in nitrogen oxides from fossil fuel combustion, carried here from Europe, West Africa or North America on the trade winds, to tip the balance from a sink to a source of ozone"

Professor John Plane, University of Leeds said: "This study provides a sharp reminder that to understand how the atmosphere really works, measurement and experiment are irreplaceable. The production of iodine and bromine mid-ocean implies that destruction of ozone over the oceans could be global".

Dr Lucy Carpenter, University of York and UK co-ordinator of the Observatory added: "This observatory is a terrific facility that will enable us to keep an eye on the chemical balance of the atmosphere and feed this information into global climate models to greatly improve predictions for this region in the future".

sábado, 28 de junio de 2008

Friday, June 27, 2008

Hace 20 años, aprovechando un verano muy calurosos en el nordeste de EEUU, sede de la élite política, James Hansen, director ya entonces del Instituto Goddard de Estudios Espaciales (GISS), presentó un alarmista informe al parlamento sobre los peligros del calentamiento global. En el informe venía una gráfica sobre la posible evolución de la temperatura global (en trazos negros), en la que ahora el autor del blog "Climate Skeptic" ha superpuesto la evolución registrada por los satélites (en rojo). Desde los satélites, que tienen una cobertura global, se mide la temperatura con un aparato denominado MSU o AMSU (Advanced Microwave Sounding Units) que mide la temperatura del aire en diferentes niveles. Estas medidas (UAH) se remontan a hace unos 28 años, Diciembre de 1979, y en este período la tendencia calculada para la capa más superficial da tan sólo 0,13ºC por década, en absoluto catastrófica, mucho menor que la pronosticada por Hansen y además con la irregularidad que se manifiesta en la gráfica roja.

Para Hansen era horroroso, y así lo expresaba en su informe, que le temperatura media global ascendiese a los niveles supestamente alcanzados en dos períodos cálidos del pasado: el Altithermal, de hace 6.000 años o el Eemiense, de hace 125.000. Pero este nivel térmico de entonces nada tuvo que ver con el CO2, cuya concentración era bastante menor que la actual. Por otra parte, estos dos períodos, que sí fueron cálidos, fueron también mucho más húmedos que el actual, lo cual no estuvo nada mal.

ps. más sobre el Altithermal y el Eemiense:
Características del Eemiense
Humedad y calor en la primera mitad del Holoceno
y sobre el Libro de Jeremías

jueves, 26 de junio de 2008

Warming on 11 year hiatus? How about cooling?

23
06
2008

A guest post by Basil Copeland

Lucia, at rankexploits.com, has been musing over Tilo Reber’s posting of a graph showing flat 11 year trends in the HadCRUT land-ocean global temperature anomaly and the two MSU satellite data sets, UAH and RSS. In answer to the question whether global warming is on an 11 year hiatus, “not quite,” says Lucia. She challenges Tilo’s omission of the GISS data set, because notwithstanding questions about the reliability of GISS, it still shows a positive trend over the 11 year period in question. Unless all the measures show a flat trend, Lucia’s not ready to conclude that global warming has been on an 11 year hiatus.

I understand the desire to look at as many metrics as possible in trying to divine what is going on with globally averaged temperature. I also understand the reasons for questioning the reliability of GISS. What I don’t understand is why the only measure of trend that seems to count is a trend derived from linear regression. William Briggs recently had an interesting post to his blog on the relationship between trends in CO2 and temperature in which he introduced the use of loess lines to track trends that are not represented well by linear regression. Loess refers to a type of locally weighted regression that in effect fits a piecewise linear or quadratic trend through the data, showing how the trend is changing over time. Especially in an environment where the charge of cherry-picking the data — choosing starting and ending points to produce a particular result - is routinely made, loess lines are a relatively robust alternative to simple trend lines from linear regression.



Click for a larger image

Figure 1 fits a loess line through the data for GISS using the same 11 year period used by Tilo Reber (except that I’ve normalized all anomalies in this discussion relative to their 11 year mean to facilitate comparison to a common baseline). The red line is the GISS anomaly for this period, about its mean, and the blue line is the loess line. While it varies up and down over the period in question, I would argue that the overall trend is essentially flat, or even slightly negative: the value of line at the end of the period is slightly lower than at the beginning of the period. What this loess line shows is that a linear regression trend is not a particularly good way to represent the actual trend in the data. Without actually fitting a linear trend line, we can reasonably guess that it will trend upwards, because of the way the loess line is lower in the first half of the period in question, and higher in the second half. Linear regression will fit a positive, but misleading, slope through the data, implying that at the end of the period the GISS is on an upward trend when in fact the trend peaked around 2006 and has since declined.



Click for a larger image

Figure 2 is rainbow of colors comparing all four of the metrics we tend to follow here on WUWT. Not surprisingly, the loess lines of HadCRUT, UAH and RSS all track closely together, while GISS is the odd duck of the lot. So what does this kaleidoscope of colors tell us about whether global warming is has gone on an 11 year hiatus? I think it tells us rather more than even Tilo was claiming. All of the loess lines show a net decline in the trend over the 11 year period in question. It is relatively minor in the case of GISS, but rather pronounced in the case of the other three. Of the other three, the median anomaly at the beginning of the period, as represented by the loess lines, was 0.125; at the end of the period, the median anomaly had dropped to -0.071, for a total decline of 0.196, or almost 0.2C.

Global warming on hiatus? It looks to me like more evidence of global cooling. Will it continue? Neither linear regression nor loess lines can answer that question. But the loess lines certainly warn us to be cautious in naively extrapolating historical trends derived by simple linear regression.

Not even GISS can support the conclusion from the last 11 years of data that global warming continues to march upward in unrelenting fashion.

Surprise: Explosive volcanic eruption under the Arctic ice found
25
06
2008

I posted on a similar story about volcanic eruptions under Antarctic ice earlier this year. What is unique about this situation is that it was a large eruption that went completely undetected, and under pressures that they thought not possible. The big question is then; where did the heat from the volcano go, and what effect did it have on the sea ice environment? Research has been going on looking at volcanism in the ridge but this discovery of a significant eruption in 1999 is new and unexpected.

From Science and The Sea: “In the last few years, for example, scientists have found that a long ridge beneath the north polar ice cap is dotted with volcanoes, and with vents of superheated water that could be home to many new species.”

More info on the Gakkel Ridge here

Today’s Press release from EurekAlert:

International expedition discovers gigantic volcanic eruption in the Arctic Ocean












A “lonely ” seismometer drifts with the sea ice.

Click here for more information.


An international team of researchers was able to provideevidencef explosive volcanism in the deeps of the ice-covered Arctic Ocean for the first time. Researchers from an expedition to the Gakkel Ridge, led by the American Woods Hole Oceanographic Institution (WHOI), report in the current issue of the journal Nature that they discovered, with a specially developed camera, extensive layers of volcanic ash on the seafloor, which indicates a gigantic volcanic eruption.

“Explosive volcanic eruptions on land are nothing unusual and pose a great threat for whole areas,” explains Dr Vera Schlindwein of the Alfred Wegener Institute for Polar and Marine Research in the Helmholtz Association. She participated in the expedition as a geophysicist and has been, together with her team, examining the earthquake activity of the Arctic Ocean for many years. “The Vesuvius erupted in 79 AD and buried thriving Pompeii under a layer of ash and pumice. Far away in the Arctic Ocean, at 85° N 85° E, a similarly violent volcanic eruption happened almost undetected in 1999 – in this case, however, under a water layer of 4,000 m thickness.” So far, researchers have assumed that explosive volcanism cannot happen in water depths exceeding 3 kilometres because of high ambient pressure. “These are the first pyroclastic deposits we’ve ever found in such deep water, at oppressive pressures that inhibit the formation of steam, and many people thought this was not possible,” says Robert Reves-Sohn, staff member of the WHOI and lead scientist of the expedition carried out on the Swedish icebreaker Oden in 2007.

A major part of Earth’s volcanism happens at the so-called mid-ocean ridges and, therefore, completely undetected on the seafloor. There, the continental plates drift apart; liquid magma intrudes into the gap and constantly forms new seafloor through countless volcanic eruptions. Accompanied by smaller earthquakes, which go unregistered on land, lava flows onto the seafloor. These unspectacular eruptions usually last for only a few days or weeks.



The installation of a seismometer on an ice floe.

Click here for more information.


Volcanic ashes on the sea bed of Gakkel Ridge (Photo: WHOI)



Bathymetric chart of the Gakkel Ridge at 85°E. Photographic bottom surveys were conducted along profiles shown as thin, black lines. The photo showing volcanic ashes on the sea bed were taken at the site, which is marked with a red star and the letter a.







jueves, 19 de junio de 2008

A Window on Water Vapor and Planetary Temperature

Here is some interesting news; according to data from NOAA’s Earth System Laboratory, atmospheric water vapor is on the decline globally.

You’ve probably heard many times how water vapor is actually the most important “greenhouse gas” for keeping our planet warm, with an effectiveness far greater than that of CO2.

It is generally accepted that the rank of important greenhouse gases is:
water vapor and clouds which causes up to 70% of the greenhouse effect on Earth.
carbon dioxide, which causes 9–26%
methane, which causes 4–9%
ozone, which causes 3–7%

Note the range of uncertainties, on water vapor some say the percentage goes up to 90% with reduced numbers on the other three.

It is absolutely true that water vapor is the gas most responsible for the “greenhouse effect” of our atmosphere. Greenhouse gases let short-wave solar radiation through the atmosphere, but impede the escape of long-wave radiation from the Earth’s surface. This process keeps the planet at a livable temperature: Without a suitably balanced mixture of water vapor, CO2, methane, and other gases in the atmosphere, Earth’s average surface temperature would be somewhere between -9 and -34 degrees Fahrenheit, rather than the balmy average 59 degrees it is today.

This graph then from NOAA’s Earth System Research Laboratory, showing specific humidity of the atmosphere up to the 300 millibar pressure level (about 8 miles altitude) is interesting for it’s trend:


[UPDATE: After reading comments from our always sharp readers, and collaborating with three other meteorologists on the graph, I'm of the opinion now that this graph from ESRL, while labeled as "up to 300mb only" is misleading due to that label. The first impression I had would be from the surface to 300mb i.e. the "up" portion of the label, but on the second thought I believed the label was intended to be numerical meaning "zero to 300mb" or from the top of the atmosphere down as opposed from the surface up as we normally think of it. The values looked like anomaly values, but are inthe range of absolutes for that elevation also.

Thanks to some work by commenter Ken Gregory, looking at other ways this and similar graphs can be generated from the site, it has be come clear that this is a level, not a range from a level. The label ESRL placed "up to 300mB was intended to list the availability of all data levels. Thus there is no 200mb data.

This demonstrates the importance of labeling a graph, as without any supplementary description, it can be viewed differently than the authors intend. A better label would be "at 300mb" which would be unambiguous. ESRL should correct this to prevent others from falling into this trap.]

For some background into atmospheric absorption efficiency of the electromagnetic spectrum, this graph is valuable:
Note the CO2 peak at 15 microns is the only significant one, as the 2.7 and 4.3 micron CO2 peaks have little energy to absorb in that portion of the spectrum. But the H2O (water vapor) has many peaks from .8 to 8 microns, two that are fairly broad, and H2O begins absorbing almost continuously from 10 microns on up, making it overwhelmingly the major “greenhouse gas”.

Here is another graph looking at it in a different way:

domingo, 15 de junio de 2008

More Signs Of The Sun Slowing Down



In my post from yesterday, I highlighted a paragraph from a NASA press release which touched on one of the final findings of the soon to be ended Ulysses spacecraft mission to study the sun:

“Ulysses ends its career after revealing that the magnetic field emanating from the sun’s poles is much weaker than previously observed. This could mean the upcoming solar maximum period will be less intense than in recent history. “


A few months ago, I had plotted the Average Geomagnetic Planetary Index (Ap) which is a measure of the solar magnetic field strength but also daily index determined from running averages of eight Ap index values. Call it a common yardstick (or meterstick) for solar magnetic activity.



solar-geomagnetic-Ap Index

Click for a larger image


I had noted that there was a curious step function in 2005, almost as if something had “switched off”.


Today, since it is fathers day, and I get to do whatever I want, I chose to revisit this graph. Later I plan to take my children to launch model rockets, but for now, here are some interesting new things I’ve found.


First, I’ve updated the original Ap graph to June 2008 as you can see below.




Click for a larger image


Source data, NASA Space Weather Prediction Center:

http://www.swpc.noaa.gov/ftpdir/weekly/RecentIndices.txt


As you can see, the Ap Index has continued along at the low level (slightly above zero) that was established during the drop in October 2005. As of June 2008, we now have 32 months of the Ap hovering around a value just slightly above zero, with occasional blips of noise.



Since it is provided in the same dataset, I decided to also plot the smoothed Ap Index. I had noted to myself back in February that the smoothed Ap Index had dropped to minus 1.0. I figured it was just an artifact of the smoothing algorithm, but today that number remains there, and there doesn’t appear to be any change even though we’ve had a bit of noise in March that put the Ap Index back up to 10 for that month.


I also plotted my own 24 month smoothing window plot, shown in magenta.




Click for a larger image


I find it curious that the smoothed value provided by SWPC remains at -1. I figure if it is a software error, they would have noted and fixed it by now, and if they haven’t then perhaps they are standing by the number. Odd. One possibility may be that they are using a 12 month fixed window, instead of a moving window month to month. If so, then why show the -1.0 data vaues? Put nulls — in the dataset.


UPDATE: Astute reader Jorma Kaskiseiväs points out something I missed. The explanation is in the header for the dataset file, a short note: # Missing data: -1″. I was looking in the companion readme file for an explanation. Thanks for pointing this out. Surprising though that SWPC does not use a running average. Easy to do as I’ve shown.



While I was searching for something that could explain this, I came across this plot from NOAA’s NGDC which was used to illustrate solar storm frequency related to sunspots:




Click for original source image, a larger plot is here via FTP link.


But what I found was most interesting was the data file they provided, which had the number of days in a year where the Ap Index exceeded 40. You can view that data file yourself here via FTP link. The accompanying readme file for the data is also available here.


What is most striking is that since 1932, there have not been ANY years prior to 2007 that have zero data. The closest was 1996:


1996 0 0 0 0 0 0 0 0 0 1 0 0 1



———————————————————–

YEAR JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC TOTAL


———————————————————–

2005 3 0 2 1 3 2 2 2 3 0 0 0 18


2006 0 0 1 2 0 0 0 1 0 0 0 1 5


2007 0 0 0 0 0 0 0 0 0 0 0 0 0


2008 0 0 0 0 0 0


Now we have almost two years.


Here is my plot of the above dataset:





Click for a larger image


I also decided to plot the 10.7 centimeter band solar radio flux, also a metric of solar activity. It is in the same SWPC dataset file as the Ap Index, in columns 8 and 9. Oddly the smoothed 10.7 CM flux value provided by SWPC also has dropped precipitously and stayed there. I also provided my own 24 month wind smoothed value which is plotted in magenta.




Click for a larger image


Like the smoothed Ap Index, it has also stayed that way a few months. NOTE: The data past Dec 2007 on the blue line from SWPC is not valid. The smoothed 24 month window is.


Either way it appears we continue to slide into a deeper than normal solar minima, one not seen in decades. Given the signs, I think we are about to embark upon a grand experiment, over which we have no control.


viernes, 13 de junio de 2008

NASA to Probe Sun “in situ”

Until the SOHO satellite was launched, astronomers had to be content to look through earth bound telescopes at the sun. Now that the sun is key to “the biggest threat facing mankind - climate change” it seems only sensible that NASA send a probe for direct measurment.

Now if we can just get Jim Hansen out of his office to look at some of the weather stations he keeps using in the GISS surface temperature database, we’ll really have something.

No word yet on whether Quizno’s will be putting a “Mmm…Toasty!” bumper sticker on the probe in exchange for scientific funding assistance.
From NASA Science News

For more than 400 years, astronomers have studied the sun from afar. Now NASA has decided to go there.
Right: An artist’s concept of Solar Probe Plus. [

The name of the mission is Solar Probe+ (pronounced "Solar Probe plus"). It's a heat-resistant spacecraft designed to plunge deep into the sun's atmosphere where it can sample solar wind and magnetism first hand. Launch could happen as early as 2015. By the time the mission ends 7 years later, planners believe Solar Probe+ will solve two great mysteries of astrophysics and make many new discoveries along the way.

The probe is still in its early design phase, called "pre-phase A" at NASA headquarters, says Guhathakurta. "We have a lot of work to do, but it's very exciting."

Johns Hopkins' Applied Physics Lab (APL) will design and build the spacecraft for NASA. APL already has experience sending probes toward the sun. APL's MESSENGER spacecraft completed its first flyby of the planet Mercury in January 2008 and many of the same heat-resistant technologies will fortify Solar Probe+. (Note: The mission is called Solar Probe plus because it builds on an earlier 2005 APL design called Solar Probe.)

At closest approach, Solar Probe+ will be 7 million km or 9 solar radii from the sun. There, the spacecraft's carbon-composite heat shield must withstand temperatures greater than 1400o C and survive blasts of radiation at levels not experienced by any previous spacecraft. Naturally, the probe is solar powered; it will get its electricity from liquid-cooled solar panels that can retract behind the heat-shield when sunlight becomes too intense. From these near distances, the Sun will appear 23 times wider than it does in the skies of Earth.

Above: A simulated view of the Sun illustrating the trajectory of Solar Probe+ during its multiple near-Sun passes.

The two mysteries prompting this mission are the high temperature of the sun's corona and the puzzling acceleration of the solar wind:

Mystery #1—the corona: If you stuck a thermometer in the surface of the sun, it would read about 6000o C. Intuition says the temperature should drop as you back away; instead, it rises. The sun's outer atmosphere, the corona, registers more than a million degrees Celsius, hundreds of times hotter than the star below. This high temperature remains a mystery more than 60 years after it was first measured.

Mystery #2—the solar wind: The sun spews a hot, million mph wind of charged particles throughout the solar system. Planets, comets, asteroids—they all feel it. Curiously, there is no organized wind close to the sun's surface, yet out among the planets there blows a veritable gale. Somewhere in between, some unknown agent gives the solar wind its great velocity. The question is, what?

"To solve these mysteries, Solar Probe+ will actually enter the corona," says Guhathakurta. "That's where the action is."

The payload consists mainly of instruments designed to sense the environment right around the spacecraft—e.g., a magnetometer, a plasma wave sensor, a dust detector, electron and ion analyzers and so on. "In-situ measurements will tell us what we need to know to unravel the physics of coronal heating and solar wind acceleration," she says.

Right: The re-designed Solar Probe+ spacecraft. [more]

Solar Probe+’s lone remote sensing instrument is the Hemispheric Imager. The “HI” for short is a telescope that will make 3D images of the sun’s corona similar to medical CAT scans. The technique, called coronal tomography, is a fundamentally new approach to solar imaging and is only possible because the photography is performed from a moving platform close to the sun, flying through coronal clouds and streamers and imaging them as it flies by and through them.

With a likely launch in May 2015, Solar Probe+ will begin its prime mission near the end of Solar Cycle 24 and finish near the predicted maximum of Solar Cycle 25 in 2022. This would allow the spacecraft to sample the corona and solar wind at many different phases of the solar cycle. It also guarantees that Solar Probe+ will experience a good number of solar storms near the end of its mission. While perilous, this is according to plan: Researchers suspect that many of the most dangerous particles produced by solar storms are energized in the corona—just where Solar Probe+ will be. Solar Probe+ may be able to observe the process in action and show researchers how to forecast Solar Energetic Particle (SEP) events that threaten the health and safety of astronauts.

Solar Probe+’s repeated plunges into the corona will be accomplished by means of Venus flybys. The spacecraft will swing by Venus seven times in six years to bend the probe’s trajectory deeper and deeper into the sun’s atmosphere. Bonus: Although Venus is not a primary target of the mission, astronomers may learn new things about the planet when the heavily-instrumented probe swings by.

“Solar Probe+ is an extraordinary mission of exploration, discovery and deep understanding,” says Guhathakurta. “We can’t wait to get started.”

Surprise: Leaves Maintain Temperature,

Surprise: Leaves Maintain Temperature, new findings may put dendroclimatology as metric of past temperature into question


Dendroclimatology: thermometer or hygrometer?

Hot climate or cold, tree leaves stay in comfort zone

From the Google Climate Discussion Group, see an article also in Science News

Paris, June 11; Agence France-Presse

A new study that shows their internal temperature remains constant at 21.4deg could challenge the way trees are used to determine historical climate data

The internal temperature of leaves, whether in the tropics or a cold-clime forest, tends toward a nearly constant 21.4 degrees Celsius, reports a study released today.

It had long been assumed that actively photosynthesising leaves - using energy from sunlight to convert carbon dioxide and water into sugar - are nearly as cold or hot as the air around them.

The new findings not only challenge long-held precepts in plant biology, but could upend climate models that use tree rings to infer or predict past and present temperature changes.

For decades, scientists studying the impact of global warming have measured the oxygen isotope ratio in tree-rings to determine the air temperature and relative humidity of historical climates.

Oxygen atoms within water molecules evaporate more or less quickly depending on the number of neutrons they carry, and the ratio between these differently weighted atoms in tree trunk rings has been used as a measure of year-to-year fluctuations in temperatures and rainfall.

“The assumption in all of these studies was that tree leaf temperatures were equal to ambient temperatures,” lead researcher Brent Helliker told AFP. “It turns out that they are not.”

Helliker and University of Pennsylvania colleague Suzanna Richter turned those assumptions upside down in examining 39 tree species, across 50 degrees of latitude ranging from sub-tropical Columbia to boreal Canada.

They compared current observed records of humidity and temperature against the isotope ratios in the trees, and found that tree leaves were internally cooler than surrounding air temperatures in warm climes, and warmer in cool climes.

Even more startling was that in all cases the average temperature - over the course of a growing season - was about 21degC.

“It is not surprising to think that a polar bear in northern Canada and a black bear in Florida have the same internal body temperature,” because both animals have internal thermostats to prevent overheating or freezing to death, he said.

“But to think that a Canadian black spruce and a Caribbean Pine have the same average leaf temperature is quite astonishing,” he added.

Tree leaves keep cool through constant evaporation and reducing sun exposure through leaf angles or reflective qualities. Warmth is gained by decreasing evaporation and increasing the number of leaves per branch.

All these tricks should be seen as evolutionary adaptations that help the trees attain a maximum of nutrients through optimal photosynthesis, Helliker said.

The fact that part of this adaptation occurs at the level of entire forest canopies, and not just within individual leaves, is one reason direct measurements of tree temperatures have been so hard.

The new findings, published in the British journal Nature, are bolstered by a recent study of a mixed species forest in Switzerland based on infrared thermal imaging.

Measured across an entire growing season, the forest canopy temperatures were found to be 4degC to 5degC higher than the cool, ambient air in the Swiss Alps.

miércoles, 11 de junio de 2008

Los últimos diez

Esta es la gráfica de la evolución de la temperatura global en los últimos diez años y unos cuantos meses (Enero 1998-Mayo 2008) que publica el instituto GISS de la NASA bajo la dirección de James Hansen (uno de los grandes jefes de la galaxia).

Para encontrar la figura hay que rebuscar un poco. Se entra en esta página, se pincha en la ventana de gráficos y se va para abajo hasta dar con ella.

Es una doble gráfica. Una se refiere a las temperaturas del aire en los continentes (negra) y otra a la global (roja), tierra y mar, en la que los valores del mar son los de las temperaturas de las aguas superficiales. Las gráficas indican las diferencias (o anomalías) de las temperaturas mensuales con respecto a las medias mensuales del período de referencia 1951-1980 (durante el cual la temperatura media global fue de más o menos 14ºC). Los valores de las anomalías varían entre 0ºC y 1ºC aproximadamente.

Los periódicos, a los que se supone les interesa la actualidad, no la publican nunca. No la encuentran.

posted by Antón Uriarte @ 1:12 AM

lunes, 9 de junio de 2008

Surprise: Earths’ Biosphere is Booming, Satellite Data Suggests CO2 the Cause


Eco Worriers: “CO2 is a pollutant!” Gaia: “Tell that to the biosphere.” Biosphere: “Yumm, burp!”

The SeaWiFS instrument aboard the Seastar satellite has been collecting ocean data since 1997. By monitoring the color of reflected light via satellite, scientists can determine how successfully plant life is photosynthesizing. A measurement of photosynthesis is essentially a measurement of successful growth, and growth means successful use of ambient carbon. This animation shows an average of 10 years worth of SeaWiFS data. Dark blue represents warmer areas where there tends to be a lack of nutrients, and greens and reds represent cooler nutrient-rich areas which support life. The nutrient-rich areas include coastal regions where cold water rises from the sea floor bringing nutrients along and areas at the mouths of rivers where the rivers have brought nutrients into the ocean from the land.
In praise of CO2
With less heat and less carbon dioxide, the planet could become less hospitable and less green
Lawrence Solomon
Financial Post, Don Mills, Ontario
Saturday, June 07, 2008

Planet Earth is on a roll! GPP is way up. NPP is way up. To the surprise of those who have been bearish on the planet, the data shows global production has been steadily climbing to record levels, ones not seen since these measurements began.

GPP is Gross Primary Production, a measure of the daily output of the global biosphere –the amount of new plant matter on land. NPP is Net Primary Production, an annual tally of the globe’s production. Biomass is booming. The planet is the greenest it’s been in decades, perhaps in centuries.

Until the 1980s, ecologists had no way to systematically track growth in plant matter in every corner of the Earth — the best they could do was analyze small plots of one-tenth of a hectare or less. The notion of continuously tracking global production to discover the true state of the globe’s biota was not even considered.

Then, in the 1980s, ecologists realized that satellites could track production, and enlisted NASA to collect the data. For the first time, ecologists did not need to rely on rough estimates or anecdotal evidence of the health of the ecology: They could objectively measure the land’s output and soon did — on a daily basis and down to the last kilometer.

The results surprised Steven Running of the University of Montana and Ramakrishna Nemani of NASA, scientists involved in analyzing the NASA satellite data. They found that over a period of almost two decades, the Earth as a whole became more bountiful by a whopping 6.2%. About 25% of the Earth’s vegetated landmass — almost 110 million square kilometres — enjoyed significant increases and only 7% showed significant declines. When the satellite data zooms in, it finds that each square metre of land, on average, now produces almost 500 grams of greenery per year.

Why the increase? Their 2004 study, and other more recent ones, point to the warming of the planet and the presence of CO2, a gas indispensable to plant life. CO2 is nature’s fertilizer, bathing the biota with its life-giving nutrients. Plants take the carbon from CO2 to bulk themselves up — carbon is the building block of life — and release the oxygen, which along with the plants, then sustain animal life. As summarized in a report last month, released along with a petition signed by 32,000 U. S. scientists who vouched for the benefits of CO2: “Higher CO2 enables plants to grow faster and larger and to live in drier climates. Plants provide food for animals, which are thereby also enhanced. The extent and diversity of plant and animal life have both increased substantially during the past half-century.”

From the 2004 abstract: Our results indicate that global changes in climate have eased several critical climatic constraints to plant growth, such that net primary production increased 6% (3.4 petagrams of carbon over 18 years) globally. The largest increase was in tropical ecosystems. Amazon rain forests accounted for 42% of the global increase in net primary production, owing mainly to decreased cloud cover and the resulting increase in solar radiation.

Lush as the planet may now be, it is as nothing compared to earlier times, when levels of CO2 and Earth temperatures were far higher. In the age of the dinosaur, for example, CO2 levels may have been five to 10 times higher than today, spurring a luxuriantly fertile planet whose plant life sated the immense animals of that era. Planet Earth is also much cooler today than during the hothouse era of the dinosaur, and cooler than it was 1,000 years ago during the Medieval Warming Period, when the Vikings colonized a verdant Greenland. Greenland lost its colonies and its farmland during the Little Ice Age that followed, and only recently started to become green again.

This blossoming Earth could now be in jeopardy, for reasons both natural and man-made. According to a growing number of scientists, the period of global warming that we have experienced over the past few centuries as Earth climbed out of the Little Ice Age is about to end. The oceans, which have been releasing their vast store of carbon dioxide as the planet has warmed — CO2 is released from oceans as they warm and dissolves in them when they cool — will start to take the carbon dioxide back. With less heat and less carbon dioxide, the planet could become less hospitable and less green, especially in areas such as Canada’s Boreal forests, which have been major beneficiaries of the increase in GPP and NPP.

Doubling the jeopardy for Earth is man. Unlike the many scientists who welcome CO2 for its benefits, many other scientists and most governments believe carbon dioxide to be a dangerous pollutant that must be removed from the atmosphere at all costs. Governments around the world are now enacting massive programs in an effort to remove as much as 80% of the carbon dioxide emissions from the atmosphere.

If these governments are right, they will have done us all a service. If they are wrong, the service could be all ill, with food production dropping world wide, and the countless ecological niches on which living creatures depend stressed. The second order effects could be dire, too. To bolster food production, humans will likely turn to energy intensive manufactured fertilizers, depleting our store of non-renewable resources. Techniques to remove carbon from the atmosphere also sound alarms. Carbon sequestration, a darling of many who would mitigate climate change, could become a top inducer of earthquakes, according to Christian Klose, a geohazards researcher at Columbia University’s Lamont-Doherty Earth Observatory. Because the carbon sequestration schemes tend to be located near cities, he notes, carbon-sequestration-caused earthquakes could exact an unusually high toll.

Amazingly, although the risks of action are arguably at least as real as the risks of inaction, Canada and other countries are rushing into Earth-altering carbon schemes with nary a doubt. Environmentalists, who ordinarily would demand a full-fledged environmental assessment before a highway or a power plant can be built, are silent on the need to question proponents or examine alternatives.

Earth is on a roll. Governments are too. We will know soon enough if we’re rolled off a cliff.

Radiosondes vs models

Finally, I want to show the status of the "direct" predictions and measurements of the temperature and to mention two graphs from Climate Audit.

This chart, taken from RealClimate.ORG, shows the overall warming expected from the doubling of CO2 concentrations from 280 ppm before the industrial revolution to 560 ppm expected around the year 2100 (assuming business-as-usual, i.e. a pretty constant rate of CO2 emissions in the future), as predicted by the GISS model E, dominated by the greenhouse effect.

Look in the middle of the picture, above the equator. You see the dark red "hot spot" over there. At the height (y-axis) corresponding to the pressure of 200 hPa, you are in the middle of the dark red cloud where the total warming should be not only higher than 3 °C (between 3 °C and 14.6 °C) but much higher than that, probably around 5 °C or so. This figure (5 °C) is roughly 1.5 times the surface warming (around 3 °C according to the IPCC's central figure and around 1.8 °C according to the picture above) - a classical feature of the greenhouse models.

Now, in 50 years, we add about 100 ppm of CO2 and we should therefore induce more than 1/3 of the effect of the CO2 doubling. So in 50 years, the place above the equator where the pressure is 200 hPa should heat up by more than 5 °C / 3 = more than 1.5 °C. (The CO2 emissions in this 50-year period were actually closer to the "earlier" emissions that should have a higher warming impact, because of the logarithmic slowdown: so my figure is probably an underestimate.) Does this significant warming actually occur in reality?
This is the actual graph of this tropospheric temperature record as measured by the Hadley Center's radiosondes (balloons). The net warming during the last 50 years is at most of order 0.2 °C and probably much smaller than that: Steve McIntyre calculated that since the beginning of the "satellite era" in 1979, the balloon trend has been actually negative (cooling). At any rate, it is very close to zero - and it is certainly much smaller than the 1.5 °C of warming predicted by the greenhouse models, as explained in the previous paragraph.

And yes, March 2008, the most recent month they have released, was the HadAT2 200 hPa tropical radiosondes' coolest month at least since the late 1950s (since January 1958) - one that was only matched by 1 month in the early 1970s, namely January 1972 (both Jan 1972 and Mar 2008 had -1.4 °C in the column, check the link in this paragraph).

This place above the equator is the most natural place where the temperature should be measured if your aim is to verify the greenhouse theory of the climate - simply because the signal is predicted to be maximized in this region. And the observations are smaller than the predictions by an order of magnitude or more.

Now, one order of magnitude is not a detail. If you accept that it is fair to compare economics and climatology because their "typical" predictions are comparably inaccurate (and I think that it is fair), the order-of-magnitude discrepancy between the theory and the observations is similar to a prediction by a group of economists who use their "consensus models" to forecast that the GDP will grow (or drop) by 40% a year - because of some effect - but the reality is only 4%. It's a pretty bad prediction, even in the fuzzy context of economics, isn't it?

Now, I want to emphasize that we must be a priori very open-minded because there can exist problems both with the observations as well as with the models. On the other hand, when you look at the weather balloons and the radiosondes they carry (see the picture on the left side), it is not too easy to imagine that there is some serious problem with them. These radiosondes measure the temperature (and the wind speed/direction) by thermometers and transmit the resulting numbers to the terrestrial radio receivers; see the Wikipedia text about radiosondes.

Try to think hard and invent an explanation why such a simple system would be sending warming trends that are 10 times smaller than the "real" ones (predicted by the models). I don't know of such an explanation. But once you find one, you should be ready to solve 1 or 2 similar puzzles - namely why the completely different satellite methodologies also lead to the same negligible warming trend if the "real" trend prescribed by the IPCC should be approximately 10 times faster.

I think it is sensible to expect an explanation what's wrong with the balloon and satellite numbers before someone's presentation of certain numbers from some computer games has a chance to be considered as "reality" by sane people.

Good luck. Before you find your ingenious method to solve these key puzzles, I will continue to think that the IPCC predictions have pretty much been falsified because the order-of-magnitude discrepancy we observe is pretty much the most serious discrepancy that we could a priori expect and if that were not enough for falsification, nothing would be enough. The qualitative agreement in one quantity of minor importance (related to winds) is not enough to confirm your models if more important predictions (temperature) fail.

Now, the balloon data may be very non-uniform and the "local noise" in them can be high. But it is fair to say that if the actual (accurate) thermometers can't demonstrate any significant warming trend over there, the "life on Earth" probably won't die because of such a warming either. A deadly fever is usually strong enough to be visible by thermometers, especially by the most accurate ones created for this purpose. ;-)

So think hard but try to imagine that your assumptions could be incorrect, after all.

Sherwood, Allen, and radiosondes

The media recently wrote far-reaching comments about the latest Nature Geophysics article by Steven Sherwood and Robert Allen (Yale University):
Warming maximum in the tropical upper troposphere deduced from thermal winds.
The authors - or at least the media - have claimed that a new method to "measure" the tropical tropospheric temperatures has removed all contradictions between the theoretical and empirical warming rates in the troposphere.

Recall that the greenhouse-dominated models predict rapid warming in the troposphere, roughly 10 km above the equator. The satellite measurements (UAH MSU, RSS MSU) show an actual warming rate that is at least 10 times slower than the theoretical predictions. The data from balloons and radiosondes they carry, for example the Hadley Center data, confirm the satellite figures. Detailed numbers will be discussed below.

That seems to be a problem. Every acceptable solution to this problem must either find serious errors in both the satellite and balloon data or a serious error in the theoretical models (or both).

Steven Sherwood and his pre-PhD student, Robert Allen, use a different strategy. They pretend that the discrepancy doesn't exist at all. How do they do it? Well, they want you to believe that the measurements of the temperatures don't exist. Instead, they propose their own, idiosyncratic, elaborated "measurement" of the tropospheric temperatures. Well, there is one additional immediate problem: it's not really their own method, as we will see. ;-)


Sherwood & Allen vs Pielke Sr

They look at some patterns in the thermal (?) westerly winds, manipulate them to obtain a rather continuous function, and claim that this function of the winds data is ... a measurement of temperature that is apparently better than the thermometers. Their method is not really original: it is a small subset of the methods discussed by Roger Pielke Sr and two co-authors in 2006 and especially by Pielke Sr and four co-authors in 2001. See also Pielke's comments about his priority.

So the idea is that instead of proving global warming, you prove global blowing :-) and then you argue that blowing and warming sound similar, especially according to your model that links the two. This strategy has the advantage that when the climate begins to cool down, you can also say that global blowing is the same thing as global cooling and the cataclysmic warming can continuously "rotate" into a new kind of catastrophic cooling. :-)

The problems with the particular conclusions by Sherwood and Allen have been discussed by
Roger Pielke Sr,
too. He is preparing a technical manuscript on that issue. The main drawback of their approach is circular reasoning. They want to demonstrate that the models are consistent with reality but what they actually call "reality" is extracted from the models, too.

More precisely, the relationship between the winds and the temperature is derived from the very same models that are shown to disagree with the actual temperature measurements by the balloons and satellites. So the arguments they show only support the compatibility of one particular theoretical prediction with the observations - namely the quantity describing winds as predicted by the very same models.

But a correct model should agree not only with one but with all observed quantities - especially with the temperature if this quantity is the main focus of your models. ;-)

The models also link this quantity related to winds to the temperature but the real measurements actually falsify the predicted temperature trend and the Yale authors don't change anything about that. To a large extent, they only demonstrate a "self-consistency" - which really means uniqueness of one prediction by the models that characterize the winds. It is not shocking that the predictions of such a model are self-consistent; it is much more non-trivial constraint that they should also be consistent with the real data.

domingo, 8 de junio de 2008

La oscilación del Pacifico

Desde hace muchos meses las aguas del Pacífico Norte próximas a la costa meridional de Alaska y a la costa oeste de Canadá y de Estados Unidos presentan una anomalía térmica negativa. Están más frías de lo normal. El mapa de abajo representa en tonos azules las aguas anómalamente frías y en tonos amarillos y anaranjados las aguas anómalamante cálidas. El mapa corresponde al pasado día 5 de Junio 2008 y lo he recortado de este link.
Cuando en esta región del Pacífico existe esa configuración térmica de las aguas, se cree que la temperatura media global desciende o, por lo menos, deja de aumentar.

Hasta hace no mucho, lo que ha preponderado allí, con sus más y sus menos, durante unos 30 años, es una configuración térmica casi opuesta, cuya expresión más típica es la representada en el mapa de abajo (aquí los valores del mapa y de la escala son de la anomalía de altura del mar y no de la temperatura; las aguas cálidas se dilatan y se eleva el nivel del mar y con las aguas frías pasa lo contrario ):

A esta oscilación oceánica, cuyo período es de varias décadas, se le llama PDO (Pacific Decadal Oscillation). Ahora parece que hemos entrado en una fase que tiende a ser negativa. Según algunos, o bastantes, sería una de las razones de que el clima global en lo que llevamos del siglo XXI no se haya calentado y, es posible, que tampoco lo haga en las próximas décadas. En este caso a los niños se les contará en los colegios que fue gracias al protocolo de Kioto.

Abajo pongo la evolución del índice PDO.

sábado, 7 de junio de 2008

NOAA reports on our cooler than normal spring

NOAA: U.S. Has 36th Coolest Spring on Record
June 6, 2008

High resolution (Credit: NOAA)
The March-May spring season was the 36th coolest on record for the contiguous United States, according to an analysis by NOAA’s National Climatic Data Center in Asheville, N.C. Separately, last month ended as the 34th coolest May for the contiguous United States, based on records dating back to 1895.
The average spring temperature of 51.4 degrees F was 0.5 degree F below the 20th century average. The average May temperature of 60.3 degrees F was 0.7 degree F below the 20th century mean, based on preliminary data.
U.S. Temperature Highlights
The March-May temperatures were cooler than average from the Northwest and extending throughout the central Plains and upper Mississippi Valley. In all, 19 states had a cooler-than-average spring.
Twenty-five states were cooler than average for May. Pennsylvania was much cooler than average and ranked eighth coolest.
The unusually cool temperatures kept the nation’s overall temperature-related residential energy demand for May above average. Based on NOAA’s Residential Energy Demand Temperature Index, contiguous U.S. temperature-related energy demand was approximately 3.5 percent above average in May, but near average for the spring season.
Florida, Texas, and Washington were warmer than average for May.

RSS: Global Temperature Also Cooler in May


A few days ago I highlighted the drop in global temperatures as measured by satellite from UAH, the University of Alabama, Huntsville. They published their satellite derived Advanced Microwave Sounder Unit data set of the Lower Troposphere for May 2008 and it showed that it is significantly colder globally, colder even than the significant drop to -0.046°C seen in January 2008.

The global ∆T for UAH from April to May 2008 was -.195°C

RSS (Remote Sensing Systems of Santa Rosa, CA) RSS Microwave Sounder Unit (MSU) lower troposphere global temperature anomaly data by For May 2008 is published and has moved below the zero anomaly line, with a value of -0.083°C for a change (∆T) of -0.163 °C globally from April 2008.

RSS
2008 1 -0.070
2008 2 -0.002
2008 3 0.079
2008 4 0.080
2008 5 -0.083

I had predicted when I posted the UAH data that the RSS value for global temperature anomaly for the lower troposphere would end up around 0.05 to -0.15°C. Coming in at -0.083°C, I was on target.

This value is greater in magnitude than the drop seen in January 2008 to -0.07°C

miércoles, 4 de junio de 2008

UAH MSU: May 2008 cooler than April by 0.19 °C

UAH MSU have released their new satellite data for May 2008. The global anomaly was -0.17 °C, the coldest reading after January 2000 and the third coldest monthly figure after September 1993.
Yes, I mean that anomaly-wise, the May was even colder than all the cool months of 2008, despite the dramatically weakening La Nina that now seems likely to change to ENSO neutral conditions this month. For example, the month-on-month cooling from April 2008 was by 0.19 °C while May 2008 was more than 0.75 °C cooler than January 2007. See the trend since January 2007 above.

The average anomaly for the first five months of 2008 is negative. 1994 was the last year whose average annual anomaly was negative.

Regional details

The month-on-month cooling included both hemispheres - by 0.21 °C on the NH and by 0.17 °C on the SH. Also, it covered the ocean as well as the land (on both hemispheres) pretty symmetrically.

The middle troposphere reveals a similar recent cooling trend. In some cases, it is not just recent. For example, the warming trend for the Southern Hemisphere during the last 30 years is 0.00 °C per decade while the warming trend for the troposphere above the world's ocean is 0.03 °C per decade, about 15 times slower than the IPCC-predicted trend. The middle troposphere is where the greenhouse theory predicts the most rapid warming.

The Sun

The Sun has been spotless at least for 9 days. The standardized May sunspot number was 2.9, equal to April, and the solar flux was even slightly lower than in April, namely 68.4.

Temperature satellitari molto fredde in Maggio

Il satellite ha effettuato i suoi rilevamenti termici: le temperature sono state decisamente basse nel mese di Maggio 2008, con uno scarto termico medio di -0,29°C, e si è trattato del mese più freddo almeno dal Gennaio 2000.
Le temperature terrestri continuano a comportarsi come i gamberi, facendo una sostanziale marcia indietro rispetto ai grandi caldi misurati nel corso degli anni Duemila.

Anche lo scorso mese di Maggio ha presentato un consistente raffreddamento della nostra Atmosfera, stando ai sensori satellitari: -0,29°C per quello che riguarda la Media Troposfera, e -0,17°C considerando invece i sensori della Bassa Troposfera.

Riguardo alla Media Troposfera, uno scarto termico di -0,29°C non veniva più registrato dal Gennaio del 2000, quando le temperature risultarono più basse della norma di -0,31°C.

Gli ultimi sei mesi hanno presentato un'anomalia termica negativa, a partire dallo scorso Dicembre, mentre i primi 5 mesi del 2008 sono stati mediamente più freddi della norma di -0,17°C, e, per ritrovare un inizio d'anno così freddo, bisogna andare indietro addirittura al 1993, quando le temperature terrestri erano interessate dal velo di polveri vulcaniche del Vulcano Pinatubo.

Anche per la Bassa Troposfera, lo scarto termico di -0,17°C non veniva più raggiunto proprio dal Gennaio di quell'anno.

In questo caso lo scarto termico dei primi cinque mesi del 2008 è stato più contenuto, e pari a -0,02°C.

Ma anche per questo tipo di temperature rilevate via satellite si tratta dei primi cinque mesi dell'anno più freddi dal 1997.

Si è trattato dunque di un vero e proprio "colpo di spugna" relativamente al caldo misurato negli anni Duemila, e, improvvisamente, il clima è tornato indietro di dieci anni!

Un'ultima parola riguardo alle rilevazioni satellitari.

Le rilevazioni dei sensori nella Bassa Troposfera, considerano le temperature dell'aria ad altezze inferiori ai 3000 metri di quota.

Questo fa sì che vengano segnalate le variazioni termiche di uno strato d'aria vicino a quello in cui viviamo, tuttavia vengono così escluse ampie zone della superficie terrestre che superano tale quota, come il Tibet, l'Himalaya, le Alpi, le Ande, e, soprattutto, l'intero Continente Antartico.

I sensori della Media Troposfera hanno invece il vantaggio di non escludere nessuna zona terrestre, dando così una visione più equilibrata dei cambiamenti climatici in atto.

Anche nel Maggio 2008 l'attività umana si fa sentire con la sua influenza termica: l'Emisfero Nord ha infatti presentato uno scarto termico, nella bassa troposfera, di -0,04°C, contro i -0,31°C dell'Emisfero Sud non industrializzato.

Così come le zone di terraferma dell'Emisfero Settentrionale, con +0,11°C, sono state nettamente più calde di quelle Oceaniche, che hanno presentato uno scarto termico di -0,18°C.

In allegato, l'andamento delle temperature della bassa Troposfera, nel periodo 1979-2007.

Si nota il forte riscaldamento del Polo Nord, ed, in genere, dell'Emisfero Nord, ed il raffreddamento del Circolo Polare Antartico.

Si nota tuttavia anche che i sensori in bassa Troposfera escludono dal computo tutto l'Antartide, il Tibet, le Ande.