About Jeff Masters
Cat 6 lead authors: WU cofounder Dr. Jeff Masters (right), who flew w/NOAA Hurricane Hunters 1986-1990, & WU meteorologist Bob Henson, @bhensonweather
By: JeffMasters, 4:40 PM GMT on March 31, 2009
Twenty years ago this month, on March 13, 1989, I was aboard NOAA's P-3 weather research aircraft, bumping through a turbulent portion of a fierce winter storm in a remote ocean area between Greenland and Norway. We were searching for clues on how to make better weather forecasts for the regions of Norway and the northern British Isles battered by these great storms. Our 2-month project, based in Bødø, Norway, was called the Coordinated Eastern Arctic Research Experiment (CEAREX) . Today's flight took us through the heart of an extratropical storm developing at the edge of the sea ice that covered the ocean waters east of Greenland.
As I looked over at the white-capped, forbidding waters of the Greenland Sea, I reflected today's flight was not particularly dangerous by Hurricane Hunter standards, though the storm's tropical storm-force winds made the ride a bit rough at times. However, we were a long way from civilization. Should an emergency require us to ditch the aircraft in the ocean or the nearby remote island of Jan Mayen, we'd be tough to find unless we were able to radio back our position before going down. Far from any land areas, our communication life-line to the outside world was HF radio (ham radio), which relied on Earth's ionosphere to bounce signals off of. Three hours into the flight this life-line abruptly stopped working.
Figure 1. Sea ice swirls in ocean eddies off the coast of Labrador, Canada, in this photo I took during a 1989 CEAREX flight.
"Jeff, can you come up to the cockpit?" Aircraft Commander Dan Eilers' voice crackled over the intercom. I took a break from monitoring our weather instruments, took off my headset, and stepped forward into the cockpit of the P-3.
"What's up, Dan?" I asked.
"Well, HF radio reception crapped out about twenty minutes ago, and I want to climb to 25,000 feet and see if we can raise Reykjavik Air Traffic Control to report our position. We're flying at low altitude in hazardous conditions over 500 miles from the nearest airport, and it's not good that we're out of communication with the outside world. If we were to go down, search and rescue would have no idea where to look for us."
I agreed to work out an alteration to the flight plan with our scientists, so that we could continue to collect good data on the storm while we climbed higher. The scientists weren't too happy with the plan, since they were paying $20,000 for this flight, and wanted to stay low at 1,500 feet to better investigate the storm's structure. Regardless, we climbed as high as we could and orbited the storm, issuing repeated calls to the outside world over our HF radio. No one answered.
"I've never seen such a major interruption to HF radio!" Commander Eilers said, worriedly. "We can go back down to 1,500 feet and resume the mission, but I want to periodically climb to 25,000 feet and continue trying to establish communications. If we can't raise Air Traffic Control, we should consider aborting the mission".
I agreed to work with the scientists to accommodate this strategy. They argued hotly against a possible cancellation of this mission, which was collecting some unique data on a significant winter storm. So, for the next four hours, we periodically climbed to 25,000 feet, issuing futile calls over our HF radio. Finally, after an uncomfortable eight hours, it was time to go home to our base in Norway. As twilight sank into Arctic darkness, a spectacular auroral display--shimmering curtains of brilliant green light--lit up sky. It began to dawn on us that the loss of our HF radio reception was probably due to an unusual kind of severe weather--a "Space Weather" storm. An extremely intense geomagnetic storm was hitting the polar regions, triggering our brilliant auroral show and interrupting HF radio communications.
The geomagnetic "Superstorm" of March 13, 1989
As it turned out, the geomagnetic storm of March 13, 1989 was one of the most intense such "Space Weather" events in recorded history. The storm developed as a result of a Coronal Mass Ejection (CME) from the sun four days previously. The CME event blasted a portion of the Sun's plasma atmosphere into space. When the protons and electrons from the Sun arrived at the Earth, the planet's magnetic field guided the highly energetic particles into the upper atmosphere near the magnetic poles. As a result, the lower levels of the polar ionosphere become very ionized, with severe absorption of HF radio, resulting in my uncomfortable flight over the Greenland Sea with no communications. The geomagnetic storm didn't stop there--the storm's charged particles triggered a strong magnetic impulse that caused a voltage depression in five transmission lines in the Hydro-Quebec power system in Canada. Within 90 seconds, automatic voltage compensation equipment failed, resulting in a generation loss of 9,450 MW. With a load of about 21,350 MW, the system was unable to withstand the generation loss and collapsed. The entire province of Quebec--six million people--was blacked out for approximately nine hours. The geomagnetic storm also triggered the failure of a large step-up transformer at the Salem Nuclear Power Plant in New Jersey, as well as 200 other failures on the North American power system. Auroras were observed as far south as Florida, Texas, and Cuba during this geomagnetic "superstorm".
Figure 2. Red and green colors predominate in this view of the Aurora Australis (Southern Hemisphere aurora) photographed from the Space Shuttle in May 1991 at the peak of the geomagnetic maximum that also brought us the March 13, 1989 geomagnetic "superstorm". The payload bay and tail of the Shuttle can be seen on the left hand side of the picture. Auroras are caused when high-energy electrons pour down from the Earth's magnetosphere and collide with atoms. Red aurora occurs from 200 km to as high as 500 km altitude and is caused by the emission of 6300 Angstrom wavelength light from oxygen atoms. Green aurora occurs from about 100 km to 250 km altitude and is caused by the emission of 5577 Angstrom wavelength light from oxygen atoms. The light is emitted when the atoms return to their original unexcited state. Image credit: NASA.
Solar Maximum is approaching
The sun waxes and wanes in brightness in a well-documented 11-year cycle, when sun spots and their associated Coronal Mass Ejections occur. We just passed through solar minimum--the sun is quiet, with no sun spots. We are headed towards a solar maximum, forecast to occur in 2012. Geomagnetic storms are at their peak during solar maximum, and we'll have to be on the lookout for severe "Space Weather" starting in 2010. I'll talk more about severe "Space Weather" storms in my next post, when I'll discuss the greatest Space Weather storm in recorded history--the famed "Carrington Event" of 1859--and what damages it might wreak were it to happen today. An extraordinary report funded by NASA and issued by the U.S. National Academy of Sciences (NAS) in 2008 says that a repeat of the Carrington Event could result in the most costly natural disaster of all time.
MetaTech Corporation's animation of the March 13, 1989 geomagnetic "superstorm".
NOAA's Space Weather Prediction Center (SWPC)
By: JeffMasters, 2:56 PM GMT on March 26, 2009
The QuikSCAT satellite, launched in 1999, provides crucial measurements of surface wind speed and direction over Earth's oceans twice per day. Forecasters world-wide have come to rely on data from QuikSCAT to issue timely warnings and make accurate forecasts of tropical and extratropical storms, wave heights, sea ice, aviation weather, iceberg movement, coral bleaching events, and El Niño. But QuikSCAT is ailing. Originally expected to last just 2-3 years, QuikSCAT is now entering its tenth year, and is definitely showing its age. The spacecraft's primary transmitter, power control unit, and battery have all failed over the years. The loss of the spares for any of these components will mean the end of QuikSCAT--a satellite that likely provides hundreds of millions of dollars of benefit each year to the public. Just one example of QuikSCAT's value, taken from a recent study (H. Kite-Powell, 2008) wind data from QuikSCAT and the resulting improvements to warning and forecast services save the container and bulk shipping industry $135 million annually by reducing their exposure to hurricane force wind conditions in non-tropical storms by 44% over the North Pacific and North Atlantic. Loss of QuikSCAT would result in a 80 - 90% loss in detection capability for hurricane-force conditions in extratropical cyclones.
Figure 1. NASA's QuikSCAT satellite, launched in 1999. Image credit: NASA.
Alternatives to QuikSCAT
Two valuable alternatives to QuikSCAT are available, but neither can come close to making up for the loss of QuikSCAT. The Windsat instrument aboard the Coriolis satellite (launched in 2003) measures wind speed and wind direction using a different technique. Evaluation of these data at NHC and NOAA's Ocean Prediction Center (OPC) shown the winds to be unreliable in and around the storm environment. There's also the European ASCAT satellite, launched in 2007. Like QuikSCAT, ASCAT can measure global wind speed and direction twice per day. However, the data is available at 25 km resolution (two times coarser than the 12.5 km QuikSCAT), and ASCAT covers only 60% of the area covered by QuikSCAT in the same time period. QuikSCAT sees a swath of ocean 1800 km wide, while ASCAT sees two parallel swaths 550 km wide, separated by a 720 km gap. I find it frustrating to use ASCAT to monitor tropical storms, since the passes miss the center of circulation of a storm of interest more than half the time. On the plus side, ASCAT has the advantage that the data is not adversely affected by rain, unlike QuikSCAT.
The need for a new QuikSCAT
Since the loss of QuikSCAT would be such a significant blow, and the alternative sources of ocean surface wind data are of significantly lower quality, NOAA has been pushing for a QuikSCAT replacement for years. Former National Hurricane Center director Bill Proenza laudably made a big push in 2007 for a new QuikSCAT satellite. Unfortunately, he made claims about the usefulness of QuikSCAT for improving hurricane track forecasts that were not supported by scientific research, an error that may have ultimately led to his downfall. While there is evidence that QuikSCAT data may improve hurricane track forecasts of some computer models, NHC uses many models to make hurricane track forecasts, and some of these models are not helped by QuikSCAT data. For example, a 2009 model study by Dr. Jim Goerss of the Naval Research Lab found that QuikSCAT winds made no improvement to hurricane track forecasts of the NOGAPS model, one of the key models used by NHC to predict hurricane tracks. QuikSCAT is extremely valuable for many other aspects of hurricane forecasting, though. It provides early detection of surface circulations in developing tropical depressions, and for defining gale (34 kts) and storm-force (50 kts) wind radii. The information on wind radii from QuikSCAT is especially important for tropical storms and hurricanes outside the range of aircraft reconnaissance flights conducted in the Atlantic and Eastern Pacific basins, and for the regions where there are no reconnaissance flights (Central Pacific, Western Pacific, and Indian Ocean). Accurate wind radii are critical to the National Hurricane Center (NHC), Central Pacific Hurricane Center (CPHC), and Guam Weather Forecast Office (WFO) watch and warning process, since they affect the size of tropical storm and hurricane watch and warning areas. Between 2003 and 2006, QuikSCAT data were used at NHC 17% of the time to determine the wind radii, 21% of the time for center fixing, and 62% of the time for storm intensity estimates.
Figure 2. Comparison of simulated wind measurements for the Alaska coast, including Juneau and Sitka. Left: the next-generation QuikSCAT XOVWM satellite accurately retrieves winds in the Inside Passage, including a jet of tropical-storm force winds (yellow colors) along one channel. Right: the current QuikSCAT instrument cannot cover the coast or Inside Passage due to its limited resolution, and underestimates the area covered by winds of 42+ knots by a factor of 2 - 3 (orange colors). There is heavy shipping traffic in the areas missed by QuikSCAT coverage. Image credit: NASA QuikSCAT Follow-on Study.
The best solution: a next-generation QuikSCAT
QuikSCAT is 15-year old technology, and has significant limitations in what it can do. The needs of the weather forecasting community would best be served by launching a next-generation QuikSCAT satellite, called the Extended Ocean Vector Winds Mission (XOVWM). This is the solution recommended by the National Research Council in their decadal survey, 2007: Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond, page 456.
Some of the limitations of the current QuikSCAT that would be solved by a next-generation QuikSCAT:
1) QuikSCAT has limited spatial resolution, and cannot "see" winds within 20-30 km of the coast. This is where the bulk of ship traffic and fishing occurs. The proposed next-generation QuikSCAT XOVWM satellite would be able to "see" winds within 5 km of the coast.
2) QuikSCAT cannot measure winds greater than approximately 65 mph (a Category 1 hurricane has winds of 74 mph or greater). A next-generation QuikSCAT XOVWM satellite would be able to measure winds up to Category 5 hurricane strength (>155 mph).
3) QuikSCAT cannot "see" through heavy rain. A next-generation QuikSCAT XOVWM satellite would.
Obviously, all of these capabilities would be a huge boost for determining the size and strength of a hurricane, and reduce the amount of uncertainty in hurricane forecasts. The cost of a next-generation QuikSCAT XOVWM satellite was estimated by NASA in a 2008 study to be about $500 million. Ideally, a constellation of two satellites would be launched, to prevent the gaps in coverage that occur with the current single satellite. A two-satellite system was estimated to cost $723 million, as estimated in the 2008 Jet Propulsion Laboratory study, QuikSCAT Follow-on Mission Concept Study (JPL Publication 08-18, April 2008)
The second best solution: a QuikSCAT replacement
A second, cheaper solution that is being considered is to launch a replacement QuikSCAT satellite that has similar capabilities to the current one. NOAA and NASA are exploring a partnership with the Japanese Space Agency (JAXA) to fly a QuikSCAT instrument on their GCOM-Water Cycle satellite, scheduled to launch in 2016. Funding must begin in 2010 in order to meet this launch deadline. The proposed QuikSCAT replacement would be able to measure winds as high as 100 mph (Category 2 hurricane strength), and have improved ability to measure winds in heavy rain. The new satellite would have a 20% improvement in spatial resolution. The cost would be less than that of a next-generation QuikSCAT, since the rocket and and satellite are already paid for. However, there are additional costs involved in adapting QuikSCAT to the Japanese engineering requirements. The final costs of such a replacement QuikSCAT have not been determined yet, but would probably be several hundred million dollars.
This is the type of cause that it is important for we as citizens to lobby Congress for. Write your Senators and Representatives! The earliest a new QuikSCAT could get launched is 2015, and the current satellite is probably going to die well before then. Feel free to use the information above, or come up with your own. Thanks!
Contact info for your House Representative
Contact your Senator
2007 NOAA QuikSCAT user impact study:
By: JeffMasters, 1:27 PM GMT on March 24, 2009
Alaska's Redoubt Volcano continues to erupt, with the latest blast coming just after midnight Eastern time (7:41pm AKDT). The latest eruption threw ash 50,000 feet into the air, but the ash has settled to the ground and the ashfall advisory for cities to the north and northwest of Anchorage such as Talkeetna has expired. Redoubt is located about 100 miles southwest of Alaska's most populous city, Anchorage. The prevailing southerly winds deposited a swath of ash about 200 miles long to the north of the volcano (Figure 1). Redoubt last erupted between December 1989 - April 1990, and its ash clouds presented a major hazard to aviation. On December 16, 1989, Redoubt's eruption spewed ash into the air to a height of 14,000 m (45,000 ft) catching KLM Royal Dutch Airlines flight 867, a Boeing 747 aircraft, in the plume. All four engines stalled and the aircraft plummeted 13,000 feet before the pilot was able to restart the engines and land safely in Anchorage. The total costs to the aviation industry from the 1989 - 1990 eruption were about $100 million. Eighty percent of these costs were due to damaged equipment. For more information on the Redoubt eruption, check out the Alaska Volcano Observatory home page.
Figure 1. Ash on the snow to the north of Alaska's Mt. Redoubt crater in this true color image from NASA's Terra satellite. Image taken 21:49 GMT March 23, 2009. Image credit: Johnathan Dehn, Geographic Information Network of Alaska..
Redoubt's effect on the climate should be minimal
Many historic volcanic eruptions have had a major cooling impact on Earth's climate. However, Redoubt is very unlikely to be one of them. To see why this is, let's examine recent volcanic eruptions that have had a significant cooling effect on the climate. In the past 200 years, Mt. Pinatubo in the Philippines (June 1991), El Chichon (Mexico, 1982), Mt. Agung (Indonesia, 1963), Santa Maria (Guatemala, 1902) Krakatoa (Indonesia, 1883), and Tambora (1815) all created noticeable cooling. As one can see from a plot of the solar radiation reaching Mauna Loa in Hawaii (Figure 2), the Mt. Pinatubo and El Chichon eruptions caused a greater than 10% drop in sunlight reaching the surface. The eruption of Tambora in 1815 had an even greater impact, triggering the famed Year Without a Summer in 1816. Killing frosts and snow storms in May and June 1816 in Eastern Canada and New England caused widespread crop failures, and lake and river ice were observed as far south as Pennsylvania in July and August. Volcanic eruptions cause this kind of climate cooling by throwing large amounts of sulfur dioxide gas into the stratosphere. This gas reacts with water to form sulphuric acid droplets (aerosol particles), which are highly reflective, and reduce the amount of incoming sunlight.
Figure 2. Reduced solar radiation due to volcanic aerosols as measured at Mauna Loa Observatory, Hawaii. Image credit: NOAA/ESRL.
You'll notice from the list of eruptions above that all of these climate-cooling events were from volcanoes in the tropics. Above the tropics, the stratosphere's circulation features rising air, which pulls the sulfur-containing volcanic aerosols high into the stratosphere. Upper-level winds in the stratosphere tend to flow from the Equator to the poles, so sulfur aerosols from equatorial eruptions get spread out over both hemispheres. These aerosol particles take a year or two to settle back down to earth, since there is no rain in the stratosphere to help remove them. However, if a major volcanic eruption occurs in the mid-latitudes or polar regions, the circulation of the stratosphere in those regions generally features pole-ward-flowing, sinking air, and the volcanic aerosol particles are not able to penetrate high in the stratosphere or get spread out around the entire globe. Redoubt is located near 59° north latitude, far from the tropics, and thus is unlikely to be able to inject significant amounts of sulfur aerosols into the stratosphere. Furthermore, the previous 1989 - 1990 eruption of Redoubt (Figure 3) put only about 1/100 of the amount of sulfur into the air that the 1991 eruption of Mt. Pinatubo did, according to the TOMS Volcanic Emissions Group. We can expect the current eruption of Redoubt to be similar in sulfur emissions to the 1989 - 1990 eruption, and have an insignificant impact on global climate.
Figure 3. Amount of sulfur gases put into the air by recent volcanic eruptions. Note that the 1989 eruption of Redoubt put only 1/100 the amount of sulfur dioxide (SO2) into the air that the 1991 eruption of Mt. Pinatubo did. Image credit: TOMS Volcanic Emissions Group.
For more information
Realclimate.org has a nice article that goes into the volcano-climate connection in greater detail. One interesting quote from the article: There can be some exceptions to the tropics-only rule, and at least one high latitude volcano appears to have had significant climate effects; Laki (Iceland, 1783-1784). The crucial factor was that the eruption was almost continuous for over 8 months which lead to significantly elevated sulphate concentrations for that whole time over much of the Atlantic and European regions, even though stratospheric concentrations were likely not particularly exceptional.
scienceblog.com has an interesting article about the largest volcanic eruption of the 20th century--the 1912 eruption of Alaska's Mt. Novarupta, located in the same chain of volcanoes as Mt. Redoubt. According to a NASA computer model, Novarupta's climate-cooling aerosols stayed north of 30°N latitude, and did not cause global cooling. However, the model indicates that the eruption may have indirectly weakened India's summer monsoon, producing an abnormally warm and dry summer over northern India.
By: JeffMasters, 2:09 PM GMT on March 23, 2009
After many months of rumbling, Alaska's Redoubt Volcano finally exploded beginning at 11:38pm last night. Four separate eruptions have sent clouds of ash up to 50,000 feet high into the air. Redoubt is located about 100 miles southwest of Alaska's most populous city, Anchorage. Ash fall advisories were issued for the cities of Talkeetna, Willow, and Cantwell to the north and west of Anchorage until 8am AKDT this morning, and light ash has already been reported at Skwentna. The prevailing southerly winds are expected to carry the ash west of Anchorage today. However, if the volcano has ejected significant ash at a height of 35,000 - 40,000 feet, the southwesterly winds at that altitude would carry the ash over Anchorage (Figure 2). Redoubt last erupted between December 1989 - April 1990, and its ash clouds presented a major hazard to aviation. On December 16, 1989, Redoubt's eruption spewed ash into the air to a height of 14,000 m (45,000 ft) catching KLM Royal Dutch Airlines flight 867, a Boeing 747 aircraft, in the plume. All four engines stalled and the aircraft plummeted 13,000 feet before the pilot was able to restart the engines and land safely in Anchorage. For more information on the Redoubt eruption, check out the Alaksa Volcano Observatory home page.
Figure 1. The summit crater of Alaska's Mt. Redoubt crater showing rapidly melting glacier and enlarged "ice piston" feature on Saturday, March 21, 2009. Image credit: Cyrus Read, Alaska Volcano Observatory/U.S. Geological Survey.
Figure 2. Plot of ash trajectories originating at the Redoubt volcano (black star) at 8 am EDT Monday March 23, 2009. The initial eruption carried ash to a height of 20,000 feet (green line), so ash is expected to move NNE, passing west of Anchorage. Ash has already fallen at Skwentna (SKW) to the north of Anchorage. If the newer blasts were able to carry significant ash to 40,000 feet (pink line), the prevailing southwesterly winds at that altitude would carry the ash over Anchorage (ANC). Image credit: NOAA.
Significant tornado outbreak possible today
NOAA's Storm Prediction Center (SPC) is predicting a "Moderate Risk" of severe weather and tornadoes over eastern Kansas and northern Oklahoma today, as a strong Springtime low pressure system tracks across the Midwest. "Moderate Risk" is SPC's second-highest level of risk, and they expect severe thunderstorms with possible strong (EF2 or EF3) tornadoes will form late this afternoon along the cold front extending south from the low. This severe weather outbreak will be hampered somewhat by a lack of moisture, though. It's been very dry the first three months of 2009, which has made this year's tornado season about 50% less active than usual. Follow the outbreak today on our Interactive Tornado Page and Severe Weather Page.
By: JeffMasters, 12:35 PM GMT on March 20, 2009
Global temperatures in February remained about where they've been the past year, with Earth recording its 9th warmest February on record, according to statistics released by the National Climatic Data Center. This past winter was the eighth warmest winter on record (December-February), and the January-February year-to-date period was also the eighth warmest. The most notable extreme February heat occurred February 7 in southern Australia. Many locations set new all-time hottest temperature records, including an all-time state record for Victoria when temperatures reached 48.8°C (119.8°F) in Hopetoun, shattering the previous record of 47.2°C (117.0°F) set in January 1939. The extreme heat was accompanied by very dry conditions that contributed to the development of deadly wildfires that killed 210 people. The most notable cold conditions for the winter of 2008/2009 occurred in the United Kingdom, which had its coldest winter since 1995/1996.
Figure 1. Departure of temperature from average for the month of February 2009. Image credit: NOAA's National Climatic Data Center.
A dry and warm February for the U.S.
For the contiguous U.S., February temperatures were the 27th warmest in the 114-year record, according to the National Climatic Data Center. The month was very dry, ranking as the 8th driest February. New Jersey and Delaware had their driest February ever recorded. The winter of 2008/2009 (December - February) ranked as the 5th driest winter on record, and the year-to-date January - February period was the driest ever such period. Texas recorded its driest winter. Thanks to all the dry weather, the U.S. has only seen about 50% of normal tornado activity in 2009, according to NOAA's Storm Prediction Center. On March 19, 2009, 21% of the contiguous United States was in moderate-to-exceptional drought. This is unchanged from January.
La Niña conditions continue
La Niña conditions continued in the Eastern Pacific Ocean in February, and NOAA's Climate Prediction Center is continuing their La Niña Advisory. They define La Niña conditions as occurring when the 1-month mean temperature anomaly in the equatorial Eastern Pacific (the area 5°N - 5°S, 120°W - 170°W, also called the "Niña 3.4 region") cools below -0.5°C and is expected to persist for three consecutive months. In addition, the atmospheric response typically associated with a La Niña must be observed over the equatorial Pacific Ocean. Sea surface temperatures anomalies peaked at -1.1°C below average in the Niña 3.4 region during early January. It appears that La Niña has peaked, as ocean temperatures in the Niña 3.4 region have warmed to -0.4°C below average as of March 15. Nearly all the model forecasts for the Niño 3.4 region show that La Niña will dissipate between May - July 2009, and neutral conditions are expected for the August - October peak of hurricane season. Only three out of 16 El Niño models are predicting an El Niño event for hurricane season. The number of Atlantic hurricanes is typically reduced in an El Niño year, due to increased wind shear from strong high-level winds, but that doesn't look like it will happen this year.
Expected impacts of the current La Niña conditions during during March - May 2009 inlude above-average precipitation over Indonesia, and below-average precipitation over the central equatorial Pacific. Compared to the Northern Hemisphere winter, La Niña impacts over the United States are typically less pronounced. For the contiguous United States, potential impacts include below-average precipitation across the southern states. Other potential impacts include below-average temperatures in the Pacific Northwest and above-average temperatures across much of the southwestern and south-central United States.
Sea ice extent
February 2009 Northern Hemisphere sea ice extent was the 4th lowest on record, according to the National Snow and Ice Data Center, and is currently near its annual maximum. The record February low was set in 2005. Since today is the Spring Equinox, the sun will rise at the North Pole, bringing 24-hour daylight and warming conditions that will begin melting Arctic sea ice.
Portlight Relief Walk this weekend
Saturday March 21 in New Orleans marks the inaugural event in a series of 18 fundraising Relief Walks sponsored by Portlight.org. A hearty thanks go to all the organizers and participants in this effort!
By: JeffMasters, 2:04 PM GMT on March 17, 2009
Last week, I introduced the National Climatic Data Center's Climate Extremes Index, which uses temperature and precipitation records to see if the U.S. climate is getting more extreme. Today, I'll focus on how the drought and precipitation extremes that go into the Climate Extremes Index have changed over the past century. The three precipitation-related factors to go into the Climate Extremes Index are:
1) The sum of (a) percentage of the United States in severe drought (equivalent to the lowest tenth percentile) based on the Palmer Drought Severity Index (PDSI) and (b) percentage of the United States with severe moisture surplus (equivalent to the highest tenth percentile) based on the PDSI.
2) Twice the value of the percentage of the United States with a much greater than normal proportion of precipitation derived from extreme (equivalent to the highest tenth percentile) 1-day precipitation events.
3) The sum of (a) percentage of the United States with a much greater than normal number of days with precipitation and (b) percentage of the United States with a much greater than normal number of days without precipitation.
Items 1 and 3 have shown no change in annual average value over the past century, but there has been a marked increase in the number of heavy 1-day precipitation events in recent decades. Thus, the record values of the Climate Extremes Index in recent years is due to a combination of the increase in heavy 1-day precipitation events, plus the increase in maximum and minimum temperatures.
Figure 1. The Annual Climate Extremes Index (CEI) for heavy 1-day precipitation events shows that these events, on average, have affected 10% of the U.S. over the past century (black line). However, heavy precipitation events have increased recently, with seven of the top ten years on record having occurred since 1995. Image credit: National Climatic Data Center.
Heavy precipitation events
Global warming theory predicts that global precipitation will increase, and that heavy precipitation events--the ones most likely to cause flash flooding--will also increase. This occurs because as the climate warms, evaporation of moisture from the oceans increases, resulting in more water vapor in the air. According to the 2007 Intergovernmental Panel on Climate Change (IPCC) report, water vapor in the global atmosphere has increased by about 5% over the 20th century, and 4% since 1970. The Climate Extremes Index plot for extreme 1-day precipitation events (Figure 1) does indeed show a sharp increase in heavy precipitation events in recent decades, with seven of the top ten years for these events occurring since 1995. The increases in heavy precipitation events have primarily come in the Spring and Summer, when the most damaging floods typically occur. This mirrors the results of Groisman et al. (2004), who found an increase in annual average U.S. precipitation of 7% over the past century, which has led to a 14% increase in heavy (top 5%) and 20% increase in very heavy (top 1%) precipitation events. Kunkel et al. (2003) also found an increase in heavy precipitation events over the U.S. in recent decades, but noted that heavy precipitation events were nearly as frequent at the end of the 19th century and beginning of the 20th century, though the data is not as reliable back then.
Drought and extreme wetness
Global warming theory predicts that although global precipitation should increase in a warmer climate, droughts will also increase in intensity, areal coverage, and frequency (Dai et al., 2004). This occurs because when the normal variability of weather patterns brings a period of dry weather to a region, the increased temperatures due to global warming will intensify drought conditions by causing more evaporation and drying up of vegetation. Increased drought is my number one concern regarding climate change for both the U.S. and the world in the coming century. Two of the three costliest U.S. weather disasters since 1980 have been droughts--the droughts of 1988 and 1980, which cost $71 billion and $55 billion, respectively. The heat waves associated with these droughts claimed over 17,000 lives, according to the National Climatic Data Center publication, Billion-Dollar Weather Disasters. Furthermore, the drought of the 1930s Dust Bowl, which left over 500,000 people homeless and devastated large areas of the Midwest, is regarded to be the third costliest U.S. weather disaster on record, behind Katrina and the 1988 drought. (Ricky Rood has an excellent book on the Dust Bowl that he recommends in his latest blog post).
Figure 2. The Annual Climate Extremes Index (CEI) for drought. The worst U.S. droughts on record occurred in the 1930s and 1950s. There has been no trend in the amount of the U.S. covered by drought conditions (blue bars) or by abnormally moist conditions (red bars) over the past century. About 10% of the U.S. is typically covered by abnormally dry or wet conditions (black lines). Image credit: National Climatic Data Center.
The good news is that the intensity and areal coverage of U.S. droughts has not increased in recent decades (blue bars in Figure 2). The portion of the U.S. experiencing abnormal drought and exceptionally wet conditions has remained nearly constant at 10% over the past century. A recent paper by Andreadis et al., 2006, summed up 20th century drought in the U.S. thusly: "Droughts have, for the most part, become shorter, less frequent, and cover a smaller portion of the country over the last century. The main exception is the Southwest and parts of the interior of the West, where, notwithstanding increased precipitation (and in some cases increased soil moisture and runoff), increased temperature has led to trends in drought characteristics that are mostly opposite to those for the rest of the country especially in the case of drought duration and severity, which have increased."
The rest of the globe has not been so lucky. Globally, Dai and Trenberth (2004) showed that areas experiencing the three highest categories of drought--severe, extreme, and exceptional--more than doubled (from ~12 to 30%) since the 1970s, with a large jump in the early 1980s due to an El Niño-related precipitation decrease over land, and subsequent increases primarily due to warming temperatures. According to the Global Drought Monitor, 50 million people world-wide currently live in areas experiencing the highest level of drought (exceptional).
The future of U.S. drought
As the climate continues to warm, I expect the frequency, severity, and areal coverage of droughts to increase over the U.S. We're certainly off to a dry start in 2009--the period January - February this year was the driest such period in U.S. history, according to the National Climatic Data Center.
Andreadis, K. M. Lettenmaier, D. P., "Trends in 20th century drought over the continental United States", Geo. Res. Letters 33, 10, L10403, DOI 10.1029/2006GL025711
Dai A., K.E. Trenberth, and T. Qian, 2004: A global data set of Palmer Drought Severity Index for 18702002: Relationship with soil moisture and effects of surface warming", J. Hydrometeorol., 5, 11171130.
Gleason, K.L., J.H. Lawrimore, D.H. Levinson, T.R. Karl, and D.J. Karoly, 2008: "A Revised U.S. Climate Extremes Index", J. Climate, 21, 2124-2137.
Groisman, P.Y., R.W. Knight, T.R. Karl, D.R. Easterling, B. Sun, and J.H. Lawrimore, 2004, "Contemporary Changes of the Hydrological Cycle over the Contiguous United States: Trends Derived from In Situ Observations," J. Hydrometeor., 5, 64-85.
Kunkel, K. E., D. R. Easterling, K. Redmond, and K. Hubbard, 2003, "Temporal variations of extreme precipitation events in the United States: 1895-2000", Geophys. Res. Lett., 30(17), 1900, doi:10.1029/2003GL018052.
By: JeffMasters, 1:54 PM GMT on March 13, 2009
Is the climate in the U.S. getting more extreme? The answer to this question depends upon how one defines "extreme". For example, the number of extreme tornadoes (violent EF-4 and EF-5 twisters) has not increased in recent years. We lack the data to judge whether there has been an increase in severe thunderstorms and hail. There has been a marked increase in Atlantic hurricane activity since 1995 (though the possible contribution of human-caused global warming to this increase is not something hurricane scientists agree upon). Since it is difficult to quantify how severe storms like tornadoes and hurricanes are changing, a better measure of how climate extremes are changing is to look at temperature and precipitation, which are well-measured. NOAA's National Climatic Data Center (NCDC) has developed a Climate Extremes Index to attempt to quantify whether or not the U.S. climate is getting more extreme. The Climate Extremes Index (CEI) is based upon three parameters:
1) Monthly maximum and minimum temperature
2) Daily precipitation
3) Monthly Palmer Drought Severity Index (PDSI)
The temperature data is taken from 1100 stations in the U.S. Historical Climatology Network (USHCN), a network of stations that have a long period of record, with little missing data. The temperature data is corrected for the Urban Heat Island effect, as well as for station and instrument changes. The precipitation data is taken from 1300 National Weather Service Cooperative stations. The Climate Extremes Index defines "much above normal" as the highest 10% of data, "much below normal" as the lowest 10%, and is the average of these five quantities:
1) The sum of (a) percentage of the United States with maximum temperatures much below normal and (b) percentage of the United States with maximum temperatures much above normal.
2) The sum of (a) percentage of the United States with minimum temperatures much below normal and (b) percentage of the United States with minimum temperatures much above normal.
3) The sum of (a) percentage of the United States in severe drought (equivalent to the lowest tenth percentile) based on the Palmer Drought Severity Index (PDSI) and (b) percentage of the United States with severe moisture surplus (equivalent to the highest tenth percentile) based on the PDSI.
4) Twice the value of the percentage of the United States with a much greater than normal proportion of precipitation derived from extreme (equivalent to the highest tenth percentile) 1-day precipitation events.
5) The sum of (a) percentage of the United States with a much greater than normal number of days with precipitation and (b) percentage of the United States with a much greater than normal number of days without precipitation.
Figure 1. The Annual Climate Extremes Index (CEI), updated through 2008, shows that U.S. climate has been getting more extreme since the early 1970s. Image credit: National Climatic Data Center. On average since 1910, 20% of the U.S. has seen extreme conditions in a given year (thick black line).
As summarized by Gleason et al. (2008), the National Climatic Data Center concludes that based on the Climate Extremes Index, the percentage of the U.S. seeing extreme temperatures and precipitation generally increased since the early 1970s. These increases were most pronounced in the summer. No trend in extremes were noted for winter. The annual CEI index plot averaged for all five temperature and precipitation indices (Figure 1) showed that five of the fifteen most extreme years on record occurred since 1997. Shorter-lived periods with high CEI values occurred in the 1930s and 1950s, in association with widespread extreme drought and above-average temperatures. The most extreme year in U.S. history was 1998, with 1934 a close second. The year 1998 was the hottest year in U.S. history, with a record 78% of the U.S. experiencing minimum temperatures much above normal. That year also had a record 23% of the U.S. with much greater than normal precipitation from extreme 1-day precipitation events. The 1934 extreme in CEI was due in large part because of the most widespread drought of the century--a full 52% of the U.S. was affected by severe or extreme drought conditions. That year also saw a record 64% of the U.S. with much above normal maximum temperatures.
The impact of maximum and minimum temperatures on the Climate Extreme Index
It is very interesting to look at the five separate indices that go into the Climate Extremes Index. Today I'll look at temperature, and next week, I'll focus on drought and precipitation. The portion of the U.S. experiencing month-long maximum temperatures either much above normal or much below normal has been about 10% over the past century (black lines in Figure 2). However, over the past decade, about 20-25% of the U.S. has been experiencing monthly maximum temperatures much above normal, and the portion of the U.S. experiencing much colder than normal high temperatures has been near zero. Minimum temperatures show a similar behavior, but have increased more than the maximums (Figure 3). Over the past decade, minimum temperatures much above normal have affected 25-35% of the U.S. This means that the daily range of temperature (difference between minimum and maximum) has decreased over the past decade, which is what global warming says should be happening if greenhouse gases are primarily to blame for the rise in temperatures.
While there have been a few years (1921, 1934) when the portion of the U.S. experiencing much above normal maximum temperatures was greater than anything observed in the past decade, the sustained lack of maximum temperatures much below normal over the past decade is unique. The behavior of minimum temperatures over the past decade is clearly unprecedented--both in the lack of minimum temperatures much below normal, and in the abnormal portion of the U.S. with much above normal minimum temperatures. Remember that these data ARE corrected for the Urban Heat Island effect, so we cannot blame increased urbanization on the increase in temperatures. Recall that the all-time record maximum and minimum temperature data, which I presented in a post in February, are not corrected for the Urban Heat Island Effect, but look very similar to the CEI maximum and minimum temperature trends presented here.
A lot of people have told me that they believe we are experiencing more wild swings of temperature from hot to cold from day to day in recent years, but the CEI data does not answer this question. To my knowledge, a study of this kind has not been done.
Figure 2. The Annual Climate Extremes Index (CEI) for maximum temperature, updated through 2008, shows that 20-25% of U.S. has had maximum temperatures much above normal over the past decade. Image credit: National Climatic Data Center.
Figure 3. The Annual Climate Extremes Index (CEI) for minimum temperature, updated through 2008, shows that 25-35% of U.S. has had minimum temperatures much above normal over the past decade. Image credit: National Climatic Data Center.
Gleason, K.L., J.H. Lawrimore, D.H. Levinson, T.R. Karl, and D.J. Karoly, 2008: "A Revised U.S. Climate Extremes Index", J. Climate, 21, 2124-2137.
Annual WeatherDance contest ready for registration!
Armchair forecasters, now's your chance to shine! WeatherDance, based on teams in the men's and women's NCAA basketball tournaments, allows players to predict which team's city will be hotter or colder on game day in each round of the Big Dance. Beginning today, players can make their forecasts at the Weather Dance Web site at: www.weatherdance.org. The site will be updated with cities promptly after NCAA seeding announcements. First round Weather Dance selections must be entered by 11:59 p.m. EDT Wednesday, March 18.
"Officially, Weather Dance began as a class project to get students involved in weather forecasting, but we kept it around because it got popular. People think they can do better forecasting than the meteorologists. Well, here's their shot!" said Perry Samson, WeatherDance creator, co-founder of the The Weather Underground, Inc., and Professor in the Department of Atmospheric, Oceanic and Space Sciences at the University of Michigan.
This is the fifth year for the game. Last year more than 2,000 people played. Most play merely for the thrill, but many K-12 science teachers involve their classes as part of meteorology units. The winning teacher will receive an expense-paid trip to join the Texas Tech/University of Michigan Storm Chasing team this spring for a day of tornado chasing in Tornado Alley. Other winners will receive a Weather Underground umbrella, "Extreme Weather" mugs, or a copy of the book "Extreme Weather," by Christopher C. Burt.
I'll talk about drought and precipitation trends in my next post, Monday or Tuesday.
By: JeffMasters, 3:46 PM GMT on March 10, 2009
At last week's 63rd Interdepartmental Hurricane Conference, a number of notable news items surfaced regarding doings at the National Hurricane Center (NHC). Some of these are detailed on the NHC web site, and others I learned by talking to the people at the conference and via emails. Of note:
Saffir-Simpson Scale being redefined
NHC is considering removing any mention of storm surge from the familiar Category 1-2-3-4-5 Saffir-Simpson scale, starting this June. The current definition is primarily based upon wind speed, but storm surge flooding is included as well. The new definition will make the Saffir-Simpson scale exclusively keyed to wind speeds. This change will help pave the way for the proposed Storm Surge Warning, discussed next.
New Storm Surge Warning product proposed
The impact of Hurricane Ike on the Texas coast in 2008 underscored the inadequacy of the Saffir-Simpson scale to characterize storm surge threat. Ike was a strong Category 2 hurricane on the Saffir-Simpson scale, yet brought a storm surge characteristic of a strong Category 3 hurricane to the coast. Very high storm surges in excess of ten feet were recorded along portions the Louisiana coast, in regions that did not get hurricane force winds. The water level rose four feet above normal at Pascagoula, MS, some 170 miles to the east of the eastern edge of the Hurricane Warning, well before that warning was issued. To address these concerns, NHC is considering issuing a separate storm surge warning. This is great idea, but there are a number of major technical hurdles to leap before this product can be made operational. NHC director Bill Read indicated that official storm surge warnings are probably 3 - 5 years in the future. Among the concerns:
1) What level of water qualifies? Should it be different depending on the location?
2) Should a level of certainty be used (e.g., 40% chance of the surge reaching 5 feet?)
3) Would a "storm surge watch" be issued beforehand?
4) The storm surge can stay elevated for several days after a storm passes. How long would the surge warning stay in effect?
Figure 1. Example of how the proposed new Storm Surge Warning and Hurricane Warning areas would have looked for Hurricane Katrina. NHC is also considering unifying the "Inland Hurricane Wind Warning" and Hurricane Warning (currently only issued for the coast) into one unified Hurricane Warning. Image credit: National Hurricane Center.
Expanded lead times for hurricane watches and warnings
Currently, NHC issues a Hurricane Watch 36 hours before the potential arrival of hurricane force winds at the coast, and a Hurricane Warning 24 hours in advance. As early as the 2010 season, it is proposed that these lead times be extended to 48 hours for a Watch and 36 hours for a Warning. This would give increased time for people to prepare, at the expense of warning more people unnecessarily. However, hurricane track forecasts have improved so markedly (50% in the past 20 years, with record accuracy again in 2008) that the number of people being over-warned would not significantly change compared to the 1990s.
Cone of Uncertainty reduced in size
For the Atlantic, official NHC forecasts for track in 2008 were the best ever, for both short range (12, 24, 48 hour) and longer range (3 - 5 days). As a result, NHC will be modestly reducing the size of the "cone of uncertainty" for 2009. Recall that the "cone of uncertainty" is set so that 2/3 of all track errors over the past five years will fall inside the cone. You're definitely not safe if you're in the cone, and 1/3 of the time, storms will deviate outside the cone!
In response to recommendations made from a panel investigating morale problems at NHC in the wake of the July 2007 revolt by NHC employees against then-director Bill Proenza, a new branch chief position--head of the hurricane forecasters-- has been created and filled. The new branch chief of the Hurricane Specialists Unit is former senior hurricane forecaster James Franklin. He will now devote 80% of his time to administrative matters, and will be performing only one shift per week doing hurricane forecasting. Senior hurricane specialist Dr. Rick Knabb is gone, he left last year to head the Central Pacific Hurricane Center. NHC has hired Dr. Michael Brennan, who came on board during the tail end of the 2008 season, was hired to fill this vacancy. NHC will be short one hurricane forecaster during the 2009 season, as senior hurricane forecaster Stacy Stewart has been called up for military reserve duty.
I'll have a new post Thursday or Friday.
By: JeffMasters, 3:21 PM GMT on March 06, 2009
At the 63rd Interdepartmental Hurricane Conference (IHC) in St. Petersburg, Florida this week, the latest results from the 2008 hurricane research program and plans for the upcoming 2009 campaign were presented by a number of scientists involved in the National Atmospheric and Oceanic Administration's (NOAA's) P-3 weather research aircraft program. NOAA flies three state-of-the art flying laboratories into hurricanes each year, and 2008 saw their second busiest season ever. The two low-altitude P-3 aircraft flew 39 missions, and the high-altitude G-IV jet flew 23 missions. A total of 1156 dropsondes and 525 expendable ocean probes were launched on these missions, with a particular emphasis on studying the interaction of the Gulf of Mexico ocean currents with Hurricane Ike and Hurricane Gustav. These interactions play a key role in rapid hurricane intensification in the Gulf of Mexico. NOAA's aircraft also caught the genesis of two hurricanes from the tropical wave stage--Dolly and Fay. These data will be used to help improve modeling studies to better forecast when a tropical wave will turn into a tropical depression, something the models aren't very good at now.
For the first time, wind data from the P-3 tail Doppler radar was sent in real time in 2008 for ingestion into an experimental computer forecast model, the WRF-ARW. There are high hopes that this data will lead to a significant improvement in short term (24-48 hour) hurricane forecasts, beginning in 2010. For this year, the testing phase of this project will continue, with the real-time Doppler radar data from both P-3s being ingested into a non-operational version of the HWRF model. The HWRF is one of the most reliable models used by the National Hurricane Center (NHC) to produce the official forecast, and is receiving a huge amount of development effort in the coming years. If the 2009 test phase goes well, the Doppler P-3 data may go into the operational version of the HWRF model as early as 2010.
NOAA jet getting major upgrades
The NOAA G-IV jet, "Gonzo", is in the process of having a tail Doppler radar installed. The protective radome has already been installed, and the guts of the radar are expected to be added later this year. By 2010, it is expected that Gonzo will have its Doppler radar operational, which should greatly increase scientists' ability to see into the heart of a hurricane, due to the lofty vantage point this radar will have (40,000 high, as opposed to the 25,000 maximum altitude of the Doppler radars on the P-3s). Gonzo has also been fitted with a special version of the Stepped Frequency Microwave Radiometer (SFMR), the extremely valuable surface wind speed instrument carried on all the Air Force C-130 and NOAA P-3 hurricane reconnaissance aircraft. Alan Goldstein of NOAA's Aircraft Operations Center reported that testing of Gonzo's SFMR during 2008 showed that the instrument performed well, and NHC will begin receiving the data during the 2009 hurricane season (but the data will probably be for internal NHC use only until more testing is performed).
Figure 1. The NOAA G-IV high altitude weather research jet, "Gonzo". Image credit: NOAA/AOC.
A new P-3 for NOAA
The other big news from the NOAA Hurricane Hunters is the addition of a new P-3, N44RF. In keeping with the theme of naming their aircraft after Jim Henson Company's Muppets characters, the new aircraft will be dubbed "Animal". The "new" P-3 was reclaimed from the "Boneyard" of disused P-3s at Tucson's Davis-Monthan Air Force Base, and "Animal" is currently being refurbished and fitted with state-of-the art weather research instrumentation. Animal is not being fitted with a Doppler radar at present, and it is expected that the aircraft will primarily fly air pollution research missions, freeing up the other P-3s (Kermit and Miss Piggy) for exclusive hurricane work during the six months of hurricane season. Crew for the new P-3 are already beginning to arrive at NOAA's Aircraft Operations Center, and it is anticipated that the new aircraft will be ready to fly in 2010.
Figure 2. Nose art from the WP-3Ds, N42RF and N43RF. Copyright © The Jim Henson Company.
By: JeffMasters, 12:39 PM GMT on March 05, 2009
I'm at the 63rd Interdepartmental Hurricane Conference (IHC) in St. Petersburg, Florida this week, catching up on the latest hurricane research results and plans. About 150 scientists and administrators from all the major U.S. hurricane research agencies are here, and I'll present a few of the highlights of the conference in my next few posts.
We can make crude measurements of atmospheric wind and temperature using remote sensing instruments mounted on aircraft on satellites, but we currently have no way to take such measurements of pressure. Accurate pressure measurements of the inital state of the atmosphere are key in making accurate computer model forecasts, since it is differences in pressure that drive all winds, as air flows from high pressure to low pressure in an attempt to equalize the pressure. Currently, sea surface air pressure measurements can only be obtained from in-situ observations including buoy, ship and dropsonde measurements, which are expensive and sparse in spatial coverage. Dr. Roland Lawrence of Old Dominion University and Qilong Min of SUNY Albany presented a new technique to make remote pressure measurements, in a talk titled "Flight Test Results of a Differential Microwave Radar for Remote Sensing of Atmospheric Pressure". Scientists at NASA's Langley Research Center have built a prototype instrument that has successfully taken remote pressure measurements on aircraft test flights over the ocean. The instrument can measure surface pressure to an accuracy of 4 mb, and possibly as good as 1 mb, when averaged over a several kilometers of ocean area. While flights tests into a hurricane and real-time assimilation of the data in hurricane forecast models are still probably several years away, this is one technology that has the potential to make a big improvement in hurricane track and intensity forecasts. One potential problem with using the instrument in hurricanes is that since the device is measuring the total amount of oxygen in the air along its beam to derive the air pressure, one also needs to know the amount of water vapor along this beam (since water vapor contains oxygen). This quantity will usually need to be modeled, since we don't have detailed humidity measurements available in hurricanes.
I'll post another postcard from IHC on Friday.
By: JeffMasters, 3:58 PM GMT on March 04, 2009
I'm at the 63rd Interdepartmental Hurricane Conference (IHC) in St. Petersburg, Florida this week, catching up on the latest hurricane research results and plans. About 150 scientists and administrators from all the major U.S. hurricane research agencies are here, and I'll present a few of the highlights of the conference in my next few posts.
One new technology discussed involves the release of hundreds of "superpressure" helium balloons into the atmosphere surrounding an approaching hurricane. Justyna Nicinska and Alexander MacDonald of the Oceanic and Atmospheric Research branch of the National Oceanic and Atmospheric Administration (NOAA) presented a talk on this concept--the WISDOM project (Weather In Situ Deployment Optimization Method). The WISDOM project uses hundreds of "superpressure" balloons (Figure 1) to take data around the periphery of a hurricane. These balloons carry a 100 gram GPS receiver and satellite radio transmitter, and are launched from the ground. The balloons quickly rise to an altitude of 12,000 or 26,000 feet, where they remain for a period of 2 - 5 days. They are designed to stay at a constant pressure as they blow across the Atlantic, radioing back their GPS position every 15 - 30 minutes. This position information can then be used to derive the winds in the vicinity of the balloon. This wind information can then be fed into computer forecast models, and the hundreds of new data points over the data-poor ocean regions surrounding the hurricane should help to make improvements in hurricane track and intensity forecasts. The system was tested in November 2008 during Hurricane Paloma, when 57 balloons were launched from locations in the Caribbean and along the U.S. coast. The balloons remained in the air for up to a week, successfully transmitting their position and providing wind information. The data was not ingested in real time into any computer forecast models, though.
Figure 1. Launch of a WISDOM balloon from Miami. Image credit: NOAA.
For 2009, a full-scale test launch of hundreds of balloons into the atmosphere surrounding a high-impact hurricane is planned. Again, the data will not be ingested in real time into hurricane forecast models, but will be available for "hindcast" studies to see how the additional data helps forecasts. The balloons cost several thousand dollars each. Additional costs include the deployment of eight trained student teams (two people per team via commercial air transport) to potential launch sites:
* Barbados, Cayman Islands, St. Croix, and Mexico in the Caribbean
* Cape Verdes Islands in the far eastern Atlantic
* U.S. East Coast: Miami FL, Charleston SC, Morehead City, NC
* Central U.S.: Denver CO, Twin Cities MN, Wilmington OH, Dodge City KS
* Western Gulf Coast: Jackson MS, Corpus Christi TX
The current costs of a WISDOM deployment are similar to the costs of flying the Hurricane Hunters for two days into a hurricane. The thought is that costs will fall a factor of ten, to several hundred dollars per balloon, once the project moves into production mode. The WISDOM project is currently funded (by the Department of Homeland Security) through 2014. The WISDOM team hopes to develop a new data package which will carry a temperature and humidity sensor for future work. It is unknown how much improvement the WISDOM project might make for hurricane track forecasts. The NOAA jet can improve hurricane track forecasts by 25% when it flies one of its dropsonde missions around a hurricane, but these dropsondes provide a more 3-dimensional picture of the atmosphere than the WISDOM balloons will provide, and it is unlikely that the WISDOM balloons wil be able to affect hurricane track forecasts to that degree. Wind data similar to the WISDOM data come from cloud tracking by the GOES satellites, and this data has been shown to improve hurricane track forecasts by 7% to 24% in one model study. It is uncertain how much additional improvement might result from using the WISDOM data, since GOES is already providing some data of a similar nature.
I'll have more Thursday from the Interdepartmental Hurricane Conference.
The views of the author are his/her own and do not necessarily represent the position of The Weather Company or its parent, IBM.