Tuesday, June 27, 2006

Poor Prognosis For The Long Range Forecast

Well, with it still raining like crazy, what better time than this to evaluate our little experiment on long range weather forecasting.

(Here in Washington, we broke the record for single day precipitation twice in two days. We suspect that by the end of the day we will have broken the record for monthly precipitation as well. We hope our golf course doesn't get flooded!)

For those who are just tuning in, a brief recap. A couple weeks ago (June 9, to be precise) we started a little experiment on the accuracy of those long range weather forecasts. In our case, we used Weather.com's 10-day and 5-day forecasts for Arlington, Virginia and compared them to the actual weather on the forecast days.

For purposes of our experiment, we asked the question: What if I want to plan an outdoor activity in Arlington between 1-6 pm on the day in question? If the forecast was for a 50% or greater chance of rain, we would plan to stay indoors. If less than 50%, we'd plan for outdoors. We also gave Weather.com a generous +/- 5 degrees on the temperature forecast before we'd say the forecast was wrong. (I.e., if the actual temperature was within 5 degrees of the forecast temp., we said the forecast was accurate.)

For those days where the forecast was accurate, by our definition, we assigned a green Y (meaning, "Yes, the forecast was useful"; otherwise, we assigned a red N, meaning the forecast was "Not useful").

So, how did the experiment turn out?

Five Day Okay

Well, the 5-day forecast did okay, if you don't mind being wrong 39% of the time. Over an 18 day period, the 5-day forecast earned a Y 11 times, and an N 7 times. It was slightly better than simply flipping a coin. Still, if you were planning something important you might not take too much comfort in the 5-day forecast.

Ten Day No Way

The 10-day forecast, however, was much worse. As we anticipated, the 10-day forecast is worse than flipping a coin. In fact, the 10-day forecast was considered accurate only 6 out of 14 days, an accuracy rate of only 43%. Indeed, if you assume the opposite of the 10-day forecast, you will be better off.

Granted, the experiment didn't go on very long, and we've had some unusually unstable weather during this period. So, to be fair, the Curmudgeon will continue the experiment (without boring you daily with the results) for a couple months to see if things change.

It would be interesting to see someone do a more sophisticated version of this experiment. For example, one could look at the data over an entire year for each of the 10 days of the long range forecast and generate some interesting statistics. Presumably, the shortest range forecast--that for the next day--would be most accurate, with a curve of less accuracy as one got to the 10 day forecast.

Here's the real rub: we suspect that The Weather Channel folks, as well as those at the National Weather Service, Weather Underground, Accuweather and all those other fun weather sites know full well that the longer range forecasts are less accurate than a coin toss. The question then, is this: why issue forecasts that you know are, more likely than not, MIS-leading?

We suspect that someday in the future, everyone will know weeks or even months in advance exactly what the weather will be on a given day at a given time in a given location. Then, what will we have to talk about?

No comments: