Mark Ellinwood
EF2
I also picture the computer recognizing the measurement error, and correcting for it when the warning is issued. How do you think the meteorologist knew there was an error? His training and experience. Not some "feeling" in his gut. Well the computer can be trained to recognize errors too, and also compensate for said errors.
As has been noted already - computers do. They are still early in the development cycle, but they can take a storm that has formed (and even one that has not) and "nowcast" from that. I'd suggest looking up "Warn On Forecast" for more info on what exists now and where it's going. http://journals.ametsoc.org/doi/abs/10.1175/2009BAMS2795.1
No - it doesn't at all. It can spit out an ALL CAPITAL TORNADO WARNING if it wants, or it can notify cell phones in the area of the storm that a tornado is likely to form, and post a note on Twitter. Long before the human met can do all those.
And that's a detriment - not a featureThe computer can analyze EVERY storm on the radar scope, and storms that aren't even on the screen yet, and see which ones are moving into better/worse environments. A human can do a handful, but if you have 12 supercells out there - the PC will win.
Actually those private forecasters are using models that people outside of their environment don't have access to. It's cheaper to pay AW $100K per year than it is to develop a $100M modeling system.
I appreciate your concern for the plight of the human forecasterbut I'm just not sure you are up to speed on where things are now (and what that means for the future.)
I'm completely aware of warn-on-forecast. To get to the level of human forecasters, it requires having very accurate model data many minutes/hours before storms form. Again, if one storm is off by 10 miles, you could be tornado warning a population of 1,000 vs. a population of 50,000 in some cases.
You're telling me that a model will be able to analyze the live radar and release warnings immediately? I would love to see a live, minute-by-minute, scan-by-scan model that's constantly updating with up-to-the-minute data. I know that's ultimately the goal of warn-on-forecast, but I have my doubts about how well the models will do with accurate detection (both location and intensity) and the false alarm rate. At WeatherBug, we have DTAs (Dangerous Thunderstorm Alerts) that are automatic warnings for storms based on live radar and lightning, but it certainly has its limitations.
What models are we (private forecasters) using that others don't have access to? The Euro? You can get that data for as little as $15-20/month from places like WeatherBell and StormVista. Some companies (Like MDA Weather Services/EarthSat, which I used to work for) have other Euro data available at a higher premium, but even those higher rates would be cheaper than hiring an actual met. MDA has a proprietary "superensemble" that combines select model data and automatically weighs each one, but even then, the human forecasters that work there can beat this proprietary information. About 95% of the forecasting tools I use at my private forecasting job are free to the public.
I live in the private forecasting world, and try to keep up with new forecasting model developments as much as I can. I even took classes on NWP and atmospheric modeling when I was in school. I know where the model limitations are, and can exploit them accordingly to put out a better forecast. That is my knowledge base. Do I know every little detail about all of the latest research? No. I'm not directly involved with them. Do I know what they're trying to accomplish, and where some of the flaws are? Certainly.