Accuracy of barometric pressure sensor in mobile devices?

Joined
Dec 8, 2010
Messages
85
Location
Milwaukee, WI
I just upgraded to a Samsung Galaxy S3 this past weekend, and today I come to find it has a built-in barometric pressure sensor. There are a few apps out there that will display the data reported by it, but I'm wondering how accurate it may be, or if it will even be useful while chasing/spotting.

Example: 986.09 mBars in my office

Does anyone have experience with this type of thing?
 
I have a small cheap wristwatch that I use for hiking and climbing. Its pressure sensor is extremely accurate. To within feet or a small fraction of a millibar. If you walk up a flight of stairs, it will read 10 feet higher. If the surface pressure is steady and temperature profile near standard (thats a big if) it is more accurate than a GPS. Whatever instrument error it might have is orders of magnitude less than the normal variation in pressure over time and space.
 
I agree, that is one device that is remarkably accurate and apparently cheap to make. My old Suunto watch easily measures going up the stairs 10 feet.
 
An interesting question. Makes me wonder why, if such devices are accurate and don't need calibrating, do airplanes have to input current local altimeter (pressure) to insure their altimeter is properly calibrated for that airport, at that time? And, how does such a device know if the pressure change is altitude change related or atmospheric pressure change related? I guess it has a built in GPS chip and makes allowances based on the derived altitude [change]?
 
Last edited by a moderator:
An interesting question. Makes me wonder why, if such devices are accurate and don't need calibrating, do airplanes have to input current local altimeter (pressure) to insure their altimeter is properly calibrated for that airport, at that time? And, how does such a device know if the pressure change is altitude change related or atmospheric pressure change related? I guess it has a built in GPS chip and makes allowances based on the derived altitude [change]?

Altimeters are set locally and usually on hand-offs controllers will give local altimeter settings (if you are below 18K ft, above that 29.92" is used). Pressure changes from wx systems can influence the system. There is a phrase "from high to low, look out below" high/low is the pressure. (I hope this makes sense, I am way too tired to type correctly)
 
Accuracy and sensitivity are two different things. Ten feet of altitude is about .01" of pressure and both the Suunto watch I had and the Casio I have now can resolve that change. Since altitude and ambient pressure are related to derive sea level pressure you have to know two accurately to calculate the third.

Airplanes need to know their altitude above sea level accurately, so they need to have supplied the accurate sea level pressure at their location to set their accurately calibrated altimeter (barometer). A mobile observer that wants accurate sea level pressure needs their accurate altitude and an accurately calibrated barometer.

Unless you're stationary for a bit and have a good satellite constellation gps-derived altitude seems too variable to get an accurate altitude, so I end up, for example, checking my watch against the map altitude. When I do that I've noticed a significant lag in coming up with an accurate reading, FWIW.

Personally, I have trouble understanding the utility of having a barometer (or an anemometer) unless you're stationary (when a hand-held Kestrel works pretty well) or collecting research data for post-processing. But, hey, different strokes for different folks....
 
Ya, I should say that the accuracy of the sensor is extremely precise. To take direct pressure values and convert that into useful information often takes a bit of effort and knowledge, as the the atmosphere is almost always changing or non-standard.

Makes me wonder why, if such devices are accurate and don't need calibrating

The sensor itself is extremely accurate. To get altitude above a reference level you need constant calibration.
 
"accuracy of the sensor is extremely precise"
Accuracy has nothing to do with precision. Accuracy has to do with whether a measurement agrees to the actual property being measured. Precision has to do with the number of significant digits used in the measurement. A pressure measurement of 1009.246+/- 3 mb is very precise, but not very accurate. 1009.246+/- 0.01 mb is very precise and accurate. 1010+/-10 mb (measurement to the nearest 10 mb) is neither precise or accurate.
 
Last edited by a moderator:
"accuracy of the sensor is extremely precise"
Accuracy has nothing to do with precision. Accuracy has to do with whether a measurement agrees to the actual property being measured. Precision has to do with the number of significant digits used in the measurement. A pressure measurement of 1009.246+/- 3 mb is very precise, but not very accurate. 1009.246+/- 0.01 mb is very precise and accurate. 1010+/-10 mb (measurement to the nearest 10 mb) is neither precise or accurate.

Assuming the true pressure value isn't 1009.2. If it was, those would all be quite accurate, with errors on the order of 1% or less.
 
The type of precision you defined is more of a computer precision (number of decimal points resolved). I was taught in all my physical science classes that precision was basically the uncertainty in the measurement, i.e., the spread among measurements of a quantity assuming the measured quantity is not changing. For example, if I use two different thermometers to take 100 measurements of the temperature of a block of ice at 0 C, and the standard deviations among the measurements are 1 C and 5 C, then I will say the thermometer that had a 1 C spread among the measurements was more precise. The accuracy is how close to the actual value the "guess" was. If they both give me average temps of -0.174 C, then both have the same accuracy. If one reads only tenths of a degree while the other reads only unit degrees, then I might say one is more precise than the other, but I would know I'm talking about a different kind of precision. I might instead use the term "specific" to describe the number of decimal points the device outputs.

Here's a reference for the discussion: http://en.wikipedia.org/wiki/Accuracy_and_precision
 
The thermometer with the 5° C sigma would have a larger spread in measurement reading, it's variations would be larger then the thermometer with the 1° C sigma for repeat measurements. The odds of both thermometers reading exactly the same for a single measurement would be low. If you were trying to find the average temperature of a static system to get a accurate result you would have to take many more readings with the less precise thermometer. So yes you could say they have the same accuracy for a average of temperatures, but not for single measurements.
 
If two thermometers with different precisions both give you a reading of 75 F for a measurement, then both are equally as accurate, regardless of the spread among other measurements. If you're talking about mean absolute error or RMSE for a collection of measurements, then you would definitely want a thermometer with better precision. However, one instrument only takes one measurement at a time.
 
No. You aren't going to get much value out of a single point pressure reading while chasing or spotting.

I beg to differ. You can learn a lot about the atmosphere by driving around with an accurate and quick-to-update pressure sensor at your disposal. What's the pressure change due simply to elevation over this hill coming up? What was the pressure increase across this gust front boundary? How strong must that tornado have been if I recorded a 3 mb pressure drop when I was 500 yards away from it at its closest passage? Or, if you're in western Kansas, where surface-based observation stations are few and far between, you could use your sensor to estimate where the exact center of the low is.
 
Back
Top