Meaning of Measurement
In the last few years I had to do quite a lot of accurate temperature measurements - now I'm even in the business of calibrating thermometers myself - and I realized that it is by no means easy.
An accuracy better than 0.1 °C (or °K) requires quality, calibrated instruments - and the only true reference for this is the old mercury thermometer. Electronic sensors must be used with due care because of their intrinsic errors and the further distortion caused by signal processing - it is not unusual for thermocouple digital thermometers to be off by a few degrees.
I'm fairly sure that the ones used in meteorology perform better than that; however, in these days even tenths of a degree seem to be politically important - so instrumental issues become prominent.
The implicit assumption, especially among the supporters of the AGW hypothesis, is that temperature measurements are all sufficiently precise and accurate. But is that really so? Is it possible that at least part of the observed warming - which seems to be settled around 0.6 +/- 0.2 °C: a 33% uncertainty - may be due to inaccurate instruments?
Another aspect is inhomogeneity - in simpler words, the temperature of atmosphere varies greatly from one regione to the other. For that matter, temperature varies appreciably even within the volume of a laboratory oven, and the arithmetic mean of temperature I then used as an input for my equation of state was a barely satisfactory compromise. Again I wonder, does it make sense to concentrate something as huge and
complex as the atmosphere into a 2-digit number and a linear temperature trend?
Yes, average temperature can be seen as an index for total energy content, and maybe in that regard is meaningful, but one must not think that any day now is roughly 0.6 °C hotter that it was a couple of centuries ago. That's a very wrong way to take it.