Hello All,
I am trying to estimate land surface temperature (LST) from Landsat 5 data. The study area is on the East coast in the US, and luckily it includes a temperature station with hourly data on the day the satellite passed. I've converted DN5 to radiance and used an inversion of Planck's to calculate brightness temperature-- this is all pretty standard stuff.
Here's my problem: The uncorrected at-satellite brightness temperature at the temperature station location is about 7 degrees F higher than the station measured at the temperature station. My general understanding is that the East coast is rather muggy, so I would suspect that atmospheric correction should cause LST to increase compared with the at-satellite brightness temperature, as there's probably a fair bit of moisture in the atmosphere absorbing radiance between the land surface and satellite. The fact that the uncorrected at-satellite temperature is higher than the measured temperature seems problematic.
Indeed, using the Atmospheric Correction Parameter Calculator (http://atmcorr.gsfc.nasa.gov/) and a standard radiative transfer equation, my land surface radiance increases compared with the uncorrected at-satellite radiance, and the corrected temperature increases to nearly 12 degrees F above the temperature measured by the temperature station.
I've checked and rechecked my radiance and Planck's inversion calculation multiple times. I'm confident these equations are fine. I've tried calculating radiance from both R/Grescale and LMIN/LAX equations, and I've even gone back to previous calibrations (e.g., Chander, 2003), but each of these approaches impacts temperature only slightly.
Any ideas? I'm pretty much at a loss here. I would really appreciate any advice/suggestions.
Thanks,
Anthony
ETA: Oh, I was wondering if this could be a time issue. My understanding is that Landsat image acquisition is listed as GMT. Is that correct?