Submitted by AspGuy25 t3_10k4ov8 in askscience
aspheric_cow t1_j5ostme wrote
Infrared cameras measure the temperature by measuring the amount of infrared radiation emitted by a surface. But metal surfaces inherently emit less (it had low emissivity) so the infrared camera reads low. Try putting a temperature sensor (RTD, thermistor or thermocouple) on the metal and it should read 105C. Or put a dab of paint or piece of tape on the metal and aim the IR camera at that.
shlepky t1_j5p2k9q wrote
Infrared thermometers usually have to be calibrated. When you get them, they measure radiation as if the surface they're measuring has emissivity ratio of 1 (black body radiation - which means all of the bodies heat is radiated out). If you know what the actual temperature is, you iteratively change the emissivity until you get the correct measurement. When you measure a different surface, you'll have to repeat the same process though. Cc: /u/AspGuy25
LitLitten t1_j5pufhq wrote
Iirc this is also why they can’t really “see” through glass.
Blocking the frequency range of infrared light or long-wave infrared typically detected by cameras. One reason cars get ungodly hot is due to a bunch of light being absorbed, infrared being emitted, but unable to pass through the windows.
[deleted] t1_j5ueqdm wrote
[deleted]
[deleted] t1_j5r97l0 wrote
[removed]
Viewing a single comment thread. View all comments