Relative humidity and dew point are often mistaken for the same thing and can be sources of confusion. But what are the differences between the two, and which is more relevant in everyday life?

>> Watch the video to learn more

Relative humidity and dew point both give us an idea of the amount of moisture in the atmosphere; however, only dew point is a true measurement of the atmospheric moisture.

Relative humidity is defined by the National Weather Service as the amount of atmospheric moisture present, relative to the amount that would be present if the air were saturated. But what does that really mean In short, the relative humidity is a function of both the moisture content in the atmosphere and the actual air temperature. By itself, it does not directly indicate the amount of moisture present in the atmosphere.

Relative humidity is calculated using meteorological variables such as vapor pressure, saturation vapor pressure, temperature and dew point. Without explaining the equation variable by variable, we can simply state that the relative humidity is determined by how close the air temperature is to the dew point temperature. The closer these values are, the higher the humidity. The further apart these values are, the lower the humidity.

>> Read more trending news

Dew point is the term that most meteorologists use to describe the amount of moisture in the air. The National Weather Service defines dew point as the temperature to which air must be cooled to in order to reach saturation, assuming air pressure and moisture content are constant. In short, the dew point is an accurate measurement of the moisture content in the air. When talking about a certain day feeling "muggy" or "sticky," the dew point temperature is the more accurate term to use.

Graphic by Brett Collar
icon to expand image