Relative humidity is a ratio between the partial pressure of water vapor and the saturation pressure of water vapor at the current temperature and pressure. If the temperature and pressure change, then the relative humidity will change also.
You are correct that higher temperatures allow the atmosphere to hold more water. That means that the saturation pressure of water vapor has increased while the current vapor pressure has remained the same, causing the relative humidity to drop.
We think of humidity as how hot and sticky it is outside. The closer the water vapor pressure is to its saturation point, the more hot and sticky we feel. We associate humidity with heat since that is when we are uncomfortable, but rain is caused by the relative humidity rising to 100% because the humid air cooled to the point that the saturation pressure dipped below the current vapor pressure (or other pressure changes, or a combination of both). You can learn more at the link below.
I hope this helps.
Increase
That would be "dewpoint"...When the air temperature falls to the dewpoint (or dewpoint rises to the air temperature), then you have 100% relative humidity.
When temperature rises, the capacity of air to hold water vapor increases. Consequently, the relative humidity decreases because the amount of moisture present in the air remains the same, but it is spread out over a larger volume.
If the absolute humidity remains constant while the temperature rises, the relative humidity will decrease (and vice versa). This is because the air's capacity to hold water increases as the temperature increases so the constant amount of water represents a smaller and smaller percentage of the maximum amount the air can hold. A: As air temperature goes up, the maximum amount of water vapor that it can hold goes up. Thus if the water content stays constant, the the humidity goes down. If the humidity stays constant, then the water vapor content goes up.
because land heats up and cools down very fast so at night when the sun is not there the land starts to cool down quickly and the excess of heat is sent back by long wave radiation
That would be "dewpoint"...When the air temperature falls to the dewpoint (or dewpoint rises to the air temperature), then you have 100% relative humidity.
Increase
When the temperature of air is cooled or reduced the relative humidity (RH) increases. The moisture content of the air remains the same until the RH rises to the point of 100% saturation and condensation occurs.
The relative humidity drops. This is because the temperature for the parcel of air is cold and wet which leaves enough room for some water vapor, but not a lot.
Cold air cannot hold as much water vapor as warm air. As temperature drops relative humidity rises. Absolute humidity remains constant until the dewpoint temperature is reached, then decreases with temperature as water precipitates out of the air. Below the dewpoint temperature relative humidity remains constant at 100%.
Yes. Relative humidity is the amount of water vapor actually in the air compared to the amount that could be in the air (saturation point) at the exisiting temperature. So, if the temperature of the air changes and the amount of water vapor in it does not, the relative humidity will be different. But, if the temperature of the air changes and so does the amount of water vapor in it, then the relative humidity could be the same as before the temperature change. That is to say that the air could contain the same percentage of water vapor that it could hold at each temperature, even though the actual amounts are different.
"Because relative humidity is related with the temperature of the air. Relative humidity is the rate of water vapour to the maximum amount of water vapour can air hold at that temperature. The amount of water vapour that air can hold is increses as the temperature of the air increases. If the air holds same amount of water while the temperature is incresing, relative humidity of the air decreses because maximum amount of water that air can hold increases and the rate of humidity to tha maximum humidity decreses."Someone had given this answer, and it is partially correct, however, their bizarre English and grammar skills make it hard to understand. I think what they meant was that relative humidity is the amount of water vapor in the air, compared to what the air can "hold" at a given temperature. As temperature increases, the amount of water vapor or moisture the air can hold does as well.So, after the sun rises the temperature of the air increases, so does the amount of moisture the air can hold and the actual amount of water vapor in the air may stay the same, thus decreasing the relative humidity. The opposite happens at night.Relative humidity = (actual vapor density/ saturation density) x100%
That would be "dewpoint"...When the air temperature falls to the dewpoint (or dewpoint rises to the air temperature), then you have 100% relative humidity.
When temperature rises, the capacity of air to hold water vapor increases. Consequently, the relative humidity decreases because the amount of moisture present in the air remains the same, but it is spread out over a larger volume.
Relative humidity expresses a percentage of humidity in the air to the maximum amount of humidity that could be in the air. For example: when the temperature rises the air will be able to hold much more humidity so the relative humidity will drop.
If the absolute humidity remains constant while the temperature rises, the relative humidity will decrease (and vice versa). This is because the air's capacity to hold water increases as the temperature increases so the constant amount of water represents a smaller and smaller percentage of the maximum amount the air can hold. A: As air temperature goes up, the maximum amount of water vapor that it can hold goes up. Thus if the water content stays constant, the the humidity goes down. If the humidity stays constant, then the water vapor content goes up.
Relative humidity is the volume of water vapor in a sample of the air, compared to the maximum that the air can hold at the given temperature, expressed as a percentage. Water can "dissolve" in air. As the temperature and pressure of air goes up, the amount of water that can be held in the air increases. We can measure the "absolute" humidity of air, but this isn't especially useful. It is more handy to know how much water is IN the air, as a percentage of the amount of water the air COULD hold. That's "relative humidity". It is important because as the temperature rises, the air COULD hold more water, so the relative humidity falls. As the temperature falls, the relative humidity rises. As the air cools, when the air can't hold any more water than it has now, the relative humidity is 100%, and we call this temperature the "dew point", when dew will begin to settle onto the grass. If the air gets much colder, the water will condense out of the air and form FOG. In some cases, the air can hold more moisture than it ordinarily would, which is referred to as supersaturation, which is much more common at temperatures below the freezing point.