A clinical thermometer needs to be an accurate maximum thermometer to ensure it registers and retains the highest temperature reached by the patient during an illness. This allows healthcare providers to monitor fever spikes accurately and make appropriate medical decisions based on the highest temperature recorded.
Constriction in a clinical thermometer allows for easier reading of temperature changes, important for quick diagnostics. In a laboratory thermometer, a wider range of temperatures may need to be measured with higher precision, so constriction is less important for quick readings.
So the working fluid doesn't flow back into the bulb when the it is removed from the heat source. This makes a clinical thermometer "sticky"; it retains the reading of the highest temperature experienced until it is "reset" by shaking.
The constriction prevents the mercury from returning back to the bulb when the thermometer is removed from a particular body.
You would likely have different ranges and accuracies for different thermometers. I'm using Fahrenheit. Many newer thermometers would be in Celsius, including those used in hospitals/clinics, or in other parts of the world outside of the USA. A clinical thermometer might read from about 80°F to 110°F, and would be accurate to 1/10 or 2/10 degrees. They can be digital, mercury, or even plastic disposable. They normally have a method to lock in the maximum temperature (like the old ones that you had to shake down). Modern clinical thermometers will either have a disposable plastic cover for the non-disposable variety, or will be 100% disposable. Ocular ear thermometers are a new type of infra-red thermometers. A household thermometer might read from -20°F to 120°F, and might only be accurate to 1 or 2 degrees. If the thermometer is a glass thermometer, the scale is never written on the actual thermometer. A scientific thermometer might have a range up to the boiling point of water... is often in Celsius (-10°C to 110°C), and accurate to a degree Celsius (2 degrees Fahrenheit) or so. A cooking thermometer might have a range of 100°F to 500°F. There may be some glass (or disposable) cooking thermometers, but many are also metal for durability.
The thermometer may still be absorbing heat from the mouth after removal, which can temporarily maintain a higher temperature reading. Additionally, the thermometer may have some thermal inertia, causing a slight delay in reflecting the true temperature change.
It stays at the maximum point so you get an accurate reading
No, the temperature of hot tea is substantially higher than the maximum that a clinical thermometer is designed for.
Constriction in a clinical thermometer allows for easier reading of temperature changes, important for quick diagnostics. In a laboratory thermometer, a wider range of temperatures may need to be measured with higher precision, so constriction is less important for quick readings.
No, a clinical thermometer is not suitable for measuring the temperature of hot tea as it is designed for measuring human body temperature. The high temperature of the hot tea could damage the clinical thermometer or give inaccurate readings. It is better to use a food thermometer designed for measuring the temperature of liquids.
A clinical thermometer typically shows a maximum reading while a lab thermometer typically shows the temperature right now. When you take a thermometer out of a patients mouth (or other place that you are measuring the temperature) you most often want the maximum temperature to keep showing until you reset the instrument. When using a lab oratory thermometer you want the instrument to react as quickly as possible so that changes (up and down) can be noticed and recorded.
The kink in a mercury or alcohol clinical thermometer helps to prevent the mercury or alcohol from flowing back into the bulb once the thermometer is removed from a patient's body. This ensures that the maximum temperature reached during measurement is retained for reading.
So the working fluid doesn't flow back into the bulb when the it is removed from the heat source. This makes a clinical thermometer "sticky"; it retains the reading of the highest temperature experienced until it is "reset" by shaking.
JAMES SIX invented the Maximum and Minimum thermometer in 1782.
The constriction prevents the mercury from returning back to the bulb when the thermometer is removed from a particular body.
principle of six's minimam & maximum thermometer
You would likely have different ranges and accuracies for different thermometers. I'm using Fahrenheit. Many newer thermometers would be in Celsius, including those used in hospitals/clinics, or in other parts of the world outside of the USA. A clinical thermometer might read from about 80°F to 110°F, and would be accurate to 1/10 or 2/10 degrees. They can be digital, mercury, or even plastic disposable. They normally have a method to lock in the maximum temperature (like the old ones that you had to shake down). Modern clinical thermometers will either have a disposable plastic cover for the non-disposable variety, or will be 100% disposable. Ocular ear thermometers are a new type of infra-red thermometers. A household thermometer might read from -20°F to 120°F, and might only be accurate to 1 or 2 degrees. If the thermometer is a glass thermometer, the scale is never written on the actual thermometer. A scientific thermometer might have a range up to the boiling point of water... is often in Celsius (-10°C to 110°C), and accurate to a degree Celsius (2 degrees Fahrenheit) or so. A cooking thermometer might have a range of 100°F to 500°F. There may be some glass (or disposable) cooking thermometers, but many are also metal for durability.
The thermometer may still be absorbing heat from the mouth after removal, which can temporarily maintain a higher temperature reading. Additionally, the thermometer may have some thermal inertia, causing a slight delay in reflecting the true temperature change.