.000345 that is just simple math right of the point the places start 10th 100th 1000th and so on read away okay, but, how did you determine the .00345? In other words, hwo do you read the caliber? How did you egt those numbers?
Their Vernier scale consists of two readings; the Vernier scale and the main scale. One can read that by the first line is the main scale and the next line is the Vernier reading.
The same as if you were not standing on your head.
Apply a known load to the scale and measure its change in length. According to Hooke's law, as long as the spring isn't overstressed, the force it provides is proportional to the extension. As long as you don't overstress the spring, this means that if a 1 Newton carrot extends the spring 1 centimeter, then a 2 Newton apple will extend the spring 2 cm. You will need to mark numerous readings on the frame of the spring scale, such that you can measure various values of force
A micrometer is one millionth of a meter, a kilometer is 1000 meters. So you can see there are no kilometers in a micrometer, but there are one billion micrometers in a kilometer
It depends how accurately you can read the scale. A ruler would only be accurate to 1 mm. If the caliper has a vernier scale it should be ten times better
http://wiki.answers.com/Q/How_do_you_read_a_micrometer_and_caliber"
The smallest division on the main scale of a micrometer gauge typically corresponds to 0.5 mm or 0.025 inches. This is the precision at which the main scale can be read.
To read a measurement on a micrometer, observe the main scale and the Vernier scale. The main scale represents the whole millimeters, while the Vernier scale indicates the fraction of a millimeter. The measurement is obtained by combining the values shown on both scales where they align.
Difficult to explain without diagrams, but the micrometer relies on an accurate screw which advances the caliper a precise amount with each revolution. So you turn the screw until the object is lightly held, then read the axial scale and add on for the number of screw turns above the nearest scale reading. The most accurate type also have a vernier scale for very small distances. I suggest you look at Wikipedia 'Micrometer' which has a thorough explanation with diagrams.
To read a micrometer, first ensure the instrument is calibrated and set to zero. Place the object between the anvil and spindle, then rotate the thimble until it gently contacts the object. Read the measurement by combining the main scale reading (on the sleeve) with the thimble scale reading (where the thimble aligns with the main scale). The main scale typically measures whole millimeters or inches, while the thimble scale provides fractional measurements, usually in hundredths of a millimeter or thousandths of an inch.
It is the scale on a micrometer.
A digital micrometer is the easiest to read as it displays the exact reading on a screen.
least count of a micrometer= pitch/no of division on the circular scale
The principle of a micrometer is based on the rotation of a screw to precisely measure small distances. The screw moves a spindle, which is connected to a scale that indicates the measurement. By calibrating the micrometer scale, accurate readings can be obtained.
69.9
The smallest part of a centimeter that can typically be read or estimated with a micrometer caliper is 0.01 centimeters, or 10 micrometers (µm). This is because most micrometers have a minimum scale division of 0.01 mm, which corresponds to 0.001 centimeters. However, skilled users may estimate measurements to even finer resolutions, depending on the micrometer's design and their experience.
An imperial micrometer can measure to within 0.001in (1000th of an inch).