answersLogoWhite

0


Best Answer

The Treaty of the Meter was an agreement signed by 20 countries in 1875 that established the International Bureau of weight and Measures in France to provide standards of measurement for use throughout the world. The idea of defining a unit of length in terms of the wavelength of light had been floated early in the 19th century (J. Babinet, 1827), before there was any way of realizing the idea in practice. By the end of the century this was no longer so. "White" light is a mixture of light with different wavelengths. To define a unit of length in terms of wavelength, one needs light that is all of the same wavelength. Light consisting of only one wavelength-any wavelength, provided it is visible-appears to a human to be colored, and is called monochromatic. Fortunately it doesn't seem hard to produce monochromatic light: sprinkle some salt on the gas flames of a kitchen range. When the sodium atoms in the salt get excited, they give off a yellow light which is pretty much all the same wavelength. It is the same yellow as the light from sodium vapor street lamps. The wavelength is characteristic of the sodium atom. In 1892-3 A. A. Michelson and J. R. Benoit succeeded in measuring the meter in terms of the wavelength of red light given off by excited cadmium atoms. Benoit and others refined the measurement in 1905-7, and in 1907 the International Solar Union (which is now the IAU) defined the international angstrom, a unit of distance to be used in measuring wavelengths, by making 6438.4696 international angstroms equal to the wavelength of the red line of cadmium. This value was taken from Benoit's experiments, and was chosen so that one angstrom was approximately 10-10 meter. (In 1927, the 7th CGPM provisionally sanctioned measuring distances in terms of the red line of cadmium, taking its wavelength to be 0.643 846 96 micrometers.) Meanwhile, much had been learned since 1892. Even in the best of spectroscopes, the red line of cadmium was somewhat fuzzy. In fact, it turned out to be composed of many lines (physicists refer to its "hyperfine structure"), which affected how precisely the light's wavelength could be determined. When the existence of isotopes was discovered, it became clear that part of the reason for the fuzziness was that the light was not coming from a single kind of atom, but from a mixture of isotopes: cadmium atoms with the same number of protons, but different numbers of neutrons. Investigating light from pure isotopes, it was found that if an atom had an even number of protons, and the sum of the numbers of protons and neutrons it contained was also even, the light from it had no hyperfine structure. (Such atoms have no nuclear spin, hence no coupling of nuclear spin to electron spins-and the light comes from the electrons.) The 9th CGPM (1948) allowed as how the meter might eventually be defined in terms of light from such an isotope. Three isotopes were intensively investigated to see which would be most suitable as the basis for a standard of length: krypton-86 (36 protons), Mercury-198 (80 protons), and cadmium-114 (48 protons). The committee in charge of following these developments recommended that any new definition be stated in terms of the wavelength in a vacuum instead of in air, and that the length of the wavelength should be specified by comparing it with the already determined wavelength of the red line of cadmium, not with the International Prototype of the Meter. The 10th CGPM (1954) accepted these recommendations, in effect making the angstrom exactly equal to 10-10 meter and defining the meter in terms of light, although this was not formally acknowledged until 1960. The advisory committee declared krypton-86 the winner in 1957, and in 1960, the 11th CGPM (Resolution 6), noting that "the International Prototype does not define the meter with an accuracy adequate for the present needs of metrology," redefined the meter as "the length equal to 1 650 763.73 wavelengths in vacuum of the radiation corresponding to the transition between the levels 2p10 and 5d5 of the krypton 86 atom." Defined this way, it proved impossible to realize the meter with an accuracy better than 4 parts in 109, and eventually that was not accurate enough. In the meantime, however, the laser had been invented, and the light it produced-not only all one wavelength, but all in phase-opened up new possibilities for metrology. In 1983 the 17th CGPM (Resolution 1) redefined the meter in terms of the speed of light in a vacuum. The value for the speed of light, 299,792,458 meters per second, had already been recommended in 1975 by the 15th CGPM, (Resolution 2). Its use in the meter's definition made the speed of light fall within the limits of uncertainty of the best existing measurements. Thus the second, rejected as too arbitrary in 1791, has become the basis of the meter. We have probably not seen the last redefinition of the meter; the current definition may need tuning if even more accuracy becomes necessary. For example, the speed of light is affected by the strength of the gravitational field, and the 1983 definition does not take such factors into account. The Metric Act in 1866 was significant because recognized the metric system as a legal system of measurement in the United States. Basically, it was significant because it said that the United States said the metric system was reliable enough to be used in the U.S. A little extra information-It is sometimes referred to as the Kasson Act, after Congressman John A. Kasson of Iowa, who chaired the House Committee on Coinage, Weights, and Measures.

User Avatar

Wiki User

15y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

Early measurements of length were based on parts of the human body: the length of a foot, the distance of a pace, a hand span, and the width of a finger or thumb. Because these lengths would not be the same for different individuals, at first there were no standard length measurements. Around 3000 BC the Egyptians introduced their royal cubit based on the length of the Pharaoh's forearm from the tip of his little finger. This was made from a granite rod against which others could copy to make their own measuring rods. Smaller lengths required subdivisions of the royal cubit. The hand, having a width of about 4 inches, is still used in measuring the height of horses.

Weights were based on the quantity of objects like seeds and beans. The term grain is used in measuring a small amount of a substance. The stone, equal to 14 pounds, is still used in some places.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the earliest form of measurement?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What were the earliest measurement based on?

The answer depends on the earliest measurements of WHAT: time, distance, mass/weight, ...


What is the earliest unit invented by man for measurement?

Stone


What was the earliest measurement?

See the Web Links to the left for the answer!


What was the earliest form of exchange?

The earliest form of exchange was known as barter


What is the form of measurement?

the smallest form of measurement is a nanometer


What is the full form of eat?

Earliest Available Time or Earliest Arrival Time


What is the Historical development of measurement and evaluation?

Weights and measures were among the earliest tools invented by man.


What are some of the earliest form of customer service?

what are some of the earliest forms of customers service


What were the earliest forms of Chinese literature?

I think the earliest form is poem, because The Book of Songs is the earliest poetry anthology and the earliest literature.


What is a form of measurement?

newtons


What are the earliest form of humans?

Your mom controolrd


What was the earliest form of all life?

Cyanobacteria.