It is called the magnitude scale. It is a log scale.
Magnitude. First magnitude descibes many bright stars, and a span of five magnitudes represents a difference of a hundred times in the star's brightness. The dimmest stars seen by a perfect human eye in perfect conditions is 6th magnitude.
Magnitude
In order to compare the brightness of two stars, a measure of their brightness was needed. Determining what is termed as a star's 'magnitude' helped crossed this hurdle. The term 'magnitude' describes the brightness of a star as viewed from the earth.WebWiseDude.com The Brightness of StarsLook up at the night sky and you will see many stars, some very bright, some slightly less bright and some even faint. What does this mean? Does a bright star mean that it is very near to the earth, or does it mean that it is bigger than the rest so it shines bright? Or is it just that different stars have different levels of brightness? These questions have puzzled astronomers through the ages and they have endeavored to find the answers for us.In order to compare the brightness of two stars, a measure of their brightness was needed. Determining what is termed as a star's 'magnitude' helped crossed this hurdle. The term 'magnitude' describes the brightness of a star as viewed from the earth.Early astronomical findingsThe origin of this concept can be traced back to Ptolemy, the astronomer of yore hailing from Alexandria. In the course of his studies of the stars he divided all the visible ones into six different categories on the basis of their brightness, the first magnitude containing stars which were the brightest and the second, third, fourth, fifth and sixth containing stars according to the diminishing level of their brightness. Those in the sixth magnitude were barely visible to the human eye. Later astronomers improved upon Ptolemy's classification, as better tools to study the mysteries of space were accessible to them. Astronomers in the 17th Century used the telescope and classified more stars that were hitherto unknown, owing to their not being visible to the naked eye.Then as the need for improving the system was felt, a standard system of magnitudes was adopted in the 19th Century. According to this system, a star would be 2.512 times brighter than a star of the next magnitude. There is a lot of mathematics involved here too. Do some mental sums and you will see that 2.512 is the fifth root of 100. So when you look at it the other way round, a star in the first magnitude is 100 times brighter than a star in the sixth magnitude.There are twenty stars in the first magnitude and they have a magnitude of 1.5. These are the brightest stars that are visible from the earth. The higher the magnitude attributed to a star is, the dimmer it is when seen from the earth. Until the nineteenth century, magnitude was the only way to measure a star's brightness. Astronomers now have high precision instruments that help them measure even minute differences between the magnitudes of different stars. Now astronomers are able to measure the actual amount of light from a star that reaches the earth.Magnitudes - absolute, apparent and visualAbsolute magnitude refers to the magnitude of a star when viewed from a standard distance of about 32 light years. Unless otherwise specified the given magnitude of a star is its apparent magnitude. When a star is studied with the help of a telescope, it is studied to photographic film. Photographic film is more sensitive to blue light whereas the human eye is more sensitive to yellow light. So the perception of brightness will differ. Hence, a classification of visual magnitude was found necessary to indicate this difference.Some stars and their magnitudesThe sun has a magnitude of -26.7 and is 10 million times as bright as Sirius.Sirius, the brightest star outside the solar system, has a magnitude of -1.6.Alpha Centauri the third brightest star has a magnitude of -0.1.Arcturus, the fourth brightest star in the sky, has a visual magnitude of -0.05.
The star studded portion of the American flag is called the Star Studded Portion.
The brightness as seen from Earth is called the "apparent magnitude".The real brightness (defined as the apparent brightness, as seen from a standard distance) is called the "absolute magnitude".
The measure of a star's brightness is its magnitude. A star's brightness as it appears from Earth is called its Apparent Magnitude.Star's brightness is measured by there magnitude.
The measure of a star's brightness is its magnitude. A star's brightness as it appears from Earth is called its Apparent Magnitude.Star's brightness is measured by there magnitude.
the brightness of a star is called it's magnitude
A nova is a star that suddenly increases in brightness.
Absolute Brightness .
The brightness of a star to an observer on Earth is called it's Apparent Magnitude. The intrinsic brightness of a star is known as it's Absolute Magnitude.
It's distance from Earth and the star's actual brightness
Two factors that affect a star's apparent brightness are: 1.) The distance between the Earth and the star 2.) The absolute magnitude (the actual brightness) of the star Hope that helps :P
That refers to its actual brightness, not to how we see it. The apparent brightness depends on the real ("absolute") brightness, but also on the distance.
the moon light
Apparent magnitude.