Two different stars with different luminosity may appear to have the same brightness to an observer because the brighter may be more distant.
This illustrates the need in astronomy to help range distant stars; since apparent magnitude alone will not yield enough information to gauge distance. The establishment of a "standard candle" or object of known brightness can be used for comparison; these can be established through various means including statistical models, observation of variable stars, behavior of nearby supernovae, etc. Once the distance of a star is known, the absolute magnitude can be derived from the apparent magnitude using the inverse-square law.
If a star is nine times as bright but three times as far away as another, the two stars will have the same apparent magnitude. Brightness, like gravity, decreases with the square of the reciprocal of the distance.
There are many factors that explain why one star may appear to be brighter than another. One star is just newer.
becuase he needs his catarax done
Another Answer
The brighter star may be farther away or obscured by space-borne dust.
That's because the less luminous star may be a lot closer to the Earth.
This is usually because they are at different distances. The brightness of light decreases as the square of the distance between source and observer.
The apparent magnitude also depends on the distance.
"First magnitude" usually means the brightest 21 stars, as seen from Earth. Another definition is stars with apparent magnitudes 0.5 to 1.5. This definition excludes the very brightest stars, like Sirius. They are the first stars that become visible after sunset and they all have names. Examples are Altair, Aldebaran, Capella, Spica, Antares, Fomalhaut, Deneb, Regulus, Sirius, etc. There can be confusion because First Magnitude stars are not stars with an "apparent magnitude" of exactly "one". They are just the brightest stars, but naturally their magnitudes are close to one.
The apparent magnitude is what we see, and this can be measured directly. The absolute magnitude must be calculated, mainly on the basis of (1) the apparent magnitude, and (2) the star's distance. So, to calculate the absolute magnitude, you must first know the star's distance.
Magnitudes of stars start in the negative, so the brightest star from Earth is of course the Sun, so it has an apparent magnitude of -26.74 (Note negative), whereas Polaris (The North Star) has an apparent magnitude of +1.97 See related question for differences between apparent and absolute magnitude.
It is actually absolute magnitude, opposed to apparent magnitude which is how much light stars appear to give off.
The brightest stars were traditionally magnitude 1; the weakest that could still be seen with the naked eye, 6. This system has been formalized and refined; as a result, there are now not only magnitudes with decimals, but also negative magnitudes for the very brightest stars and planets. For example, Venus has a magnitude of approximately minus 4.
ble
Negative The apparent magnitude of our sun is -26.73 whereas Vega is +0.03
magnitude, dim stars have positive magnitudes and bright stars have negative magnitudes
In the context of stars, a magnitude is not a measure of size but of brightness or apparent brightness. The apparent magnitude of the sun is -27 while Sirius, the brightest star has a magnitude of only -1.4: negative magnitudes are more bright, and stars with magnitudes greater than 6.5 are not visible to the naked eye. However, the sun is a star of modest modest size compared with some of the giants and supergiants.
The apparent magnitude is how bright the star appears to us, but stars are all at different distances so that a star that is really bright might look dim because it is very far away. So the absolute magnitude measures how bright the star would look if it was placed at a standard distance of 10 parsecs. When the absolute magnitude is greater than the apparent magnitude, it just means that it is closer than 10 pc. The brightest stars have absolute magnitudes around -7.
"First magnitude" usually means the brightest 21 stars, as seen from Earth. Another definition is stars with apparent magnitudes 0.5 to 1.5. This definition excludes the very brightest stars, like Sirius. They are the first stars that become visible after sunset and they all have names. Examples are Altair, Aldebaran, Capella, Spica, Antares, Fomalhaut, Deneb, Regulus, Sirius, etc. There can be confusion because First Magnitude stars are not stars with an "apparent magnitude" of exactly "one". They are just the brightest stars, but naturally their magnitudes are close to one.
There are two terms used to describe a stars brightness, absolute magnitude and apparent magnitude. The one you want is absolute magnitude - this is where the stars distance from us is taken out of the equation, effectively comparing the stars brightness side by side from a set distance (10 parsecs or 32.6 light years). Apparent magnitude is the other measure, this is how bright a star apparently looks from Earth. The huge distances and range of distances involved means that you can have very bright stars (high absolute magnitude) that apparently look as bright as a much closer but dimmer (low absolute magnitude) star - their apparent magnitudes might be similar, but they may have vastly different absolute magnitudes.
There are 86 stars in the constellation which appear in the Bayer/Flamsteed catalogues. There are no stars brighter than an apparent magnitude of 3. The brightest are Alrescha and Kullat Nunu (both mag 3.62).
Yes, if the matter surrounding one star is more dense than that surrounding the other it would appear to be less bright.
The apparent magnitude of the Sun is listed as -26.74. I want to know what is the formula used to compute this? How is this figure of -26.74 arrived at? Can this formula be employed for calculating the apparent magnitudes of stars of different spectral types too?
The less luminous one is closer to the observer, just as a candle in the same room can seem as bright as a sodium vapor lamp down the street.
The apparent magnitude is what we see, and this can be measured directly. The absolute magnitude must be calculated, mainly on the basis of (1) the apparent magnitude, and (2) the star's distance. So, to calculate the absolute magnitude, you must first know the star's distance.