answersLogoWhite

0


Best Answer

The standard distance used for evaluating absolute magnitude is 10 parsec.

The standard distance used for evaluating absolute magnitude is 10 parsec.

The standard distance used for evaluating absolute magnitude is 10 parsec.

The standard distance used for evaluating absolute magnitude is 10 parsec.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

14y ago

The standard distance used for evaluating absolute magnitude is 10 parsec.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: The distance from the earth to a star that has identical apparent and absolute magnitudes is?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Natural Sciences

What variable of apparent brightness is eliminated when discussing absolute brightness?

Distance. "Absolute magnitudes" are all calculated as if viewed from the same distance, while "apparent magnitude" is how bright the star appears to be as seen from Earth.


Does a red giant or a white dwarf star have greater absolute?

Red giants have typical absolute magnitudes which are 10-15 magnitudes below white dwarfs, which means that the red giants are 10,000-1,000,000 times brighter, after due allowance for distance.


What are the relationships between apparent magnitude and absolute magnitude?

Apparent magnitude is the brightness of an object as seen from Earth without any atmosphere.Absolute magnitude is the brightness of an object as seen from a predetermined distance, depending on the object.For planets, the distance used is 1 AU (Astronomical Units). Stars and galaxies use 10 parsecs which is about 32.616 light years.The dimmer an object is the higher the positive value. The brighter an object is the higher the negative value.Examples:The Sun has an apparent magnitude of -26.74 but an absolute magnitude of 4.83Sirius has an apparent magnitude of -1.46 but an absolute magnitude of -1.42This means that from Earth, the Sun is a lot brighter, but if the Sun was replaced by Sirius, Sirius would be 25 times more luminous.See related links for more information


Why is absolute magnitude of some stars greater than their apparent magnitude for stars?

The apparent magnitude is what we see, and this can be measured directly. The absolute magnitude must be calculated, mainly on the basis of (1) the apparent magnitude, and (2) the star's distance. So, to calculate the absolute magnitude, you must first know the star's distance.


How do you measure the distance to a cepheid?

Cepheids have a certain relationship between their period, and their absolute luminosity. Thus, their absolute luminosity can be determined. Comparing this with their apparent luminosity allows us to calculate their distance.Cepheids have a certain relationship between their period, and their absolute luminosity. Thus, their absolute luminosity can be determined. Comparing this with their apparent luminosity allows us to calculate their distance.Cepheids have a certain relationship between their period, and their absolute luminosity. Thus, their absolute luminosity can be determined. Comparing this with their apparent luminosity allows us to calculate their distance.Cepheids have a certain relationship between their period, and their absolute luminosity. Thus, their absolute luminosity can be determined. Comparing this with their apparent luminosity allows us to calculate their distance.

Related questions

What variable of apparent brightness is eliminated when discussing absolute brightness?

Distance. "Absolute magnitudes" are all calculated as if viewed from the same distance, while "apparent magnitude" is how bright the star appears to be as seen from Earth.


What is the absolute and apparent magnitude of a giant star?

Magnitudes require distance and luminosity. Therefore a specific star is required.


What distance is the absolute magnitude and the apparent magnitude of a star would be equal if the star is seen at?

The standard distance is 10 parsecs. At this distance the star's apparent magnitude equals its absolute magnitude. A star 100 parsecs away has an absolute magnitude 5 magnitudes brighter than its apparent magnitude. 1 parsec is 3.26 light-years.


What is a stars brightness as if it were a standard distance?

Theres `Absolute Magnitude` which is the brightness of a star at a set distance. Then there is `Apparent Magnitude` which is the apparent brightness from earth, regardless of distance.


Why does Arcturus star have greater absolute magnitude than the sun buy a much lower apparent magnitude?

The apparent magnitude is how bright the star appears to us, but stars are all at different distances so that a star that is really bright might look dim because it is very far away. So the absolute magnitude measures how bright the star would look if it was placed at a standard distance of 10 parsecs. When the absolute magnitude is greater than the apparent magnitude, it just means that it is closer than 10 pc. The brightest stars have absolute magnitudes around -7.


Which one tells us how bright the stars would appear if all stars were at the same distance ferom the earth?

There are two terms used to describe a stars brightness, absolute magnitude and apparent magnitude. The one you want is absolute magnitude - this is where the stars distance from us is taken out of the equation, effectively comparing the stars brightness side by side from a set distance (10 parsecs or 32.6 light years). Apparent magnitude is the other measure, this is how bright a star apparently looks from Earth. The huge distances and range of distances involved means that you can have very bright stars (high absolute magnitude) that apparently look as bright as a much closer but dimmer (low absolute magnitude) star - their apparent magnitudes might be similar, but they may have vastly different absolute magnitudes.


Why is the absolute magnitude of some stars greater their apparent magnitude?

The apparent magnitude is how bright the star appears to us, but stars are all at different distances so that a star that is really bright might look dim because it is very far away. So the absolute magnitude measures how bright the star would look if it was placed at a standard distance of 10 parsecs. When the absolute magnitude is greater than the apparent magnitude, it just means that it is closer than 10 pc. The brightest stars have absolute magnitudes around -7.


How does apparent magnitude of a star differ from absolute magnitude?

Apparent magnitude is the brightness as observed from earth, while absolute magnitude is the brightness of a star at a set distance. The apparent magnitude considers the stars actual brightness as well as it's distance from us, but absolute magnitude takes the distance factor out so that star brightnesses can be directly compared.


Does a red giant or a white dwarf star have greater absolute?

Red giants have typical absolute magnitudes which are 10-15 magnitudes below white dwarfs, which means that the red giants are 10,000-1,000,000 times brighter, after due allowance for distance.


Compare and contrast apparent magnitude and absolute magnitude?

Apparent Magnitude is the star's brightness as it appears from earth. absolute magnitude is the apparent brightness of a star if viewed from a distance of 32.6 light years away.


Compare apparent magnitude and absolute magnitude?

Apparent Magnitude is the star's brightness as it appears from earth, while, Absolute Magnitude is the apparent brightness of a star if viewed from a distance of 32.6 light years away.


What is difference between apparent brightness and absolute brightness?

Absolute Brightness: How bright a star appears at a certain distance. Apparent Brightness: The brightness of a star as seen from Earth.