answersLogoWhite

0

Many astronomers use a common method known as terenateral. This is when they record the height of the corona on the star and use a unit of measurment known as hypengeroly.

User Avatar

Wiki User

14y ago

What else can I help you with?

Continue Learning about Astronomy

What is a graph that shows relationship between a stars magnitude and temperature?

The Hertzsprung-Russell (HR) diagram is a graph that shows the relationship between a star's magnitude (luminosity) and temperature. It plots stars based on their color (temperature) and brightness (magnitude), allowing astronomers to classify stars and understand their evolutionary stage.


Who are the scientist name who discovered the magnitude of the stars?

The magnitude scale for stars, which measures their brightness, was developed by the ancient Greek astronomer Hipparchus in the 2nd century BCE. Later, the modern system of stellar magnitude was refined by astronomers such as Norman Pogson in the 19th century, who established a more precise logarithmic scale. Additionally, the work of astronomers like Johann Heinrich von Mädler and others contributed to the understanding of stellar brightness and its measurement.


How is stars brightness measured from earth?

Astronomers use a special term to talk about the brightness of stars. The term is "magnitude". The magnitude scale was invented by the ancient Greeksaround 150 B.C. The Greeks put the stars they could see into six groups. They put the brightest stars into group 1, and called them magnitude 1 stars. Stars that they could barely see were put into group 6. So, in the magnitude scale, bright stars have lower numbers.


What do astronomers use to measure stars?

Their eyes, Refracting Telescopes (ones with class lenses) Reflecting Telescopes (ones with mirrors) Radio Telescopes Imaging computer chips in conjunction with telescopes Space based telescopes Underground telescopes (to detect high energy or exotic particles from stars) Gravity wave detectors


What is a stars brightness known as?

A star's brightness is known as its magnitude. Stars with lower magnitude numbers are brighter than stars with a higher magnitude number.

Related Questions

What are the different ways astronomers use to study stars?

the astronomers use Absolute magnitude


Who is the greek astronomers who set up a system for measuring the magnitude stars?

galileo


How are stars brightness compared to each other?

Astronomers use the term magnitude to compare the brightnesses of stars. Really bright stars are 1st magnitude while the faintest we can see with the naked eye are about magnitude 6. A 12 inch telescope can see down to about magnitude 14 or 15. Hubble Space Telescope can see down to about magnitude 27.


What is the standard distance that astronomers use to find thew absolute magnitude of stars?

Astronomers use a standard distance of 10 parsecs (approximately 32.6 light-years) to determine the absolute magnitude of stars. This allows for a consistent measurement of a star's intrinsic brightness, independent of its distance from Earth. The absolute magnitude represents the brightness a star would have if it were placed at this standard distance.


How do astronomers label some stars within constellations?

Astronomers label stars within constellations using Greek letters, such as Alpha, Beta, Gamma, etc., in order of their brightness within the constellation. For example, the brightest star in Orion is labeled Alpha Orionis, or Betelgeuse.


What is a graph that shows relationship between a stars magnitude and temperature?

The Hertzsprung-Russell (HR) diagram is a graph that shows the relationship between a star's magnitude (luminosity) and temperature. It plots stars based on their color (temperature) and brightness (magnitude), allowing astronomers to classify stars and understand their evolutionary stage.


What system classifies stars according to its brightness?

The system that classifies stars according to their brightness is called the magnitude scale. This scale measures the apparent brightness of stars as seen from Earth, with lower numbers indicating brighter stars; for example, a star with a magnitude of 1 is brighter than one with a magnitude of 5. Additionally, the absolute magnitude scale measures the intrinsic brightness of stars at a standard distance of 10 parsecs. Together, these systems help astronomers categorize and compare stars based on their luminosity.


Rocks are to geologists as stars are to?

Rocks are to geologists as stars are to astronomers.


Why is it important to know the absolute magnitude of stars?

Knowing the absolute magnitude of stars is crucial because it allows astronomers to determine their intrinsic brightness, independent of their distance from Earth. This helps in comparing the true luminosities of different stars and understanding their evolutionary stages. Additionally, absolute magnitude is essential for calculating distances to stars using methods like the distance modulus, which enhances our understanding of the structure and scale of the universe.


Who are the scientist name who discovered the magnitude of the stars?

The magnitude scale for stars, which measures their brightness, was developed by the ancient Greek astronomer Hipparchus in the 2nd century BCE. Later, the modern system of stellar magnitude was refined by astronomers such as Norman Pogson in the 19th century, who established a more precise logarithmic scale. Additionally, the work of astronomers like Johann Heinrich von Mädler and others contributed to the understanding of stellar brightness and its measurement.


How does the sun compare to other stars in terms of surface temperature apparent absolute magnatuide and size?

Temperatures in the sun's middle atmosphere, the Chromosphere are 4,225°C to 6,000°C. In the sun's outer atmosphere, or Corona, temperatures may reach 2,000,000°C, while stars are related to colour: Stars that have the lowest surface temperatures (below 3,500 °C)are red. Stars that have the highest surface temperatures (above 25,000 °C) are blue. The Apparent Magnitude of the sun is less than the other stars. Some stars are actually more luminous, or brighter than the sun is. If these stars are located far from Earth, they may not appear bright to us. Using only their eyes, ancient astronomers described star brightness by magnitude. They called the brightest stars they could see 'first magnitude' and the faintest stars they could see 'sixth magnitude'. Astronomers using telescopes see many stars that are too dim to see through the naked eye. Rather than replacing the magnitude system, astronomers added to it. Today, the brightest stars have a magnitude of about -2. The faintest stars that we can see through a telescope have a magnitude of +30. Sirius the brightest star in the night sky has an apparent magnitude of -1.46. To the naked eye, the sun has apparent magnitude of -26.8, even though it is not as luminous a star as Sirius is. The sun is simply located closer to Earth. The apparent magnitude of the sun is -26.8. However the absolute magnitude of the sun is +4.8 which is typical of many stars. Now compare the sun, which is located 8.3 light minutes from Earth, to Sirius, which is located 8.6 light years from Earth. Sirius has an apparent magnitude of -1.46 and an absolute magnitude of +1.4. Therefore, Sirius is much more luminous than the sun is.


How is stars brightness measured from earth?

Astronomers use a special term to talk about the brightness of stars. The term is "magnitude". The magnitude scale was invented by the ancient Greeksaround 150 B.C. The Greeks put the stars they could see into six groups. They put the brightest stars into group 1, and called them magnitude 1 stars. Stars that they could barely see were put into group 6. So, in the magnitude scale, bright stars have lower numbers.