The apparent magnitude of a star is dependent on the star's size, temperature and distance from where it is observed. An absolute magnitude is determined by the same three factors, but the distance is fixed at 10 parsecs.
There are three factors, actually. The star's size and temperature determine the absolute magnitude, or how bright the star really is. Those two factors can be considered as one - the star's absolute magnitude. The absolute magnitude combined with our distance from the star determines its apparent magnitude, or how bright the star appears to be from Earth. So, a big, hot, super bright star very far away may have the same apparent magnitude as a small, cool star that's fairly close to the Earth.
Its real (absolute) magnitude; its distance from Earth; the amount of light that's absorbed by matter between the star and us (extinction); distortions due to gravitational lensing.
how dense the star is
Three physical factors that determine a star's brightness are its temperature (hotter stars are brighter), size (larger stars are generally brighter), and distance from Earth (the closer a star is, the brighter it appears).
It isn't clear what you want to determine about the star.
The main difference is brightness: a twelfth magnitude star is brighter than a fifteenth magnitude star. Magnitude is a logarithmic scale, so each step in magnitude represents a difference in brightness of about 2.5 times. This means a twelfth magnitude star is approximately 12.5 times brighter than a fifteenth magnitude star.
the brightness of a star is called it's magnitude
The magnitude is the brightness of the star.
A magnitude 1 star is 100 times brighter than a magnitude 6 star.A magnitude 1 star is 100 times brighter than a magnitude 6 star.A magnitude 1 star is 100 times brighter than a magnitude 6 star.A magnitude 1 star is 100 times brighter than a magnitude 6 star.
Triangulation using the base of Earth's orbit and parallax observations.Use of Cepheid variable starsUse of the star's colour spectrum to determine it's luminosity and comparison of that with its apparent magnitude.
The absolute magnitude of a star is a measure of its true brightness if it were placed at a standard distance of 10 parsecs from Earth. To calculate the absolute magnitude from the apparent magnitude (m) of 6, you would need to know the star's distance. Without this information, we cannot determine the absolute magnitude.
The model for measuring the apparent magnitude (brightness from earth) of a star says that a magnitude 1 star will be 100 times brighter than a magnitude 6 star (just visible with the naked eye). This means that a magnitude 1 star is 2.512 times brighter than a magnitude 2 star, which is 2.512 times brighter than a magnitude 3 star. To jump two places up the scale, use 2.512 x 2.512 as a multiplier, i.e. mag 1 is 6.31 times brighter than magnitude 3 star. To jump three places use 2.512 x 2.512 x 2.512 (or 2.512 cubed) = 15.851. So a magnitude 4 star will be 15.85 times brighter than a magnitude 7 star. Working the other way, a magnitude 7 star will appear 6.3% as bright as a magnitude 4 star (1/15.85 and x 100 to get percentage).