This has nothing to do with shape. The apparent magnitude means how bright a star looks to us. The absolute magnitude means how bright the star really is (expressed as: how bright would it look at a standard distance).
The two types are apparent magnitude, the magnitude of a star as it appears to us, and absolute magnitude, which is what a star's apparent magnitude would be at a standard distance of ten parsecs.
The brightness of a star depends on its temperature, size and distance from the earth. The measure of a star's brightness is called its magnitude. Bright stars are first magnitude stars. Second magnitude stars are dimmer. The larger the magnitude number, the dimmer is the star.The magnitude of stars may be apparent or absolute.
Our Sun is pretty much average. It's larger than about 60 to 70 % of the other stars in the Milky Way; the estimate increases as we keep discovering more and more very small and very dim brown dwarf "stars" (that are right on the boundary between "star" and "not star").
pie
The Hertzsprung-Russell diagram (H-R diagram) shows the relationship between absolute magnitude, luminosity, classification, and effective temperature of stars. The diagram as originally conceived displayed the spectral type (effectively the surface temperature) of stars on the horizontal axis and the absolute magnitude (their intrinsic brightness) on the vertical axis.
The two types are apparent magnitude, the magnitude of a star as it appears to us, and absolute magnitude, which is what a star's apparent magnitude would be at a standard distance of ten parsecs.
The apparent magnitude is what we see, and this can be measured directly. The absolute magnitude must be calculated, mainly on the basis of (1) the apparent magnitude, and (2) the star's distance. So, to calculate the absolute magnitude, you must first know the star's distance.
distance from the Earth. The apparent magnitude of a star is how bright it appears from Earth, while the absolute magnitude is how bright a star would be if it were located at a standard distance of 10 parsecs away from Earth. The difference in magnitude is primarily influenced by the star's distance, with closer stars having a smaller difference and more distant stars having a larger difference between their apparent and absolute magnitude.
Apparent magnitude is the brightness of an object as seen from Earth without any atmosphere.Absolute magnitude is the brightness of an object as seen from a predetermined distance, depending on the object.For planets, the distance used is 1 AU (Astronomical Units). Stars and galaxies use 10 parsecs which is about 32.616 light years.The dimmer an object is the higher the positive value. The brighter an object is the higher the negative value.Examples:The Sun has an apparent magnitude of -26.74 but an absolute magnitude of 4.83Sirius has an apparent magnitude of -1.46 but an absolute magnitude of -1.42This means that from Earth, the Sun is a lot brighter, but if the Sun was replaced by Sirius, Sirius would be 25 times more luminous.See related links for more information
The scale of star brightness is the 'magnitude'. The definition of the magnitude is: A change of six magnitudes equals a factor of 100. So one magnitude change is a factor equal to the 6th root of 100 = about 2.15443 (rounded)
The apparent magnitude of the Sun is -26.73. (Yes negative)The absolute magnitude of the Sun is 4.83See related question for the difference between absolute and apparent magnitude.For comparison at maximum brightness.Full Moon -12.6Venus -3.8Mars - 3Sirius -1.47Ganymede 4.6Object visible with the naked eye 6.5
The brightness of a star depends on its temperature, size and distance from the earth. The measure of a star's brightness is called its magnitude. Bright stars are first magnitude stars. Second magnitude stars are dimmer. The larger the magnitude number, the dimmer is the star.The magnitude of stars may be apparent or absolute.
The basic idea is:* Measure the star's apparent magnitude * Calculate the star's distance * The absolute magnitude can be directly calculated from these two pieces of information. However, adjustments may need to be made for extinction - that is, if there is a lot of dust or gas between the star and us, it looks dimmer than without the dust or gas. Without extinction, the Wikipedia gives the following formula: M = m - 5((log10 DL) - 1) Where M is the absolute magnitude, m is the apparent magnitude, DL is the distance in parsec.
I assume you mean the absolute magnitude (brightness) of stars. The problem with this is that it can't be directly measured. What astronomers can measure is the apparent magnitude. To make conclusions about the absolute magnitude, they would also have to know the distance to the star, as well as data about extinction, i.e., how much dust and gas there is between us and the start which may make the light look fainter. Note that the absolute magnitude is very important to characterize a star - but it may be difficult to calculate it with much precision.
Its real (absolute) magnitude; its distance from Earth; the amount of light that's absorbed by matter between the star and us (extinction); distortions due to gravitational lensing.
Does it mean that the star is a main sequesnce star? ( . Y . ) The above isn't true. A star can be a blue supergiant and be on the main sequence but still not be even visible to us, therefore the apparent and absolute magnitude wouldn't be the same. But to answer your question, I don't think it has a name, it just means that you are seeing the star's absolute and apparent magnitude at the same time, so if you placed the star at 32.6 light years away(the absolute magnitude scale)then the star would not appear to change in brightness
The absolute magnitude is the magnitude (brightness) an object would have at a standard distance - how bright would it look at a standard distance. For a star or galaxy, the standard distance of 10 parsecs is commonly used.