Scientists talk about "Apparent Magnitude" and "Absolute Magnitude".
The first definition refers to how bright a star appears to us on Earth.
The second definition is related to how luminous that star really is.
Oddly enough, the brighter a star is, on either scale, the lower the number for its magnitude.
The scales are a bit complicated because they are "logarithmic".
However, one simple fact is that a difference on of 5 magnitudes means a difference of 100 times in brightness.
Click on the link below for more details.
Absolute Magnitude
Scientists use the term magnitude to describe a star's brightness.
One way to describe a star's brightness is by its apparent magnitude, which is how bright it appears from Earth. Another way is by its absolute magnitude, which measures how bright a star would appear if it were placed at a standard distance of 10 parsecs from Earth.
Stars of the first magnitude are some of the brightest stars in the night sky. They typically have a visual magnitude that is lower than +1.5, making them easily visible to the naked eye. Examples of first magnitude stars include Sirius, Canopus, and Arcturus.
"First magnitude" usually means the brightest 21 stars, as seen from Earth. Another definition is stars with apparent magnitudes 0.5 to 1.5. This definition excludes the very brightest stars, like Sirius. They are the first stars that become visible after sunset and they all have names. Examples are Altair, Aldebaran, Capella, Spica, Antares, Fomalhaut, Deneb, Regulus, Sirius, etc. There can be confusion because First Magnitude stars are not stars with an "apparent magnitude" of exactly "one". They are just the brightest stars, but naturally their magnitudes are close to one.
Telescopes, combined with spectroscopy are used for the colors. The apparent brightness can be measured using a telescope with a special "CCD camera". To measure the "real" brightness ("absolute magnitude") you also need to be able to work out the distance to the star.
A star's brightness is known as its magnitude. Stars with lower magnitude numbers are brighter than stars with a higher magnitude number.
No. The Richter scale is a way for scientists to describe how much energy was released by an earthquake (this is known as the earthquakes magnitude).
The most commonly referred to scale by the press and the public is the Richter scale for measuring earthquake magnitude. However this was actually replaced in the 1970s by the Moment Magnitude scale which is the magnitude scale favoured and in use by seismologists.
Stars can be described by their temperature, size (diameter), brightness (luminosity), color, composition, and age. These characteristics help scientists classify and study stars in the universe.
Scientists actually use two measurements to identify a star's brightness. One is luminosity, or the energy that star puts out. Another is magnitude, or the amount of light a star puts out.
Scientists classify stars by size based on their mass. Stars can be categorized as dwarf stars (like our Sun), giant stars, or supergiant stars, with the size increasing as the mass of the star increases. The classification can also include specific categories such as red dwarfs, white dwarfs, or blue giants, depending on additional characteristics.