The amount of light we receive from a star - other things being equal - is inversely proportional to the square of the distance. For example, from a star that is ten times as far from us as another star (of the same type), we will only receive 1/100 of the light that we receive from the closer star.
The "actual" brightness of the star is called "absolute magnitude". It is calculated as if all stars were at the same distance, so that we can compare stars directly. The "apparent magnitude" of the star is how bright it appears in our own night sky.
Two stars can have the same apparent magnitude even though one star is small, dim and close, while the other star is huge, bright and distant.
As far as astronomy is concerned, not much! However, magnitude uses an actual, mathematically-defined system whereas brightness is a more 'fuzzy' idea. For instance, you could say 'Sirius is brighter than Polaris' (which is true!) but magnitude enables you to actually say how much brighter it is. The magnitude of Sirius is about -1.5 whereas Polaris, though slightly variable, is about 2.1.
In case you don't know, the magnitude system works 'backwards' so to speak, so the smaller the number the brighter the star. Sirius is so bright its magnitude value is negative. The magnitude of the Sun is about -26.
To complicate things, astronomers use several magnitude systems. The one above (and the one I guess you're asking about) is called APPARENT magnitude - that is, a measure of how bright a star looks in the sky. But Sirius only appears brighter than Polaris because it's much, much closer to us (8 light years instead of Polaris's 600-odd lightyears). To account for this, astronomers use another magnitude system called ABSOLUTE magnitude. We choose a default distance of 10 parsecs (about 32 lightyears) and work out how bright a star would be if it were at this distance. It's a way of eliminating the 'closer is brighter' effect I just mentioned, and gives a guide to the actual physical parameters of each star.
The answer is lengthy, the answer is "Black Body Radiation". In short: The hotter the object is, the higher the frequency/shorter the wavelength of the spectrum is emitted.
Infrared is the lowest temperature, then a black body transitions roughly through red, orange, yellow to white. Not green or blue, due to color mixing of multiple frequencies.
one is light the other is how fast it travles
Three factors that affect a star's brightness are the star's distance from earth, its age and its luminosity. The farther the star is from earth, the less bright it appears. As a star increases in age, its brightness also increases. Its brightness also depends on its luminosity, which is the amount of energy the star emits per second.
True!
Its apparent brightness, as well as the star's gravitational field.
Apparent brightness: how bright an object - such as a star - looks to us. True brightness: how bright such an object really is. Defined as: how bright it would look at a standard distance.
To find the number of light years between two celestial objects, we first find the distance from each object to earth. If we connect the dots between Earth and the two objects, we have a triangle. We to sides lengths of that triangle (the distances between Earth and the objects), and we can measure one angle (the angle at the vertex where Earth is. This is enough information to find the distance between the objects using trigonometry (in this case, the law of cosines). Finding the distance from Earth to an object can be a bit complex. One commonly used method is to look for a pulsating star. We can figure out the absolute brightness (how bright it is without factoring in distance away) of these stars by how often they pulse. Then we can measure the apparent brightness (how bright it looks to us). We can then use both these values to find the distance to the star. (This also works for some supernovae.) Another method is to use objects that are considered to be 'standard candles'. These objects do not pulse, but we know the relationship between their absolute brightness, apparent brightness, and distance away.
Absolute Brightness: How bright a star appears at a certain distance. Apparent Brightness: The brightness of a star as seen from Earth.
Hertzsprung and Russell.
Hertzsprung and Russell.
Brightness tells you the temperature and mostly temperature would tell the brightness of the star that we are talking about.
"Apparent magnitude" is the star's brightness after the effects of distance. "Absolute magnitude" is the star's brightness at a standard distance.
Absolute Brightness .
Theres `Absolute Magnitude` which is the brightness of a star at a set distance. Then there is `Apparent Magnitude` which is the apparent brightness from earth, regardless of distance.
Three factors that affect a star's brightness are the star's distance from earth, its age and its luminosity. The farther the star is from earth, the less bright it appears. As a star increases in age, its brightness also increases. Its brightness also depends on its luminosity, which is the amount of energy the star emits per second.
Other things being equal, the farther the star, the less bright it will seem to us. Specifically, the apparent brightness is inversely proportional to the square of the distance.
Two factors that affect a star's apparent brightness are: 1.) The distance between the Earth and the star 2.) The absolute magnitude (the actual brightness) of the star Hope that helps :P
i hate you !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!1
The brightness as seen from Earth is called the "apparent magnitude".The real brightness (defined as the apparent brightness, as seen from a standard distance) is called the "absolute magnitude".