Table of Contents
What happens if two stars have the same absolute magnitude?
For two stars with the same absolutely magnitude but different apparent magnitude, one star could be farther than the other. Extinction (the absorption or scattering of light) also affects apparent magnitude by making a star appear dimmer.
What does it mean when two stars have the same magnitude?
apparent magnitude
Two objects that have the same apparent magnitude, as seen from the Earth, may either be: At the same distance from the Earth, with the same luminosity.
How could two stars have the same apparent magnitude but different absolute magnitudes?
Infer how two stars could have the same apparent magnitude but different absolute magnitudes. Absolute Magnitude is the amount of light a star releases. Two stars could look like they had the same brightness from Earth, but really they could be different distances apart and actually have a different brightness.
What would be the distance of a star whose apparent magnitude and absolute magnitude were the same?
If the star is exactly 10 parsecs away (rare, but it does happen), the absolute magnitude will be the same as the apparent magnitude. The apparent magnitude is actually a good indicator of true luminosity. Thus, if m – M = 0, then the distance D = 10 pc.
Which star do we find most in the night sky?
Sirius is bright enough to dominate the night sky after sunset particularly if there are no planets around. It’s very bright because: Sirius is a big blue-white star 25 times the mass of our Sun. Sirius is just 8.7 light-years from the Solar System, making it the seventh closest star to Earth.
What is absolute magnitude and what does it depend on?
An object’s absolute magnitude is defined to be equal to the apparent magnitude that the object would have if it were viewed from a distance of exactly 10 parsecs (32.6 light-years), without extinction (or dimming) of its light due to absorption by interstellar matter and cosmic dust. …
Which is brighter a star or a magnitude?
First confusing point: Smaller magnitudes are brighter! The magnitude scale was originally defined by eye, but the eye is a notoriously non-linear detector, especially at low light levels. So a star that is two magnitudes fainter than another is not twice as faint, but actually about 6 times fainter (6.31 to be exact).
Which is fainter a star or two magnitudes?
So a star that is two magnitudes fainter than another is not twice as faint, but actually about 6 times fainter (6.31 to be exact). Second confusing point: Magnitude is a logarithmic scale! A difference of one magnitude between two stars means a constant ratio of brightness.
Which is brighter a red star or a blue star?
If a red star and a blue star both appear equally bright and both are the same distance from Earth, which one has the larger radius? Explain why. If two stars in a binary system were moved farther apart, how would their masses and orbital periods change?
How did Hipparchus measure the brightness of the stars?
The Greek astronomer Hipparchus cataloged the stars in the night sky, defining their brightness in terms of magnitudes (m), where the brightest stars were first magnitude (m=1) and the faintest stars visible to the naked eye were sixth magnitude (m=6). First confusing point: Smaller magnitudes are brighter!