![]() |
|
|
Thread Tools | Display Modes |
#2
|
|||
|
|||
![]()
In message , Robert
writes Ok, so the brightness of a star is supposed to be proportional to the luminosity / distance squared. If a star's luminosity suddenly increases by 600 then the apparent magnitude should be +600 (since the distance is unchanged), right? Yet apparently the correct answer is that the apparent magnitude decreases by 7. Can someone please 'splain? Thanks. Magnitude is a logarithmic scale, not a linear one, and is defined such that brighter objects have lower magnitudes. A decrease of 5 magnitudes is defined as a 100-fold increase in brightness, i.e. 1 magnitude corresponds to a factor of approximately 2.512 in brightness. -- Stewart Robert Hinsley |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Luminosity Functions | PoorRichard | Research | 5 | August 30th 07 08:43 AM |
ASTRO: NGC 281 Luminosity | Rick Johnson[_2_] | Astro Pictures | 0 | December 24th 06 11:24 PM |
apparent magnitude through scope | Peter Michelson | UK Astronomy | 5 | February 23rd 05 03:57 PM |
Apparent & absolute magnitude | Alexander Duerloo | Astronomy Misc | 9 | July 18th 03 06:52 AM |
Apparent & absolute magnitude | Alexander Duerloo | UK Astronomy | 0 | July 12th 03 05:24 PM |