![]() |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
![]()
Ok, so the brightness of a star is supposed to be proportional to the
luminosity / distance squared. If a star's luminosity suddenly increases by 600 then the apparent magnitude should be +600 (since the distance is unchanged), right? Yet apparently the correct answer is that the apparent magnitude decreases by 7. Can someone please 'splain? Thanks. |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Luminosity Functions | PoorRichard | Research | 5 | August 30th 07 08:43 AM |
ASTRO: NGC 281 Luminosity | Rick Johnson[_2_] | Astro Pictures | 0 | December 24th 06 11:24 PM |
apparent magnitude through scope | Peter Michelson | UK Astronomy | 5 | February 23rd 05 03:57 PM |
Apparent & absolute magnitude | Alexander Duerloo | Astronomy Misc | 9 | July 18th 03 06:52 AM |
Apparent & absolute magnitude | Alexander Duerloo | UK Astronomy | 0 | July 12th 03 05:24 PM |