![]() |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
![]()
Hi,
I have read that the people working behind the Mars missions, appear to have successfully upgraded the Mars Relay to transmit at 256 kilobits/sec, allowing approximately 250 megabits worth of data to be transmitted in a single relay session. (I'm speaking only of data rates from Mars Surface to Mars Orbit.) I have also read several documents on the Net, including some old 1990's specification document on the Mars Global Surveyor relay antenna. It gave me the idea that the data rate was limited to around 128 kilobits per second. This harkens back to a bandwidth-upgrade accomplishment that I remember well. In the 1990's, I recall the breakthroughs that were necessary in order to cram data through Galileo's low gain antenna at Jupiter, after the high gain antenna failed to open after repeated attempts. If I remember correctly, the 16 bits/sec (max) low-gain antenna was upgraded well beyond spec via software to perform up to 160 bits/sec (max) uncompressed rate, plus various compression algorithms (both lossy and lossless) were implemented, to achieve the equivalent of approximately 1 kilobits/sec rate average. The literal engineering accomplishment of a very difficult "cramming as many elephants as possible through a drinking straw as quickly as possible" problem, so to speak. I would like to ask -- how was the relay bandwidth upgrade accomplished, considering the original specifications appears to have permitted 128 kilobits per second? Was a provision added for 256 kilobits per second relay just in case it was found to be possible, or was this a software accomplishment that allowed the specs to be extended while the probes were already in orbit? New modulation/codec algorithm? Hand-optimized clock-cycle-exact code? Power supply modulation to antenna? Turning off other instruments to give enough power for the higher data rate? (Just being wild on unorthodox ideas that might have been invented to exceed an antenna system spec -- I am not in the space industry myself but have a basic understanding of various concepts like these.) This also leads to another interesting question -- How much room is there for further improvement? Would an upgrade to 512 kilobits/sec and 1024 kilobits/sec be theoretically possible using any of the existing orbiters, especially if a future surface mission had a theoretical aimable UHF antenna dish that tracked the relay satellite? Are the satellites theoretically flexible enough to support even higher data rates for future missions of any existing orbiter (Surveyor/Climate/Express as of 2003), assuming their missions were extended to cover such a theoretical future surface mission? Don't be afraid to be technical; I'm a software engineer here, although not in the space industry. Mark Rejhon www.marky.com (personal webpage) |
#2
|
|||
|
|||
![]()
If nothing of the kind of upgrade you suggested has occured, I belive the
rate was accomplished with the Odyssey orbiter as both the MER rovers and that orbiter fully supports 256 kbps. Sincerely Bjørn Ove |
#3
|
|||
|
|||
![]() |
#4
|
|||
|
|||
![]()
In article ,
Mark Rejhon wrote: This harkens back to a bandwidth-upgrade accomplishment that I remember well. In the 1990's, I recall the breakthroughs that were necessary in order to cram data through Galileo's low gain antenna at Jupiter, after the high gain antenna failed to open after repeated attempts. If I remember correctly, the 16 bits/sec (max) low-gain antenna was upgraded well beyond spec via software to perform up to 160 bits/sec (max) uncompressed rate, plus various compression algorithms (both lossy and lossless) were implemented, to achieve the equivalent of approximately 1 kilobits/sec rate average. The literal engineering accomplishment of a very difficult "cramming as many elephants as possible through a drinking straw as quickly as possible" problem, so to speak. When they originally announced plans for the low-gain antenna Gailleo mission, 3 techniques were emphasized: - improving reception by arraying DSN antennas and installing more sensitive receivers. - lossless and lossy data compression for non-imaging and imaging data, respectively. - better ECC allowing higher data rates and lower error rates. AIUI compression is pretty standard on contemporary missions. Jon __@/ |
#5
|
|||
|
|||
![]()
Also some things can be upgraded, cause the chips are EPROM,
electronically programmable memory and like chips that can be changes within reason.. Like how some newer BIOS can be changed by a program. Mike Jon Leech wrote: In article , Mark Rejhon wrote: This harkens back to a bandwidth-upgrade accomplishment that I remember well. In the 1990's, I recall the breakthroughs that were necessary in order to cram data through Galileo's low gain antenna at Jupiter, after the high gain antenna failed to open after repeated attempts. If I remember correctly, the 16 bits/sec (max) low-gain antenna was upgraded well beyond spec via software to perform up to 160 bits/sec (max) uncompressed rate, plus various compression algorithms (both lossy and lossless) were implemented, to achieve the equivalent of approximately 1 kilobits/sec rate average. The literal engineering accomplishment of a very difficult "cramming as many elephants as possible through a drinking straw as quickly as possible" problem, so to speak. When they originally announced plans for the low-gain antenna Gailleo mission, 3 techniques were emphasized: - improving reception by arraying DSN antennas and installing more sensitive receivers. - lossless and lossy data compression for non-imaging and imaging data, respectively. - better ECC allowing higher data rates and lower error rates. AIUI compression is pretty standard on contemporary missions. Jon __@/ |
Thread Tools | |
Display Modes | |
|
|