View Single Post
  #88  
Old September 26th 06, 09:25 PM posted to sci.physics.research,sci.astro.research
[email protected]
external usenet poster
 
Posts: 96
Default Ranging and Pioneer

Oh No wrote:
Thus spake George Dishman

"Oh No" wrote in message
k...
Thus spake Spud
...
gr-qc/0104064

Many thanks. If I understand Anderson correctly it is merely an
engineering constraint. With a (perhaps greatly) amplified signal it
should be possible to reduce integration times to achieve correlation so
that the signal can be returned without a substantial range delay.


The problem was more fundamental. Reading between the lines,
and from some personal correspondence, the rate of change of
the carrier frequency was I believe limited in choice by the
design of the exciter or correlator equipment. The narrow
bandwidth required to get an acceptable SNR at the craft
meant the minimum sweep rate was too fast and the craft lost
lock on the uplink. Increased gain at the craft wouldn't help,
only more transmit power, but that was in the 100's of kW
already and being routed through their largest (70m) dishes.


Considering that on Earth we pick up a signal of 8kW from Pioneer,


That is 8 W, not 8 kW. The anomaly is equivalent to
the radiation pressure of a signal of just 63 W, a
light bulb would do it.

it
doesn't sound to me as though the amplitude of the signal sent to
Pioneer was really too low. Loss of lock on frequency modulation sounds
interesting. I might be helpful to understand this more precisely. Can
you give more detail on how the signal is encoded.


AIUI they had a dedicated piece of equipment that
they used. It provided an output which was used to
modulate the uplink carrier and it also then looked
at the downlink frequency and tried to run a
correlation. Once it found a match, it would measure
the time delay.

I don't know the exact modulation scheme but I
believe it involved "ramping" the transmitter frequency,
possibly by generating commands to the local signal
generator rather than producing a frequency itself. The
ATDF format includes "ramp" records and that facility
was used as part of the signal acquisition strategy.

My guess it that linear frequency ramps were used
possibly with variable ramp durations to resolve any
ambiguity.

Anyway that is the basis on which I am working at the moment.

But I am not an engineer, and I was hoping this might be confirmed. My
arguments would take quite a different form if this was a fundamental
constraint.


It was purely an equipment limitation, possibly exacerbated
by radiation damage during the Jupiter flyby though that is
my speculation.


I am slightly sceptical about the equipment failure theory, because it
would mean the equipment from both pioneers failed in the same way at
much the same place. I would also be a little surprised if radiation
from Jupiter is energetic enough to cause such damage. After all other
space craft have flown close to the sun and survived.


Jupiter is worse because the craft get much closer and
at the time they were shocked at the levels. Remember
the craft were some of the first to encounter these fields
and subsequent craft have had much better protection
as a result.

Equipment limitation seems probable. It would be more satisfying if one
could identify a cause for the limitation, since clearly it was not
expected from the design parameters. Your response has caused me to
wonder if there is an unmodeled effect in the uplink which might
contribute to loss of lock. Clearly there is a Doppler variation in the
signal received by Pioneer due to orbital motion of the Earth of
30,000m/s which must have been taken into account.


The effect from the rotation of the Earth is more significant
due to the shorter period.

But how narrow is the
bandwidth, and how accurately does this need to be predicted? Could an
unmodelled effect as small as _+-2m/s (varying annually) contribute to
loss of lock?


For the initial search I know the DSN uses FFTs over
a fairly wide rang - 16kHz from memory with a check
in adjacent bands. The diurnal Doppler is around 8 kHz
(and is known of course) but produces a rate of change
in the kHz per hour range. The FFT then told a PLL
where to look with an initial bandwidth in the low Hz
which then dynamically reduced to as low as 0.01Hz
as lock was obtained. However, the craft hardware
was probably much simpler. I'm sorry I can't help
more on that but I don't have details on the craft
receiver design.

Against that, the uplink power was 400 kW compared
to just 8 W on the downlink so I would have expected
a much stronger lock. Anyway the bottom line is that
I expect it was the rate of change of frequency that
would break the uplink lock. It's a shame they couldn't
hack the correlator box to run a smaller frequency
deviation but since it was probably dedicated hardware
in those days, it might not have been practical.

BTW, I am a digital and systems engineer in
a company working significantly in HF comms.


It's clear you know a lot about this kind of thing, and your input is
much appreciated.


Thanks, I don't pretend to know everything but I hope
it is helpful.

George