A Space & astronomy forum. SpaceBanter.com

Go Back   Home » SpaceBanter.com forum » Astronomy and Astrophysics » Amateur Astronomy
Site Map Home Authors List Search Today's Posts Mark Forums Read Web Partners

Instantaneous vs. Time-Averaged Seeing



 
 
Thread Tools Display Modes
  #1  
Old December 4th 04, 10:05 AM
Mike Simmons
external usenet poster
 
Posts: n/a
Default Instantaneous vs. Time-Averaged Seeing

On Fri, 3 Dec 2004 19:02:23 +0000 (UTC), Pierre Vandevenne
wrote:

I have a very basic understanding of digital signal processing at the
mathematical level and what is achieved in terms of resolution is
apparently beyond what can be achieved iin the frameworkd of my limited
understanding.

I would appreciate immensely if someone specialized in signal processing
could explain how this works.


I can't explain the math, Pierre, but it makes sense to me in terms of the
amount of information collected. The mistake is in thinking that the
CCD's five available pixels will always record the same thing. If
multiple images record the same information over and over then nothing
will be gained by taking more images. If the camera and subject never
move so that the pixels always record the same information there is
nothing to be gained by combining the images. Average combinging them
will give the same result as a single image -- a single value with no
variance. But in reality there is motion of the image (as well as real
variance) so you aren't just recording the same information over and over
on the few pixels you have available to record the light from Titan.
You're taking a new sample of light coming from Titan each time, and each
sample is slightly different. The amount of information is increased not
by increasing the resolution of the instrument but by sampling multiple
times with the same instrument. If you're just recording the exact same
part of Titan on the same pixel each time (i.e., if there is zero movement
in the image) you're still getting a distribution of values from a sample
rather than a single value and that will increase the precision of the
measurement although it doesn't increase spatial resolution. But you're
not always going to get the same part of the image recorded on the same
pixel each time (especially in mediocre seeing) so you're getting a
spatial distribution as well. In a sense in one image you're sort of
seeing between the pixels of a previous image.

I used to do something similar with a manual 35mm SLR camera that used an
average behind-the-lens exposure meter. The meter was a single element
detector, giving a reading averaged over the entire scene, i.e., zero
spatial resolution across the scene. But by moving the camera around and
watching the reading change I could get an idea of the variations of light
in two dimensions -- based on multiple readings with a zero-resolution
instrument -- and estimate what exposure was necessary for a particular
image element within the scene. Thus I appeared to be exceeding the
theoretically resolving limit (zero) of the meter. But that limit applies
to a single reading.

All of this has similarities to interferometry -- basically the same idea
of collecting more information and combining it -- but while it's been
explained to me by someone at an interferometry facility my ability to
regurgitate it on demand is severely limited. The information was
collected (it made sense at the time) but scrambled in the poor "seeing"
of my brain, I guess.

And if this actually works, my next question will be, "why did those nasa
guy put their scope in space"?


There are larger telescopes on Earth and with adaptive optics and other
techniques they can record finer resolution than the telescopes in space.
But the ones in space don't have weather or daytime, aren't limited to a
small area of the field for the best resolution, don't require good seeing
for diffraction-limited observing like AO systems do, don't have skyglow
from the atmosphere (I suppose imaging through the gegenschein is
limitingg) and many other factors that make it harder to do these things
from Earth. They can take longer exposures and not have to take multiple
exposures and stack and combine the images (maybe something similar is
still done?). The information arriving at the telescope in space hasn't
been spread around to where you have to go chasing it and putting it back
in order.

Mike Simmons
  #2  
Old December 4th 04, 05:22 PM
Tim Auton
external usenet poster
 
Posts: n/a
Default

"matt" wrote:
Mike Simmons wrote in message ...

[snip]
I can't explain the math, Pierre, but it makes sense to me in terms of the
amount of information collected. The mistake is in thinking that the
CCD's five available pixels will always record the same thing.

[snip]
the process of increasing spatial resolution of a CCD sensor beyond its
limit dictated by pixel size is called dithering . It implies moving the
image in sub-pixel steps and taking multiple exposures at these slitghly
changed positions .

For details see:
http://www.stsci.edu/instruments/wfpc2/drizzle.html


However the Drizzle algorithm was developed for use with
*under*-sampled images. I don't think it claims resolution of high
spatial frequency detail beyond the limitations imposed by the optics
of the instrument (Rayleigh criterion, Dawes limit or Sparrow limit -
but that's another argument).

The Titan image apparently shows spatial resolution at a frequency
higher than the Sparrow limit of the instrument. Show me a
peer-reviewed paper that says that is possible in the general case and
I might believe it's not noise.


Tim
--
Foo.
  #3  
Old December 4th 04, 06:35 PM
Martin Brown
external usenet poster
 
Posts: n/a
Default

Tim Auton wrote:
"matt" wrote:

Mike Simmons wrote in message ...


I can't explain the math, Pierre, but it makes sense to me in terms of the
amount of information collected. The mistake is in thinking that the
CCD's five available pixels will always record the same thing.


the process of increasing spatial resolution of a CCD sensor beyond its
limit dictated by pixel size is called dithering . It implies moving the
image in sub-pixel steps and taking multiple exposures at these slitghly
changed positions .

For details see:
http://www.stsci.edu/instruments/wfpc2/drizzle.html


However the Drizzle algorithm was developed for use with
*under*-sampled images. I don't think it claims resolution of high
spatial frequency detail beyond the limitations imposed by the optics
of the instrument (Rayleigh criterion, Dawes limit or Sparrow limit -
but that's another argument).


No but some of the deconvolution techniques in routine use now for the
past couple of decades can and do. The worry is that they can also
produce artifacts if used improperly. They are a double edged sword.

The Titan image apparently shows spatial resolution at a frequency
higher than the Sparrow limit of the instrument. Show me a
peer-reviewed paper that says that is possible in the general case and
I might believe it's not noise.


It isn't possible in the general case. But it is possible with good
signal to noise data on a high contrast target with a well qualified
point spread function. I think the first one is pubically accessible.

Please don't hammer ADS abstracts for downloads unless you really want
to see the deeper mathematical detail. It is not light reading and they
are big files.

See for example:

STSDAS Users Guide
http://stsdas.stsci.edu/documents/SUG/UG_29.html

Tim Cornwell's paper in A&A' on the VLA Maxent deconvolution code VM
http://adsabs.harvard.edu/cgi-bin/np...1a5e6d34902655

NASA data analysis team peer reviewed article
http://adsabs.harvard.edu/cgi-bin/np...1a5e6d34902655

There *are* reasons to worry about the detail on a bright planetary disk
if the deconvolution code is allowed to overfit the data. It may cause
ringing effects on edge transitions that can lead to spurious artefacts.
And I suspect that some of these examples have been overcooked.

The bright ring round the left edge of Jupiter on the following example
is almost certainly due to overfitting the data (or algorithmic
instability).

http://www.buytelescopes.com/gallery...o.asp?pid=2073

Regards,
Martin Brown
  #4  
Old December 4th 04, 07:35 PM
Thierry Legault
external usenet poster
 
Posts: n/a
Default

yes! Drizzling works with under-sampled images, with a FWHM smaller than 2.0
pixels
(and the benefits really appear under 1.5 pixels), and it cannot be the case
at the F-ratio
used here.

"Martin Brown" a écrit dans le message
news: ...
However the Drizzle algorithm was developed for use with
*under*-sampled images. I don't think it claims resolution of high
spatial frequency detail beyond the limitations imposed by the optics
of the instrument (Rayleigh criterion, Dawes limit or Sparrow limit -
but that's another argument).



  #5  
Old December 4th 04, 07:43 PM
Bill Meyers
external usenet poster
 
Posts: n/a
Default



Mike Simmons wrote:

(snip of a good post)

Mike,
I always look forward to your posts on s.a.a. Knowledgeable and sensible.
Ciao,
Bill Meyers


  #6  
Old December 5th 04, 12:05 AM
Brian Tung
external usenet poster
 
Posts: n/a
Default

Tim Auton wrote:
The Titan image apparently shows spatial resolution at a frequency
higher than the Sparrow limit of the instrument. Show me a
peer-reviewed paper that says that is possible in the general case and
I might believe it's not noise.


No paper, but a plausibility argument. The output image is the
source image (i.e., the object) convolved with the PSF of the instrument.
Construct the inverse of the PSF, and deconvolve. You should improve
your image beyond the ordinary resolution constraints.

Possible problems are that the PSF is now sufficiently well-known to
do this (I'm sure the noise level in the output is highly sensitive to
errors in estimating the PSF), the object is too large and hence the
PSF varies too much over the field of view, the seeing is too bad, etc.
I'm sure there's lots of stuff I'm not thinking of.

Brian Tung
The Astronomy Corner at http://astro.isi.edu/
Unofficial C5+ Home Page at http://astro.isi.edu/c5plus/
The PleiadAtlas Home Page at http://astro.isi.edu/pleiadatlas/
My Own Personal FAQ (SAA) at http://astro.isi.edu/reference/faq.txt
  #7  
Old December 5th 04, 08:04 PM
Pierre Vandevenne
external usenet poster
 
Posts: n/a
Default

Mike Simmons wrote in
news
On Fri, 3 Dec 2004 19:02:23 +0000 (UTC), Pierre Vandevenne


I can't explain the math, Pierre, but it makes sense to me in terms of
the amount of information collected. The mistake is in thinking that


Yes, intuitively I tend to 'feel' just as you feel. It is nice to think
that one photon coming from here and one photon coming from there could be
spatially reorganized to augment the effective resolution. But it may also
simply be impossible. That's why I am looking for mathematically verifiable
stuff...

Just as wheels appearing to rotate backwards are totally counter-intuitive,
but once you see the math, it is crystal clear.


---
Pierre Vandevenne - DataRescue sa/nv - www.datarescue.com
The IDA Pro Disassembler & Debugger - world leader in hostile code analysis
PhotoRescue - advanced data recovery for digital photographic media
latest review: http://www.pcmag.com/article2/0,1759,1590497,00.asp
  #8  
Old December 7th 04, 03:26 AM
William C. Keel
external usenet poster
 
Posts: n/a
Default

Mike Simmons wrote:
On Fri, 3 Dec 2004 19:02:23 +0000 (UTC), Pierre Vandevenne
wrote:


I have a very basic understanding of digital signal processing at the
mathematical level and what is achieved in terms of resolution is
apparently beyond what can be achieved iin the frameworkd of my limited
understanding.

I would appreciate immensely if someone specialized in signal processing
could explain how this works.


....snip...

And if this actually works, my next question will be, "why did those nasa
guy put their scope in space"?


There are larger telescopes on Earth and with adaptive optics and other
techniques they can record finer resolution than the telescopes in space.
But the ones in space don't have weather or daytime, aren't limited to a
small area of the field for the best resolution, don't require good seeing
for diffraction-limited observing like AO systems do, don't have skyglow
from the atmosphere (I suppose imaging through the gegenschein is
limitingg) and many other factors that make it harder to do these things
from Earth. They can take longer exposures and not have to take multiple
exposures and stack and combine the images (maybe something similar is
still done?). The information arriving at the telescope in space hasn't
been spread around to where you have to go chasing it and putting it back
in order.


There is still interesting processing to do on space images. One certainly
does want to combine multiple images - if nothing else, to reject false
structures from cosmic-ray impacts in the detectors. This is why
individual Hubble image exposures almost never go above half an orbit
even when something is visible longer - the amount of image compromised
starts to become too large. Standard procedure is to combine two with
rejection of wild pixels that are high in one; and for critical
applications more. As was done on the Deep Fields, there is also
a gain to be had from multiple observations slightly offset from
one another, sincethe cameras had to incororate compromises between
pixel sampling and field of view. This technique, "drizzling",
regains a bit of the resolution lost by undersampling the PSF core.
And, incidentally, gives images which look just beautiful when magnified
to the level of original pixels.

So some things are still the same all over...

Bill Keel
  #9  
Old December 8th 04, 07:12 PM
Mike Simmons
external usenet poster
 
Posts: n/a
Default

On 6 Dec 2004 21:26:10 -0600, William C. Keel
wrote:

There is still interesting processing to do on space images...

So some things are still the same all over...


Thanks for the out-of-this-world reality for those of us stuck on Earth,
Bill.

Mike Simmons
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
The Gravitational Instability Theory on the Formation of the Universe Br Dan Izzo Policy 6 September 7th 04 09:29 PM
Earth rotation don findlay Astronomy Misc 122 July 9th 04 07:57 PM
Pioneer 10 anomaly: Galileo, Ulysses? James Harris Astronomy Misc 58 January 28th 04 11:15 PM
UFO Activities from Biblical Times (LONG TEXT) Kazmer Ujvarosy SETI 2 December 25th 03 07:33 PM
Correlation between CMBR and Redshift Anisotropies. The Ghost In The Machine Astronomy Misc 172 August 30th 03 10:27 PM


All times are GMT +1. The time now is 10:09 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 SpaceBanter.com.
The comments are property of their posters.