A Space & astronomy forum. SpaceBanter.com

Go Back   Home » SpaceBanter.com forum » Others » Astro Pictures
Site Map Home Authors List Search Today's Posts Mark Forums Read Web Partners

ASTRO: OT Residual Bulk Image: its root cause and the best known method for curing it: and a comment about Thermal Diffusion image smear



 
 
Thread Tools Display Modes
  #1  
Old January 19th 08, 07:30 PM posted to alt.binaries.pictures.astro
Richard Crisp[_1_]
external usenet poster
 
Posts: 985
Default ASTRO: OT Residual Bulk Image: its root cause and the best known method for curing it: and a comment about Thermal Diffusion image smear

I keep hearing of people having problems with Residual Bulk Image (RBI) in
their CCD cameras. I'd like to share some information with the group that
may be expositive to many.

I have prepared a one page explanation of the phenomenon of trapping sites
at the interface between the epi and substrate in a typical front
illuminated ccd

The link is he
http://www.narrowbandimaging.com/ima...mage_traps.jpg

It shows two diagrams: one is a band diagram overlaid over a cross section
of the ccd and the other is a plot of photon penetration depth as a function
of wavelength

At the interface between the lightly doped p- Epi device layer and the
heavily P+ doped substrate there is a zone of trapping sites that creates a
small well. That well is not influenced by the gate voltages like the
"wells" we think of in the CCD. Since the average depth of penetration of a
photon before interacting with the silicon is inversely related to
wavelength, the longer wavelength photons penetrate deeper into the silicon
than do the shorter wavelength photons. Where they finally interact with the
silicon is where the photoelectron is generated.
If the wavelengths are long enough, there is a statistical probability that
the generated photoelectron will be trapped in one of these interface traps
rather than being collected in the well. Once trapped, the charge carriers
will escape from these trapping site due to random thermal motion. There's a
time constant that turns out to be exponentially related to temperature.
Note that it is not necessary to have a saturated sensor to exhibit RBI.

The longer the subsequent exposure taken after trapping photoelectrons in
these interface traps, the more charge will leak out into the following
frame. This is the source of RBI (Residual Bulk Image).
Since the desired photoelectrons are winding up in trapping sites instead of
the pixel's well; the QE is adversely affected. But as these interface traps
fill, more of that charge winds up in the pixel's wells. This results in a
variation of QE which will affect the linearity of the camera. It is bad for
photometry. The effect is called Quantum Efficiency Hysteresis.
If you run the sensor warmer to shorten the time constant of the trap
holding time, then you are not avoiding the trapping, you simply are
reducing the time the charge remains trapped and this is to some extent at
the expense of reducing your maximum exposure time (limit reached when dark
current shot noise exceeds read noise is a commonly accepted upper limit
found in the literature). Additionally there are hot pixels in the typical
array that are thermally activated and at warmer temperatures there are more
of them. Since running the array warmer to reduce the time constant doesn't
avoid the trapping and associated QEH, it simply shortens the time constant
for the subsequent leak-out, such leak out can result in tails in the
image that may or may not be noticed by the naked eye but that would be
noted in a CTE measurement. Any residual charge will introduce error into
photometry measurements for example so they should be avoided.

The approach used in the Galileo and Cassini cameras was to fill the traps
prior to exposures and run the cameras cold to lengthen the time constant of
the subsequent leakout.They want to run them cold anyway because of the need
for minimal dark current for their applications. Additionally the sensors
"clean up" significantly at reduced temperatures. My KAF6303E has a bad
column that turns good somewhere between -38 and -42C for example. Cassini
needed even colder temperatures to get the cosmetic quality they required.
The point is that you aren't limited in exposure time when you run colder;
the traps start out filled, remain filled through out the exposure and then
are "topped off" again before the next exposure. They don't get QEH, they
don't get RBI and they can take arbitrarily long exposures.

The traps are filled by flooding the sensor with NIR light: about 100x over
full well is sufficient. Next the array needs to be flushed. Then the
exposure can be started.
So the procedure is Flood, Flush, Integrate.

here are examples of non-saturated sensor RBI, saturated sensor RBI and the
complete elimination of RBI by following the procedure used by Cassini and
Galileo.

http://www.narrowbandimaging.com/rbi_page.htm

The procedure needs to be done for any sort of frame: Lights, Darks, flats
etc

Back illuminated devices generally do not exhibit RBI: because the bulk
substrate is etched away

A vertical antiblooming gate also doesn't exhibit RBI: the signal goes into
the substrate. But a standard front-side illuminated full frame CCD built on
Epitaxial wafers will exhibit it under the right conditions.Those conditions
are easily encountered with KAF series sensors from Kodak in my experience
(KAF3200ME, KAF6303E, KAF401E to name three I have personally seen it occur)


Thermal Diffusion Image Smear:

A related but different phenomenon results in smearing of images due to
thermal diffusion.

Again with longer wavelength light such as deep red to NIR, photon to
charge conversion may occur outside of the potential wells depending on the
specifics of the junction engineering of the sensor. Since the charge
packets are in a field free region of the substrate, they aren't confined to
a potential well, and are free to move around under random thermal
diffusion. Some proportion of those photoelectrons will wind up being
captured in pixel wells but not in the pixel they should land in. That
results in a smearing of the image that is observed in the longer
wavelengths but not the shorter wavelengths. So a NIR image may be smeared
or look out of focus while a green or blue filtered image will look normal.

This is an issue with the design of the sensor and the wafer fab process:
the higher doped the device layer, the shallower are the depletion regions
forming the wells. The shallower the depletion regions the more charge
carriers are created outside the wells leading to more charge diffusion
smearing and more potential RBI.

While the RBI can be cured by operating the device as described above,
thermal diffusion can only be addressed by stopping carriers from being
generated outside the depletion region. Unfortunately that usually means an
IR cut filter.

Cheap CMOS sensors are especially bad about the Thermal Diffusion image
smearing: typically the wafer fab process has doping concentrations
optimized for logic products avoiding latchup and that means low
resisitivity substrates/device layer (epi). Low resistivity is another way
to say highly doped and highly doped means the wells do not extend very
deeply into the device layer. The end result is much greater image smear in
the red and longer wavelengths. Again that's why the IR cut filters are
typically used on CMOS-based DSLRs. There are other reasons to use them as
well including avoiding the need to correct the lens over the full range of
silicon wavelength response. But the key reason that cheap CMOS sensors use
an IR cut filter is to avoid the charge diffusion problem

Here is an example of image smear due to charge diffusion:

http://www.narrowbandimaging.com/field_free_a_page.htm

So if you want to image in the NIR you may possibly have some problems with
thermal diffusion if you are using Kodak KAF series sensors. A better choice
would be a sensor made on a high resitivity bulk substrate that is optimized
for NIR response. If you have a back illuminated sensor that is a bit on the
thicker side then you can get reasonable QE and avoid both the thermal
diffusion and the RBI completely. It is helpful if it is a bit thicker than
ones optimized for blue response since you need to accomodate the deeper
penetration depth of the longer wavelength photons: you want them to
interact with the silicon and not pass completely through the sensor. And of
course it is easy to avoid the correction problems by using reflective
instead of refractive optics.

So if you have an interest in NIR imaging, I suggest not using a KAF sensor,
not using refractive optics and sticking with reflective optics and
back-illuminated sensors.





 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
ASTRO: another image of M-2 George Normandin[_1_] Astro Pictures 0 July 8th 07 04:08 AM
ASTRO: Root-cause of Residual Bulk Image explained Richard Crisp Astro Pictures 6 April 5th 07 02:21 PM
[fitsbits] Close of "Tiled Image Compression" Public Comment Period William Pence FITS 0 February 16th 07 09:42 PM
[fitsbits] Start of "Tiled Image Compression" Public Comment Period William Pence FITS 0 November 15th 06 07:08 PM
ASTRO: Another Image Richard Crisp Amateur Astronomy 0 February 26th 05 05:47 AM


All times are GMT +1. The time now is 04:33 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 SpaceBanter.com.
The comments are property of their posters.