View Single Post
  #1  
Old February 22nd 07, 11:59 PM posted to sci.image.processing,sci.astro.ccd-imaging,comp.graphics.algorithms
Roberto Waltman
external usenet poster
 
Posts: 5
Default Matching images from different sources.


Looking for information, algorithms, etc. on how to match images of
the same object obtained from different sources.

(Also on what would be the proper terminology to describe this
problem. I'm sure I am doing a poor job here. )

For example, I may take pictures of a cloud formation using three
cameras sensible to the visible, infrared and ultraviolet spectra.
The cameras, although close to each other, may be located far enough
to introduce parallax errors, they may have different resolutions, the
images capture may not be simultaneous, so the cloud shapes may change
slightly from one image to the next, etc.

By 'matching' I mean scaling and rotating the images so that they can
be overlaid in such a way that all the data in any area of the screen
is coming from the same 'region' in the physical world.

The matching process should be based only in the images, I may not
have enough information about the cameras physical location and
orientation.

I understand that in the most general case the images could be so
different that this problem is unsolvable, but I still expect to be
able to find (partial) solutions when some minimal correlation level
exists.

Thanks,


Roberto Waltman

[ Please reply to the group,
return address is invalid ]