View Single Post
  #1  
Old January 17th 05, 08:38 PM
Peter Abrahams
external usenet poster
 
Posts: n/a
Default robotic telescopes & machine learning


Computer interaction with telescopes is expanding to include machine
learning techniques. It is difficult to guess where this will lead, but
the incorporation of machine learning is making things very interesting.

A recent journal is dedicated to the topic of robotic telescopes:
Astronomische Nachrichten 325:6-8 (2004). Special Issue: Third Potsdam
Think Shop on Robotic Astronomy.

Definitions used in this publication:

Automatic telescopes have a 'goto' system capable of automatic
acquisition of targets, examples include amateur Schmidt Cassegrains,
which require a person to confirm alignment and perform other tasks.

Automated telescopes include a computer capable of executing a night's
observing program or observations of a list of objects. These require
an operator to start at dusk, stop at dawn, and correct for errors or
incoming clouds. The instrument must find & acquire the target, confirm
it is the correct target (in a crowded field), focus, and make the
observation or measurement .... repeatedly.

Remote telescopes are automated telescopes operated from a distant
location, and the systems include weather detection instruments & web
cams to determine observing conditions.

Robotic telescopes are unmanned, they allow an operator to initiate an
observing program, and the telescope will complete it. Also described
as 'autonomous'. There are levels of robotic autonomy. Generally the
operator is active at the beginning of a night, but less frequent
initiation is possible. Recovery from system errors might require human
intervention. Automatic scheduling involves selection of targets, based
on optimization determined by the operator's program; and multiple
observing programs utilizing a single telescope involve prioritizing &
equitably distributing observing time.

Adaptive optics are automatic by nature, but system calibration is
critical & complex; the immediate use on robotic telescopes is to
shorten integration time on stellar sources.

Utility: Classification of objects as they are observed. Spectroscopy;
automated spectral analysis seems to be a big field now. Photometry.

Locations in Antarctica are claimed to have the best seeing on earth;
these & other inhospitable sites are opened up by robotic telescopes.


Robotic telescopes are being developed using machine learning
technology. (One definition of 'machine learning': Computer systems
acquire knowledge from previous performance & results, and improve their
performance over time. Raw data are externally supplied; training
examples are supplied by a previous stage of the process. Uses pattern
recognition software.)

These telescopes could recognize celestial transients (survey
operations); slew to fast transients such as Gamma ray bursts; and
monitor variations in persistent sources (recognizing changes as they
happen). 'Time domain astronomy' is a broad classification that
organizes events by their temporal characteristics. Nearly all
transients faster than a few minutes in length are terrestrial, thus
target selection is a challenge.

Also: Networks of autonomous robotic telescopes; possibly using diverse
instruments.

Are there papers about the future development of telescopes combined
with machine learning?

--
=============================================
Peter Abrahams telscope.at.europa.dot.com
The history of the telescope and the binocular:
http://home.europa.com/~telscope/binotele.htm