💾 Archived View for spam.works › mirrors › textfiles › science › pictures.txt captured on 2023-06-16 at 20:22:16.

View Raw

More Information

-=-=-=-=-=-=-


                   HOW WE GET PICTURES FROM SPACE

Since  the  first  cave  dweller ventured out to gaze up at the night
sky, people have sought to know more about the mysterious images  and
lights  seen  there.  Being  limited  by  what could be seen with the
unaided eye, that early stargazer relied on intellect and imagination
to depict the universe, etching images in stone  by  hand,  measuring
and  charting  the  paths  of the wanderers, and becoming as familiar
with the sky as the limited technology would allow.

Although stargazers frequently took the wrong paths in attempting  to
explain  what  they saw, many of them developed new tools to overcome
their limitations. Galileo crafted a fine telescope for observing the
heavens. His hand-drawn pictures of the satellites  of  Jupiter,  the
"cup  handles" of Saturn, and the phases of Venus, when combined with
the possible reasons for those facts, shook the very  foundations  of
the  European  society  in  the Middle Ages. Bigger and more powerful
telescopes, combined with even newer tools, such as spectroscopes and
cameras, have answered most of the the  questions  of  those  ancient
stargazers. But in doing so, they have unfolded even newer mysteries.

Beginning  in  the  1960s, our view of the heavens reached beyond the
obscuring atmosphere of Earth as unmanned spacecraft carried  cameras
and  other  data  sensors  to probe the satellites and planets of the
Solar System. Images those spacecraft sent back to the Earth provided
startling clarity to details that are  only  fuzzy  markings  on  the
planets'  surfaces when seen from Earth-based telescopes. Only two of
the presently known planets, Neptune and Pluto, remain unexplored  by
our  cameras.  In  August  1989, Voyager 2 will snap several thousand
closeup frames of the  planet  Neptune  and  its  largest  satellite,
Triton. By the end of the 20th century, only Pluto will not have been
visited by one of our spacecraft.

The knowledge humans have today of outer space would astound Galileo.
Spacecraft  have  sent  back  pictures  of  a  cratered and moon-like
surface of the planet Mercury and revealed  circulation  patterns  in
the  atmosphere  of  Venus.  From Mars, they have sent back images of
craters, giant  canyons,  and  volcanoes  on  the  planet's  surface.
Jupiter's   atmospheric   circulation   has   been  revealed,  active
volcanoes on the  Jovian  moon  Io  have  been  shown  erupting,  and
previously  unknown  moons and a ring circling the planet discovered.
New moons were found orbiting Saturn and  the  Saturnian  rings  were
resolved  in  such  detail  that  over 1,000 concentric ring features
became apparent. At Uranus, Voyager sent back  details  of  a  planet
that  is  covered  by  a featureless, bluish-green fog. The planet is
encircled by rings darker than charcoal  and  shaped  by  shepherding
satellites,  accompanied  by five large satellites, and immersed in a
magnetic field.

Those discoveries, and thousands  of  others  like  them,  were  made
possible  through  the  technology  of  telemetry,  the  technique of
transmitting data by means of radio  signals  to  distant  locations.
Thus,  the  spacecraft  not  only  carries data sensors but must also
carry a telemetry system to convert the data from the various sensors
into radio pulses. These pulses are received by a huge  dish  antenna
here  on  Earth.  The  signals  are  relayed  to  data  centers where
scientists and engineers can convert the radio pulses back  into  the
data the sensors originally measured.

A camera system on board the spacecraft measures reflected light from
a planet or satellite as it enters the spacecraft's optical system. A
computer  converts  the  measurements  into numerical data, which are
transmitted to  a  receiver  on  Earth  by  radio  waves.  On  Earth,
computers reassemble the numbers into a picture.

Because  the  measurements  are taken point by point, the images from
space are not considered "true" photographs,  or  what  photographers
call  a "continuous tone," but rather a facsimile image composed of a
pattern of dots assigned various shades  from  white  to  black.  The
facsimile image is much like the halftones newspapers use to recreate
photographs. ?If you examine a newspaper photograph with a magnifying
glass,  you  will  see  that  is is composed of many small, variously
shaded dots.?

Even more closely related to the way images are received  from  space
is  the  way  a  television  set  works. For a picture to appear on a
television set, a modulated beam of light  rapidly  illuminates  long
rows  of tiny dots. filling in one line then the next until a picture
forms. These dots are called picture elements, or pixels  for  short,
and  the  screen  surface  where they are located is called a raster.
Raster scanning refers  to  the  way  the  beam  of  light  hits  the
individual  pixels  at  various  intensities to recreate the original
picture.  Of course, scanning happens very  fast,  so  it  is  hardly
perceptible to the human eye. Images from space are drawn in much the
same manner on a television-like screen (a cathode-ray tube).

Although  cameras  on a spacecraft probing the Solar System have much
in common with those in television  studios,  they  also  have  their
share  of  differences.  For  one,  the space-bound cameras take much
longer to form and transmit an image. While  this  may  seem  like  a
disadvantage,  it  is  not.  The images produced by the slow-scanning
cameras are of a much higher quality and contain more than twice  the
amount of information present in a television picture.

The  most  enduring  image  gatherer  in space has been the Voyager 2
spacecraft. Voyager carries a dual television  camera  system,  which
can  be  commanded  to  view  an  object  with either a wide-angle or
telephoto lens. The system is mounted on a science platform that  can
be  tilted  in any direction for precise aiming. Reflected light from
the  object  enters  the  lenses  and  falls  on  the  surface  of  a
selenium-sulfur  vidicon  television  tube,  11 millimeters square. A
shutter in the camera controls the amount of light reaching the  tube
and can vary exposure times from 0.005 second for very bright objects
to  15  seconds  or  longer  when searching for faint objects such as
unknown moons.

The vidicon tube temporarily holds the image on its surface until  it
can  be  scanned  for  brightness  levels. The surface of the tube is
divided into 800 parallel lines, each containing 800 pixels, giving a
total of 640,000. As each pixel is  scanned  for  brightness,  it  is
assigned a number from 0 to 255.

The  range  (0  to 255) was chosen because it coincides with the most
common counting unit in computer systems, a unit called  a  byte.  In
computers,  information  is  stored in bits and bytes. The bit is the
most fundamental counting or storage unit, while a byte is  the  most
useful  one.  A bit contains one of two possible values, and can best
be thought of as a tiny on-off switch on  an  electrical  circuit.  A
byte,  on  the  other hand, contains the total value represented by 8
bits. The value can be interpreted in many ways, such as a  numerical
value,  an  alphabet  character  or symbol, or a pixel shaded between
black and white. In a byte, the position of  each  bit  represents  a
counting power of 2. (By convention, bit patterns are read from right
to  left.)  Thus,  the  first bit (the righmost bit) of the eight bit
sequence represents 2 to the zero power, the second bit refers  to  2
to  the  1 power, and so on. For each bit in a byte that has a one in
it, you add the value of that power of two (the sequence value) until
all eight bits are counted.  For example, if the  byte  has  the  bit
value of 00101101, then it represents the number 45. The binary table
at  the  end of this document shows how translation of bits and bytes
to numbers is done.

If all the bits in an eight-bit  sequence  are  ones,  then  it  will
correspond  to  the  value 255. That is the maximum value that a byte
can count to. Thus, if a byte is used to represent shades of gray  in
an  image,  then by convention the lowest value, zero, corresponds to
pure black, while the highest value 255, corresponds to  pure  white.
All other values are intermediate shades of gray.

When  the  values  for  all  the  pixels have been assigned, they are
either sent directly to a receiver on Earth  or  stored  on  magnetic
tape to be sent later. Data are typically stored on tape on board the
spacecraft when the signals are going to be temporarily blocked, such
as  when  Voyager  passes  behind  a  planet or a satellite. For each
image, and its total of 640,000 pixels, 5,120,000 bits of  data  must
be  transmitted  (640,000  x  8). When Voyager flew close to Jupiter,
data were transmitted back to Earth at a rate of  more  than  100,000
bits  per  second.  This  meant  that  once  data  began reaching the
antennas on Earth's surface,  information  for  complete  images  was
received in about 1 minute for each transmission.

As  the  distance of the spacecraft from Earth increases, the quality
of the radioed data stream decreases and the rate of transmission  of
data  has  to  be  slowed  correspondingly.  Thus, at the distance of
Uranus, the data has to be transmitted some six to eight times slower
than could be done at Jupiter. That means that only one  picture  can
be  transmitted  in  the  time  six  pictures  were taken at Jupiter.
However, for the Uranus encounter, scientists and engineers devised a
scheme to get around that limitation.  The  scheme  was  called  data
compression.

To  do  that,  they  reprogrammed the spacecraft en route. Instead of
having Voyager transmit the full 8 bits for each pixel, its computers
were instructed to send back only the differences between  brightness
levels of successive pixels. That reduced the data bits needed for an
image  by  about 60 percent. Slowing the transmission rate meant that
noise did not interfere with the image reception, and by  compressing
the data, a full array of striking images was received. The computers
at  NASA's  Jet  Propulsion  Laboratory  (JPL)  restored  the correct
brightness  to  each  pixel,  producing  both   black-and-white   and
full-color images.

The  radio  signals  that a spacecraft such as Voyager sends to Earth
are received by a system of large dish antennas called the Deep Space
Network (DSN). The DSN  is  designed  to  provide  command,  control,
tracking  and  data  acquisition  for deep space missions. Configured
around the globe at locations approximately 120  degrees  apart,  DSN
provides 24-hour line-of-sight coverage.

Stations  are  located  at  Goldstone,  California,  and near Madrid,
Spain, and Canberra,  Australia.  The  DSN,  managed  by  NASA's  Jet
Propulsion  Laboratory  in  Pasadena,  California,  consists of three
64-meter  (210-ft)  diameter  dish-shaped  antennas,   six   34-meter
(111-ft)  diameter  antennas, and three 26-meter (85-ft) antennas. As
antennas at one  station  lose  contact,  due  to  Earth's  rotation,
antennas  at  the next station rotate into view and take over the job
of receiving spacecraft data. While one station is  tracking  a  deep
space  mission,  such  as  Voyager,  the  other two are busy tracking
spacecraft elsewhere in the sky.

During Voyager's contact with Saturn, the DSN recovered more than  99
percent of th 17,000 images transmitted. That accomplishment required
the  use of a technique known as "antenna arraying." Arraying for the
Saturn encounter was accomplished by  electronically  adding  signals
received  by two antennas at each site. Because of the great distance
Uranus is from the Earth, the signal received from Voyager 2 was only
one-fourth as strong as  the  signal  received  from  Saturn.  A  new
arraying  technique,  which  combined signals from four antennas, was
used during the Uranus encounter to allow up to 21,600 bits  of  data
to be received each second.

Arraying's   biggest  payoff  came  in  Australia,  whose  government
provided its Parkes Radio Astronomy Observatory 64-meter  antenna  to
be  linked  with  the  DSN's three-antenna complex near Canberra. The
most critical events of the  encounter,  including  Voyager's  closet
approaches  to Uranus and its satellites, were designed to occur when
the spacecraft would be transmitting to the complex in Australia. The
data were successfully relayed to JPL through that array.

The DSN was able to  track  Voyager's  position  at  Saturn  with  an
accuracy of nearly 150 kilometers (about 90 miles) during its closest
approach.   This   accuracy  was  achieved  by  using  the  network's
radiometric system, the spacecraft's cameras, and a technique  called
Very  Long  Baseline  Interferometry,  or  VLBI.  VLBI determines the
direction  of  the  spacecraft  by  precisely  measuring  the  slight
difference  between  the time of arrival of the signal at two or more
ground antennas. The same technique was used at  Uranus  to  aim  the
spacecraft so accurately that the deflection of its trajectory caused
by the planet's gravity would sent it on to Neptune.

When  the  DSN  antennas receive the information from the spacecraft,
computers at the Jet Propulsion Laboratory store it  for  future  use
and  reassemble  it into images. To recreate a picture from data that
has been sent across the vacuum of space, computers read the data bit
by  bit,  calculating  the  values  for each pixel and converting the
value into a small square of light. The squares are  displayed  on  a
television  screen  on  the  spacecraft.   The  resulting  image is a
black-and-white facsimile of the object being measured.

Color images can be made by taking three  black-and-white  frames  in
succession  and  blending  ("registering") them on one another in the
three color-planes of a television screen. In order for that to work,
however, each of the three frames has to be taken by  the  camera  on
board the spacecraft through different filters. On Voyager, one frame
is  taken through a blue filter, one through a green, and one through
an orange.

Filters have varying effects on the amount of light  being  measured.
For  example,  light passes through a blue filter will favor the blue
values in the image  making  them  appear  brighter  or  transparent,
whereas  red or orange values will appear much darker than normal. On
Earth the three images  are  given  the  appropriate  colors  of  the
filters through which they were measured and then blended together to
give a color image.

An  important  feat  the interplanetary spacecraft must accomplish is
focusing on its target while  traveling  at  extremely  high  speeds.
Voyager sped past Uranus at more then 40,000 miles an hour. To get an
unblurred  image,  the  cameras  on board had to steadily track their
target while the camera shutters were open. The technique to do this,
called  image-motion  compensation,  involves  rotating  the   entire
spacecraft  under  the  control  of  the  stabilizing gyroscopes. The
strategy was used successfully both at Saturn's satellite Rhea and at
Uranus.  Both  times,   cameras   tracked   their   targets   without
interruption.

Once  the  image is reconstructed by computers on Earth, it sometimes
happens that objects appear nondescript  or  that  subtle  shades  in
planetary  details  such  as  cloudtops cannot be discerned by visual
examination alone. This can be overcome, however, by adding  a  final
"contrast  enhancement"  to  the  production. The process of contrast
enhancement is like adjusting the contrast and brightness controls on
a television set. Because the shades of the  image  are  broken  down
into   picture  elements,  the  computer  can  increase  of  decrease
brightness values of individual pixels,  thereby  exaggerating  their
difference and sharpening even the tiniest details.

For  example,  suppose  a  portion  of  an  image returned from space
reveals an  area  of  subtle  gray  tones.  Data  from  the  computer
indicates  the  range in brightness values is between 98 and 120, and
all are fairly evenly distributed. To the unaided  eye,  the  portion
appears  as  a  blurred  gray patch because the shades are too nearly
similar to be discerned.  To  eliminate  this  visual  handicap,  the
brightness  values  can  be  assigned  new numbers. The shades can be
spread farther apart, say five  shades  apart  rather  than  the  one
currently  being  looked  at.  Because the data are already stored on
computers, it is a fairly  easy  task  to  isolate  the  twenty-three
values and assign them new ones: 98 could be assigned 20, 99 assigned
25,  and so on. The resulting image is "enhanced" to the unaided eye,
while the information is the same accurate data transmitted from  the
vicinity of the object in space.

The  past  25 years of space travel and exploration have generated an
unprecedented quantity of data from planetary systems.  Images  taken
in  space and telemetered back to Earth have greatly aided scientists
in formulating better and more accurate theories about the nature and
origin of our Solar System. Data gathered at close  range,  and  from
above  the  distorting  effects of Earth's atmosphere, produce images
far more detailed than pictures taken by even the largest Earth-bound
telescopes.

In our search to understand the world as  well  as  the  universe  in
which  we live, we have in one generation reached farther than in any
other generation before us.  We  have  overcome  the  limitations  of
looking  from  the surface of our planet and have traveled to others.
Whatever yearning drew those first stargazers from  the  security  of
their  caves  to  look up at the night sky and wonder still draws men
and women to the stars.

_____________________________________________________________________



BINARY TABLE

Bit of Data         8     7     6     5     4     3     2     1
----------------------------------------------------------------------
Sequence Value    128    64    32    16     8     4     2     1
Binary Value        0     0     1     0     1     1     0     1
Byte Value          0    +0   +32    +0    +8    +4    +0    +1  = 45



Sequence Value    128    64    32    16     8     4     2     1
----------------------------------------------------------------------
Brightness Values               Binary Values
----------------------------------------------------------------------
  0   (black)       0     0     0     0     0     0     0     0
  9   (dark gray)   0     0     0     0     1     0     0     1
 62   (gray)        0     0     1     1     1     1     1     0
183   (pale gray)   1     0     1     1     0     1     1     1
255   (white)       1     1     1     1     1     1     1     1

______________________________________________________________________



BRIEF HISTORY OF PICTURES BY UNMANNED SPACECRAFT

NAME:     Pioneer 4
YEAR:     1959
MISSION:  Moon: measured particles and fields in a flyby, entered
          heliocentric orbit.

NAME:     Ranger 7
YEAR:     1964
MISSION:  Moon: 4,316 high-resolution TV pictures of Sea of Clouds;
          impacted.

NAME:     Ranger 8
YEAR:     1965
MISSION:  Moon: 7,137 pictures of Sea of Tranquility; impacted.

NAME:     Ranger 9
YEAR:     1965
MISSION:  Moon: 5,814 pictures of Crater Alphonsus; impacted.

NAME:     Surveyor 1
YEAR:     1966
MISSION:  Moon: 11,237 pictures, soft landing in Ocean of Storms.

NAME:     Surveyor 3
YEAR:     1967
MISSION:  Moon: 6,315 pictures, first soil scoop; soft landed in Sea
          of Clouds.

NAME:     Surveyor 5
YEAR:     1967
MISSION:  Moon: more than 19,000 pictures; first alpha scatter
          analyzed chemical structure; soft landed in Sea of
          Tranquility.

NAME:     Surveyor 6
YEAR:     1967
MISSION:  Moon: 30,065 pictures; first lift off from lunar surface,
          moved ship 10 feet, soft landed in Central Bay region.

NAME:     Surveyor 7
YEAR:     1968
MISSION:  Moon: returned television pictures, performed alpha scatter,
          and took surface sample; first soft landing on ejecta
          blanket beside Crater Tycho.

NAME:     Lunar Orbiter 1
YEAR:     1966
MISSION:  Moon: medium and high-resolution pictures of 9 possible
          landing sites; first orbit of another planetary body;
          impacted.

NAME:     Lunar Orbiter 2
YEAR:     1966
MISSION:  Moon: 211 frames (422 medium and high-resolution pictures);
          impacted.

NAME:     Lunar Orbiter 3
YEAR:     1967
MISSION:  Moon: 211 frames including picture of Surveyor 1 on lunar
          surface; impacted.

NAME:     Lunar Orbiter 4
YEAR:     1967
MISSION:  Moon: 167 frames; impacted.

NAME:     Lunar Orbiter 5
YEAR:     1967
MISSION:  Moon: 212 frames, including 5 possible landing sites and
          micrometeoroid data; impacted.

NAME:     Mariner 4
YEAR:     1964
MISSION:  Mars: 21 pictures of cratered moon-like surface, measured
          planet's thin, mostly carbon dioxide atmosphere; flyby.

NAME:     Mariners 6 and 7
YEAR:     1969
MISSION:  Mars: verified atmospheric findings: no nitrogen present,
          dry ice near polar caps; both flybys.

NAME:     Mariner 9
YEAR:     1971
MISSION:  Mars: 7,400 pictures of both satellites and planet's
          surface; orbited.

NAME:     Mariner 10
YEAR:     1973
MISSION:  First multiple planet encounter.

          Venus: first full-disc pictures of planet; ultraviolet
          images of atmosphere, revealing circulation patterns;
          atmosphere rotates more slowly than planetary body; flyby.

          Mercury: pictures of moon-like surface with long, narrow
          valleys and cliffs; flyby; three Mercury encounters at
          6-month intervals.

NAME:     Pioneer 10
YEAR:     1972
MISSION:  Jupiter: first close-up pictures of Great Red Spot and
          planetary atmosphere; carries plaque with intergalactic
          greetings from Earth.

NAME:     Pioneer 11 (Pioneer Saturn)
YEAR:     1973
MISSION:  Jupiter: pictures of planet from 42,760 km (26,725 mi) above
          cloudtops; only pictures of polar regions; used Jupiter's
          gravity to swing it back across the Solar System to Saturn.

          Saturn: pictures of planet as it passed through ring plane
          within 21,400 km (13,300 mi) of cloudtops; new discoveries
          were made; spacecraft renamed Pioneer Saturn after leaving
          Jupiter.

NAME:     Pioneer Venus 1
YEAR:     1978
MISSION:  Venus: studied cloud cover and planetary topography;
          orbited.

NAME:     Pioneer Venus 2
YEAR:     1978
MISSION:  Venus: multiprobe, measuring atmosphere top to bottom;
          probes designed to impact on surface but continued to return
          data for 67 minutes.

NAME:     Viking 1
YEAR:     1975
MISSION:  Mars: first surface pictures of Mars as well as color
          pictures; landed July 20, 1976; remained operating until
          November 1982.

NAME:     Viking 2
YEAR:     1975
MISSION:  Mars; showed a red surface of oxidized iron; landed
          September 03, 1976.

NAME:     Voyager 1
YEAR:     1977
MISSION:  Jupiter: launched after Voyager 2 but on a faster
          trajectory; took pictures of Jupiter's rapidly changing
          cloudtops; discovered ring circling planet, active volcano
          on Io, and first moons with color: Io, orange; Europa,
          amber; and Ganymede, brown; flyby.

          Saturn: pictures showed atmosphere similar to Jupiter's, but
          with many more bands and a dense haze that obscured the
          surface; found new rings within rings; increased known
          satellite count to 17; flyby.

NAME:     Voyager 2
YEAR:     1977
MISSION:  Jupiter: color and black-and-white pictures to complement
          Voyager 1; time-lapse movie of volcanic action on Io; flyby.

          Saturn: cameras with more sensitivity resolved ring count to
          more than 1,000; time-lapse movies studied ring spokes;
          distinctive features seen on several moons; 5 new satellites
          were discovered; flyby.

          Uranus: first encounter with this distant planet; photo-
          graphed surface of satellites, resolved rings into multi-
          colored bands showing anticipated shepherding satellites;
          discovered 10 new moons, 2 new rings, and a tilted magnetic
          field; flyby.

          Neptune: encounter scheduled for 1989.

---
NASA FACTS, HOW WE GET PICTURES FROM SPACE, Haynes, NF-151/7-87