Incredible photographs of deep space (20 photos). How space photographs are taken Image captured by a spacecraft


Take a few minutes to enjoy 25 truly breathtaking photos of the Earth and Moon from space.

This photograph of Earth was taken by astronauts on the Apollo 11 spacecraft on July 20, 1969.

Spacecraft launched by humanity enjoy views of the Earth from a distance of thousands and millions of kilometers.


Captured by Suomi NPP, a US weather satellite operated by NOAA.
Date: April 9, 2015.

NASA and NOAA created this composite image using photos taken from the Suomi NPP weather satellite, which orbits the Earth 14 times a day.

Their endless observations allow us to monitor the state of our world under the rare positions of the Sun, Moon and Earth.

Captured by the DSCOVR Sun and Earth Observing Spacecraft.
Date: March 9, 2016.

The DSCOVR spacecraft captured 13 images of the moon's shadow running across Earth during the 2016 total solar eclipse.

But the deeper we go into space, the more the view of the Earth fascinates us.


Taken by the Rosetta spacecraft.
Date: November 12, 2009.

The Rosetta spacecraft is designed to study comet 67P/Churyumov-Gerasimenko. In 2007, it made a soft landing on the surface of a comet. The main probe of the device completed its flight on September 30, 2016. This photo shows the South Pole and sunlit Antarctica.

Our planet looks like a shiny blue marble, shrouded in a thin, almost invisible layer of gas.


Filmed by the Apollo 17 crew
Date: December 7, 1972.

The crew of the Apollo 17 spacecraft took this photograph, entitled "The Blue Marble," during the last manned mission to the Moon. This is one of the most circulated photographs of all time. It was filmed at a distance of approximately 29 thousand km from the surface of the Earth. Africa is visible in the upper left of the image, and Antarctica is visible in the lower left.

And she drifts alone in the blackness of space.


Filmed by the Apollo 11 crew.
Date: July 20, 1969.

The crew of Neil Armstrong, Michael Collins and Buzz Aldrin took this photo during a flight to the Moon at a distance of about 158 ​​thousand km from Earth. Africa is visible in the frame.

Almost alone.

About twice a year, the Moon passes between the DSCOVR satellite and its main observation object, the Earth. Then we get a rare opportunity to look at the far side of our satellite.

The Moon is a cold rocky ball, 50 times smaller than the Earth. She is our greatest and closest heavenly friend.


Filmed by William Anders as part of the Apollo 8 crew.
Date: December 24, 1968.

The famous Earthrise photograph taken from the Apollo 8 spacecraft.

One hypothesis is that the Moon formed after a proto-Earth collided with a planet the size of Mars about 4.5 billion years ago.


Taken by Lunar Reconnaissance Orbiter (LRO, Lunar Orbiter).
Date: October 12, 2015.

In 2009, NASA launched the robotic interplanetary probe LRO to study the cratered surface of the Moon, but it seized the opportunity to capture this modern version of the Earthrise photograph.

Since the 1950s, humanity has been launching people and robots into space.


Taken by Lunar Orbiter 1.
Date: August 23, 1966.

The robotic unmanned spacecraft Lunar Orbiter 1 took this photo while searching for a site to land astronauts on the Moon.

Our exploration of the Moon is a mixture of the pursuit of technological conquest...


Photographed by Michael Collins of the Apollo 11 crew.
Date: July 21, 1969.

Eagle, the lunar module of Apollo 11, returns from the surface of the Moon.

and insatiable human curiosity...


Taken by the Chang'e 5-T1 lunar probe.
Date: October 29, 2014.

A rare view of the far side of the Moon taken by the China National Space Administration's lunar probe.

and search for extreme adventures.

Filmed by the Apollo 10 crew.
Date: May 1969.

This video was taken by astronauts Thomas Stafford, John Young and Eugene Cernan during a non-landing test flight to the Moon on Apollo 10. Obtaining such an image of Earthrise is only possible from a moving ship.

It always seems that the Earth is not far from the Moon.


Taken by the Clementine 1 probe.
Date: 1994.

The Clementine mission was launched on January 25, 1994, as part of a joint initiative between NASA and the North American Aerospace Defense Command. On May 7, 1994, the probe left control, but had previously transmitted this image, which showed the Earth and the north pole of the Moon.


Taken by Mariner 10.
Date: November 3, 1973.

A combination of two photographs (one of the Earth, the other of the Moon) taken by NASA's robotic interplanetary station Mariner 10, which was launched to Mercury, Venus and the Moon using an intercontinental ballistic missile.

the more amazing our house looks...


Taken by the Galileo spacecraft.
Date: December 16, 1992.

On its way to study Jupiter and its moons, NASA's Galileo spacecraft captured this composite image. The Moon, which is about three times brighter than the Earth, is in the foreground, closer to the viewer.

and the more lonely he seems.


Taken by the Near Earth Asteroid Rendezvous Shoemaker spacecraft.
Date: January 23, 1998.

NASA's NEAR spacecraft, sent to the asteroid Eros in 1996, captured these images of the Earth and Moon. Antarctica is visible at the South Pole of our planet.

Most images do not accurately depict the distance between the Earth and the Moon.


Taken by the Voyager 1 robotic probe.
Date: September 18, 1977.

Most photographs of the Earth and Moon are composite images, made up of several images, because the objects are far apart. But above you see the first photograph in which our planet and its natural satellite are captured in one frame. The photo was taken by the Voyager 1 probe on its way to its “grand tour” of the solar system.

Only after traveling hundreds of thousands or even millions of kilometers, then returning, can we truly appreciate the distance that lies between the two worlds.


Taken by the automatic interplanetary station “Mars-Express”.
Date: July 3, 2003.

The European Space Agency's robotic interplanetary station Max Express (Mars Express) took this image of Earth millions of kilometers away on its way to Mars.

This is a huge and empty space.


Captured by NASA's Mars Odyssey orbiter.
Date: April 19, 2001.

This infrared photograph, taken from a distance of 2.2 million km, shows the enormous distance between the Earth and the Moon - about 385 thousand kilometers, or about 30 Earth diameters. The Mars Odyssey spacecraft took this photo as it headed toward Mars.

But even together, the Earth-Moon system looks insignificant in deep space.


Taken by NASA's Juno spacecraft.
Date: August 26, 2011.

NASA's Juno spacecraft captured this image during its nearly 5-year journey to Jupiter, where it is conducting research on the gas giant.

From the surface of Mars, our planet appears to be just another “star” in the night sky, which puzzled early astronomers.


Taken by the Spirit Mars Exploration Rover.
Date: March 9, 2004.

About two months after landing on Mars, the Spirit rover captured a photograph of Earth appearing as a tiny dot. NASA says it is "the first ever image of Earth taken from the surface of another planet beyond the Moon."

The Earth is lost in the shining icy rings of Saturn.


Taken by the Cassini automatic interplanetary station.
Date: September 15, 2006.

NASA's Cassini space station took 165 photos of Saturn's shadow to create this backlit mosaic of the gas giant. The Earth has crept into the image on the left.

Billions of kilometers from Earth, as Carl Sagan quipped, our world is just a “pale blue dot,” a small and lonely ball on which all our triumphs and tragedies are played out.


Taken by the Voyager 1 robotic probe.
Date: February 14, 1990.

This image of Earth is one of a series of "solar system portraits" that Voyager 1 took about 4 billion miles from home.

From Sagan's speech:

“There is probably no better demonstration of stupid human arrogance than this detached picture of our tiny world. It seems to me that it emphasizes our responsibility, our duty to be kinder to each other, to preserve and cherish the pale blue dot - our only home.”

Sagan's message is constant: there is only one Earth, so we must do everything in our power to protect it, protect it mainly from ourselves.

Japan's artificial lunar satellite Kaguya (also known as SELENE) captured this video of the Earth rising above the Moon at 1000% acceleration to commemorate the 40th anniversary of the Earthrise photograph taken by the Apollo 8 crew.

Our ancestors who lived on this planet a thousand years ago did not have the technology and resources that we have now to study our Universe and what lies beyond it. Therefore, in those days, many astronomy lovers spent their nights looking at the sky, coming up with theories and telling tales about the heavens above them. Unknown to them, what was happening many light years away was more transcendental than the most exciting stories they could come up with.

Over the past 50 years, NASA has opened the door to space exploration through its sophisticated, state-of-the-art telescopes and robotic research stations that allow us to explore the various nooks and crannies of space. The only way for us to see the Universe is through photographs produced by NASA, unless, of course, you are one of the lucky ones who can afford to buy a ticket to space from Virgin Galactic.

It is very important to note that choosing a list of just twenty photographs from hundreds of thousands is very difficult and impossible to do without a certain amount of subjectivity. The photos we show you below are some of the most breathtaking images obtained through satellite imaging during planetary exploration and space missions. If you think we missed any great shots, post them in the comments!

So, we present to your attention a list of twenty of the most amazing photographs taken by NASA:

20. Hubble Extreme Deep Field

In its early stages, the Hubble Telescope was used to image very distant space located in a small region in the constellation Ursa Major. NASA recently released a new version of this image called "Hubble Extreme Deep Field," which was created from 2,000 photographs of what appeared to be an empty patch of sky in just two million seconds. Every pixel, blob, swirl and point of light in this image represents an entire galaxy. Just imagine the scale of this space. Many billions of stars have been compressed into a single pixel in this photograph.

19. Chaos in the Heart of Orion


The combined image from the Spitzer and Hubble space telescopes shows the chaos of newborn stars located 1,500 light-years away in the heart of the Orion Nebula. It is the closest massive cluster of stars to us, and astronomers believe it contains more than 1,000 young stars. The photograph shows a cluster of newborn stars scattered throughout the Nebula. The Orion Nebula is the brightest part of the Sword of Orion, also known as the Hunter constellation.

18. Yuri Malenchenko’s spacewalk


Any time an astronaut has to exit a vehicle in space, it is called a spacewalk. On September 11, 2000, cosmonaut-researcher Yuri Malenchenko was photographed during his spacewalk, giving us this breathtaking photo. That day, Yuri Malenchenko and astronaut Edward T Lu spent more than 6 hours in outer space, performing work on the external part of the International Space Station.

17. “Eye of God”


The Helix Nebula, also known as the "Eye of God", is located approximately 650 light-years from the Sun in the constellation Aquarius. Its extent is approximately 2.5 light years. Knots of compound gases of unidentified elements are visible at the inner edge of the Helix Nebula. The photo is a composite image from the Hubble Space Telescope and wide-angle images from the Kitt Peak National Observatory's Mosaic Camera.

16. Rosette Nebula


The Rosette Nebula is a huge spheroid-shaped region located next to a giant molecular cloud in the constellation Monoceros in the Milky Way galaxy, approximately 5,200 light-years from Earth. This nebula is simply huge and occupies six times more area than the surface of the entire Moon. As shown in the image above, the Rosette Nebula is a region of active star formation that glows as a result of ultraviolet radiation from young, hot blue stars whose winds inevitably pass through its center.

15. Hercules A – Black Hole


At first glance, Hercules A appears to be a typical oval galaxy, but it is unique in that its center is such a massive black hole that our galaxy looks insignificant in comparison. The Hercules A Galaxy is located just over 2 billion light-years from our Milky Way and has a total mass approximately 1,000 times that of our galaxy. The purple areas seen in this photo are most likely caused by particles of matter colliding with each other and heating up as they are pulled into the black hole.

14. Crab Nebula


The stunningly beautiful death of a star in the constellation Taurus was first observed by Chinese astronomers as a supernova in 1054. Nearly a thousand years later, a marvelous opaque object known as a neutron star that exploded released a cluster of high-energy particles into an expanding field known as the Crab Nebula.

This composite image was created by combining photographs from three observatories. Optical images from the Hubble Space Telescope are red and yellow, Chandra X-ray images are blue, and Spitzer infrared images are purple. Like many other telescopes, Chandra has observed the Crab Nebula frequently since the mission began.

13. Two spiral galaxies


This image of two galaxies was created using Hubble images taken from three angles. Powerful tidal forces from the larger galaxy NGC 2207 on the left altered the shape of the smaller galaxy IC 2163, expelling gas and stars in long streams that spread over 100,000 light-years. IC 2163 does not have enough energy to escape NGC 2207's gravitational pull, so it will be constantly pulled back. The small galaxy will be constantly trapped in their shared orbit, and both of these galaxies will continue to change and interrupt each other. Subsequently, most likely, after billions of years, both galaxies will merge into one huge galaxy. There is a theory that a number of galaxies that exist today, including the Washing Path, were formed through a similar process of smaller galaxies merging over a period of billions of years.

12. “MAVEN” (“Evolution of the atmosphere and volatiles on Mars”) (Mars Orbiter)


The image above shows NASA's artificial satellite, known as MAVEN, which is exploring the upper atmosphere of Mars to help understand climate change on the red planet. Early discoveries from the newly launched satellite have begun to reveal key details about how Mars' atmosphere was reclaimed by space over time. The data obtained from MAVEN included the discovery of a new process by which the solar wind can penetrate deep into the planet's atmosphere.

11. Inside the Flame Nebula


This infrared image from the orbiting Chandra X-ray Observatory shows the incredible star-forming region known as NGC 2024, as well as the Flame Nebula. It is located in the constellation Orion, approximately 1400 light years from Earth. According to scientists' calculations, the stars at the center of the cluster are approximately 200,000 years old, while the stars at its outer edge can be up to 1.5 million years old.

10. Light Echoes

Hubble captured another stunning image, this time of a light echo. Since January 2002, scientists have been closely monitoring a rather unusual star called V838 Monocerotis, located approximately 20,000 light-years from Earth. At the time of the explosion, which lasted several weeks, this star was 600,000 times brighter than the Sun. Over time, the star began to dim, but the light it emitted traveled from the star outward, illuminating the nebula surrounding the star. The light then hit the gas cloud of the nebula and was reflected in several places. This caused light from the explosion to spread throughout the universe, spreading cosmic dust in what the European Space Agency (ESA) calls "the most spectacular light echo in the history of astronomy."

9. Comet C/2011 W3 (Lovejoy)

Comet C/2011 W3 (Lovejoy) is a periodic comet. This means that it can have very variable orbits and appearance periods ranging from 200 to 1000 years. Lovejoy's orbital period is approximately 8,000 years. Recently, a comet passed very close to Earth, appearing as a very bright spot in the sky. As can be seen in the photo, Comet Lovejoy has a detailed ion tail, consisting of ionized gas released by ultraviolet radiation from the Sun, which is pushed outward by the solar wind. This explains the formation of the comet's beautifully structured tail.

8. Martian sunrise


While this image may not be as stunning and eye-popping as some of the others on this list, the photo above is just one of many amazing images taken by the Curiosity Rover on Mars. Watching the sun set on our own planet can be a wonderful and impressive experience, but is there anything more exciting than watching the sun set on another world? Maybe one day people will be able to witness this spectacle not in photographs, but with their own eyes.

7. Blue Marble


This stunning image, called Marble Blue, is the most detailed and color-true photograph of Earth yet. Scientists have combined months of observations of the Earth's surface, sea ice, oceans and clouds into one collage showing every square kilometer of our planet in real color. Much of the information collected in this image came from a single NASA remote-sensing instrument called the Moderate Resolution Imaging Spectroradiometer.

6. Crossbow "Curiosity"

The 1-ton robot made an unprecedented landing on Mars on August 5, 2012. Curiosity's arrival was extraordinary for many reasons, including the search for evidence of ancient life on the red planet and its biosphere based on chemotrophic and autotrophic microorganisms, which would make human settlement on Mars a real possibility in the near future. Since its arrival on the red planet, the rover has been doing some very interesting things. This photo is a great example of this: who would have thought that we could send a robot to Mars that could make crossbows?

5. First notable solar flare of 2015


On January 12, 2015, the Sun emitted a mid-level solar flare, and the astonishing event was captured by the Solar Dynamics Observatory, which regularly monitors solar processes. Solar flares are extremely powerful bursts of radiation. Radiation from a solar flare cannot pass through the Earth's atmosphere, but if it is powerful enough, it can disrupt the Earth's atmosphere at the level where GPS radio signals travel.

4. Pillars of Creation

The original photograph on which this image is based was taken by the Hubble Space Telescope on April 1, 1995. To celebrate the upcoming 20th anniversary, astronomers have created the high-resolution image of the Pillars of Creation, shown above. It was released to the delight of the public in January of this year. The "Pillars of Creation" are located in the Eagle Nebula, 7,000 light years away. They are composed of molecular hydrogen and dust, which corrode due to photoevaporation caused by ultraviolet radiation from hot stars nearby.

3. Europa is an icy satellite of Jupiter


This stunning photograph is the best image yet of Jupiter's icy moon. Europa has long intrigued scientists due to the fact that it shows signs of an ocean beneath the surface, and also because cracks are clearly visible on its surface. The ocean appears to be protected from harmful radiation, making Europa one of the most likely cosmic bodies in the solar system to harbor alien life. This is one of the main reasons why it is so tempting to scientists. Europa contains all the elements that, according to scientists, are necessary for the origin of life: energy, water and organic matter.

2. Saturn Smiles for Cassini


The automated Cassini-Huygens spacecraft was launched in October 1997 and arrived at Saturn in 2004. Since then, the device has captured many amazing images, but this mosaic of Saturn simply defies description. This image shows something that happens very rarely. The sun illuminated Saturn from behind and the Cassini-Huygens probe was closer than usual to Saturn's surface. This allowed him to capture this stunning image with incredible colors and stunningly detailed Saturn's rings.

1. Amazing solar eruption


Our Sun is a reservoir of extremely hot plasma intertwined with magnetic fields. On August 31, 2012, NASA's Solar Dynamics Observatory observed the Sun when, during a solar storm, powerful magnetic fields on the Sun's surface swirled outwards in a spiral of plasma. A coil of plasma, moving away from the surface of the Sun at a speed of 1,400 kilometers per second, flew away from its surface to a distance of 300,000 kilometers.



The moment is approaching, which all astronomers in the world have been eagerly awaiting for many years. We are talking about the launch of the new James Webb space telescope, which is considered a kind of successor to the famous Hubble.

Why are space telescopes needed?

Before we begin to consider the technical features, let's figure out why space telescopes are needed at all and what advantages they have over complexes located on Earth. The fact is that the earth's atmosphere, and especially the water vapor contained in it, absorbs the lion's share of radiation coming from space. This, of course, makes it very difficult to study distant worlds.

But the atmosphere of our planet with its distortions and cloudiness, as well as noise and vibrations on the Earth’s surface, are not an obstacle to a space telescope. In the case of the automatic Hubble Observatory, due to the absence of atmospheric influence, its resolution is approximately 7–10 times higher than that of telescopes located on Earth. Many photographs of distant nebulae and galaxies that cannot be seen in the night sky with the naked eye were obtained thanks to Hubble. Over 15 years of operation in orbit, the telescope received more than one million images of 22 thousand celestial objects, including numerous stars, nebulae, galaxies and planets. With the help of Hubble, scientists, in particular, have proven that the process of planet formation occurs near most of the luminaries of our Galaxy.

But Hubble, launched in 1990, will not last forever, and its technical capabilities are limited. Indeed, over the past decades, science has made great progress, and now it is possible to create much more advanced devices that can reveal many of the secrets of the Universe. The James Webb will become just such a device.

James Webb capabilities

As we have already seen, a full-fledged study of space without devices such as Hubble is impossible. Now let's try to understand the concept of "James Webb". This device is an orbital infrared observatory. In other words, its task will be to study the thermal radiation of space objects. Let us remember that all bodies, solid and liquid, heated to a certain temperature, emit energy in the infrared spectrum. In this case, the wavelengths emitted by the body depend on the heating temperature: the higher the temperature, the shorter the wavelength and the higher the radiation intensity.

Among the main tasks of the future telescope is to detect the light of the first stars and galaxies that appeared after the Big Bang. This is extremely difficult, since light moving over millions and billions of years undergoes significant changes. Thus, the visible radiation of a particular star can be completely absorbed by a dust cloud. In the case of exoplanets, it is even more difficult, since these objects are extremely small (by astronomical standards, of course) and “dim.” For most planets, the average temperature rarely exceeds 0°C, and in some cases it can drop below –100°C. It is very difficult to detect such objects. But the equipment installed on the James Webb Telescope will make it possible to identify exoplanets whose surface temperatures reach 300 K (which is comparable to the Earth’s indicator), located further than 12 astronomical units from their stars and at a distance of up to 15 light years from us.

The new telescope was named after the second head of NASA. James Webb was at the helm of the US space agency from 1961 to 1968. It was on his shoulders that control over the implementation of the first manned launches into space in the United States lay. He made a major contribution to the Apollo program, the goal of which was to land a man on the Moon.

In total, it will be possible to observe planets located around several dozen stars “neighboring” our Sun. Moreover, “James Webb” will be able to see not only the planets themselves, but also their satellites. In other words, we can expect a revolution in the study of exoplanets. And perhaps not even alone. If we talk about the solar system, then there may be new important discoveries here too. The fact is that the sensitive equipment of the telescope will be able to detect and study objects in the system with a temperature of –170° C.

The capabilities of the new telescope will make it possible to understand many of the processes occurring at the dawn of the existence of the Universe - to look into its very origins. Let's consider this issue in more detail: as you know, we see stars that are 10 light years away from us exactly as they were 10 years ago. Consequently, we observe objects located at a distance of more than 13 billion light years as they appeared almost immediately after the Big Bang, which is believed to have occurred 13.7 billion years ago. The instruments installed on the new telescope will make it possible to see 800 million further than Hubble, which set a record at its time. So it will be possible to see the Universe as it was just 100 million years after the Big Bang. Perhaps this will change scientists' ideas about the structure of the Universe. All that remains is to wait for the start of operation of the telescope, which is scheduled for 2019. It is expected that the device will be in operation for 5–10 years, so there will be plenty of time for new discoveries.

General device

To launch the James Webb, they want to use the Ariane 5 launch vehicle, created by Europeans. In general, despite the dominant role of the US space department, the project can be called international. The telescope itself was developed by the American companies Northrop Grumman and Ball Aerospace, and in total experts from 17 countries took part in the program. In addition to specialists from the US and EU, Canadians also made significant contributions.

After launch, the device will be in a halo orbit at the L2 Lagrange point of the Sun-Earth system. This means that, unlike Hubble, the new telescope will not orbit the Earth: the constant “flickering” of our planet could interfere with observations. Instead, the James Webb will orbit the Sun. At the same time, to ensure effective communication with the Earth, it will move around the star synchronously with our planet. The distance of the James Webb from the Earth will reach 1.5 million km: due to such a large distance, it will not be possible to modernize or repair it like the Hubble. Therefore, reliability is at the forefront of the entire James Webb concept.

But what is the new telescope? Before us is a spacecraft weighing 6.2 tons. To be clear, Hubble weighs 11 tons—almost twice as much. At the same time, Hubble was much smaller in size - it can be compared to a bus (the new telescope is comparable in length to a tennis court, and in height to a three-story house). The largest part of the telescope is the solar shield, which is 20 meters long and 7 meters wide. It looks like a huge layer cake. To make the shield, a special special polymer film was used, coated with a thin layer of aluminum on one side and metallic silicon on the other. The voids between the layers of the heat shield are filled with vacuum: this complicates the transfer of heat to the “heart” of the telescope. The purpose of these steps is to protect from sunlight and cool the ultra-sensitive matrices of the telescope to –220° C. Without this, the telescope will be “blinded” by the infrared glow of its parts and you will have to forget about observing distant objects.

What catches your eye most is the mirror of the new telescope. It is necessary to focus light beams - the mirror straightens them and creates a clear picture, while color distortions are removed. The James Webb will receive a main mirror with a diameter of 6.5 m. For comparison, the same figure for Hubble is 2.4 m. The diameter of the main mirror for the new telescope was chosen for a reason - this is exactly what is needed to measure the light of the most distant galaxies. It must be said that the sensitivity of the telescope, as well as its resolution, depends on the size of the mirror area (in our case it is 25 m²), which collects light from distant space objects.

For the Webb mirror, a special type of beryllium was used, which is a fine powder. It is placed in a stainless steel container and then pressed into a flat shape. After removing the steel container, the beryllium piece is cut into two pieces, making mirror blanks, each of which is used to create one segment. Each of them is ground and polished, and then cooled to a temperature of –240 °C. Then the dimensions of the segment are clarified, its final polishing takes place, and gold is applied to the front part. Finally, the segment is retested at cryogenic temperatures.

Scientists considered several options for what the mirror could be made of, but ultimately the experts chose beryllium, a light and relatively hard metal, the cost of which is very high. One of the reasons for this step was that beryllium retains its shape in cryogenic temperatures. The mirror itself is shaped like a circle - this allows the light to be focused on the detectors as compactly as possible. If James Webb, for example, had an oval mirror, the image would be elongated.
The main mirror consists of 18 segments, which will open after the vehicle is launched into orbit. If it were solid, then placing the telescope on the Ariane 5 rocket would simply be physically impossible. Each of the segments is hexagonal, which allows you to make the best use of space. The mirror elements are gold in color. Gold plating ensures the best reflection of light in the infrared range: gold will effectively reflect infrared radiation with a wavelength from 0.6 to 28.5 micrometers. The thickness of the gold layer is 100 nanometers, and the total weight of the coating is 48.25 grams.

In front of the 18 segments, a secondary mirror is installed on a special mount: it will receive light from the main mirror and direct it to scientific instruments located at the rear of the device. The secondary mirror is much smaller than the primary mirror and has a convex shape.

As is the case with many ambitious projects, the price of the James Webb Telescope turned out to be higher than expected. Initially, experts planned that the space observatory would cost $1.6 billion, but new estimates say that the cost could increase to 6.8 billion. Because of this, in 2011 they even wanted to abandon the project, but then it was decided to return to it implementation. And now “James Webb” is not in danger.

Scientific instruments

To study space objects, the following scientific instruments are installed on the telescope:

- NIRCam (near infrared camera)
- NIRSpec (near-infrared spectrograph)
- MIRI (mid-infrared instrument)
- FGS/NIRISS (Fine Guidance Sensor and Near-Infrared Imaging Device and Slitless Spectrograph)

James Webb Telescope / ©wikimedia

NIRCam

The near-infrared camera NIRCam is the main imaging unit. These are a kind of “main eyes” of the telescope. The operating range of the camera is from 0.6 to 5 micrometers. The images taken by it will subsequently be studied by other instruments. It is with the help of NIRCam that scientists want to see the light from the earliest objects in the Universe at the dawn of their formation. In addition, the instrument will help study young stars in our Galaxy, create a map of dark matter, and much more. An important feature of NIRCam is the presence of a coronagraph, which allows you to see planets around distant stars. This will become possible due to the suppression of the light of the latter.

NIRSpec

Using a near-infrared spectrograph, it will be possible to collect information regarding both the physical properties of objects and their chemical composition. Spectrography takes a very long time, but using microshutter technology it will be possible to observe hundreds of objects over a sky area of ​​3 × 3 arcminutes. Each NIRSpec microgate cell has a lid that opens and closes under the influence of a magnetic field. The cell has individual control: depending on whether it is closed or open, information about the part of the sky under study is provided or, conversely, blocked.

MIRI

The mid-infrared instrument operates in the range of 5–28 micrometers. This device includes a camera with a sensor that has a resolution of 1024x1024 pixels, as well as a spectrograph. Three arrays of arsenic-silicon detectors make MIRI the most sensitive instrument in the James Webb Telescope's arsenal. It is expected that the mid-infrared instrument will be able to distinguish between new stars, many previously unknown Kuiper Belt objects, the redshift of very distant galaxies, and the mysterious hypothetical Planet X (also known as the ninth planet in the solar system). The nominal operating temperature for MIRI is 7 K. The passive cooling system alone cannot provide this: two levels are used for this. First, the telescope is cooled to 18 K using a pulsation tube, and then the temperature is lowered to 7 K using an adiabatic throttling heat exchanger.

FGS/NIRISS

FGS/NIRISS consists of two instruments - a precision pointing sensor and a near-infrared imager and a slitless spectrograph. In fact, NIRISS duplicates the functions of NIRCam and NIRSpec. Operating in the range of 0.8–5.0 micrometers, the device will detect the “first light” from distant objects by pointing equipment at them. NIRISS will also be useful for detecting and studying exoplanets. As for the FGS precision pointing sensor, this equipment will be used to point the telescope itself in order to be able to obtain better images. The FGS camera allows you to form an image from two adjacent areas of the sky, the size of which is 2.4 × 2.4 arc minutes each. It also reads information 16 times per second from small groups of 8x8 pixels: this is enough to identify the corresponding reference star with 95% probability anywhere in the sky, including high latitudes.

The equipment installed on the telescope will allow for high-quality communication with the Earth and transmit scientific data at a speed of 28 Mbit/s. As we know, not all research vehicles can boast of this capability. The American Galileo probe, for example, transmitted information at a speed of only 160 bps. This, however, did not prevent scientists from obtaining a huge amount of information about Jupiter and its satellites.

The new spacecraft promises to become a worthy successor to Hubble and will allow us to answer questions that remain a sealed mystery to this day. Among the possible discoveries of "James Webb" is the discovery of worlds similar to Earth and suitable for habitation. The data obtained by the telescope could be useful for projects considering the possibility of the existence of alien civilizations.

This is how scientists first “saw” Mars

51 years ago, on July 14, 1965, the Mariner 4 space station approached Mars and, for the first time in human history, took several photographs of another planet. To take photographs, I had to use a large analog camera, which was mounted at the bottom of the device. After the camera took a photograph, the image was sent as a digital code to Earth. Once this code was received on Earth, it had to be run through a decoder. The operation of this device took several hours.

But these were the first images of Mars in human history, and NASA employees did not want to wait. Therefore, it was decided to decode the image on our own, manually.

Since the code of shades of black and white for the received code was known, the experts decided to color the received message with pencils, with colors ranging from yellow to brown. Therefore, it turned out that the world's first image of Mars was not a photograph, but a hand-colored sketch.


Enlarged area of ​​the image

The image shows a section of the surface of Mars near the equator. From this angle, the image looks as if it was taken while on the surface of the Red Planet. But in fact, the "slope" in the middle of the frame is the rounded edge of the planet. Here is a black and white image that makes it clear the real position of the device.

Mariner 4 is an automatic interplanetary station. It was intended to conduct scientific research of Mars from the flyby trajectory, transmit information about interplanetary space and about the space near Mars. It was planned to obtain images of the surface and conduct an experiment on radio eclipse of a signal from the station by a planet to obtain information about the atmosphere and ionosphere. The lead organization for design, manufacturing and testing is NASA's Jet Propulsion Laboratory or JPL. The development of individual systems was carried out by various industrial organizations and higher education institutions.


This is what Mariner 3 and 4 looked like. Below is not a cannon, as it might seem, but a video camera (Image: NASA)

This device became the first spacecraft to take close-up images of another planet and transmit them to Earth. Mariner 4 took 21 complete photographs of Mars and 1 incomplete one. The incomplete photograph was obtained due to the fact that Mars closed the device and radio communication with Earth was interrupted. This happened from July 14 to 15.

As in the case of Venus, photographs of the atmosphere and surface of which humanity was able to obtain several years after Mariner 4’s approach to the Red Planet, photographs of Mars made it possible to move from speculation about the structure of the surface to facts and theories. The myth of canals on the surface of Mars, whose unwitting authors are astronomers Giovanni Schiaparelli and Percival Lowell, has existed for a very long time. It was the reason that scientists and ordinary people for a long time considered the canals to be the work of Martians. Schiaparelli, observing Mars, named the discovered lines with the Italian word “canali”, which means any channels (both natural and artificial origin), and can be translated into English as “channels”, “canals” or “grooves” (canals, artificial canals or furrows). When translating his works into English, the word “canals” was used, used in English to denote channels of artificial origin. He himself did not subsequently specify what exactly he meant. But few people questioned the habitability of Mars: someone had to create these channels on a planetary scale.


A map of Mars created in 1962 by US Air Force specialists demonstrated the presence of channels on its surface. This map was used by NASA to plan Mariner's route. Rectangles mark locations photographed by Mariner 4 cameras

But the device did not see any channels - neither man-made nor natural. Photos and data provided by the station's instruments showed that Mars is a dry and cold planet with surface temperatures below zero Celsius. The planet is riddled with cosmic radiation - it has no ionosphere to protect it from high-energy particles. Mariner 4 found no traces of civilization on Mars. Therefore, in 1965, the myth about the presence of canals on the surface of the planet was dispelled.

Now, half a century later, humans have enough tools to study Mars. Curiosity and Oppotunity are working on its surface. There are several spacecraft in orbit, including the Mars Reconnaissance Orbiter and Mangalyaan. All this allows us to carefully study Mars, making interesting discoveries. For example, orbiters have helped us learn about the periodic appearance of liquid water on the surface of the Red Planet.

This study began with Mariner 4. Its 50th anniversary coincided with the New Horizons flyby of Pluto.

Just half a century ago, scientists painted encoded images received from space with pencils. And now astronomers are receiving detailed images of objects distant from Earth, such as Pluto and comet Churyumov-Gerasimenko, Charon and Ceres. I wonder what will happen in another 50 years?

As promised in the comments to my publication “Why are rovers on Mars!”, where questions were asked about space photographs, photographs of space objects, about the stitching of photographs and how rovers take “selfies”, this material has been prepared.

So: "Let's go!"))

Photos from space published on the websites of NASA and other space agencies often attract the attention of those who doubt their authenticity - critics find traces of editing, retouching or color manipulation in the images. This has been the case since the birth of the “moon conspiracy,” and now photographs taken not only by Americans, but also by Europeans, Japanese, and Indians have come under suspicion. Together with the N+1 portal, we are looking into why space images are processed at all and whether, despite this, they can be considered authentic.

In order to correctly assess the quality of space images that we see on the Internet, it is necessary to take into account two important factors. One of them is related to the nature of interaction between agencies and the general public, the other is dictated by physical laws.

Public relations

Space images are one of the most effective means of popularizing the work of research missions in near and deep space. However, not all footage is immediately available to the media.

Images received from space can be divided into three groups: “raw”, scientific and public. Raw, or original, files from spacecraft are sometimes available to everyone, and sometimes not. For example, images taken by the Mars rovers Curiosity and Opportunity or Saturn's moon Cassini are released in near real time, so anyone can see them at the same time as scientists studying Mars or Saturn. Raw photographs of the Earth from the ISS are uploaded to a separate NASA server. Astronauts flood them with thousands, and no one has time to pre-process them. The only thing that is added to them on Earth is a geographic reference to make searching easier.

Usually, public footage that is attached to press releases from NASA and other space agencies is criticized for retouching, because they are the ones that catch the eye of Internet users in the first place. And if you want, you can find a lot of things there. And color manipulation:

Photo of the landing platform of the Spirit rover in visible light and capturing near-infrared light. (c) NASA/JPL/Cornell

And overlaying several images:

Earthrise over Compton Crater on the Moon. (c) NASA/Goddard/Arizona State University

And copy-paste:

Fragment of Blue Marble 2001(c) NASA/Robert Simmon/MODIS/USGS EROS

And even direct retouching, with erasing some image fragments:

Highlighted image GPN-2000-001137 of the Apollo 17 expedition. (c) NASA

NASA’s motivation in the case of all these manipulations is so simple that not everyone is ready to believe it: it’s more beautiful.

But it’s true, the bottomless blackness of space looks more impressive when it’s not interfered with by debris on the lens and charged particles on the film. A color frame is indeed more attractive than a black and white one. A panorama from photographs is better than individual frames. It is important that in the case of NASA it is almost always possible to find the original footage and compare one with the other. For example, the original version (AS17-134-20384) and the “printable” version (GPN-2000-001137) of this image from Apollo 17, which is cited as almost the main evidence of retouching of lunar photographs:

Comparison of frames AS17-134-20384 and GPN-2000-001137 (c) NASA

Or find the rover’s “selfie stick,” which “disappeared” when taking his self-portrait:

Physics of Digital Photography

Typically, those who criticize space agencies for manipulating color, using filters, or publishing black-and-white photographs “in this digital age” fail to consider the physical processes involved in producing digital images. They believe that if a smartphone or camera immediately produces color frames, then a spacecraft should be even more capable of doing this, and they have no idea what complex operations are needed to immediately get a color image onto the screen.

Let us explain the theory of digital photography: the matrix of a digital camera is, in fact, a solar battery. There is light - there is current, no light - no current. Only the matrix is ​​not a single battery, but many small batteries - pixels, from each of which the current output is separately read. Optics focuses light onto a photomatrix, and electronics reads the intensity of energy released by each pixel. From the data obtained, an image is constructed in shades of gray - from zero current in the dark to maximum in the light, that is, the output is black and white. To make it color, you need to apply color filters. It turns out, oddly enough, that color filters are present in every smartphone and in every digital camera from the nearest store! (For some, this information is trivial, but, according to the author’s experience, for many it will be news.) In the case of conventional photographic equipment, alternating red, green and blue filters are used, which are alternately applied to individual pixels of the matrix - this is the so-called Bayer filter .

The Bayer filter consists of half green pixels, and red and blue each occupy one quarter of the area. (c) Wikimedia

We repeat here: navigation cameras produce black and white images because such files weigh less, and also because color is simply not needed there. Scientific cameras allow us to extract more information about space than the human eye can perceive, and therefore they use a wider range of color filters:

Matrix and filter drum of the OSIRIS instrument on Rosetta (c) MPS

Using a filter for near-infrared light, which is invisible to the eye, instead of red, resulted in Mars appearing red in many of the images that made it into the media. Not all of the explanations about the infrared range were reprinted, which gave rise to a separate discussion, which we also discussed in the material “What color is Mars.”

However, the Curiosity rover has a Bayer filter, which allows it to shoot in colors familiar to our eyes, although a separate set of color filters is also included with the camera.

Filters on the mast camera of the Curiosity rover (c) NASA/JPL-Caltech/MSSS

The use of individual filters is more convenient in terms of selecting the light ranges in which you want to look at the object. But if this object moves quickly, then its position changes in pictures in different ranges. In the Elektro-L footage, this was noticeable in the fast clouds, which managed to move in a matter of seconds while the satellite was changing the filter. On Mars, a similar thing happened when filming sunsets at the Spirit and Opportunity rover - they do not have a Bayer filter:

Sunset taken by Spirit on Sol 489. Overlay of images taken with 753,535 and 432 nanometer filters. (c) NASA/JPL/Cornell

On Saturn, Cassini has similar difficulties:

Saturn's moons Titan (behind) and Rhea (front) in Cassini images (c) NASA/JPL-Caltech/Space Science Institute

At the Lagrange point, DSCOVR faces the same situation:

To get a beautiful photo from this shoot suitable for distribution in the media, you have to work in an image editor.

There is another physical factor that not everyone knows about - black and white photographs have higher resolution and clarity compared to color ones. These are so-called panchromatic images, which include all the light information entering the camera, without cutting off any parts of it with filters. Therefore, many “long-range” satellite cameras shoot only in panchrome, which for us means black and white footage. Such a LORRI camera is installed on New Horizons, and a NAC camera is installed on the LRO lunar satellite. Yes, in fact, all telescopes shoot in panchrome, unless special filters are used. (“NASA is hiding the true color of the Moon” is where it came from.)

A multispectral “color” camera, equipped with filters and having much lower resolution, can be attached to a panchromatic one. At the same time, its color photographs can be superimposed on panchromatic ones, as a result of which we obtain high-resolution color photographs.

Editor's Choice
Economic PLAN 1. CONCEPT OF ECONOMIC GROWTH 2. FACTORS OF ECONOMIC GROWTH 3. EXTENSIVE AND INTENSIVE GROWTH §2 Page. 16-21...

An ancient measure of length in Russia, equal to 71.12 cm. There are different versions of the origin of the arshin measure of length. Perhaps, initially, “arshin”...

Architecture of Ancient Greece Types of temples. Order. Residential architecture All achievements of ancient Greek architecture are associated with construction...

Description of the presentation on individual slides: 1 slide Description of the slide: 2 slide Description of the slide: Lesson objectives Write...
Slide 2 Lesson objectives: 1. To form an understanding of the physical principles of the operation of heat engines. 2.Introduce students to the most important...
Application. Liquid nitrogen is used as a refrigerant and for cryotherapy. Industrial applications of nitrogen gas are due to its...
Class Ciliated worms Ciliated worms are the most primitive group of lower worms; represented mainly by free-living forms....
Asia is the largest part of the world in terms of area (43.4 million sq.m.). The population of Asia is about 4 billion people. Located in Asia...
While his father was still alive, Boris received Rostov as ruler. While ruling his principality, he showed wisdom and meekness, caring first of all about...