Skip to playerSkip to main contentSkip to footer
  • 6 days ago
Hubble Space Telescope can "only see the universe in shades of grey," according to NASA's Goddard Space Flight Center. Learn how the imagery is processed into amazing color views of the cosmos.

Credit: NASA's Goddard Space Flight Center:
Miranda Chabot: Lead Producer
Miranda Chabot: Writer
Miranda Chabot: Narrator
Paul Morris: Support

Music & Sound:
Music Credit: “A Woven Narrative,” Matthew James Jude Anderson [PRS], Ninja Tune Production Music, Universal Production Music

Category

🤖
Tech
Transcript
00:00As a cosmic photographer, NASA's Hubble Space Telescope has taken over a million snapshots
00:06documenting the universe.
00:08These images illustrate, explain, and inspire us with their grandeur, but may not match
00:14what we'd see with our own eyes.
00:16That's because Hubble sees light beyond our sensitivity.
00:21Our eyes only sense a small fraction of the universe's light.
00:26This tiny band of wavelengths, called the visible spectrum, holds every color in the rainbow.
00:32Light outside that span, with longer or shorter wavelengths, is invisible to our eyes.
00:38But those invisible wavelengths can tell us so much more about the universe.
00:43Hubble houses six scientific instruments that observe at different wavelengths.
00:47Together, they expand our vision into infrared and ultraviolet light.
00:52That doesn't mean Hubble can show us never-before-seen colors.
00:56In fact, the telescope can only see the universe in shades of gray.
01:02Seeing in black and white allows Hubble to detect subtle differences in the light's intensity.
01:07If one wavelength is brighter than another, that tells us something about the science of
01:11that object.
01:13But because color helps humans interpret what we see, NASA specialists work to process and
01:19colorize publicly available Hubble data into more accessible images.
01:24When Hubble snaps a photo, it puts a filter in front of its detector, allowing specific wavelengths
01:29to pass through.
01:31Broadband filters let in a wide range of light.
01:34Narrowband filters are more selective, isolating light from individual elements like hydrogen, oxygen,
01:41and sulfur.
01:43Hubble observes the same object multiple times using different filters.
01:47Image processors then assign those images a color based on their filtered wavelength.
01:52The longest wavelength becomes red, medium becomes green, and the shortest blue, corresponding
01:58to the light sensors in our eyes.
02:00Combining them gives us a color image, showcasing characteristics we can't make out in black
02:05and white.
02:08Adding color reveals the underlying science in every image.
02:12It's like translating words into another language, making sure no information is lost.
02:19Some words have an exact counterpart.
02:21The meaning remains the same when you swap them.
02:24Hubble's true color photos are like that.
02:26They are a direct translation, using broad filters and wavelengths we can see.
02:32Other words can't be translated directly.
02:35When we use narrowband filters or peer outside the visible spectrum, it's like translating
02:40words with no one-word replacement.
02:43Easily done, but requires more work.
02:47Narrowband images highlight the concentration of important elements.
02:52Infrared images are like heat maps, helping us spot newborn stars in dark, dusty clouds,
02:58and peer further back in time and space.
03:01In Ultraviolet, we uncover active aurorae on Jupiter, and learn how young, massive stars
03:07develop.
03:09Image processors also clean up artifacts, signatures in an image that aren't produced by the observed
03:15target.
03:16As sensors age, some pixels become imperfect, returning too much electrical charge or not
03:22enough.
03:24Artifacts can leave behind odd shapes, or return images without any true black.
03:30These effects can be calibrated and removed.
03:34Other artifacts come from the dynamic environment of space.
03:38Even the best photographers get photobombed.
03:40In Hubble's case, the culprits are asteroids, spacecraft or debris trails, and high-energy particles
03:47called cosmic rays.
03:49By combining and aligning multiple observations, image processors can identify them and piece
03:55together an artifact-free image.
03:58Without processing, many Hubble images would be divided down the middle.
04:03This line, called a chip gap, is the tiny space between some camera sensors.
04:08Hubble moves slightly with each observation, allowing image processors to fill the gap and
04:13replace faulty pixels.
04:15This process is called dithering.
04:18And because there's no natural up or down in space, processors decide how to rotate and
04:24frame the image.
04:27It's a time-consuming procedure.
04:29Simple images take about a week, while large mosaics stitched together from many observations
04:35can take a month to process.
04:40Hubble images may not be what we'd see first-hand.
04:43Instead, they are tools for understanding science at a glance, shedding light on otherwise invisible
04:49views of our universe.

Recommended