NASA Admits They’re Photoshopped! Are All Space Images We See Not Real?

 

This is a spiral galaxy NGC 3982.

What a gorgeous, naturally occurring objects in our universe. Well, I mean it ends up being gorgeous after all this work . Natural pretty much every photo you’ve ever seen of space, they’re all Photoshop to perfection. And I’m sure you’re wondering why they do it. But the methodology behind how they do it is just as if not even more interesting than that.

Okay! So the most important thing to get out of the way here is that none of these photos are fake. According to the Space Agencies that took them, they’re just interpretations of our reality. These interpretations however, are edited as NASA, the ESA, the Canadian Space Agency, and most if not all other space agencies Photoshop the images they take of the universe around us.

While the term Photoshopping bears a lot of negative connotations like models getting airbrushed to perfection, or people faking experiences they never had, this is not the case with space. Everything you get from NASA is real. It’s real information in the real universe. Yes the pictures are enhanced or altered, or the colors are changed, but it’s done for real scientific reasons not just because we feel like it. Um, definitely not because we feel like making things up. That’s up to Paul Sutter, an astrophysicists at The Ohio State University and hosts the podcasts, Ask a Spaceman. He helped me understand what their scientific reasons for altering photos of space actually are.

First off, images of space are taken not for beauty, but to gain information and insight about the universe around us. Most of the photos taken of space come to us in grayscale. A monochromatic color range that’s pretty bland and boring if you’re talking to the average person. The reason for this is two-fold.

First, color cameras are just low resolution than black and white cameras. To produce any image, cameras requires sensors to capture light. For black and white cameras, this sensor is working with all the light that comes into the camera. But with the color camera, you need red, blue, and green receptors to get all the different colors to make up the world. Each receptor is only picking up one-third of the light that it normally would, which when put together, creates a less sharp image than you’d get if you’re capturing all of the light you’re capable of.

So flatter less colorful photos are ideal when it comes to getting the most information you can out of a photo. But of course, monochromatic photos aren’t as visually appealing as color photos. Both when it comes to aesthetics and when it comes to quickly grasping information. The human eye has a limit to how many shades of gray it can distinguish between. So scientists composite multiple filter photos into one image to approximate the real colors you’d get with a color camera. And then since they’re already there in Photoshop, you figure you might as well tune it up just a little bit. We are visual creatures and we like looking at pretty pictures. We like putting pretty pictures in our papers and we like showing off pretty pictures to the public. We could share charts and graphs which, yeah, well, perhaps more scientifically useful but just isn’t as interesting. So, here’s the cool picture that led to that chart and graph.

 

Pillars of Creation

Colorizing photos is more than just trying to increase the appeal of an image. Frequently colorization is also used for categorization. Just like how we categorize Excel spreadsheets, wires, and maps. Extracting all the information you can out of an image to learn more about the universe around us, means gathering light at larger spectrum than what the human eye can see. Because we can only see this little bit. With technology, we can see all these things, and those would be pretty much invisible to the naked eye, or regular camera. Check out this image of clouds where stars are forming. These clouds are somewhat opaque and would be very difficult to see with the regular camera. But thanks to infrared telescopes, we can parse them out and build a better map of our universe.

Next, scientists have to decide how they’re going to color-code it. Sometimes, that can be a little bit challenging. What color is 143 megahertz? There’s no color for that. So we assign a color to those frequencies, just like the human eye assigns a color to particular wavelengths of light. So what we’re doing when we’re, ah, collecting data or collecting light that is outside the visible spectrum, is we’re assigning colors to it. So we can actually look at these names.

Elemental Color Codes

Polarizing photos can also help teams of scientists understand more about the composition or elemental make-up of the universe. Different elements give off different wavelengths of light. So if once scientists recognized a certain element as hydrogen or sulfur for instance, it’s helpful to single that out that everyone else on the team can identify that just from looking at the color it’s painted as. If you look at that picture of a typical nebula, like the Orion Nebula or Cat’s Eye Nebula, this isn’t necessarily what you’d see with the naked eye, because the colors that have been added here are used to highlight where certain elements are. Like, oh, there’s a lot of oxygen over here, we wanna paint that up like a nice pretty blue. And if there is a bunch of iron over here, we wanna see some red. We’re assigning colors to those various elements so we can see where they are in relation to everybody else so we can try to understand this, this nebula, and how it was formed, how will evolve.

All these methods for Photoshopping are in the effort to get the most information possible out of space, and interpret it with the least amount of effort. So NASA and pretty much every other space agency out there Photoshop their images. Not because nothing is real, and the Earth is actually flat and everything you do is lie, but for the purpose of doing a better job at being astronomers. The colorization of photos I would say is 10 percent for the benefit of the public, 90 percent for the benefit of scientists. We are intentionally colorizing the images because we’re trying to understand them. And sometimes, natural color or what you would see with your eye, just isn’t gonna give a lot of information. That’s why we have these big giant telescopes in the first place so we can do stuff that we can’t do with our eyes.

And so yeah, we gonna process them, we’re gonna photoshop them. But every single thing we do in astronomy is based on the data. We’re not making anything up. We’re not faking anything. We’re not inserting anything. We’re highlighting what’s already there, so that we can, you know, do our job as scientists.

Add Comment