Traditional cameras dont work well in space, cosmic radiation being part of it.
Some of the white dots are in places where there are no stars, and there are several sources of the dots. Many of them are from cosmic rays – high energy particles emitted by the Sun, or by distant objects like supernovas or quasars that randomly run into the camera's sensors and light up one or more pixels. When a cosmic ray hits a pixel head-on, it will light up just that pixel. But if it strikes at an angle to the sensor, it can light up a line of pixels, creating a streak in the image. The longest streaks come from cosmic rays that hit the sensors at grazing angles.
They dont take just normal photographs.
Each image is taken with one or more filters positioned in front of the camera sensor. This is similar to how photographers on Earth attach filters to their cameras to achieve special effects in their photos. Another analogy is red and blue lenses in 3D glasses that allow only wavelengths of red or blue light to enter your eyes.
Most digital cameras carried by spacecraft take monochrome images using different color filters, and multiple images can be combined later by scientists on Earth to produce processed image products, including color images. The filters are specially designed to allow only certain wavelengths of light to pass through including red, green, blue, infrared, or ultraviolet. Some cameras also have polarized light filters.
Humans see color in visible light. Consumer digital cameras, including those in smartphones, can take color images meant to capture scenes similar to how we would see them with our eyes. But space imaging cameras are specially designed for science and engineering tasks. They use color filters (also called spectral filters) to reveal how a scene looks in different colors of light. Studying how the Sun, a planet, a moon, asteroid, or comet appears in different colors makes these filters a powerful tool in imaging science.
There are lots of reasons for this but I think the main reason is different elements emit different wavelengths and by using different filters it provides a clue as to what they are looking at.
How Are Images From Space Different Than My Camera's Images?
Images from space missions are a type of scientific, or engineering data. On their simplest level, images are a grid of pixels of varying brightness. The full-quality (archival quality) images scientists and engineers use for their work are uncompressed. This means two things:
Can Raw Images Be Used for Science?
Yes and no. Since some original information is lost during compression, raw images are generally not suitable for detailed scientific analysis, as fine detail and precise pixel measurements matter.
But raw images display a significant amount of detail and provide a useful first look at the image data from a mission, in a way anyone can access. They make it easy to see, at a glance, the abundance of different surface features, and the shapes and structure of features revealed in the images. Trained researchers don't use raw images for detailed scientific measurements, but there's a lot of useful, high-level information in such images that might motivate deeper inquiry.
With all that being said.
It certainly is strange why we dont have a camera on say something like the, Deep Space Climate Observatory. I dont know.
It seems using traditional cameras is not exactly a easy thing. Heres a article where they used glass to protect a very tiny digital camera.
Alongside the satellite, the TRISAT-R team sent up super small cameras made with clear borosilicate glass lenses (a highly durable form of glass) mounted directly onto 320x320 pixel image sensors, according to the statement. That's where we get our wonderfully faulted view.
You could send up a camera sealed in vacumn, but that would be alot of weight I guess.
Right, at least outside the magnetosphere. We used to send film cameras up on massively heavy satellites. Spying purposes, you know. When the film was full, the vessels containing them would be conveyed to a return capsule that detached from the satellite (now worthless) and fell back to Earth to be retrieved and developed.
Transistors did a lot for spaceflight. Specifically: they destroyed human spaceflight.
What a meaningless distinction.
Yeah its a technical definition, but it gets the point across does it not?
Weve talked about this before. I try to keep a open mind. I personally believe space exists, or something like space anyway.
Here sounds like a reasonable explanation.
https://science.nasa.gov/solar-system/multimedia/raw-images-faq/
There are lots of reasons for this but I think the main reason is different elements emit different wavelengths and by using different filters it provides a clue as to what they are looking at.
With all that being said.
It certainly is strange why we dont have a camera on say something like the, Deep Space Climate Observatory. I dont know.
It seems using traditional cameras is not exactly a easy thing. Heres a article where they used glass to protect a very tiny digital camera.
https://www.space.com/trisatr-satellites-earth-image-space-camera
You could send up a camera sealed in vacumn, but that would be alot of weight I guess.
https://www.vice.com/en/article/the-first-spy-satellites-had-to-drop-gigantic-buckets-of-film-back-to-earth/
:shrugs:
Right, at least outside the magnetosphere. We used to send film cameras up on massively heavy satellites. Spying purposes, you know. When the film was full, the vessels containing them would be conveyed to a return capsule that detached from the satellite (now worthless) and fell back to Earth to be retrieved and developed.
Transistors did a lot for spaceflight. Specifically: they destroyed human spaceflight.
"massively heavy satellites" = balloons
Back to the Qult board with you, degenerate. You have no place here.