Home › Forums › Modern Military Aviation › Missiles and Munitions › GoogleEarth image: SLAM-ER caught in flight? › Reply To: GoogleEarth image: SLAM-ER caught in flight?
It’s surprising how many people will fail color separation tests of similar difficulty to spotting the wing and tail outlines against the cluttered background in the original pic.
Most people have not taken these tests and so they do not know of this very slight deficiency, so you will have a hard time convincing them that what they cannot see is actually there. Although not 100% applicable in this instance, this page shows what I mean: http://www.toledo-bend.com/colorblind/Ishihara.html
I am well aware of color blindness. When you have a background in science, aviation, and electronics having color vision is very important. About 9% to 12% the male population has some degree of color blindness, females have a much lower likelihood of color blindness.
However I really don’t think it applies in this situation that much. For some reason the color seems to be washed out on Google. The Google image that I originally posted as a link to Google maps is virtually washed out of color, it is virtually black-and-white. However when you zoom out far enough it transitions to pretty realistic color.
I think most of it is a mind’s eye thing. People that are used to looking at aircraft and analyzing photographs are more likely to be able to distinguish it as an aircraft.
It also could have something to do with software, video card, monitor, video settings, ect…
So there are many reasons that people don’t see the same thing.
I don’t know if contrailjj altered his image, but I took the liberty of clipping it and inserting it into a montage to display how the same aircraft can look different under some circumstances. The rest of the photographs were all on the same computer, so there seems to be variation. The color quality seems to vary drastically on Google, I suspect it is because the photographs are taken at different times under different weather conditions with different cameras.
Anyway the Google images of the airplane in flight in question seems to have relatively the same when color and density as the background. The background is also is camouflage, so the vague outline of the wing and tail blend into the background. People that are used to looking at photographs and aircraft’s are going to have an easier time discerning the wings.
So I think it’s partly software, hardware, settings, eyesight and in the minds eye.

Can you tell me why some aircraft starboard navigation lights look blue and some look green?;)