Original image - upper left horizon |
It's been making the rounds on the 'net and "awfuldumb" has not been very forthcoming with answers. Those answers they have parted with have been disingenuous and ridiculous. This is how they typically respond when they don't have an answer, or don't want to. It's called reductio ad absurdum, and it's a fallacious form of logic used by people who think they are superior to those asking the questions.
I really hate when these puffed-up jerks think they are smarter than all the rest of us.
So, how about some real answers. After all, they've pulled out everything from cosmic rays to alien bonfires to avoid giving the folks who paid for the toys any real information, because they think you and I are too stupid to 1) know the difference, and 2) figure it out for ourselves, without "awfuldumb" spoon-feeding us what we are to believe.
To begin with, a bit of my credentials: I have a degree in Radio/TV/Film (before they called it mass media) and 35 years of experience as a photographer, videographer and cinematographer. I am proficient in lens theory and have a solid working knowledge of CCD imaging sensors. In other words, I know a little about the subject.
Next, the "light on Mars" is an imaging artifact, but not in the way they are telling us. It is an imaging artifact caused by an actual point-source of light on the horizon overloading the sensors. If you want the longer explanation, please stick around and I'll tell you how this is true.
First, the MastCam tool on the rover is two cameras, commonly called "left eye" and "right eye," with different focal lengths and f-stops. The "left eye" is a 10mm lens with an f-stop of 8 (f/8). What this means to you and me is that the lens is 10mm long from the outside edge of the glass to the focal plane where the image has the sharpest focus. The human eye sees roughly equivalent to a 50mm lens. A smaller number is what we call "wide angle", and a larger number is "narrow angle" or "zoom." The "fish eye" lens, or extreme wide angle is about 8mm. So, now we have a frame of reference. The "left eye" is a wide angle lens, and with an f-stop of 8, that would give us a good exposure on a partly cloudy day with diffuse sunlight.
The "right eye" is a 100mm lens with f-stop of 10 (f/10). What we know now is that the lens is 10x longer than the "left eye" and the amount of glass blocks more light, so the iris aperture (f-stop) must open more to provide the same illumination that the "left eye" does. Therefore, the "right eye" is a zoom lens that requires more light to make an image because the amount of glass used in the lens blocks more light.
One other bit of important information is the "contrast ratio." All imagining devices, including the human eye, have a ratio of light to dark where detail can be seen. You can demonstrate this by walking from the bright outdoors into a darkened room. At first, you see almost nothing, but as your eyes adjust more and more detail will emerge. The contrast ratio is a sliding scale of light to dark in which the eye or sensor can see detail. Below the scale is black, and above it is pure white. At both ends, no detail can be seen in those areas. The human eye has a ratio of between 400:1 and 10,000:1, depending on which source you check. The commonly used value is 2,000:1, meaning that in any scene, the eye can see detail in objects that are either 2,000x brighter or darker than the average light in the scene. Typically, if something goes off the top of the scale, we call that "glare", "flare" or "glint." The average HD-CCD chips have a contrast ratio of 2,500:1, or roughly equal to the human eye (the old TV system was 30:1).
It's important to note that sensors designed to record visible light are not sensitive to "cosmic rays". Those types of radiation generally cause a fogging effect in the image where focus appears to be lost. They don't cause bright flares or glints.
The important thing to remember about contrast ratio is that it is dynamic (moving up and down depending on f-stop aperture opening). Contrast is where we get detailed information, such as texture, depth and so on. Finally, at the extreme ends of the ratio, we only see black or pure white with no detail.
So now we have a working idea of how cameras and sensors work and we can analyze the Mars photo - and NASA's disingenuous non-answers - intelligently.
First, they tell us that there are two photos - one from the "left eye" and one from the "right eye". The "right eye" shows a bright spot on the horizon line, while the "left" does not, even though the images were taken either at the same time (phys.org) or a second apart (nasa.gov). Either way, it doesn't matter.
Zoom lens (right eye) compress distance so that various distances appear to be piled on top of each other. Since the right eye is "zoomed" in, it would make distant objects appear larger and closer than a normal or wide-angler lens. Thus, simultaneous photos of the same scene using a wide and zoom lens would "see" different things, including point sources of light. What is a large and bright object in the zoom image would be tiny and probably less than one pixel in the wide image. A point-source of light in the zoom image would look large and might fall off the top of the contrast ratio, while in the wide lens, it may not even be visible, much less exceed the contrast ratio.
Furthermore, the "left" and "right" eyes may have the same contrast ratios, but the different f-stops mean that the top and bottom of the scales are different. What may appear normal in the "left eye" would flare or glare in the "right eye" because white end of the scale is higher in the "left" than in the "right".
So, there are two primary mechanisms at work here. Both cameras have similar resolutions (1200x1200 dpi), but they have widely varying focal lengths (10x versus 100x) and f-stops (f/8 versus f/10). This means that the way the two imagers will have different distance compression and contrast ratios, meaning that both cameras could image the same scene at the same time and have glare in the "right eye", but not in the "left".
One last point: having spent many hours trying to figure out where glare is coming from in a frame, I know that glare has different characteristics depending on whether it is caused by reflection or a direct source. A reflection generally causes glare (in electronic sensors) with spikes all the way around in even numbers (2, 4, 6, etc.). Reflections also tend to have the shape of either the source or the reflecting surface, so that square sources appear square and round sources appear round. The sun or a light source reflecting off a surface will make a star-burst pattern and will often have a shape. A direct source will look like a blob because the sensor is picking up the full amount of light, rather than a reduced amount caused by scattering off the reflecting surface.
Now look at the photo at the beginning of the glare from Mars. Notice that the glare appears to be a vertical rectangle with two spikes coming out from either side. I interpret this as being a reflection off a square or rectangular surface that exceeds the contrast ratio of the sensor. Since the lighting looks to be fairly low (not mid-day), the source must be more than 2,500 steps brighter than the ambient light in the scene - like a large spotlight or the sun on a surface. Since the glare is geometric, the reflecting surface must be geometric and small enough so that the entire surface is reflecting light toward the camera.
Crop from original - note shape |
Speculation: The reflecting surface is not a natural object - it is geometric. The light source being reflected is very bright - could be the sun (if the angle is right) or a large studio light designed to illuminate large areas (5kW or 10kW fresnel instrument). The glare disappeared in subsequent frames because, 1) the rover or MastCam moved, 2) the source moved, 3) the source was turned off, or 4) the reflecting surface changed in some way. Given the small size of the glare relative to the overall image size, it would not take much movement in any component to eliminate the glare.
In order to create a glare that is geometric, the surface would have to be very smooth and shiny. A rough or uneven surface would cause too much refraction and would not create a glare in the first place, especially at the distance implied by the image. Materials that could do this would be polished metal, glass and similar materials not found in Nature.
Therefore, either the source, the reflecting surface, or both are artificial. Either one, some or all moved in subsequent frames to eliminate the glare (sun angle, etc.). Finally, all of the arguments put forth by NASA and their mouthpieces so far are either deliberate obfuscation or the mark of complete incompetence (take your pick),