• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Question for the Science Minds here . . .

Zachary Smith

Vice Admiral
Admiral
Why do things which are farther away look smaller?

Seriously, I'm wondering. I realize that we see, unless we are looking at an actual light source, by capturing photons reflected off an object onto our retinas. I'm guessing (trying to figure this out) that the closer an object is to our eyes, the more photons we capture from any item we are looking at. Obviously the act of interacting with that which we are seeing effects the photons in order for them to communicate the information of VISION i.e. shape, color, distance, etc.

Does it follow then that photons reflect RANDOMLY off an object, parallel to the surface of an object or what? It seems to me that if our retinas are pulling in fewer photons reflected off a given surface that, rather than making the object look smaller, it would simply be more out of focus and indistinct. I realize there is a quality of "hazing" which comes with distance but that is an atmosphereic phenomenon, if I'm not mistaken. In a vacuum, you wouldn't have interference as as factor so you would simply be receiving fewer photons as others would be headed off in other directions. How MUCH infomation does a photon carry? Does it convey only the information contained within its point of contact i.e. photon-sized specks of information from the object we are looking at as if they were pixels?

It makes common sense that objets farther away look smaller because it is our universal experience that they do. What I'm trying to understand is the machanism of WHY more distant objects look smaller.

Can anyone help?
 
Is this just a question of geometry?

What you are talking about is plane perspective. It is simple to demonstrate with drawings if somebody here can be bothered. Nothing to do with quantity of photons, but the angle they move in to get to your eye. Light coming from objects far away must travel at a shallower angle.

If something is moved 400 times further away, it appears 400 times smaller. That is why the moon eclipses the sun with roughly the same sized disc, even though the sun is 400 times bigger, it appears the same size from earth.

Apparent size = Actual size / Distance away.

Your 3d graphics card does the same geometric calculations to recreate these effects.

But yes, there will be less photons too. This is the inverse square law of light source attenuation , roughly meaning that things further away appear less intense because the light must spread out more.

Furthermore, if that light passes through an atmosphere, it is scattered, making things appear hazy, and lower colour saturation.


Photons do reflect randomly from an object, (but not really).
From a mirror, they reflect perfectly as you know they do. From other surfaces, it varies. Close up, most surfaces look like moon landscapes which reflect photons in all directions. Most likely it reflects perpendicular to the surface, least likely it reflects parallel.
 
Last edited:
Well, it's not just a question of the geometry, I suppose. There seems to be an unequal loss of data. I am, for example, at the moment looking at a bottle of fish-oil capsuls across the room from me. I can see the shape color, etc. Now, while I can see there is lettering on the bottle, I cannot read the label from here. The point I'm wondering over is that, while I lose some of the fine details, such as the individual lettering, there is NO loss of information regarding the color, shape etc. The margins are distinct. I can clearly see the perimeter of it. The coloration is as vivid as if it were inches from me.

Yet it looks smaller. I recognize that to a certain extent we are looking at limitations inherent in the biology o the human eye. The information IS reaching me, I simply cannot discern it. This could be readily demonstrated with the use of a magnifying application like a lens or such.

What makes me curious is that, while I lose particular details, others remain extnat to me. More, while evidently the appearance that the object is smaller (I'm certain it hasn't actually changed sixe--though I can't demonstrate this) as a result of--what? --photon spreading in connection with your description of plane perspective, the appearance of the item remains totally clear to me. It just LOOKS smaller. Even looking with just one eye to eliminate binocular vision, it STILL looks smaller. But ONLY smaller. It does not look less distinct. It does NOT look less intense. Only the details of the writing on the label appear significantly different due to distance. Otherwise, the only notable difference is that a bottle which is about 7 inches tall, looks to be about 1/4 inch tall from where I'm sitting. Thus it seems the loss of information I'm getting as a consequeence of distance is not uniform.

It seems very weird to me.
 
Is this just a question of geometry?

It is. Electromagnetic waves at the frequencies of visible light propagate in straight lines. Things appear smaller further away because they occupy a smaller spatial angle.

The quality of what one sees, if I am to hazard a guess, is probably limited by our visual system's ability to process the visual signals it receives, disregarding regular bad eyesight. It's our brains fault, there's a limit to how quickly it can sample the visual signal for perfect reconstruction. Smaller details (as an object moves further away, for instance) result in smaller changes in the visual signal, requiring faster sampling in order for the brain to extract the information from the signal. I think.
 
The point I'm wondering over is that, while I lose some of the fine details, such as the individual lettering, there is NO loss of information regarding the color, shape etc. The margins are distinct. I can clearly see the perimeter of it. The coloration is as vivid as if it were inches from me.

What you've lost is resolution. Your eyes have a sample rate, defined by the number of rods and cones you have distributed on your retina. So yes, it is a matter of perspective because this determines what rays of light actually are picked up by your eye... the farther away something is, the less rays you'll see. Each individual ray still contains all the information it had before (intensity, wavelength) you're just seeing less of them.

And even then, our eyes are much much worse then we generally believe, with only the very center of our visual field with enough resolution to be able to read normal sized letters. Our brain is just very good at combining all the details our constantly scanning eyes pick up.
 
So yes, it is a matter of perspective...

:lol: you trekker you.

I think the loss of information you're talking about is to do with the eye's retina. It contains an array of sensors, which combine exactly like pixels of a bitmap picture. Things further away appear smaller, which means the information carried (as it passes through the retina to the brain), is quantized into a smaller number of pixels. Information is lost exactly like information is lost if a bitmap is resized smaller. 1MP has less information than 10MP etc.

What is preserved are colours and forms.

Resizing the image doesn't change the colour of the photons. And only when you get down to really small sizes does form become lost.

The reason we don't see pixels is three fold:

(1) The layout of the sensors in our retina (pixels of the image) is exactly like the seeds in a sunflower head are arranged. So there are no horizontal-vertical axes to the picture.

sunflower.GIF


(2) Light hitting the retina is slightly diffuse because the optics of the eye are never perfect, and one bright spot can illuminate the interior of the eye which is why bright lights have a soft glow/aura around them. It is like softening an image in photoshop.

309514130_4c7f477f78_o.jpg


(3) The brain doesn't give you the information as a bitmap image. The pixels are blended and processed and distilled into forms of shape and colour, and mostly in your central field of vision.

step1.gif


There is too much information in full video and you'd be overwhelmed by it. Think of the bandwidth. The visual cortex of the brain is responsible for most of the brain's energy use. You're brain runs at about 5 watts normally. A little less when you sleep, and a little more when you perceptually analyse.

Remember that vision evolved for hunting/gathering/evasion. It extracts details relevant to this such as movement and shape, and estimated distances. The belief that you're seeing a complete image is largely an illusion. Detail is only added when you shift your consciousness to that area.
 
Last edited:
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top