Um, yeah, "interact". Superposition is a special kind of interaction from which the original waves eventually emerge unchanged. At least that's the way articles in the SIAM Journal talk about it. (SIAM = Society of Industrial and Applied Mathematics.) That model gives a nice transition to what happens with nonlinear media, in which the original waves may not emerge unchanged. More of those pesky "background and terminology" issues.
Anyway, you wrote:
Amen to that!It really is difficult to explain these sorts of issues when peoples background is limited, the way in which things tend to be explained is also overly complicated and impenetrable with maths thrown in for good measure. On the other hand it is very easy to enter into empty explanations, the kind of stuff that the TV and school is very good at churning out, they use terminology to replace real explanation. You can find yourself without enough information to truly understand, having instead to just take it for granted. I even find myself unable to understand explanations for things I already understand.
So let's see if we can do better...
This has turned into a great discussion.
I think we're all on the same page about diffraction. That model using superposition of many circular waves is equivalent to what my grid cell simulation does. Information spreads in all directions all the time, in accordance with a 2nd order differential equation, and it's a bizarre feature of the way those zillions of contributions add up that causes waves to propagate. I am always fascinated that such interesting behavior comes from a calculation that just says (in its entirety):
But we're not really here to discuss diffraction, we're here to discuss "why" images get fuzzy when you stop down too far. I put the "why" in scare quotes to emphasize that we're talking about models here. Clearly nothing more than the differential equation is required to predict the result, but, um, that doesn't mean it's the best way to think about it.
The more I think about the "spatial filtering" model, the more I like the feel of it. I'm still not very satisfied with those particular words, since both "spatial" and "filtering" mean something different to most photographers than they do in this conversation. But I think one can do a lot with a little using the general concept that the ability of waves to represent detail depends on their angle of incidence.
Something like this, perhaps...
"Light is a wave. If you look close enough, a beam of light has peaks and valleys. Visual detail in the subject is carried to the camera's sensor in the pattern of those peaks and valleys. The peaks and valleys in the light beam are all the same distance apart, but when the light beam hits the sensor, the distance between the peaks and valleys on the sensor depends on the angle of the beam. When the beam strikes the sensor at a fairly sharp angle, the peaks and valleys are close together, and this allows fine detail to be captured. But as the beam becomes more perpendicular to the sensor, the peaks and valleys move farther apart and some of their ability to represent fine detail is lost. If the beam is too close to perpendicular, then the peaks and valleys move too far apart to represent even the detail you want to see, and the image begins to look fuzzy. Stopping down the aperture too far blocks the light at steep angles that can form sharp images, leaving only the light that produces fuzzy and fuzzier images as it becomes more and more perpendicular."
I like the flavor of this explanation. It relies just enough on waves to get the idea across, without getting bogged down in superposition and tiny circular angels dancing on the head of a wavefront. (Sorry, a bit of passing whimsy there... )
No doubt this can be improved upon. Have at it!
BTW, I chose to say "beam" for a definite reason. I think that most photographers understand "ray" to be something with no width. But the light has to have width for these pictures to make sense. Just trying to think about it from the consumer's standpoint...
--Rik