What really causes "diffraction blur" ?

A forum to ask questions, post setups, and generally discuss anything having to do with photomacrography and photomicroscopy.

Moderators: rjlittlefield, ChrisR, Chris S., Pau

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

What really causes "diffraction blur" ?

Post by rjlittlefield »

In another thread, starting here, we began to revisit the issue of "diffraction" and why it is that stopping your aperture down too far causes the image to become blurred.

For context, let me quote the most relevant snippets from the last several postings in that other thread.
rjlittlefield wrote:One caution about stopping down farther: diffraction blur.
...
Stopping down farther would not have exposed much more detail in areas that are now OOF, and it would have softened even farther areas that are now fairly sharp.
...
...inserting a 2X teleconverter also doubles the effective f-number, very much like extending the lens to get the same magnification. There's no free lunch -- the tradeoff between DOF and diffraction blur is the same no matter how you get the magnification.
Harold Gough wrote:On the matter of diffraction in lenses in general, I have always understood that the concern is about that arising from the edges of the diaphragm. As you close down to a smaller and smaller aperture, that part of the image with the worst of the diffraction effects contributes an incrementally increasing proportion of the image. I don't see how that can be increased by using a teleconverter, which uses the part of the image furthest from the diaphragm.
rjlittlefield wrote:The edges of the diaphragm really have no significant effect.

What matters is that the aperture restricts the maximum angle between light rays, which in turn restricts their ability to interfere with each other to produce differences in intensity that the sensor can detect.

Graham Stabler and I had a long discussion about this some months ago. The discussion is hard to follow, and then takes off on another point, but take a look at the illustration at http://www.photomacrography.net/forum/v ... 0&start=47 .

The point is that light rays striking the sensor at relatively steep angles with respect to each other can interact to form fine patterns of intensity, while light rays striking the sensor at narrower angles can only form coarser patterns.

DOF depends on the same angles, hence my comment that there is no free lunch -- the tradeoff between DOF and sharpness is the same no matter how you get the magnification. The nominal setting of the lens to get a particular DOF/sharpness will vary depending on whether you use extension or a teleconverter, but the combination of DOF and sharpness will not.
Harold Gough wrote:I'm not sure that all this information about interference has much to do with diffraction, the latter being about change of direction, in which scattering would be included.
What we're wrestling with here is how to explain and understand what happens. The facts of what happens are quite straightforward and pretty easy to confirm by experiment: at fixed magnification, DOF and diffraction blur both depend on the effective aperture, and it makes no difference how that magnification and effective aperture are achieved.

The challenge is how to make sense of the facts. Here we are not helped by decades of photography literature that have used the word "diffraction" in several different ways, and have seldom used the word "interference" at all.

I wish I knew how to write an explanation that is simultaneously short, clear, and easily understood by people of many backgrounds. If I did, I could make a bundle of money selling copies to put in textbooks. But I don't. Nonetheless, I'm going to try again anyway, right now. Bear with me...

-------------------------------------------------------------------

The effects we're talking about are all due to the "wave nature" of light.

Like other waves, light waves do several interesting things --- they scatter off obstacles, they bend around corners, and they interfere with each other.

All of these effects are due to the same underlying physics. The different words reflect our human need to have simple models that adequately describe situations we care about.

Most explanations of image formation start at the subject and work their way toward the sensor.

I would like to try it in the other direction -- start at the sensor and work our way back to the subject.

At the sensor, what is important is "interference". All of our current sensors respond only to the intensity of light, not its phase. To form an image that the sensor can detect, light waves must come simultaneously from several different directions and interfere with each other to form a "standing wave" pattern in which regions of high and low intensity stay in the same place while the waves continue to oscillate. The high intensity regions are then "bright", the low intensity regions are "dark", and so on. In the words of most textbooks, "a real image has been formed".

To repeat, the process of forming a real image has everything to do with "interference", and nothing to do with "scattering" or "bending around an obstacle".

OK, now back up one step. Where do the light waves come from, that interfere with each other to form an image?

Of course they come from the lens, having gone through the aperture.

The primary action of the aperture is to restrict the range of directions from which light rays can reach the sensor. This has three principal effects:
1. It reduces the average intensity at the sensor.
2. It increases depth of field (by reducing the size of the blur circle for out of focus points).
3. It reduces the achievable resolution of the image formed at the sensor.

That last effect is what photographers normally call "diffraction blur", even though it's really an interference effect.

Perhaps this illustration will help:
Image

Notice the caption on the image.
Light coming from multiple directions forms a "real image" through interference that produces a standing wave pattern.
The size of detail that can appear in the image depends on the range of angles that the light comes from ---
a wider range of angles (larger aperture) can form a more detailed image.
The famous "Airy disk" is simply the real image formed by a point source imaged through a circular aperture, producing a restricted range of angles. A narrower range of angles produces a larger Airy disk, just as in the illustration above a narrower range of angles produces a wider band spacing.

To add some more context...

It's true that if you look very closely at the edge of the aperture, you can see some effects that would probably be called "scattering" and "bending":

Image

But it's important to note that these "scattering" and "bending" effects are what you see when you focus your attention on the edge of the aperture. When you focus your attention on the image being formed at the sensor, it's the interference effect that matters.

OK, working our way back toward the subject, we run into the lens also.

For this discussion, let's treat the lens as just a lump of stuff that "changes the direction of light rays so they come to a focus" back at the sensor. That lump of stuff has some width (restricted by the aperture), and it's that width that provides the range of angles that we need to form a real image.

But there's a subtle detail here. Consider a single point on the sensor, representing a single focused point on the subject. Light exiting the back of the lens, at various angles and positions so as to focus on that point, had to enter the front of the lens at various angles and positions as well. In addition, all of that light has to be coherent enough to form a real image by interference.

So, how does it happen that a single point on the subject manages to send out a set of light waves that can travel different distances, be redirected by the lens, pass through the aperture, and end up sufficiently coherent at the sensor to form that real image?

The answer to that question really is "scattering", but this time it happens at the subject, and it's critical to image formation.

To cement this picture, let's run through the process again, but in the usual order -- subject to sensor.

Incoming illumination strikes the subject. Through scattering, each point on the subject produces waves of light that propagate as expanding spheres centered on that point. Some of these waves strike the front of the lens, where they get bent by refraction at the lens surface. Continuing on, the waves strike the aperture. At the aperture, a small fraction of the waves very near the edge of the aperture do get scattered/bent by diffraction, but most of the waves either get blocked entirely or make it through the aperture with no significant change. The waves that do get through the aperture with no significant change get refracted some more by other lens surfaces, then continue on to the sensor, where they form a real image by interference.

I hope this explanation clarifies the physics.

The photography community uses the term "diffraction" in a couple of different and incompatible ways. Sometimes it means "scattering and bending around obstacles", and sometimes it means "anything related to the wave nature of light". In the phrase "diffraction blur", the meaning is "wave nature of light". The blur is due to limiting the range of angles available for interference, not due to scattering or bending around edges of the aperture.

--Rik

PS. Thanks are due to Graham Stabler for pointing out this way of thinking about the process. It took me a while to switch gears, too, but now I find this model to be more clear than anything I had seen before. Graham will probably twitch a little at a couple of my phrasings in this post, but I've tried to strike a balance between precision and clarity.

mgoodm3
Posts: 273
Joined: Mon Sep 08, 2008 8:50 am
Location: Southern OR

Post by mgoodm3 »

So basically, the width of the diffraction pattern (Airy disc) is inversely proportional to the angle that the edges of the aperture make with the detector.

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Correct.

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

looks good to me!

Graham

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Ah, now that's encouraging to hear! Thanks for checking in, Graham. :D

--Rik

Harold Gough
Posts: 5786
Joined: Sun Mar 09, 2008 2:17 am
Location: Reading, Berkshire, England

Post by Harold Gough »

Effect 3 is, of course, the issue.

After reading your explanation I really need to lie down in a darkened room, free of interference, for some time!

Your theory seems to work only for the wave form of light and not for particulate (photon) form. Or have I got that wrong?

How does this fit in with the centre of lenses giving better resolution than the edges?

Harold
My images are a medium for sharing some of my experiences: they are not me.

DaveW
Posts: 1702
Joined: Fri Aug 04, 2006 4:29 am
Location: Nottingham, UK

Post by DaveW »

I am very thick Rik, but trying to understand being a joiner not a scientist, so you can put me right in simple terms when you say:-

"It's true that if you look very closely at the edge of the aperture, you can see some effects that would probably be called "scattering" and "bending":

As I weighed it up in the past, when you stop down the ratio of diaphragm edge to clear centre volume for light rays to pass through increases. I had always presumed therefore that your "scattering and bending" effects would be proportionately greater at smaller apertures than larger ones, so degrade an image taken at a smaller aperture more than one taken at a larger aperture?

Putting the clear centre volume to diaphragm edge ratio another way. If you could close the diaphragm right down so the central hole closes altogether it would be 100% diaphragm edge to 0% centre volume, and as you open the diaphragm up the central volume percentage becomes larger and the diaphragm edge percentage effect smaller, Therefore any diaphragm edge effects decrease proportionately as the diaphragm is opened up?

That does not argue with your light waves idea, but does mean the diaphragm edge effects become proportionately more degrading on the image as you stop down? So are you not slightly underplaying the effect the diaphragm edge scattering and bending does have on degrading the image as you stop down?

Keep the rebuttal simple so a Secondary School boy who left at 15 in the 1950's can understand it! :?

DaveW :)

Harold Gough
Posts: 5786
Joined: Sun Mar 09, 2008 2:17 am
Location: Reading, Berkshire, England

Post by Harold Gough »

As a grammar school boy, who was not very good at physics, but is now retired after a career in science (biology/ecology), my mind is blown somewhat by these explantions but my contention was much the same as Dave's, more comprehensible, account.

By coincidence, I have just been dealing (as executor of the estate) with old school reports, for myself and my sibligs, found among our late mother's personal papers.

Harold
My images are a medium for sharing some of my experiences: they are not me.

mgoodm3
Posts: 273
Joined: Mon Sep 08, 2008 8:50 am
Location: Southern OR

Post by mgoodm3 »

I take it that the edges of the diaphragm have very little to do with the resulting pattern other than restricting the angles of light that intercept the detector. Light striking the detector at limited angles (ie similar angles) will produce a wider diffraction pattern.

Kinda like the interference of sound waves. Waves of similar frequency will produce a longer wavelength interference pattern.

mgoodm3
Posts: 273
Joined: Mon Sep 08, 2008 8:50 am
Location: Southern OR

Post by mgoodm3 »

Does this sound reasonable?

The Airy disc is the summation of a multitude of individual diffraction patterns caused by the various angle of incoming light. With a narrow aperture the range of angles is limited to those that are relatively similar and a wider diffraction pattern dominates (Wider Airy disc). With an open aperture you will add in much narrower patterns from more widely angled waves and those will tend to narrow the summation (Airy disc)

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Harold Gough wrote:Your theory seems to work only for the wave form of light and not for particulate (photon) form. Or have I got that wrong?
The wave equations give the probability of sensing a photon at any particular place.

Perhaps you are thinking of "ray theory"? Ray theory is great for designing lenses, but it says nothing at all about diffraction. Essentially, ray theory tells you where the center of the Airy disk will be, but nothing about its width.
How does this fit in with the centre of lenses giving better resolution than the edges?
The edges of lenses suffer from aberrations. That's why most lenses do better stopped down a notch or two, contrary to what would be predicted by diffraction theory ignoring the aberrations. What you need to get best resolution is both a wide aperture and a lens with small aberrations so that its edges can contribute properly.
DaveW wrote:are you not slightly underplaying the effect the diaphragm edge scattering and bending does have on degrading the image as you stop down?
No, I'm not.

Remember what I wrote earlier, "All of these effects are due to the same underlying physics."

Everything that happens everywhere is due to a continuous process of diffraction and interference.

What an aperture does is to remove some wave components that would otherwise contribute to make the interference pattern look like simple wave propagation. Removing those components causes the pattern to become complicated.

What the sensor observes is the complicated pattern at the sensor position.

You can interpret that complicated pattern in terms of "scattering and bending" around edges of the aperture, or you can interpret it in terms of adding up contributions from light coming in from various angles. But there's only one pattern. If you do the math carefully enough, both interpretations produce the same result.

Your model of looking at "clear centre volume to diaphragm edge ratio" works fine for many purposes. Suppose we reduce the diameter of the aperture by 2X. Then the clear centre volume reduces by 4X, so the ratio reduces by 2X, which predicts that the diffraction blur will increase by 2X.

That answer (2X increase in diffraction blur) is correct. Other answers produced by your model will also be correct, and it's easy to see why: change^2/change = change. So if you're happy thinking in terms of the ratio, then by all means keep using it.

However, when you're doing that, be careful not to fall into traps like trying to use "the part of the image furthest from the diaphragm". That idea is promising if you're thinking in terms of bending and scattering, but in fact it will make diffraction blur worse, not better. Cutting off the part of the image near the diaphragm is equivalent to just using a smaller diaphragm -- you've eliminated the previous problematic areas, but added new ones that are even worse.

Personally I now find it simpler to skip the intervening steps and just think directly in terms of the angle. Both strategies yield the same result, but thinking in terms of the angles produces a nice unification of DOF and diffraction blur that makes it simpler to think about the tradeoffs.
mgoodm3 wrote:Does this sound reasonable?

The Airy disc is the summation of a multitude of individual diffraction patterns caused by the various angle of incoming light. With a narrow aperture the range of angles is limited to those that are relatively similar and a wider diffraction pattern dominates (Wider Airy disc). With an open aperture you will add in much narrower patterns from more widely angled waves and those will tend to narrow the summation (Airy disc)
That sounds fine.

--Rik

Harold Gough
Posts: 5786
Joined: Sun Mar 09, 2008 2:17 am
Location: Reading, Berkshire, England

Post by Harold Gough »

Yes, ray theory. Rays must represent photon paths.

A general point about theories, not specific to physics but probably its area of most relevance, is :

If we want to understand our physical world we are faced with reality and actuality.

Reality it our best understanding of how it is, and is expressed by theories. As we better understand the perceived world, we change our idea of reality, much expressed as theories. The model of atomic structure is a good example of this.

Actuality is the underlying cause of our reality. This involves such considerations as whether matter has any physical existence (my words) at partical scales, or is it all energy? These are phenomena, often observable only indirectly by their effects on reality. In this "world" particles, or parcels of energy, can spin in two directions at once and/or be in two places at once, it the latter case being connected (quantum effects). It is possible that we can never know actuality.

I told you I was not good at physics. Now you have proof.

:)

Harold
My images are a medium for sharing some of my experiences: they are not me.

DaveW
Posts: 1702
Joined: Fri Aug 04, 2006 4:29 am
Location: Nottingham, UK

Post by DaveW »

The trouble about "facts" is they often change generation to generation. It was once a "fact" that the earth was flat and the sun revolved around the earth.

Stephen Hawking has now said some of his theories in his first acclaimed book are now incorrect. All "facts" are based on present knowledge and we never know which "facts" will be disproved in the future. In fact (pun intended) all "facts" as far as we can really prove are only theories we use to try and explain what we observe and fit the situation.

http://www.prospect-magazine.co.uk/arti ... hp?id=3490

http://news.bbc.co.uk/1/hi/sci/tech/3913145.stm

The only thing we really know is a fact, is that we really don't know and probably will never know what are facts. We only know we seem to have some theories that seem to fit what we observe.

DaveW

NikonUser
Posts: 2693
Joined: Thu Sep 04, 2008 2:03 am
Location: southern New Brunswick, Canada

Post by NikonUser »

Donald Rumsfield said it best:

As we know,
There are known knowns.
There are things we know we know.
We also know
There are known unknowns.
That is to say
We know there are some things
We do not know.
But there are also unknown unknowns,
The ones we don’t know
We don’t know.

Feb. 12, 2002, Department of Defense news briefing
NU.
student of entomology
Quote – Holmes on ‘Entomology’
” I suppose you are an entomologist ? “
” Not quite so ambitious as that, sir. I should like to put my eyes on the individual entitled to that name.
No man can be truly called an entomologist,
sir; the subject is too vast for any single human intelligence to grasp.”
Oliver Wendell Holmes, Sr
The Poet at the Breakfast Table.

Nikon camera, lenses and objectives
Olympus microscope and objectives

Harold Gough
Posts: 5786
Joined: Sun Mar 09, 2008 2:17 am
Location: Reading, Berkshire, England

Post by Harold Gough »

And we know that closing down the aperture too far is a no-no!

:D

Harold
My images are a medium for sharing some of my experiences: they are not me.

Post Reply Previous topicNext topic