Making scale bars with no calculations (OT-->diffraction)

A forum to ask questions, post setups, and generally discuss anything having to do with photomacrography and photomicroscopy.

Moderators: rjlittlefield, ChrisR, Chris S., Pau

Epidic
Posts: 137
Joined: Fri Aug 04, 2006 10:06 pm
Location: Maine

Post by Epidic »

Graham Stabler wrote:Will,

The objective has a certain NA and hence a certain resolving power and you are correct that this is not necessarily the resolution you will get at the image plane, the resolution at the image plane comes down to the system NA.

Typically you may have a high NA objective and a low NA tube lens forming the image, however the effective NA of the second lens is higher because of the magnification, basically because the objective tends to collimate the light the angles into the tube lens are much lower so it should not reduce resolution.

That is a specific set up but in general resolution comes down to the angles at the object plane and then keeping that information when the light is collected, closing the aperture removes some of the light from the higher angles and hence reduces resolution. The angles in the illumination matter as well of course.

Graham S
The angular size of the exit pupil is how you can define the f-number of a system - or as you say the angle of incidence of the light cone. The f-number of a system can be used to calculate (or predict) the resloving power at the image plane. NA cannot. This is simply the effects of diffraction.

Where are you getting the term "system NA"? Numerical aperture is based on the medium of the object and the angular size of the entrance pupil of the objective. It is not related to anything else. Tube NA??? How can componants of an optical system have a numerical aperture beyond the objective and condensor?

If you read my post, you would see that I stated that resolution at the object plane is related to NA (angular size of the entrance pupil).

To answer your next post, this is not spatial filtering. Spatial filters are mechanical stops used with lasers to form coherent light for illumination (not imaging forming). Apertures are not added to do this. Also, you cannot know the resolving power of a system based solely on the exit pupil as other factors contribute to the actual performance of a lens. While you may find it convenient to think that an aperture limits spatial frequencies, you would be mistaken to link it with spatial filtering.
Will

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

The NA can be used to predict the resolving power at the image plane as long as you do not loose further information in subsequent optics. Spot size is 1.22.lambda/NA and you take that size multiply that by your magnification and ensure you have at least 4 pixels to sample it (nyquist theorem).

The term system NA (I got from working in the applied optics group at Nottingham University): If you have a finite conjugate objective which images from the object plane directly to the image plane (much like the typical camera lens) then there is no place that information/angles/resolution can be lost except in that lens. If on the other hand you have some relay optics or a system with a infinite conjugate objective and a tube lens then you must ensure that none of those optics remove any further information. Of course the tube lens is probably just a low NA doublet but because at that point in the system all the angles are reduced it can still maintain the resolution of the system to that of the objective. Of course you still have to sample the imaging plane sufficiently and I'm ignoring any aberration etc.

Filtering a laser beam using a pin hole (which is an aperture) is an example of spatial filtering but it is just one example and it works in just the same way. By putting a pin hole at the center you allow only the DC component of the light through, if you enlarge the hole (use an IRIS) you let more and more of the light through and more spatial information.

"Also, you cannot know the resolving power of a system based solely on the exit pupil as other factors contribute to the actual performance of a lens."

I didn't say you could although the quoted NA of a lens should take other factors into account, at least it take account of that lens.

"While you may find it convenient to think that an aperture limits spatial frequencies, you would be mistaken to link it with spatial filtering."

I disagree, remember that the original point I was making was what is the cause of a reduction in resolution as you stop down not what defines the resolution at the lenses maximum aperture.

My only point is that to just blame diffraction is not correct, it suggests that as you stop down then you are getting some strange edge effects when in fact you are simply removing light that came into the object lens at high angles which is what represents the high diffraction orders or fine detail. If you don't want to call that spatial filtering then that is fine by me.

What I really don't understand is that the other Grahams explanation actually supported what I said, was congratulated and has since been contradicted:

"If you make the aperture very small indeed, then you will collect very little of the information in the rays diffracted by the object, hence the resolution will be poor. You may also introduce diffraction effects from the very small aperture and these will further degrade the image."

Graham

Epidic
Posts: 137
Joined: Fri Aug 04, 2006 10:06 pm
Location: Maine

Post by Epidic »

Graham, I was simply confused as I thought you were taking about image resolving power. In that regard, Graham's explination is not complete. However, it seems you are only discussing resolution at the object plane. (You would be correct in thinking there is no difference between an object and an image, but that is not a usual view among photographer.)

I am confused that you are thinking of diffraction is simply some kind of edge effect. It is more confusing that you think diffraction has nothing to do with the creation of the orders of diffraction. It would seem, from my point of view, that diffraction is directly related to those orders and they illustrate the problem of aperture very well. Nor do I understand why the angular size of a pupil (or as you put it, light coming from high angles) is unrelated to the effects of diffraction when they can be directly related. I suspect that we most likely have a vocabulary problem. I take it you work in physics. I have a difficult time talking with the physics department here as they use a different nominclature from my field, which is imaging.
Will

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

I am fully aware of what diffraction is and I don't think stopping down causes an edge effect but when people start talking about light interacting with the aperture as you did or just say "it's diffraction" then it makes it sound like there is some special case and I thought it could be missunderstood as most people when they think of diffraction are thinking of the spreading of the edge of a plane wave when it passes an aperture (because of huygens principle).

I see the object resolving power as the maximum potential resolving power for the image, get the info and try to keep the info.

The high diffracted orders which I have at no point said have nothing to do with resolution end up at the outer edges of the optical system and are hence attenuated or filtered by the aperture. But that to me does not warrent the explanation that diffraction is the cause of a loss of resolution. Diffraction is the cause of any image and resolution whatsoever even if you don't stop down but the cause of the loss of resolution should not be described as diffraction, some mention must be made of the removal of the higher orders.

To put it another way, if you put a coat on you get warmer but you would not say the cause of the temerature increase was heat you would say it was insulation.

I suspect we are totally on the same page really but language is getting in the way as is the way a physist type looks at things, I'm all NA, BFP and fourier planes and you guys are all about f numbers and aperture.

My background is a degree in electronics, a PhD in applied optics (high resolution wide-field surface plasmon microscopy, don't ask) and now I am trying to design robot flying insects for a living which is why I am trying to get some really good images of insect flight aparatus and find myself on this forum.

Graham

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Graham Stabler wrote:Reduction in resolution with decreasing aperture is predicted by ray theory if you understand that fine detail requires high angles into the object lens.
Sure, but there's an important issue of background and terminology hiding here.

The "ray theory" I've always read about is the one that says light propagates in straight lines unless it gets reflected or refracted. That's all -- nothing about spreading, or about detail depending on angle. In fact nothing about detail at all -- just point sources and ray fans. Certainly that theory can be extended by adding in the concept that the amount of detail carried by a ray depends on angle -- and on color, to avoid mentioning waves. But I've never seen that done before and I presume that most people who read what I write won't have either.

Since we're sharing backgrounds, my degrees are in math and computer science. I've worked professionally in digital image processing, synthetic aperture focusing, and computational holography (3D Fourier transforms of digitized ultrasound -- don't ask). When I see "spatial filter", I'm liable to think anything from a carefully formed piece of plastic that distorts a physical wavefront, to a computational algorithm that inputs an array of numbers, outputs an array of numbers, and does pretty much the same computation for every array element. I'm pretty sure that most other people's use of the term fits in there someplace, but I might be a bit hazy exactly where.

--Rik

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

By the way, it's a separate issue, but please note that members of this forum have a wide diversity of backgrounds and specialities.

As a matter of policy -- and I get to say this since I'm the Editor -- I want to see posters "seek first to understand, and then to be understood".

It really doesn't help the discussion to jump on somebody because they happen to use a term differently from what happens to be custom in your speciality.

Clarifying the differences is very helpful, however. Just describe what you understand a term to mean, and inquire what the other person means, if their usage doesn't make sense to you.

This discussion definitely seems to be moving in that direction.

Thanks very much!

Carry on... :D

--Rik

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Graham Stabler wrote:...trying to get some really good images of insect flight aparatus...
Everything that I can recall seeing in this forum is either completely static or a single short exposure. I've seen a few references posted to literature involving dynamics, mostly for dragonflies, but that's all that comes to mind.

Are you looking for images, techniques, collaborators, or maybe all three and anything else as it pops up?

--Rik

puzzledpaul
Posts: 414
Joined: Sun Aug 06, 2006 4:15 am
Location: UK
Contact:

Post by puzzledpaul »

<< trying to get some really good images of insect flight aparatus >>

No doubt you're already aware of this kit - but (having watched the 'Planet Earth' series) it's the first thing that came to mind.

No idea of your budget, of course - and assume that this sort of rig probably costs the usual 2 limbs :)

pp

http://news.thomasnet.com/companystory/521165

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

When I say I am trying to get really good images I mean I am trying to produce them myself, I'm on this forum like everyone else to learn how to do that. I have already found out about helicon focus software, the use of ping pong/wiffle balls as diffusers, object movies and a number of other things and I will give back what I can. A lot of my images are of dissected parts so I'm not expecting to see much of that here :)

I have access to a high speed video camera 4000fps at 800X400 pixels with a Nikon 1:1 macro lens for dynamic work.

Graham

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

gpmatthews wrote:Simple explanations of complex phenomena are rarely comprehensive!
Now there is a pithy quote if I've ever heard one.

This issue of "Is it diffraction or is it spatial filtering?" has been bouncing around in my head since you guys raised the question.

It got so bad that I finally went back to the basic math & physics and programmed up some simulations from first principles. When I say "basic" and "first principles", I mean really low level stuff -- brute force solution of the wave equation using purely local calculations on a fine grid. No formal analysis, nothing Fourier, not even any phase and amplitude calculations -- nothing but grid cells and finite approximations to 2nd derivatives -- subtract, multiply, add.

Then I looked at the simulation results and asked, "How would I describe what I see here?"

The results provided an interesting perspective on terminology.

When I started with plane or spherical waves and allowed them to propagate freely, they just marched right across the simulation grid. I looked at them and said to myself "Light travels in straight lines."

But it doesn't, really. That's just my human interpretation of the solution of the wave equation.

When I imposed an obstacle, I observed that waves appeared even in areas "behind" the obstacle, in what would be "shadow" areas when I imagined that light travels in straight lines. I looked at those waves and said to myself, "Oh, this is diffraction -- the light is bending or spreading behind the obstacle."

Again, this is just my human interpretation.

When I ran plane waves through an aperture, I observed interference patterns, and I said to myself "Oh, this is diffraction too -- it looks just like the pictures in Wikipedia."

Then I tried converging spherical waves, corresponding to a point source imaged through a perfect lens. (That's "perfect" as in no aberrations. I don't want to get into the other kind of "perfect" that has to do with negative refraction.)

When I ran those converging waves through an aperture, I observed interference patterns and noted that there was no sharp focal point. Instead, in what should have been the focal plane, I observed a fuzzy concentration of waves.

When I made the aperture smaller, I observed that the interference patterns got more obvious, and in the focal plane, the fuzzy concentration spread out farther. But still, I'm looking down on the whole simulation and the interference pattern is blatantly obvious. "Diffraction", I said. "Diffraction, diffraction, diffraction!"

Then I went one more step. (This one was a thought experiment because actually coding it up and getting it to run would have taken way too long.) Instead of a point source imaged through a perfect lens, I imagined an array of point sources, each having a different intensity so as to construct a patterned "subject". And I further imagined that rather than looking down on the whole simulation, I could only see the pattern of energy projected on the focal plane.

You know what happened? That pattern in the focal plane looked remarkably like what I'd get if I took the sharp pattern of the original subject, and just ran a blurring filter over it.

Suddenly my brain wanted to switch gears and think in terms of spatial filtering.

I think what's determining my favorite model is complexity of the pattern being imaged. If the pattern is simple -- a point source or a plane wave -- then I find it feasible to think about the waves directly. When the pattern gets complex the waves become too messy to deal with, but the spatial filtering model works fine.

So is the observed blurring "because of diffraction" or "because of spatial filtering"???

To play devil's advocate, I don't think it's either one.

I think the observed blurring is just because that's what light does, and you can choose to call it "diffraction" or "spatial filtering" as you prefer. If you do the mathematics carefully enough, you get the same result from either standpoint.
Graham Stabler wrote:To put it another way, if you put a coat on you get warmer but you would not say the cause of the temerature increase was heat you would say it was insulation.
That sounds reasonable at first read.

But on the other hand, suppose you swathe yourself in layers of really good insulation, and end up dying because your body temperature gets too high.

In that case, I'll wager the medical examiner's certificate will list the cause of death as "overheating", not "overinsulating".

By analogy, the images must get fuzzy because of too much diffraction, not too much filtering.

Ah, language -- you gotta love it! :roll:

--Rik

Edit: fix spelling error.
Last edited by rjlittlefield on Sun Jan 20, 2008 7:56 pm, edited 1 time in total.

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

For the terminally curious, here is a reduced-size animation showing the effect of changing aperture size, for the case of converging spherical waves (point source imaged with a perfect lens). Orange line marks the aperture; cyan line marks the center of the incoming spherical waves.

This aperture happens to be reflective, so there's an interference pattern above the aperture also. That doesn't affect the waves that come down through the aperture.

It's interesting that as the aperture shrinks, the point of sharpest "focus" also moves closer to the aperture. It's not unreasonable, given the limiting cases of very wide and very narrow aperture, but I don't recall seeing a theoretical treatment of this effect.

--Rik

Image
(Animated GIF -- your browser must be set to allow animation.)

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

rjlittlefield wrote:
But on the other hand, suppose you swathe yourself in layers of really good insulation, and end up dying because your body temperature gets too high.

In that case, I'll wager the medical examiner's certificate will list the cause of death as "overheating", not "overinsulating".
But mine was an argument of logic not of language. What needs to be explained is why the resolution is reduced with decreasing aperture. As diffraction occurs all of the time then diffraction is not an description of what causes the loss in resolution even if it is possible to explain the loss of resolution in terms of diffraction. So my objection is to a one word answer which is why I said it was akin to saying the propagation of light caused the reduction in resolution.

The other thing about looking at the spreading effect of diffraction is that it only conciders the light that does get through the system, it totally ignores all of that light which is blocked by the aperture and all of that information which is used to create a sharp image. Even if there was no spreading the resolution would be lost so it is more than something being done to the light that remains it is what light remaining that counts as well.

Graham

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

my objection is to a one word answer
Agreed -- one word answers are seldom adequate, and even less often helpful.

You wrote earlier, "I am fully aware of what diffraction is".

But I don't know what your awareness is. In particular I don't know exactly what you mean by the word, even after all these postings.

"Diffraction" has a lot of aspects. It is associated with spreading behind an obstacle, blurring and ringing in an image behind an aperture, redirection of a monochromatic plane wave by a grating, creation of spherical waves when a plane wave hits a small reflector, and appearance of regular interference patterns when one shines X-rays on a crystal. When I just now Googled on "define: diffraction", I found 27 more-or-less respectable definitions that mention those aspects.

However, one thing I cannot find in any of those definitions is the concept that diffraction happens "all of the time", as you just wrote. So it looks to me that you're using the word in some way different from all those 27 definitions, but I can't tell exactly how. Do you mean simply "behaving in accordance with the wave equation", even in free space?

To tell the truth, I'm not even quite sure what you mean by "spreading". Do you mean the near-field effect in which waves appear in places that can't "see" the source on a straight-line path? Do you mean the far-field effect in which an aberration-free lens nonetheless turns a point source into an Airy disk? Or are you talking about what happens when you run a collimated beam through an aperture, and observe that its diameter increases downstream?

Let's see if we can reach some common ground.

Start with the near-field effect.

In my simulations, one might imagine that the appearance of waves behind the edge of the aperture is simply due to numbers that propagate through the grid cells in the gap next to it ("the light that gets through the aperture").

But the same numbers come through those same cells even when the aperture is removed.

The difference is that removing the aperture allows additional numbers to propagate through other grid cells. Those additional numbers combine with the ones that come through the gap in such a way as to exactly cancel any puzzling disturbances, so that without the aperture, the wavefront just marches across the grid. In other words, what causes spreading into the shadow regions of the near-field is removing information that otherwise would have come in from regions that the obstacle now occludes.

In the far field, exactly the same thing happens although it may be harder to recognize. The simple naive "expected" pattern may be that the spherical wavefronts should converge to a point. The fact that they do not can be explained by the absence of contributions that would otherwise cancel the observed differences from the naive expectation.

I think this example illustrates that you and I are agreed the interesting effects are caused by the removal of information.

And so, in the context of this conversation, I'm actually quite happy to agree that the stopping-down effect is "caused by spatial filtering".

At the same time, I am quite sure that the next time a photographer asks "Why does stopping down too far make my image fuzzy?", I will not answer the question by saying "Oh, because it does too much spatial filtering."

Those words ("too much spatial filtering") only make sense to someone who already has a firm concept of light as waves that can interact with each other in interesting ways depending on angle.

To someone who is still operating (at best) at the level of rays and refraction, the important message is more like "It's a wave effect. Trust me for now, and go read about diffraction when you have time."

While preparing this response, I went back and carefully re-read all of the earlier postings in this thread and the one that I linked to.

It's only a guess, but perhaps your initial question was prompted by my statement in this post that "All of the lenses have lousy resolution when stopped down too far, mostly due to diffraction."

I'm sorry if that sentence struck one of your hot buttons, but even after all this discussion, I still think it was the best choice in context. "Spatial filtering" simply would not have communicated correctly there.

--Rik

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

I probably should not have said "it is not diffraction it is spatial filtering" I should have said what my problem with this one word explanation was. It didn't help that it was countered with the response, "it is not spatial filtering it is diffraction".

I understand diffraction in terms of Huygens' principle of superposition, that is you can consider a wave front as being the superposition of many circular waves.

https://byjus.com/physics/the-huygens-p ... ave-front/

I have seen much better diagrams in books like Optics by Hecht. So if you took an infinitely long line and drew an infinite number of circles on it then the front defined by those circles would be straight too. This is a plane wave. If you then add an aperture you get a bending of the wave front because the wave fronts that would ensure a straight wavefront are no longer there. From this basic phenomena you can get lots of optical effects such as diffraction orders from gratings, interference etc etc. But in my mind even a plane wave at least in reality is diffracting because it is finite diffraction is considered a phenomena associated with light bending then a none spreading wave might be considered the limit. In any case the principle of superposition is the driving force of propagation.

If someone asked me why resolution was lost I would say this:

"The information defining the finer structures of the object is contained in the light that enters the lens at greater angles to the optical axis, roughly the higher angles that the lens can accept the higher its potential resolution. However this light can be removed from the system by stopping down the aperture so it is no longer present at the image plane, when this is done resolution is reduced."

And if anyone wanted to know why it was that higher angles related to high resolution I would way this.

"If you drop a stick in to a pond you get a straight wave propagating along, if you drop two sticks in to the pond so that their waves cross each other you get interference. When one wave is going up at the same time as the other you get an especially high crest, when one goes down at the same time as the other you get an especially low trough. How fine the peaks and troughs are depends on the angle between the two sticks. If they are parallel then you see no interference, as you move to higher angles you see a very fine pattern (overlaying two transparencies is a great demo). Imaging is the reverse of this effect, you can have a fine pattern from which two waves emerge (diffracted orders), the finer the pattern the greater the angle between the two waves, if you wish to make an image you must collect this light and keep it!"

For my short version of all this I would say

"you block the light that defines the fine structures"

It really is difficult to explain these sorts of issues when peoples background is limited, the way in which things tend to be explained is also overly complicated and impenetrable with maths thrown in for good measure. On the other hand it is very easy to enter into empty explanations, the kind of stuff that the TV and school is very good at churning out, they use terminology to replace real explanation. You can find yourself without enough information to truly understand, having instead to just take it for granted. I even find myself unable to understand explanations for things I already understand.

If I was any good with animation I'd love to make some animations of the things I have learnt that once you cut through the bull are so extremely simple and powerful. Like Fourier optics, it sounds a nightmare, even the word Fourier but you can understand it from the point of view of plane waves adding up and collimation.

Graham

Admin edit, rjl, 1/3/2023, to replace broken link http://www.cmmp.ucl.ac.uk/~ahh/teaching ... node4.html

Graham Stabler
Posts: 209
Joined: Thu Dec 20, 2007 11:22 am
Location: Swindon, UK

Post by Graham Stabler »

rjlittlefield wrote:
Those words ("too much spatial filtering") only make sense to someone who already has a firm concept of light as waves that can interact with each other in interesting ways depending on angle.
Interact?

OK now I'm a pedant :)

Post Reply Previous topicNext topic