Hi!
Photomacrography is a relatively new thing for me but I've been asked to explore it further for my job. I am just not sure where to go next with my given requirements.
I am trying to fit less than 2um per pixel on an image.
Attached is an image of a calibrated micrometer (the scale is in um) I took with our current equipment (a nikon D5300 and an old Nikon 60mm micro lens, plus extenders).
SOOC there are 2.5um per pixel.
The most I've done with macrophotgraphy on my own is go outside and photograph plants and insects with my 50mm macro (plus extenders) but nothing more than that.
I'm wondering if I need to look at possibly getting an actual microscope objective and adapter. But I am just not sure to what extent or what our best route would be for this.
Thank you so much for any advice you might have!!
Equipment Suggestions
Moderators: rjlittlefield, ChrisR, Chris S., Pau
Equipment Suggestions
Kind Regards,
Laura
IG: @lauramckenziecombs
Laura
IG: @lauramckenziecombs
- enricosavazzi
- Posts: 1514
- Joined: Sat Nov 21, 2009 2:41 pm
- Location: Västerås, Sweden
- Contact:
I would suggest that, as a first step, you should learn to do as much as possible with the equipment you already have, possibly with one or a few accessories that are not too expensive. You can then gradually move on to equipment for higher magnification, which is more difficult to use as well as more expensive.
The subject width in your test picture seems to be quite close to 35 mm, while the sensor of the D5300 is 23.5 mm wide. This means you are photographing at less than the 1x magnification of which the Micro Nikkor 60 mm is capable without extension rings.
Since you mention using extension rings, this means that you did not extend the focusing helicoid of the lens fully.
Assuming that your goal is to reach the maximum magnification of which your equipment is capable, you should mount the lens on extension rings and focus it to the highest magnification, which for this lens is 1x (without extension rings). In this way, depending on the length of the extension rings, you will reach a total magnification above 1x.
Since the focal length of this lens shortens when focusing close, it is difficult to tell exactly which magnification you will achieve with a given length of the extension rings, but a rough guess is that you will probably reach around 2x with 45-50 mm of extension rings and the lens focused at 1x. The subject may however be too close to the front element of the lens for practical use.
Above 1x, most camera lenses, including this one, will perform better reversed, which requires a reversing ring like the BR-2A and a filter adapter to match the filter mount of the lens. This also improves the working distance, which should in this case be around 45-50 mm, i.e. quite usable.
I don't know whether the D5300 will be able of automatic exposure with a reversed lens, but there are workarounds if it cannot.
If you have the Micro Nikkor 60 mm AF-D, it has a mechanical aperture ring and can be used reversed (I have not used Nikkors for about a decade, and don't remember if you also need something to force the aperture of a reversed lens to close). More recent Nikon lenses, without a mechanical aperture ring, cannot be used reversed (unless one goes through shenanigans to make them kind of work).
The subject width in your test picture seems to be quite close to 35 mm, while the sensor of the D5300 is 23.5 mm wide. This means you are photographing at less than the 1x magnification of which the Micro Nikkor 60 mm is capable without extension rings.
Since you mention using extension rings, this means that you did not extend the focusing helicoid of the lens fully.
Assuming that your goal is to reach the maximum magnification of which your equipment is capable, you should mount the lens on extension rings and focus it to the highest magnification, which for this lens is 1x (without extension rings). In this way, depending on the length of the extension rings, you will reach a total magnification above 1x.
Since the focal length of this lens shortens when focusing close, it is difficult to tell exactly which magnification you will achieve with a given length of the extension rings, but a rough guess is that you will probably reach around 2x with 45-50 mm of extension rings and the lens focused at 1x. The subject may however be too close to the front element of the lens for practical use.
Above 1x, most camera lenses, including this one, will perform better reversed, which requires a reversing ring like the BR-2A and a filter adapter to match the filter mount of the lens. This also improves the working distance, which should in this case be around 45-50 mm, i.e. quite usable.
I don't know whether the D5300 will be able of automatic exposure with a reversed lens, but there are workarounds if it cannot.
If you have the Micro Nikkor 60 mm AF-D, it has a mechanical aperture ring and can be used reversed (I have not used Nikkors for about a decade, and don't remember if you also need something to force the aperture of a reversed lens to close). More recent Nikon lenses, without a mechanical aperture ring, cannot be used reversed (unless one goes through shenanigans to make them kind of work).
--ES
The 60mm D does need to be modified slightly to make the aperture always stay at the marked aperture. Otherwise it snaps wide open if there is no pressure on the aperture lever. There are special Nikon rings for that, but I dis-assembled the lens mount and I think all I had to do was remove a spring and that solved the problem. But that was more than 15 years ago so I don't remember exactly what I did.
- rjlittlefield
- Site Admin
- Posts: 23964
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Laura, welcome aboard!
It's not often that somebody starts by specifying their target as some number of microns per pixel, so immediately I'm wondering exactly what's driving that specification.
I assume that it relates somehow to the size of detail that you want to resolve, but then I wonder if you have already factored in the impact of Bayer filtering and assorted degradations due to spatial sampling.
As a general rule, to get reliable rendering, you'd normally like to have at least four pixels per cycle, each "cycle" being a black/white line pair so that BWBW would span 8 pixels. Some illustration and discussion of why this is so can be found at http://www.photomacrography.net/forum/v ... php?t=2439.
Anything further you can say about what you're really trying to accomplish would be helpful.
For the image that you've posted, you wrote that "SOOC there are 2.5um per pixel". I assume that SOOC means "straight out of camera". But the image as posted is different from that. I measure it as 290 pixels for 1000 microns, which would be more like 3.45 microns per pixel. If you can explain that discrepancy, it would also help a lot.
Now, taking your "less than 2 um per pixel" as an appropriate spec, and noting that the D5300 has pixels that are 3.92 microns wide (23.5 microns sensor width, divided by 6000 pixels), it seems that you're needing at least 2X magnification, which would give a total field width of 11.75 mm.
Getting pixel-sharp images all across an APS-C sensor at exactly 2X is actually a pretty challenging problem, so it will be helpful to know if there's some "give" in your specifications.
For example, if in fact you only need to cover a field that is 4 or 5 mm wide, then the obvious approach is to go for an optical magnification someplace in the range of 4X to 5X, for which low power microscope objectives are a good match. If you need a sharper image, and you can live with per-shot DOF of only around 8 microns, then you can even use certain 10X microscope objectives and push them down to 5X by using a shorter than normal "tube lens". See Lenses for use at 4-5X on an APS-sized sensor for a range of possibilities. BTW, the basic reference on using an objective with a DSLR is our FAQ: How can I hook a microscope objective to my camera?
If your subjects are 3-dimensional, then most likely you'll be needing to use "focus stacking" in order to get everything sharp. That in turn implies that you'll need to have some sort of focus-stepping mechanism capable of movements in the range of 5-50 microns. There are several good mechanisms to do that, both DIY and commercially packaged, but of course there are impacts on the budget. What is most appropriate will depend on time/cost tradeoffs, basically how many subjects you want to image, and how quickly.
I apologize for raising so many different issues, but there really are a lot of different facets to the problem. Again, anything further you can tell us about your application will be helpful.
--Rik
It's not often that somebody starts by specifying their target as some number of microns per pixel, so immediately I'm wondering exactly what's driving that specification.
I assume that it relates somehow to the size of detail that you want to resolve, but then I wonder if you have already factored in the impact of Bayer filtering and assorted degradations due to spatial sampling.
As a general rule, to get reliable rendering, you'd normally like to have at least four pixels per cycle, each "cycle" being a black/white line pair so that BWBW would span 8 pixels. Some illustration and discussion of why this is so can be found at http://www.photomacrography.net/forum/v ... php?t=2439.
Anything further you can say about what you're really trying to accomplish would be helpful.
For the image that you've posted, you wrote that "SOOC there are 2.5um per pixel". I assume that SOOC means "straight out of camera". But the image as posted is different from that. I measure it as 290 pixels for 1000 microns, which would be more like 3.45 microns per pixel. If you can explain that discrepancy, it would also help a lot.
Now, taking your "less than 2 um per pixel" as an appropriate spec, and noting that the D5300 has pixels that are 3.92 microns wide (23.5 microns sensor width, divided by 6000 pixels), it seems that you're needing at least 2X magnification, which would give a total field width of 11.75 mm.
Getting pixel-sharp images all across an APS-C sensor at exactly 2X is actually a pretty challenging problem, so it will be helpful to know if there's some "give" in your specifications.
For example, if in fact you only need to cover a field that is 4 or 5 mm wide, then the obvious approach is to go for an optical magnification someplace in the range of 4X to 5X, for which low power microscope objectives are a good match. If you need a sharper image, and you can live with per-shot DOF of only around 8 microns, then you can even use certain 10X microscope objectives and push them down to 5X by using a shorter than normal "tube lens". See Lenses for use at 4-5X on an APS-sized sensor for a range of possibilities. BTW, the basic reference on using an objective with a DSLR is our FAQ: How can I hook a microscope objective to my camera?
If your subjects are 3-dimensional, then most likely you'll be needing to use "focus stacking" in order to get everything sharp. That in turn implies that you'll need to have some sort of focus-stepping mechanism capable of movements in the range of 5-50 microns. There are several good mechanisms to do that, both DIY and commercially packaged, but of course there are impacts on the budget. What is most appropriate will depend on time/cost tradeoffs, basically how many subjects you want to image, and how quickly.
I apologize for raising so many different issues, but there really are a lot of different facets to the problem. Again, anything further you can tell us about your application will be helpful.
--Rik
Good morning and thank you all for your responses!
Let me start by saying: I have greatly underestimated the complexity of this field and have much to learn! Haha.
Rik, the information you have given to me and to others is incredible! How did you learn all of this!? I have an associates in photographic technology but all this is so far beyond anything we were taught- and I am eager to learn more- But being a visual learner, so much of the information given is beyond my current comprehension.
I work for an engineering firm, we get most of our work through SBIR contracts that we obtain by sending proposals to... We are considering making a proposal for one that involves particle counting & sizing while suspended in fluid. This is something the company has done before but the size we are needing to get down to is much smaller than before. We need to be able to size particles that are 20nm to 10um. The images do not need to be 3D.
I'm almost wondering if a DSLR would not be our best option...especially as the device capturing the images needs to be able to be handheld. I know we are going to need a bit of light as well. I just started at this firm in December but they have shown me some of their work that involved illumination by laser. Should've sprung for the degree in optical engineering.
Any advice given is greatly appreciated.
Thank you all again,
Hope everyone has a great week!
Laura
Let me start by saying: I have greatly underestimated the complexity of this field and have much to learn! Haha.
Rik, the information you have given to me and to others is incredible! How did you learn all of this!? I have an associates in photographic technology but all this is so far beyond anything we were taught- and I am eager to learn more- But being a visual learner, so much of the information given is beyond my current comprehension.
I work for an engineering firm, we get most of our work through SBIR contracts that we obtain by sending proposals to... We are considering making a proposal for one that involves particle counting & sizing while suspended in fluid. This is something the company has done before but the size we are needing to get down to is much smaller than before. We need to be able to size particles that are 20nm to 10um. The images do not need to be 3D.
I'm almost wondering if a DSLR would not be our best option...especially as the device capturing the images needs to be able to be handheld. I know we are going to need a bit of light as well. I just started at this firm in December but they have shown me some of their work that involved illumination by laser. Should've sprung for the degree in optical engineering.
Any advice given is greatly appreciated.
Thank you all again,
Hope everyone has a great week!
Laura
Kind Regards,
Laura
IG: @lauramckenziecombs
Laura
IG: @lauramckenziecombs
- rjlittlefield
- Site Admin
- Posts: 23964
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Laura,
Thank you for the further information and the kind words. My formal training is mostly in mathematics and computer science. In optics I have mostly learned from reading and experimentation. Making sure that physical results match theoretical predictions is an excellent substitute for an instructor grading tests. If you want some quick lessons, the best explanatory material I know right now comes from Edmund Optics. See http://www.photomacrography.net/forum/v ... hp?t=36667 for discussion and links.
Now, getting back to your specific problem...
20nm to 10um is quite a wide range. The lower end of it is far below the diffraction limit for any light that you'd like to be using. So, it is clearly not accessible to optical imaging in the sense of "shape resolved on an array of pixels".
On the other hand, most or all of the range should be accessible to optical scattering measurements.
The basic idea is simple: shine a bright light on the sample and measure the amount of light that is scattered toward the sensor. The amount of light scattered by a particle varies directly with particle size, so in principle all you have to do is measure the amount of light scattered from each particle to get some indication of its overall size. I say "some indication" because the measurement does not tell you particle shape and must assume some intrinsic reflectivity.
Scattering has the advantage that it works even below the diffraction limit of resolution.The measurement also does not depend on perfect focus, so it's great for particles that are not well localized. See HERE for a particularly gee-whiz demonstration of scattering from a single atom dynamically suspended in vacuum. Google search on cell phone particle size analysis will also get you some interesting and relevant hits.
Of course you can combine this type of measurement with array-of-pixels sensing and an imaging lens. I expect this would be helpful for measuring multiple particles at once, and for identifying at least some cases where two particles are so close together that they would be treated as one by a simpler detector.
If you do decide to go this route, then be sure to get a sensor whose output values have a simple relationship to absolute light intensity. In a DSLR, shooting as raw and converting to a linear color profile would do the trick. In that case a simple sum of pixel values is also the sum of light intensity. If the camera can only output JPEG or any other format that has a nonlinear mapping between intensity and pixel value, then you'll have more headaches inverting the mapping before doing the sum. In any case I don't really recommend a DSLR or any other consumer camera that has a Bayer filter, unless you decide to just look at the data from one color of pixels. Simpler would be to use a monochrome camera where all the pixel positions are the same.
As an aside, I used to work in contract R&D so I have some feel for what your concept design meetings might be like. But for me that was a prior life that I have no desire to go back to. Your project sounds like a fun challenge. When it's all over, I would be interested to hear something about what your team came up with, to the extent that you can comfortably describe that in public.
--Rik
Thank you for the further information and the kind words. My formal training is mostly in mathematics and computer science. In optics I have mostly learned from reading and experimentation. Making sure that physical results match theoretical predictions is an excellent substitute for an instructor grading tests. If you want some quick lessons, the best explanatory material I know right now comes from Edmund Optics. See http://www.photomacrography.net/forum/v ... hp?t=36667 for discussion and links.
Now, getting back to your specific problem...
20nm to 10um is quite a wide range. The lower end of it is far below the diffraction limit for any light that you'd like to be using. So, it is clearly not accessible to optical imaging in the sense of "shape resolved on an array of pixels".
On the other hand, most or all of the range should be accessible to optical scattering measurements.
The basic idea is simple: shine a bright light on the sample and measure the amount of light that is scattered toward the sensor. The amount of light scattered by a particle varies directly with particle size, so in principle all you have to do is measure the amount of light scattered from each particle to get some indication of its overall size. I say "some indication" because the measurement does not tell you particle shape and must assume some intrinsic reflectivity.
Scattering has the advantage that it works even below the diffraction limit of resolution.The measurement also does not depend on perfect focus, so it's great for particles that are not well localized. See HERE for a particularly gee-whiz demonstration of scattering from a single atom dynamically suspended in vacuum. Google search on cell phone particle size analysis will also get you some interesting and relevant hits.
Of course you can combine this type of measurement with array-of-pixels sensing and an imaging lens. I expect this would be helpful for measuring multiple particles at once, and for identifying at least some cases where two particles are so close together that they would be treated as one by a simpler detector.
If you do decide to go this route, then be sure to get a sensor whose output values have a simple relationship to absolute light intensity. In a DSLR, shooting as raw and converting to a linear color profile would do the trick. In that case a simple sum of pixel values is also the sum of light intensity. If the camera can only output JPEG or any other format that has a nonlinear mapping between intensity and pixel value, then you'll have more headaches inverting the mapping before doing the sum. In any case I don't really recommend a DSLR or any other consumer camera that has a Bayer filter, unless you decide to just look at the data from one color of pixels. Simpler would be to use a monochrome camera where all the pixel positions are the same.
As an aside, I used to work in contract R&D so I have some feel for what your concept design meetings might be like. But for me that was a prior life that I have no desire to go back to. Your project sounds like a fun challenge. When it's all over, I would be interested to hear something about what your team came up with, to the extent that you can comfortably describe that in public.
--Rik