More Bdelloid Rotifer
Moderators: rjlittlefield, ChrisR, Chris S., Pau
-
- Posts: 693
- Joined: Mon Aug 14, 2006 6:42 pm
- Location: South Beloit, Ill
More Bdelloid Rotifer
Bdelloid rotifer
Leitz Ortholux in brightfield.
45X Leitz achromat (upper image)
25X Leitz achromat (lower image)
10X Leitz Periplan GF projection eyepiece plus 1/3X Sp. Reflex lens
Canon 10D camera
Strobe illumination
Photoshop enhancements
Same subject as my last post.
Walt
-
- Posts: 693
- Joined: Mon Aug 14, 2006 6:42 pm
- Location: South Beloit, Ill
- Charles Krebs
- Posts: 5865
- Joined: Tue Aug 01, 2006 8:02 pm
- Location: Issaquah, WA USA
- Contact:
-
- Posts: 693
- Joined: Mon Aug 14, 2006 6:42 pm
- Location: South Beloit, Ill
- rjlittlefield
- Site Admin
- Posts: 23564
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Walter,
You'll have to wait a while to buy it, but technology to do this is working in the lab.
It's a variation of "wavefront coding". A specially constructed filter is placed in the optical path. As seen by the sensor, the whole image is horribly fuzzy, but a sharp image with extended depth of field can be recovered from it by computation. The source image is a single exposure, so it should work fine with flash.
See "Applications of extended depth of focus technology to light microscope systems" at http://www.colorado.edu/isl/papers/microscope.pdf . The paper claims 8X improvement in DOF.
--Rik
You'll have to wait a while to buy it, but technology to do this is working in the lab.
It's a variation of "wavefront coding". A specially constructed filter is placed in the optical path. As seen by the sensor, the whole image is horribly fuzzy, but a sharp image with extended depth of field can be recovered from it by computation. The source image is a single exposure, so it should work fine with flash.
See "Applications of extended depth of focus technology to light microscope systems" at http://www.colorado.edu/isl/papers/microscope.pdf . The paper claims 8X improvement in DOF.
--Rik
-
- Posts: 693
- Joined: Mon Aug 14, 2006 6:42 pm
- Location: South Beloit, Ill
- rjlittlefield
- Site Admin
- Posts: 23564
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Walt,
It's related and probably complementary.
Both techniques use deconvolution to back out the blurring effects of optics. For astronomy, what's backed out is the PSF (point spread function) of the lenses, mirrors, and atmosphere. The trick with wavefront focusing is that the special filter is designed so that it blurs a wide range of depths pretty much equally, so that backing out the blur recovers an image with extended DOF.
The problem with deconvolution is that in its pure form, it's vulnerable to noise.
Maximum entropy is a refinement that allows one to compute what in some sense is the "most likely" scene that could have produced the sensor image, given what you know about the potential scenes and the system's noise properties. See http://www.astro.princeton.edu/~gk/A542/mario.ppt#263,11,Regularization procedures . (You may have to copy and paste this URL into your browser. Neither phpBB nor TinyURL seem to handle the special characters very well.)
I suspect that the microscopy problem is harder than the astronomy problem, because the range of possible scenes is a lot larger. But I'd also be surprised if some form of maximum entropy calculation did not improve the wavefront coding process too.
--Rik
It's related and probably complementary.
Both techniques use deconvolution to back out the blurring effects of optics. For astronomy, what's backed out is the PSF (point spread function) of the lenses, mirrors, and atmosphere. The trick with wavefront focusing is that the special filter is designed so that it blurs a wide range of depths pretty much equally, so that backing out the blur recovers an image with extended DOF.
The problem with deconvolution is that in its pure form, it's vulnerable to noise.
Maximum entropy is a refinement that allows one to compute what in some sense is the "most likely" scene that could have produced the sensor image, given what you know about the potential scenes and the system's noise properties. See http://www.astro.princeton.edu/~gk/A542/mario.ppt#263,11,Regularization procedures . (You may have to copy and paste this URL into your browser. Neither phpBB nor TinyURL seem to handle the special characters very well.)
I suspect that the microscopy problem is harder than the astronomy problem, because the range of possible scenes is a lot larger. But I'd also be surprised if some form of maximum entropy calculation did not improve the wavefront coding process too.
--Rik