Effect of Misalignment Between Optical Axis and Motion Axis

Have questions about the equipment used for macro- or micro- photography? Post those questions in this forum.

Moderators: rjlittlefield, ChrisR, Chris S., Pau

mjkzz
Posts: 1681
Joined: Wed Jul 01, 2015 3:38 pm
Location: California/Shenzhen
Contact:

Effect of Misalignment Between Optical Axis and Motion Axis

Post by mjkzz »

About a week ago, I was asked why his final stacked image always has streaking pattern on the right side by someone who is pretty good at stacking. The optical system was composed of 5x Mitutoyo and Raynox 150, the rail used does not have much shift either and is securely mounted.

5x Mitty is almost telecentric so there should be very little magnification change and the streaking pattern does not suggest that ether. So this is pretty puzzling till we examined some details of his setup and the answer emerged: the misalignment of motion axis and optical axis. After correcting that, the images look normal.

So I spent some time analyzing it and here is my findings and I think it could be beneficial to our community:

Image

From above figure, the assumption is that the motion axis and optical axis are misaligned, forming an angle of a, the camera moves from position P to position P' with stacking distance of d. The D is focal length of the optical system, and the M is the magnification factor of the optical system.

So, the analysis is this: offset on sensor = M x d x tan(a). Also notice, essentially unless we intend to misalign on purpose, the angle a will be very small, by good intention to keep it that way. So tan(a) is very close to a measured in radians. So final result is:

offset on sensor = M x d x a

What this means is that the offset on sensor is linearly proportional to the angle a where a is very small. Taking pixel size on sensor, we can calculate # of pixels and thus the width of the streaking pattern (in one dimension).

One example: using Canon 550D and the misalignment angle a is one degree (0.0174533 radians), stacking distance d is 5mm, and optical magnification is 5x, so the # of pixels of the offset is:

5 x 5000 x 0.0174533 / 4.3 = 101.47 pixels.

Note, pixel size on Canon 550D sensor is 4.3um.

What it means is that, a one degree misalignment can cause 101.47 pixels shift over 5mm stacking distance at 5x magnification.

One can also calculate it for 50x over 0.3mm (300um) . . .

I am a bit busy lately, but I will finish up this writing as a blog with some experimental pictures in the future. But I think many of us have made this mistake (misalignment) and I definitely did, so this might be beneficial to many of us.

Beatsy
Posts: 2105
Joined: Fri Jul 05, 2013 3:10 am
Location: Malvern, UK

Post by Beatsy »

Thanks very much for posting this - very interesting. I really like "proper" numeric reasoning for problems or issues encountered :)

I only "eyeball" alignment on my rig, which is almost always plenty accurate enough. I get only a very thin band of 'streakies', if any at all. But if I have a really tight framing for a subject, these streakies can sometimes encroach on it, which is a blimmin' annoyance. Is there any systematic way to ensure perfect alignment of the optics along the rail movement axis?

ray_parkhurst
Posts: 3412
Joined: Sat Nov 20, 2010 10:40 am
Location: Santa Clara, CA, USA
Contact:

Post by ray_parkhurst »

Why isn't this pixel shift fixed by frame alignment?

mjkzz
Posts: 1681
Joined: Wed Jul 01, 2015 3:38 pm
Location: California/Shenzhen
Contact:

Post by mjkzz »

Is there any systematic way to ensure perfect alignment of the optics along the rail movement axis?
I thought about making a laser pointer that can be screwed onto either RMS or other type of threads, then you can try to align it by moving from one end to the other end of rail, let the laser pointer point to a white wall far away and see if it shifts.

Another way is using a laser pointer shooting through view finder, this was done by Chris (forgot which Chris). That method might not work for electronic view finder type camera, but could be very useful for other type, as you can make the laser to come out at center of the lens. With this, you do not need to use an objective, just a regular lens.

mjkzz
Posts: 1681
Joined: Wed Jul 01, 2015 3:38 pm
Location: California/Shenzhen
Contact:

Post by mjkzz »

Why isn't this pixel shift fixed by frame alignment?
Note sure what you mean, if you meant fixing it by software alignment, you can not because it is caused by physical shift of image due to misalignment and it is the very alignment algorithm that is making the streakies visible and rightfully so.

ray_parkhurst
Posts: 3412
Joined: Sat Nov 20, 2010 10:40 am
Location: Santa Clara, CA, USA
Contact:

Post by ray_parkhurst »

mjkzz wrote:
Why isn't this pixel shift fixed by frame alignment?
Note sure what you mean, if you meant fixing it by software alignment, you can not because it is caused by physical shift of image due to misalignment and it is the very alignment algorithm that is making the streakies visible and rightfully so.
Assuming telecentricity (since you mentioned it in the OP) then there is no perspective change due to a lateral shift caused by the axes being misaligned. So as long as the software is able to compensate the misalignment, I don't understand the issue. If the problem is that the software can't do a proper alignment, then this is a software problem.

That said, I have always aligned my systems so the two axes are parallel. It's a bit of trial and error, but not too difficult. It just requires two photos to check. My process is:

- Set stepper rail within top 25% of its range

- Adjust bellows for an appropriate magnification, critically focus on a subject with fine detail, and snap the shot

- Adjust the focus rail within bottom 25% of its range

- Re-adjust the bellows focus (keeping same magnification) and snap the shot

A close examination of these shots will show if there is misalignment. Figuring out which way to go is sometimes confusing, so simply giving the bellows a little "push" in the X or Y will confirm which way to make the adjustment and an idea of the magnitude. The actual magnitude needed is half of the total shift.

- Shim or tilt the bellows mount a little in the desired direction

- Repeat the process until desired level of alignment is achieved

Even after this process, I still find it necessary with stepper-based rails to let the software align frames due to small movements caused by rail imperfections or because the subject vibrates and moves on the stage. For the voice-coil rail, I have been able to eliminate software alignment completely.

enricosavazzi
Posts: 1474
Joined: Sat Nov 21, 2009 2:41 pm
Location: Västerås, Sweden
Contact:

Post by enricosavazzi »

I don't have an out-of-the-box solution. However, I think there are two parts to this problem. The following discussion contains several points that should be obvious to most readers, but it is better to state them, just so that we can identify possible problems and areas for improvement.

The first part of the problem is how to detect the misalignment. Ideally it should be done by using the actual lens and camera sensor that will be used for imaging, but in practice this is feasible for a sufficient amount of travel along the Z axis only if the lens is focusable. The idea is to place an optical target on the stage and adjust it in X and Y dimensions so that its image is at the center of the sensor, then move the camera + lens (or the stage, depending on the mechanical construction of the setup) along the Z axis, refocus the lens and check that the target is still centered on the sensor. If it isn't, the optical axis and Z travel axis are not parallel.

When using a repurposed telephoto lens as tube lens, refocusing the lens may allow enough Z travel to test the alignment (the point is that image quality should remain good enough for alignment even when the tube lens is focused away from infinity, since curvature of field and CA don't affect a point target at the center of the image circle).

This way to detect misalignment involves a few assumptions, foremost that the objective, tube lens, camera, and lens focusing mechanism are properly centered and aligned with respect to each other. This may not be a given for homemade setups that include e.g. an infinity objective and a repurposed tube lens, connected together and to the camera with long stacks of adapter rings and extension tubes of unknown precision. If we cannot trust the reciprocal alignment of camera and optics, the number of variables increases and alignment without precision instruments becomes orders of magnitude more difficult. Another assumption is that we can tell with precision when the target is at the exact center of the sensor (easy enough in recorded images, not equally easy in live view unless the live view can display a cross reticle or equivalent). Yet another assumption is that the Z travel is not accompanied by non-repeatable X and Y, or angular, shifts.

The second part is providing mechanical adjustments to align the optical axis parallel to the travel axis. This involves quite a bit of trial-and-error, and after each adjustment one has to start again the whole test procedure from the beginning.
--ES

mjkzz
Posts: 1681
Joined: Wed Jul 01, 2015 3:38 pm
Location: California/Shenzhen
Contact:

Post by mjkzz »

Ray, I will draw some pix to illustrate the point that it is the very alignment algorithm that makes the streakie visible. Rik is filling in the streak with some existing pixels (or there could be an option in Zerene), and my algorithm is filling it with black dots.

While thinking about how to answer Ray's question, it comes to me that the actual formula for the shift of FOV of the lens is:

FOV offset = (s' - s) * cos(a)
FOV offset = d x tan(a) x cos(a)
FOV offset = d x sin(a)

But again, as the angle a is small enough, a=sin(a) in radians.

I was thinking about topology of it, then forgot to rotate it to get FOV offset. You can draw a line perpendicular to the optical axis to arrive this.

elf
Posts: 1416
Joined: Sun Nov 18, 2007 12:10 pm

Post by elf »

For even more entertainment add movements to the lens and camera :P

BugEZ
Posts: 850
Joined: Sat Mar 26, 2011 7:15 pm
Location: Loves Park Illinois

Post by BugEZ »

With my old Pentax *ist DS camera I was plagued with hot pixels. As annoying as they were, the streaks they created in a stack did a marvelous job of pointing out the axis of the actuator. When the streaks pointed at the center of the image the axis were aligned. If they wobble, the actuator wobbles.

My newer cameras don't have hot pixels to the same extent and misalignment is harder to identify.

Keith

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

BugEZ wrote:My newer cameras don't have hot pixels to the same extent and misalignment is harder to identify.
Have you considered batch editing your test stacks to add some markers in fixed pixel positions? A Photoshop action would be pretty simple.

--Rik

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Re: Effect of Misalignment Between Optical Axis and Motion A

Post by rjlittlefield »

mjkzz wrote:offset on sensor = M x d x a

What this means is that the offset on sensor is linearly proportional to the angle a where a is very small. Taking pixel size on sensor, we can calculate # of pixels and thus the width of the streaking pattern (in one dimension).
I agree with this calculation.

But I prefer to think of the shift as a fraction of frame width, in which case the corresponding formula is
Rik wrote:shift as fraction of frame width = d/w * a

where
d is stack depth
w is frame width at subject
a is the misalignment angle in radians
So...
If your stack is equally wide and deep, the shift is about 1.75% per degree, independent of magnification, sensor size, or sensor resolution.

Deeper or shallower stacks have more or less shift, in proportion to the stack depth.


As an example,
mjkzz wrote:One example: using Canon 550D and the misalignment angle a is one degree (0.0174533 radians), stacking distance d is 5mm, and optical magnification is 5x, so the # of pixels of the offset is:

5 x 5000 x 0.0174533 / 4.3 = 101.47 pixels.

Note, pixel size on Canon 550D sensor is 4.3um.

What it means is that, a one degree misalignment can cause 101.47 pixels shift over 5mm stacking distance at 5x magnification.
Alternately, the subject width is 22.3/5 = 4.46 mm, so the shift as a fraction of frame width is 5/4.46*0.0174533 = 1.96% .

The resulting value is the same to within rounding error, but the calculation as fraction of frame width seems simpler and makes more intuitive sense to me.

--Rik

Smokedaddy
Posts: 1951
Joined: Sat Oct 07, 2006 10:16 am
Location: Bigfork, Montana
Contact:

Post by Smokedaddy »

I find this very interesting since I have almost finished my manual horizontal rig. At the moment I will only be using a MP-E and a few reversed enlarger lenses. I was wondering how 'critical' it was if my lens was perpendicular to my specimen? If so, I was wondering how would I verify that.

Chris S.
Site Admin
Posts: 4042
Joined: Sun Apr 05, 2009 9:55 pm
Location: Ohio, USA

Post by Chris S. »

mjkzz wrote:Another way is using a laser pointer shooting through view finder, this was done by Chris (forgot which Chris). That method might not work for electronic view finder type camera, but could be very useful for other type, as you can make the laser to come out at center of the lens. With this, you do not need to use an objective, just a regular lens.
It was I, in the post Laser aiming and focus in photomacrography.

I have indeed used this laser to align my rig. With the tube lens assembly on the camera, but with no objective attached, the laser projects nicely onto a white piece of foam core mounted on the subject stand. Make a dot where the laser hits, then slew the rig through its full range of focusing movement. If the laser doesn't move, I consider the rig to be aligned. If the laser does move, it's obvious which way to adjust the rig to correct it.

I do recall meeting some frustrations with this approach, such that I didn't think my method mature enough to post out. But as it's been a few years, I don't recall what those frustrations were. I do think the approach has merit.

--Chris S.

mjkzz
Posts: 1681
Joined: Wed Jul 01, 2015 3:38 pm
Location: California/Shenzhen
Contact:

Post by mjkzz »

If your stack is equally wide and deep, the shift is about 1.75% per degree, independent of magnification, sensor size, or sensor resolution.

Deeper or shallower stacks have more or less shift, in proportion to the stack depth.
Agree, if stacking distance is approximately same as width of FOV, then 1.75% per degree is good estimate.

For vertical streakie, with same condition, ie, stacking distance is roughly same as width of FOV, the shift is about 3/2*1.75% = 2.53% per degree vertically.

Post Reply Previous topicNext topic