Pupil factor and linear perspective
Moderators: Chris S., Pau, Beatsy, rjlittlefield, ChrisR
Pupil factor and linear perspective
Dear users in this forum
Recently when I seached the Internet for the word "Pupil Factor", I was referred to an article of rjlittlefield on this site. Reading it I feel certain that this person understand this concept very well. I should tell that I am a High School Teacher from Denmark with a masters degree in Mathematics and Physics. Beside being interested in mathematics beside my job, I have also become increasingly interested in photography as well. One question I have been investigating is if a photographic lense does preserve the linear perspective (Central Projection model). My own clear understanding is that it does, at least for "ordinary" lenses. Not Fish Eye lenses of course, because straight lines are not depicted as straight lines or a point. So in order to understand it better I turned my attention towards the physics of lenses. I did read in "Physics of Digital Photography" by D. A. Rowlands. In that book I stumpled upon the concept of "Pupil factor" or "Pupil magnification".
Now, before continuing with this special concept, I need to explain the idea of Linear Perspective. In the central projection model a 3D object is projected onto a plane. An Eye Point of the viewer is given. Now a 3D point is mapped to the point in the image plane which is the intersection between the image plane and the line passing through the 3D point and the Eye Point. Of course "information" is lost in this projection. One consequence is that the final image on the image plane has to be viewed from that special Eye Point in order to look like the real 3D object. So a "virtual Eye Point" is associated with every image! If viewed from another distance, the image may look unnatural and/or distorted. Now what I would like is to find the distance associated with a print of a photo shot with a given lens having a given focal length and mounted on a specific camera. From what I read, one will get a good approximation to that distance by using the formula:
dist = f*Lp/Ls
where:
dist: Distance associated with the particular print of a photo.
f: Focal Length (nominal - as written on the lens)
Lp: Length of printed image
Ls: Length of camera sensor
The condition is here, that focus is at infinity and that the pupil factor can be set to 1. If focus is at an object at distance s (different from infinity) from the camera, one need to take into account the magnification m. It can be calculated as m = f/(s-f). But also the pupil factor P is present in the more advanced formula:
dist = f*(m+P)*Lp/Ls
Now my question: How much can the Pupil Factor vary? Is it often close to 1 for ordinary lenses? I use a Fujifilm camera with prime lenses. The manufacturer does not inform about the Pupil factor for a specific prime lens it seems. Is there another way I can figure out the pupil factor for a specific prime lens?
I would much appreciate if someone could comment on these questions.
Kind regrads,
Erik V.
Recently when I seached the Internet for the word "Pupil Factor", I was referred to an article of rjlittlefield on this site. Reading it I feel certain that this person understand this concept very well. I should tell that I am a High School Teacher from Denmark with a masters degree in Mathematics and Physics. Beside being interested in mathematics beside my job, I have also become increasingly interested in photography as well. One question I have been investigating is if a photographic lense does preserve the linear perspective (Central Projection model). My own clear understanding is that it does, at least for "ordinary" lenses. Not Fish Eye lenses of course, because straight lines are not depicted as straight lines or a point. So in order to understand it better I turned my attention towards the physics of lenses. I did read in "Physics of Digital Photography" by D. A. Rowlands. In that book I stumpled upon the concept of "Pupil factor" or "Pupil magnification".
Now, before continuing with this special concept, I need to explain the idea of Linear Perspective. In the central projection model a 3D object is projected onto a plane. An Eye Point of the viewer is given. Now a 3D point is mapped to the point in the image plane which is the intersection between the image plane and the line passing through the 3D point and the Eye Point. Of course "information" is lost in this projection. One consequence is that the final image on the image plane has to be viewed from that special Eye Point in order to look like the real 3D object. So a "virtual Eye Point" is associated with every image! If viewed from another distance, the image may look unnatural and/or distorted. Now what I would like is to find the distance associated with a print of a photo shot with a given lens having a given focal length and mounted on a specific camera. From what I read, one will get a good approximation to that distance by using the formula:
dist = f*Lp/Ls
where:
dist: Distance associated with the particular print of a photo.
f: Focal Length (nominal - as written on the lens)
Lp: Length of printed image
Ls: Length of camera sensor
The condition is here, that focus is at infinity and that the pupil factor can be set to 1. If focus is at an object at distance s (different from infinity) from the camera, one need to take into account the magnification m. It can be calculated as m = f/(s-f). But also the pupil factor P is present in the more advanced formula:
dist = f*(m+P)*Lp/Ls
Now my question: How much can the Pupil Factor vary? Is it often close to 1 for ordinary lenses? I use a Fujifilm camera with prime lenses. The manufacturer does not inform about the Pupil factor for a specific prime lens it seems. Is there another way I can figure out the pupil factor for a specific prime lens?
I would much appreciate if someone could comment on these questions.
Kind regrads,
Erik V.
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
Erik, welcome aboard! I am honored that the internet sent you to read my article about pupil factor.
I will try to keep this reply brief. So, by the bullet points:
Backing up to basic concepts, the requirement for distortion-free viewing is that angular distances between features in the viewing process must be the same as angular distances between those same features in the capturing process. However, that requirement is in direct conflict with the reasons that photographers use long and short lenses in the first place. It says that if you shoot a distant scene with 50mm and 100mm lenses, then you have to view the 100mm image from twice as far away. But if you actually do that, then the viewing process makes the 100mm image look, um, exactly the same as the 50mm except for covering only half as wide an area. Somehow the mathematical goal of "avoiding distortion" has gotten in the way of the photographic goal of rendering the subject. If your goal is something other than just understanding the math, then you'll have to figure out how to resolve this conflict.
I hope this helps. If you see errors in my reasoning, then please let me know because I would much rather catch errors sooner than later.
--Rik
I will try to keep this reply brief. So, by the bullet points:
- Please take time to study another of my articles, Theory of the “No-Parallax” Point in Panorama Photography. This will explain in detail several key insights about the concepts that you're considering.
- Yes, ordinary lenses do preserve the essence of the central projection model.
- But lenses do not follow all the assumptions that people often make when drawing diagrams and deriving formulas from them. The result is that published formulas are often quite far from accurate.
- The center of perspective for the captured image is simply the center of the entrance pupil of the lens. The entrance pupil is just the limiting aperture, as seen through whatever optics are in front of it. So, if you stop down your lens and look into the front of it, that small hole formed by the iris will be the center of perspective for the scene as it is captured by the lens.
- For a lens that is focused at infinity, the formula dist = f*Lp/Ls looks OK to me.
- However, the formula dist = f*(m+P)*Lp/Ls looks definitely not correct in general. If it were correct in general, then it would continue to be correct even for distant scenes, where m goes to zero. But then the formula would reduce to dist = f*P*Lp/Ls, which is exactly a factor of P different from the earlier formula. This conflicts with the fact that when focused at infinity and shooting a distant scene, all lenses of focal length 100mm will capture exactly the same image, regardless of what their pupil factors happen to be.
Backing up to basic concepts, the requirement for distortion-free viewing is that angular distances between features in the viewing process must be the same as angular distances between those same features in the capturing process. However, that requirement is in direct conflict with the reasons that photographers use long and short lenses in the first place. It says that if you shoot a distant scene with 50mm and 100mm lenses, then you have to view the 100mm image from twice as far away. But if you actually do that, then the viewing process makes the 100mm image look, um, exactly the same as the 50mm except for covering only half as wide an area. Somehow the mathematical goal of "avoiding distortion" has gotten in the way of the photographic goal of rendering the subject. If your goal is something other than just understanding the math, then you'll have to figure out how to resolve this conflict.
I hope this helps. If you see errors in my reasoning, then please let me know because I would much rather catch errors sooner than later.
--Rik
Re: Pupil factor and linear perspective
Thank you for your quick reply. I appreciate it.
The formula dist = f*(m+P)*Lp/Ls, I found in the book "Physics of Digital Photography (Second Edition)", by D. A. Rowlands. I own a copy of the book myself, but the first chapter can be seen online on this homepage:
https://iopscience.iop.org/book/mono/97 ... 2558-5.pdf
Click on Preview to get to this first chapter. The formula I refer to is stated on page 1-38. I use however a different notation: The scaling factor X in this book is the same as Lp/Ls above. To see how it is derived one need to read a number of earlier pages, but everything should be there.
I agree with you, that if focus is at a distant object (object distance s) the magnification will be small and tend to 0 when s approches infinity. In general m = f/(s-f), where f is the nominal focal length of the lens (formula (1.19) in the book). If focus is at infinity, then I agree therefore that the formula reduces to dist = f*P*Lp/Ls. I still think that this formula holds true - of course given the setup in Gaussian Optics for thick lenses. My simplified formula dist = f*Lp/Ls is used under the assumption that P equal to 1 is a useful approximation. To be honest I don't know how much P can vary for different lenses. I tried finding some data for a few of my Fujifilm prime lenses but I was not able to find it. Do you know where/how to look for it, or is it something the manufacturer keeps for themselves? So without this knowledge, I just used the approximation P=1.
Now back to Linear Perspective. Firstly, the word "Perspective" is being used in a very broad sense. When I use it I will always mean the Linear Perspective coming from the Central Projection. A lot of people use the the word "distortion" in a very loose way when describing a specific photographic photo. Some of them obviously confuse things. In particular photos shot with wide angled lenses may look unnatural and quite a few people talk about lens errors and distortion in that connection, when the real explanation has to with watching a photo from the wrong distance. If a photo is shot with a very large wide angle lens, say 15 mm, the associated Eye Point will be very close to the image, often so close that a person won't even be able to focus his/her eyes so close up. When viewing it at a more natural (and wrong) distance, the image look distorted, although it really isn't. I was so happy to find an old Danish photography book from 1979, which summed it up in this way (translated into English):
----------
The perspective is always correct
All focal lengths reproduce the subject with the same perspective as the photographer's own eye saw it in direct view at the time of shooting. The reproduction of the image will therefore always be correct in perspective if it is viewed at the correct viewing distance. That is, if the eye takes the same position in relation to the image as the lens took for the negative when recording.
----------
Obviously Fish Eye lenses are exceptions. I am not interested in lens errors (7 lens errors), because today they are being dealt with in a rather accurate way by the manufacturers. They are usually not the origin of a distorted looking photo. Now this site is about Macro Photography, and some of those lenses being used might be rather extreme too, so maybe I may also have to make som exceptions from Linear Perspective true lenses here (if the Gaussian optics doesn't work here)? I am not sure.
I made an experiment with a camera with a zoom-lens on a tripod. First I shot a photo with the lens set at 25 mm, afterwards a photo set at 75 mm (same position and direction). As expected the photo shot with 75 mm was a perfect cropped version of the photo taken by the lens set at 25 mm: I cropped the image shot with 25 mm and enlarged it with a factor 3. Those two images were as identical as I could anticipate (depth of field not regarded), in full agreement with the theory that the lens deliver true linear perspective in both cases. Also the theory about the associated viewing distances work: The viewing distance associated with the 25mm-image is 3 times as small as the viewing distance associated with the 75mm-image, BUT when I enlarged the 25mm-photo with a factor 3, the viewing distances became the same!
NB! Thank you for the article. I will print it out and read it very soon.
The formula dist = f*(m+P)*Lp/Ls, I found in the book "Physics of Digital Photography (Second Edition)", by D. A. Rowlands. I own a copy of the book myself, but the first chapter can be seen online on this homepage:
https://iopscience.iop.org/book/mono/97 ... 2558-5.pdf
Click on Preview to get to this first chapter. The formula I refer to is stated on page 1-38. I use however a different notation: The scaling factor X in this book is the same as Lp/Ls above. To see how it is derived one need to read a number of earlier pages, but everything should be there.
I agree with you, that if focus is at a distant object (object distance s) the magnification will be small and tend to 0 when s approches infinity. In general m = f/(s-f), where f is the nominal focal length of the lens (formula (1.19) in the book). If focus is at infinity, then I agree therefore that the formula reduces to dist = f*P*Lp/Ls. I still think that this formula holds true - of course given the setup in Gaussian Optics for thick lenses. My simplified formula dist = f*Lp/Ls is used under the assumption that P equal to 1 is a useful approximation. To be honest I don't know how much P can vary for different lenses. I tried finding some data for a few of my Fujifilm prime lenses but I was not able to find it. Do you know where/how to look for it, or is it something the manufacturer keeps for themselves? So without this knowledge, I just used the approximation P=1.
Now back to Linear Perspective. Firstly, the word "Perspective" is being used in a very broad sense. When I use it I will always mean the Linear Perspective coming from the Central Projection. A lot of people use the the word "distortion" in a very loose way when describing a specific photographic photo. Some of them obviously confuse things. In particular photos shot with wide angled lenses may look unnatural and quite a few people talk about lens errors and distortion in that connection, when the real explanation has to with watching a photo from the wrong distance. If a photo is shot with a very large wide angle lens, say 15 mm, the associated Eye Point will be very close to the image, often so close that a person won't even be able to focus his/her eyes so close up. When viewing it at a more natural (and wrong) distance, the image look distorted, although it really isn't. I was so happy to find an old Danish photography book from 1979, which summed it up in this way (translated into English):
----------
The perspective is always correct
All focal lengths reproduce the subject with the same perspective as the photographer's own eye saw it in direct view at the time of shooting. The reproduction of the image will therefore always be correct in perspective if it is viewed at the correct viewing distance. That is, if the eye takes the same position in relation to the image as the lens took for the negative when recording.
----------
Obviously Fish Eye lenses are exceptions. I am not interested in lens errors (7 lens errors), because today they are being dealt with in a rather accurate way by the manufacturers. They are usually not the origin of a distorted looking photo. Now this site is about Macro Photography, and some of those lenses being used might be rather extreme too, so maybe I may also have to make som exceptions from Linear Perspective true lenses here (if the Gaussian optics doesn't work here)? I am not sure.
I made an experiment with a camera with a zoom-lens on a tripod. First I shot a photo with the lens set at 25 mm, afterwards a photo set at 75 mm (same position and direction). As expected the photo shot with 75 mm was a perfect cropped version of the photo taken by the lens set at 25 mm: I cropped the image shot with 25 mm and enlarged it with a factor 3. Those two images were as identical as I could anticipate (depth of field not regarded), in full agreement with the theory that the lens deliver true linear perspective in both cases. Also the theory about the associated viewing distances work: The viewing distance associated with the 25mm-image is 3 times as small as the viewing distance associated with the 75mm-image, BUT when I enlarged the 25mm-photo with a factor 3, the viewing distances became the same!
NB! Thank you for the article. I will print it out and read it very soon.
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
Thank you for the reference. I look forward to studying it.
--Rik
What are the values of the pupil factors for the 25mm and 75mm lenses that you used?erikV wrote: ↑Fri Dec 15, 2023 10:10 amI made an experiment with a camera with a zoom-lens on a tripod. First I shot a photo with the lens set at 25 mm, afterwards a photo set at 75 mm (same position and direction). As expected the photo shot with 75 mm was a perfect cropped version of the photo taken by the lens set at 25 mm: I cropped the image shot with 25 mm and enlarged it with a factor 3. Those two images were as identical as I could anticipate (depth of field not regarded), in full agreement with the theory that the lens deliver true linear perspective in both cases. Also the theory about the associated viewing distances work: The viewing distance associated with the 25mm-image is 3 times as small as the viewing distance associated with the 75mm-image, BUT when I enlarged the 25mm-photo with a factor 3, the viewing distances became the same!
--Rik
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
Sorry, I overlooked this question in my previous posting.erikV wrote: ↑Fri Dec 15, 2023 10:10 amTo be honest I don't know how much P can vary for different lenses. I tried finding some data for a few of my Fujifilm prime lenses but I was not able to find it. Do you know where/how to look for it, or is it something the manufacturer keeps for themselves? So without this knowledge, I just used the approximation P=1.
The easy way to get an accurate value for P is to just measure the two pupil diameters and do the division.
The procedure described at viewtopic.php?p=61516#p61516 (and the few posts after that) will work fine for most ordinary lenses. Just be sure to stop down far enough that you can see the inside edge of the iris while taking the measurement. Some lenses have such large pupils when wide open that they cannot be seen in their entirety from many viewing distances.
--Rik
Re: Pupil factor and linear perspective
Hi again. I am finally back. It was indeed a great idea to simply measure the pupil diameters myself following your description. I took a photo of some lenses from the front and from the back, each in a distance of 1 meter from the camera.

The wide angle prime lenses had pupil factors ranging from 1.16 to 1.91, whereas the prime telephoto lenses had pupil factors down to 0.65. As expected zoom-lenses had varying pupil factors depending on the focal length.
The classic old book " Optics of Photography" by Kingslake from 1992 clearly states that photographic lenses "preserves linear perspective", meaning that if one view the final print of a photo from the correct distance, it should deliver the exact same viewing experience as the real 3D object does when viewed from the position where the photograph was taken. It confirms what I read in the Danish book mentioned earlier. Now what puzzles me is the following: If the figure 1.29 on page 1-38 in the book "Physics of Digital Photography" by Rowlands (see above) is correct, I don't see how that lense can deliver a true linear perspective (central perspective), when the angles alpha and alpha' in object space and in image space are different. I mean if the lense delivers true perspective, those angles need to be the same. Imagine the photo printed on a transparent sheet of plastic. One should then be able to place that transparent sheet between the eye and the 3D object in such a way that the photographic image on the sheet overlap with the 3D object. As I see, this require those two angles to be the same. What is your thoughts about this reasoning?
With the measured pupil factors and the formulas in [Rowlands] I get in one case 40 degrees for the angle alpha and 52 degrees for the angle alpha'. Very different angles!
Regards, Erik

The wide angle prime lenses had pupil factors ranging from 1.16 to 1.91, whereas the prime telephoto lenses had pupil factors down to 0.65. As expected zoom-lenses had varying pupil factors depending on the focal length.
The classic old book " Optics of Photography" by Kingslake from 1992 clearly states that photographic lenses "preserves linear perspective", meaning that if one view the final print of a photo from the correct distance, it should deliver the exact same viewing experience as the real 3D object does when viewed from the position where the photograph was taken. It confirms what I read in the Danish book mentioned earlier. Now what puzzles me is the following: If the figure 1.29 on page 1-38 in the book "Physics of Digital Photography" by Rowlands (see above) is correct, I don't see how that lense can deliver a true linear perspective (central perspective), when the angles alpha and alpha' in object space and in image space are different. I mean if the lense delivers true perspective, those angles need to be the same. Imagine the photo printed on a transparent sheet of plastic. One should then be able to place that transparent sheet between the eye and the 3D object in such a way that the photographic image on the sheet overlap with the 3D object. As I see, this require those two angles to be the same. What is your thoughts about this reasoning?
With the measured pupil factors and the formulas in [Rowlands] I get in one case 40 degrees for the angle alpha and 52 degrees for the angle alpha'. Very different angles!
Regards, Erik
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
Erik, welcome back! It looks like you've made some good progress exploring this problem.
You wrote, and I have highlighted:
But the reasoning about the angles is not sound. This aspect, probably in combination with some misleading diagrams in the literature, is what's causing the confusion.
Let's see if I can explain. (Be warned: some of this may hurt your head a little, because the explanation is designed to break current thought patterns that are causing problems.)
Here is a simple diagram like one might find in a typical explanation:

With this diagram in hand, one might think the situation is obvious: all the corresponding angles will be equal, so surely if the little stick man simply turns around, he will see an image with correct perspective.
Of course that's not quite correct, because at first he will see the image upside down.
That particular problem is easily fixed by turning himself upside down.
But then the stick man is shocked to see that the letter ꟻ is backwards, as if seen in a mirror.
How did the mirroring happen?
Aha! The mirroring happened when the stick man turned around to look at the image.
So, if he simply moves around to the backside of the image and faces forward, then surely the image will look OK.
The diagram now looks like this:

Fair enough, but there are still devils in the details!
If we obsess about the angles of the light rays as they strike the sensor, then we might extend those rays to get this diagram:

Now you can see an important problem: the extended rays cannot be seen by the observer. In fact only a very few rays will be seen by the viewer: the ones that connect the entrance pupil and the viewer's own pupil. This is quite severe vignetting!
The problem of rays missing the observer can be easily solved by inserting a flat diffuser at the image plane. Then at least some of the rays striking each point on the image plane will get redirected into the viewer's eye. Or if we don't like the idea of losing a lot of light in diffusion, then we can stick a sensor at the image plane, and construct an image from that.
Indeed, the insertion of that diffuser or sensor does solve the problem of letting the viewer see the image with correct perspective. Progress is made!
But there's more -- a LOT more. In addition to redirecting the rays that I've drawn, the insertion of that diffuser or sensor will also redirect a bunch of other rays that I have not drawn.
And that's where a critical insight appears -- in the modified system, there is no longer any required relationship between the angles of rays as they hit the sensor, and the angles of rays as they enter the viewer's eye!
The diagram now looks like this:

When Kingslake writes that lenses preserve linear perspective, what he means is that the lens creates a pattern of light at the image plane that is consistent with linear perspective as projected through the entrance pupil. With most lenses, the actual ray angles are not equal on object and image sides of the lens.
The only challenge then is to identify the proper view point corresponding to that pattern, and that turns out to be quite simple. Quoting from John Bercovitz [*]
However, given the same magnification and the same entrance pupil position, all correct calculations must produce the same unique viewing point, regardless of other optical parameters such as pupil magnification.
This immediately rules out correctness of the formulas on Rowlands' page 1-38, which indicate that the distance must be scaled by a factor that involves the pupil magnification. I made that point in an earlier posting in this thread, when I noted that the formula "conflicts with the fact that when focused at infinity and shooting a distant scene, all lenses of focal length 100mm will capture exactly the same image, regardless of what their pupil factors happen to be."
If we need further real-life demonstration of that idea, then I offer the following animated comparison: two images of a fairly distant scene, shot with the same camera and the same lens(!), varying only in how the lens was stopped down. In one case I used the internal iris of the lens, and in the other case I left the lens iris wide open but placed a black card with a hole in it at the front of the lens. This resulted in effective f/8 in both cases, but with pupil magnification factors alternating between 1.1 and 2.9. The images show a slight difference in sharpness and pincushion distortion, due to the aperture substitution causing the light to pass through less well corrected parts of the glass. But I think it is clear that these images have essentially identical geometry and therefore identical viewing distance for correct perspective. Pupil magnification simply does not matter.

It's interesting to speculate how Rowlands' treatment went off track. Most difficulties that I see regarding optics formulas trace to either using the same words for different things, or to assumptions that may not be met, or occasionally to typos or clerical errors in the math. But this one looks like a conceptual glitch. The text says that
Anyway, to summarize:
--Rik
[*] The following bibliographic reference is copied from the "No-parallax" paper: http://www.angelfire.com/ca2/tech3d/images/persp.pdf “IMAGE-SIDE PERSPECTIVE AND STEREOSCOPY”, by John Bercovitz. A mathematical treatment of many of the same ideas discussed here. The official 1998 SPIE publication can be accessed at http://www.imaging.org/IST/store/epub.cfm?abstrid=29536 .
You wrote, and I have highlighted:
The boldfaced part is an excellent way of expressing the requirement.I don't see how that lense can deliver a true linear perspective (central perspective), when the angles alpha and alpha' in object space and in image space are different. I mean if the lense delivers true perspective, those angles need to be the same. Imagine the photo printed on a transparent sheet of plastic. One should then be able to place that transparent sheet between the eye and the 3D object in such a way that the photographic image on the sheet overlap with the 3D object. As I see, this require those two angles to be the same. What is your thoughts about this reasoning?
But the reasoning about the angles is not sound. This aspect, probably in combination with some misleading diagrams in the literature, is what's causing the confusion.
Let's see if I can explain. (Be warned: some of this may hurt your head a little, because the explanation is designed to break current thought patterns that are causing problems.)
Here is a simple diagram like one might find in a typical explanation:

With this diagram in hand, one might think the situation is obvious: all the corresponding angles will be equal, so surely if the little stick man simply turns around, he will see an image with correct perspective.
Of course that's not quite correct, because at first he will see the image upside down.
That particular problem is easily fixed by turning himself upside down.
But then the stick man is shocked to see that the letter ꟻ is backwards, as if seen in a mirror.
How did the mirroring happen?
Aha! The mirroring happened when the stick man turned around to look at the image.
So, if he simply moves around to the backside of the image and faces forward, then surely the image will look OK.
The diagram now looks like this:

Fair enough, but there are still devils in the details!
If we obsess about the angles of the light rays as they strike the sensor, then we might extend those rays to get this diagram:

Now you can see an important problem: the extended rays cannot be seen by the observer. In fact only a very few rays will be seen by the viewer: the ones that connect the entrance pupil and the viewer's own pupil. This is quite severe vignetting!
The problem of rays missing the observer can be easily solved by inserting a flat diffuser at the image plane. Then at least some of the rays striking each point on the image plane will get redirected into the viewer's eye. Or if we don't like the idea of losing a lot of light in diffusion, then we can stick a sensor at the image plane, and construct an image from that.
Indeed, the insertion of that diffuser or sensor does solve the problem of letting the viewer see the image with correct perspective. Progress is made!
But there's more -- a LOT more. In addition to redirecting the rays that I've drawn, the insertion of that diffuser or sensor will also redirect a bunch of other rays that I have not drawn.
And that's where a critical insight appears -- in the modified system, there is no longer any required relationship between the angles of rays as they hit the sensor, and the angles of rays as they enter the viewer's eye!
The diagram now looks like this:

When Kingslake writes that lenses preserve linear perspective, what he means is that the lens creates a pattern of light at the image plane that is consistent with linear perspective as projected through the entrance pupil. With most lenses, the actual ray angles are not equal on object and image sides of the lens.
The only challenge then is to identify the proper view point corresponding to that pattern, and that turns out to be quite simple. Quoting from John Bercovitz [*]
The same numerical result can be obtained in any of several other ways, for example by trigonometry or by optics formulas involving physical lens characteristics.“The distance from an image to its correct perspective point is numerically equal to the magnification of the in-focus object at the image plane times the distance from the entrance pupil of the lens to the in-focus object.”
However, given the same magnification and the same entrance pupil position, all correct calculations must produce the same unique viewing point, regardless of other optical parameters such as pupil magnification.
This immediately rules out correctness of the formulas on Rowlands' page 1-38, which indicate that the distance must be scaled by a factor that involves the pupil magnification. I made that point in an earlier posting in this thread, when I noted that the formula "conflicts with the fact that when focused at infinity and shooting a distant scene, all lenses of focal length 100mm will capture exactly the same image, regardless of what their pupil factors happen to be."
If we need further real-life demonstration of that idea, then I offer the following animated comparison: two images of a fairly distant scene, shot with the same camera and the same lens(!), varying only in how the lens was stopped down. In one case I used the internal iris of the lens, and in the other case I left the lens iris wide open but placed a black card with a hole in it at the front of the lens. This resulted in effective f/8 in both cases, but with pupil magnification factors alternating between 1.1 and 2.9. The images show a slight difference in sharpness and pincushion distortion, due to the aperture substitution causing the light to pass through less well corrected parts of the glass. But I think it is clear that these images have essentially identical geometry and therefore identical viewing distance for correct perspective. Pupil magnification simply does not matter.

It's interesting to speculate how Rowlands' treatment went off track. Most difficulties that I see regarding optics formulas trace to either using the same words for different things, or to assumptions that may not be met, or occasionally to typos or clerical errors in the math. But this one looks like a conceptual glitch. The text says that
The analysis then proceeds from that point, using exit pupil as the image space equivalent of entrance pupil. Unfortunately those words "in image space" capture the wrong concept. If one uses the phrasing "with respect to the image rather than with respect to the object", then the correct result emerges. But the troublesome wording leads to the wrong result so seductively that I can easily see how it got past proofreading and review.A photographic print or screen image should be viewed from a position that portrays the true perspective. This again requires that the photograph be viewed from the centre of perspective, but in image space rather than object space.
Anyway, to summarize:
- Perspective is not determined by the path the light takes on its way to the sensor. It is determined entirely by the pattern that the light makes when it strikes the sensor.
- In most practical situations, lenses do not preserve the angles of light rays. They just preserve the patterns of light on the sensor that would be produced by lenses that did preserve the angles.
- The view point for correct perspective does not depend on the pupil magnification in any fundamental way.
- Again I recommend to spend some time studying Theory of the “No-Parallax” Point in Panorama Photography . The ideas developed in that article bear directly on the issues discussed in this thread.
--Rik
[*] The following bibliographic reference is copied from the "No-parallax" paper: http://www.angelfire.com/ca2/tech3d/images/persp.pdf “IMAGE-SIDE PERSPECTIVE AND STEREOSCOPY”, by John Bercovitz. A mathematical treatment of many of the same ideas discussed here. The official 1998 SPIE publication can be accessed at http://www.imaging.org/IST/store/epub.cfm?abstrid=29536 .
Re: Pupil factor and linear perspective
Excellent! Thank you for the thorough explanation. I really appreciate it.
I see what you write: There is no reason why the center of the Exit Pupil should be the Eyepoint, from where to view the image on the sensor on the image-side of the compound lens. And as you state, from that point the image would even be a mirror image. The difference in angle really bothered me, but I guess I was in acceptance of authority and was not ready to call the error in the formula.
Beside your great experiments to show the independence of the pupil factor, thank you for referring me to the John Bercovich article as well: "Image-side Perspective and Stereoscopy". Here Bercovich delivers a formula for the distance in which to view the image on the sensor. Using the usual lens formulas, this new formula is quite easily derived. By multiplying with a scaling factor X I finally have a formula for the distance to view a final print of a photography:
dist = f*X*(s+PN)/(s-f)
where:
X = Lp/Ls
dist: Viewing distance associated with the particular print of a photo.
f: Focal Length (nominal - as written on the lens)
Lp: Length of printed image
Ls: Length of camera sensor
s: Distance from the center of the entrance pupil in the camera to the object in focus
PN: Distance from the primary principal point to the center of the entrance pupil of the lens
I guess the distance PN has to come with a sign: Positive if the primary principal point is closer to the object than the center of the entrance pupil and negative otherwise. Right?
NB! Alongside the Gaussian version of the formula for the viewing distance to the sensor he also provide a "Newtonian" version, which include a variable x, which is the distance from the object-in-focus to the front focusing plane. I haven't seen that Newtonian version of formulas before. The formula x*x' = f^2 is mathematically pleasing because of the inverse proportionality, but is this way to handle lens-optics usual or advantageous? As it is now, I guess I will stick with the "Gaussian way".
Of course using the formula one need to know where the center of perspective (= center of entrance pupil) is located within the lens. Bercovich even demonstrates how it can be measured by carrying out a few experiments. For my purposes, where the object distance often will be rather big compared to PN and f, the formula reduces to dist = f*X, as was my first approximate formula mentioned in this thread. I assume the precise formula will be needed when using macro lenses, right?
Viewing distance is associated with every photographic image
As I wrote earlier few people understand or acknowledge the importance of this fact. Many websites do write about distortion of lenses when that is really not the case. Every lens, except for a few (like fish-eye lenses), does display the real 3D object perspectively correct (up to small lens errors). Those lenses present the World as we see it. Kingslake also clearly points this out.
Now apart from camera-optics: I wrote a small book about the Linear Perspective back in the 90's. I looked at mathematical properties of the central projection and also wrote about painters using the linear perspective. It all started out in the renaissance in Italy as we know. In Denmark in the Golden Age of art (first half of 1800) there were experts of linear perspective. They had a rule of thumb when depicting three dimensional scenes. As an example X-perspectives had to be depicted as at least the ratio 3:1. This made the painting less sensitive to not being viewed from the correct distance.
In the new year 2024, I am actually going to give a speech or lecture about the linear perspective in the local photo club, I am attending. As one item, I will demonstrate how pictures may look "apparently distorted" when viewed from the wrong distance. As an example I have had a "stiff poster" made, which is the perspective image of a tiled floor made of squares. The associated viewing distance is 40 cm. When viewed from a longer distance it looks like the indidual tiles are more rectangular. The reason is that a large angle is displayed! I could easily create perspective images with a very big angle and create a stiff poster of that, making it look totally unreal when viewed from the wrong distance. It is unfortunately not easy to demonstrate it with a stiff poster, because of the limitations of the human eye: When viewing it from the correct position, the image will cover such a big angle that parts of the image will be outside the field of view of a human eye (the visual ability decreases by the angle). OR one could give up the idea of viewing the entire image and just look at parts of the image from a skew angle - like an anamorphosis!

When all that's said it can be quite fun to use wideangles lense.
Again big thanks for clarifying things for me!
I see what you write: There is no reason why the center of the Exit Pupil should be the Eyepoint, from where to view the image on the sensor on the image-side of the compound lens. And as you state, from that point the image would even be a mirror image. The difference in angle really bothered me, but I guess I was in acceptance of authority and was not ready to call the error in the formula.
Beside your great experiments to show the independence of the pupil factor, thank you for referring me to the John Bercovich article as well: "Image-side Perspective and Stereoscopy". Here Bercovich delivers a formula for the distance in which to view the image on the sensor. Using the usual lens formulas, this new formula is quite easily derived. By multiplying with a scaling factor X I finally have a formula for the distance to view a final print of a photography:
dist = f*X*(s+PN)/(s-f)
where:
X = Lp/Ls
dist: Viewing distance associated with the particular print of a photo.
f: Focal Length (nominal - as written on the lens)
Lp: Length of printed image
Ls: Length of camera sensor
s: Distance from the center of the entrance pupil in the camera to the object in focus
PN: Distance from the primary principal point to the center of the entrance pupil of the lens
I guess the distance PN has to come with a sign: Positive if the primary principal point is closer to the object than the center of the entrance pupil and negative otherwise. Right?
NB! Alongside the Gaussian version of the formula for the viewing distance to the sensor he also provide a "Newtonian" version, which include a variable x, which is the distance from the object-in-focus to the front focusing plane. I haven't seen that Newtonian version of formulas before. The formula x*x' = f^2 is mathematically pleasing because of the inverse proportionality, but is this way to handle lens-optics usual or advantageous? As it is now, I guess I will stick with the "Gaussian way".
Of course using the formula one need to know where the center of perspective (= center of entrance pupil) is located within the lens. Bercovich even demonstrates how it can be measured by carrying out a few experiments. For my purposes, where the object distance often will be rather big compared to PN and f, the formula reduces to dist = f*X, as was my first approximate formula mentioned in this thread. I assume the precise formula will be needed when using macro lenses, right?
Viewing distance is associated with every photographic image
As I wrote earlier few people understand or acknowledge the importance of this fact. Many websites do write about distortion of lenses when that is really not the case. Every lens, except for a few (like fish-eye lenses), does display the real 3D object perspectively correct (up to small lens errors). Those lenses present the World as we see it. Kingslake also clearly points this out.
Now apart from camera-optics: I wrote a small book about the Linear Perspective back in the 90's. I looked at mathematical properties of the central projection and also wrote about painters using the linear perspective. It all started out in the renaissance in Italy as we know. In Denmark in the Golden Age of art (first half of 1800) there were experts of linear perspective. They had a rule of thumb when depicting three dimensional scenes. As an example X-perspectives had to be depicted as at least the ratio 3:1. This made the painting less sensitive to not being viewed from the correct distance.
In the new year 2024, I am actually going to give a speech or lecture about the linear perspective in the local photo club, I am attending. As one item, I will demonstrate how pictures may look "apparently distorted" when viewed from the wrong distance. As an example I have had a "stiff poster" made, which is the perspective image of a tiled floor made of squares. The associated viewing distance is 40 cm. When viewed from a longer distance it looks like the indidual tiles are more rectangular. The reason is that a large angle is displayed! I could easily create perspective images with a very big angle and create a stiff poster of that, making it look totally unreal when viewed from the wrong distance. It is unfortunately not easy to demonstrate it with a stiff poster, because of the limitations of the human eye: When viewing it from the correct position, the image will cover such a big angle that parts of the image will be outside the field of view of a human eye (the visual ability decreases by the angle). OR one could give up the idea of viewing the entire image and just look at parts of the image from a skew angle - like an anamorphosis!

When all that's said it can be quite fun to use wideangles lense.
Again big thanks for clarifying things for me!
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
Very good -- I'm glad that my musings have been of some help.
You asked earlier for my thoughts, and indeed I have found this to be a thought-provoking discussion.
So, here are some more.
First...
You wrote:
The poster looks very useful for in-person demonstration, where the viewer can approach the image to the indicated distance.As an example I have had a "stiff poster" made, which is the perspective image of a tiled floor made of squares. The associated viewing distance is 40 cm.
But let me suggest some annotation that may be helpful:

Why is this true? It is because those two vanishing points correspond to axes that are 90 degrees apart in the real world, and would be seen that way by the viewer in the real scene.
This would have been a very useful insight to me when I was learning about 2-point perspective, as a child some 60 years ago. Somehow the placement of vanishing points always seemed rather arbitrary. It would have been reassuring to know that there really is a simple rule that produces an exact result, so the arbitrariness comes in deciding how much non-exactness is still OK.
Second...
Earlier in this thread, I wrote that
I would like to illustrate this point with an example.Backing up to basic concepts, the requirement for distortion-free viewing is that angular distances between features in the viewing process must be the same as angular distances between those same features in the capturing process. However, that requirement is in direct conflict with the reasons that photographers use long and short lenses in the first place. It says that if you shoot a distant scene with 50mm and 100mm lenses, then you have to view the 100mm image from twice as far away. But if you actually do that, then the viewing process makes the 100mm image look, um, exactly the same as the 50mm except for covering only half as wide an area. Somehow the mathematical goal of "avoiding distortion" has gotten in the way of the photographic goal of rendering the subject. If your goal is something other than just understanding the math, then you'll have to figure out how to resolve this conflict.
Here is an image that I made some years ago, as a keepsake from a hiking trip.

For the sake of discussion, let us pretend that this was shot on APS-C with a 14 mm lens. (Really it is a stitched panorama, but the horizontal angle of view is about the same as what a 14 mm lens would give.)
Also for the sake of discussion, let us pretend that we have printed this image and set it up to view from the correct perspective distance.
I now ask a question: if we had shot parts of the same scene, from the same place, with lenses of length 28 mm, 56 mm, and 112 mm, and we made prints from those images and set them up to view from their own correct perspective distances, what would those images look like to the viewer?
The answer is this:
28 mm

56 mm

112 mm

This appearance may be surprising, but it falls directly out of the mathematics. The requirement for correct perspective is that angular spans as seen by the viewer of the image must match angular spans as seen by a viewer of the scene. The longer lenses have smaller angular fields of view, so to be viewed "correctly" their images must look correspondingly smaller as well.
A corollary of this analysis is that when viewed with perfectly correct perspective, no telephoto image can reveal more detail than the viewer could see with his naked eyes when the picture was taken. The argument is straightforward: the viewers' vision in direct view is limited to their eyes' angular resolution, and to get perfectly correct perspective when viewing images, the angular spans of the subject must be preserved.
In contrast, here is what the 112 mm lens would actually show when used as typical.

I don't want to go any farther down this particular rabbit hole, so I hope the point is clear: photographers use less-than-perfect perspective almost all the time, and the reasons for doing that are totally compelling.
For macro, the situation is often even more extreme. One of my favorite macro lenses, when focused at 1:1, has its entrance pupil over 300 mm away from the focused subject. On APS-C, with a linear field of view around 23 mm wide, that gives an angular width of view roughly 4 degrees. The "proper" viewing distance would make the image look so small that I would do better to look at the real subject with just my eyes. Or going literally to the limit, we often use telecentric lenses to simplify stack-and-stitch to get very high resolution, but those lenses have an angle of view of 0 degrees, for which correct perspective would imply showing no detail at all!
So...
From my standpoint, all these issues are matters of context and tradeoffs, often benefiting from intuition but seldom from exact calculations. If I'm doing photogrammetry, or creating VR displays, or writing panorama stitching software, then I'll knock myself out to get the math exactly correct. But most of the time it's perfectly fine (and a lot more efficient) to just know the standard heuristics regarding the appearance of images shot with short and long lenses.
Again, I hope this helps.
--Rik
Re: Pupil factor and linear perspective
Interesting discussion. Your additions on my photo is correct. The actual image span is 90 degrees which is not recommended for an X-perspective, because it becomes sensitive to not being viewed from the correct distance. When viewed from the correct distance (actually point) it looks perfectly correct, though.

the final image on the right is a crop of the image on the left in order not to display a too large angle. Cartoonists don't use any measures at all (to my knowledge). They have however developed impressing skills to make things look natural afterall.
Another interesting realization is the fact that one cannot reproduce the 3D object from the perspective image without extra information. This fact is being demonstrated in a spectacular way in certain Science Centers (experimentarium). If viewed from one point the scene looks natural, but if one move to a side it looks entirely different than expected, revealing a complete illusion of the original 3D object!
One can however also create perspective drawings from an object with given measures. Here information about the location of the Eye point is needed. Information about the measures of the 3D object can be given by a Floor Plan and an Elevation Plan. The drawing goes as indicated on the figure below.

A third way to create a perspective drawing or image is by using some software, which can do it, like Rhino. There are many other options. This software can create impressing renderings. Since I am especially interested in the subject of Linear Perspective from an educational viewpoint, I have had a pretty simple wireframe application made. I provided the math needed and a clever Ukranian software engineer programmed it for me. The exact viewing position, etc. can be controlled.

Regarding your considerations about taking photos of the trees with different focal lengths, I am actually going to present at my lecture in the local photo club an important point which is something similar: When taking photos with a telephoto lens from a fixed point in a fixed direction (tripod) the result is an image which is a pure crop (and scaling) of an image taken with a wideangle lens. This is a consequence of the fact that photographic lens does preserve linear perspective as we have now concluded in this thread. I made a real experiment, which confirmed this as written previously in this thread:
You also write about panorama images. I have created a few panorama images in the software named Capture One, which I use, but I haven't studied the mathematics on how they are made yet. It has to be some kind of mathematical projection, but it is certainly not preserving linear perspective. I have panorama images in which straight lines have been depicted as very curvy. Not surprisingly, because it is made of of images with different image planes - when the individual photos are taken from the same point.
Macro lenses: Unfortunately I don't own any macro lens at the moment, which must look strange when considering the focus of this site
I moved from Canon to Fujifilm a few years ago. I hope to get a macro lens at some time in the future. What should I look for in a macro lens?
That's all for now.
Kind regards, Erik.
Yes, I usually distinguish between perspective drawings without measures and perspective drawings with measures. The former is often being used by cartoonists or illustrators. They use the general rules for perspective, for example that parallel lines ("depth lines", not in a plane parallel to the image plane) must meet in a vanishing point. An example:This would have been a very useful insight to me when I was learning about 2-point perspective, as a child some 60 years ago. Somehow the placement of vanishing points always seemed rather arbitrary. It would have been reassuring to know that there really is a simple rule that produces an exact result, so the arbitrariness comes in deciding how much non-exactness is still OK.

the final image on the right is a crop of the image on the left in order not to display a too large angle. Cartoonists don't use any measures at all (to my knowledge). They have however developed impressing skills to make things look natural afterall.
Another interesting realization is the fact that one cannot reproduce the 3D object from the perspective image without extra information. This fact is being demonstrated in a spectacular way in certain Science Centers (experimentarium). If viewed from one point the scene looks natural, but if one move to a side it looks entirely different than expected, revealing a complete illusion of the original 3D object!
One can however also create perspective drawings from an object with given measures. Here information about the location of the Eye point is needed. Information about the measures of the 3D object can be given by a Floor Plan and an Elevation Plan. The drawing goes as indicated on the figure below.

A third way to create a perspective drawing or image is by using some software, which can do it, like Rhino. There are many other options. This software can create impressing renderings. Since I am especially interested in the subject of Linear Perspective from an educational viewpoint, I have had a pretty simple wireframe application made. I provided the math needed and a clever Ukranian software engineer programmed it for me. The exact viewing position, etc. can be controlled.

Regarding your considerations about taking photos of the trees with different focal lengths, I am actually going to present at my lecture in the local photo club an important point which is something similar: When taking photos with a telephoto lens from a fixed point in a fixed direction (tripod) the result is an image which is a pure crop (and scaling) of an image taken with a wideangle lens. This is a consequence of the fact that photographic lens does preserve linear perspective as we have now concluded in this thread. I made a real experiment, which confirmed this as written previously in this thread:
I haven't considered Depth of Field in the two images. I am not sure if the cropped and scaled image from the wide angle lens will have the same DOF as the one taken with the telephoto lens. What do you think?I made an experiment with a camera with a zoom-lens on a tripod. First I shot a photo with the lens set at 25 mm, afterwards a photo set at 75 mm (same position and direction). As expected the photo shot with 75 mm was a perfect cropped version of the photo taken by the lens set at 25 mm: I cropped the image shot with 25 mm and enlarged it with a factor 3. Those two images were as identical as I could anticipate (depth of field not regarded), in full agreement with the theory that the lens deliver true linear perspective in both cases. Also the theory about the associated viewing distances work: The viewing distance associated with the 25mm-image is 3 times as small as the viewing distance associated with the 75mm-image, BUT when I enlarged the 25mm-photo with a factor 3, the viewing distances became the same!
You also write about panorama images. I have created a few panorama images in the software named Capture One, which I use, but I haven't studied the mathematics on how they are made yet. It has to be some kind of mathematical projection, but it is certainly not preserving linear perspective. I have panorama images in which straight lines have been depicted as very curvy. Not surprisingly, because it is made of of images with different image planes - when the individual photos are taken from the same point.
Macro lenses: Unfortunately I don't own any macro lens at the moment, which must look strange when considering the focus of this site

That's all for now.
Kind regards, Erik.
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
The cropped and scaled images will have the same DOF if the respective lenses were set so as to give the same entrance pupil diameter when shooting. More precisely the entrance cones must have the same angular width. This will of course require different F-numbers, for which you will have to compensate by adjusting the exposure time or the ISO setting.
There is a type of analysis, called "equivalent images", in which you shoot the same subject field, with the same illumination, through the same entrance pupil size and location, for the same exposure time, while scaling the ISO setting in proportion to the sensor area, and then scale the resulting images to be the same size and pixel count. The surprising result is that the images you get will also have the same DOF, the same amount of photon shot noise, and the same amount of diffraction blur. I summarize this by saying "same light, same image".
The result is notable because it essentially says that sensor size does not matter, as long as you can adjust the lens and ISO settings appropriately. The standard wisdom that small sensors give large DOF simply reflects the fact that small sensors naturally come with short lenses that have small diameter apertures. It is the small aperture diameter, and not anything else, that produces the large DOF.
Panoramas are another huge topic that provides some unexpected possibilities. Fundamentally you can turn any projection into any other projection. The most common practice is to use lenses that do preserve straight lines, to make panoramas that do not. But one can also make panoramas that preserve straight lines, simply by selecting the appropriate output projection. One can even use lenses that do not preserve straight lines, like fisheyes, to make panoramas or wide angle photos that do preserve them! On one summer vacation trip I carried only a fisheye lens, but used it to make many ordinary snapshots by running the fisheye images through panorama software with appropriate settings. The advantage of carrying the fisheye was that I always had a sufficiently wide angle. Of course there were tradeoffs in resolution, and that's why I said "snapshots".panorama images ...certainly not preserving linear perspective
First decide what you want to do, then choose a lens to match. For most cameras, you can now get lenses that focus as short as 1X optical magnifications, sometimes as much as 2X, and also have automatic focus, automatic diaphragm, and often image stabilization. That combination makes a good lens for low magnification field work. For use in studio, where manual operation is OK, there are many more options.What should I look for in a macro lens?
--Rik
Re: Pupil factor and linear perspective
Interesting with the "equivalent images" theory across different sensor sizes. I haven't been looking at that question before. Fixed position, direction and angle as well as shutter speed. Then the focal length f, the F-Number N and ISO value S can be adjusted so as to give the same DOF, same (order of) photon shot noise and same level of diffraction softening. I Googled this concept and an article of Rowlands actually popped up!
https://www.spiedigitallibrary.org/jour ... 10801.full#_=_
From the assumptions he deduce that the diameters of the Entrance pupils need to be the same, exactly as you pointed out. Quite pleasing that DOF, overall noise and diffraction level mainly depend on a physical property like the amount of light and less on the photographic equipment.
In the case of focus at infinity he presents the following simple formulas, where R is the linear scaling factor of the sensorsize from camera-setup 2 to camera-setup 1:
f2 = f1/R
N2 = N1/R
S2 = S1/R^2
Example: If I want the equivalent situation on my APS-C sensor compared to a fullframe sensor with say f1 = 50 mm, N1 = 4.0 and S1 = 800, then I will need f2 = 32.9 mm, N2 = 2.6 and S2 = 346 (crop-factor 1.52). Rowlands also provide formulas for the more advanced situation, where focus is NOT at infinity. These formulas actually contain the pupil factor!
I will probably at one time look at the mathematics involved in Panorama images.
Regarding Macro lenses, I guess it is most appropriate to ask questions in another thread, when I am ready for that.
Big thanks for your replies. It helped a lot!
Erik
https://www.spiedigitallibrary.org/jour ... 10801.full#_=_
From the assumptions he deduce that the diameters of the Entrance pupils need to be the same, exactly as you pointed out. Quite pleasing that DOF, overall noise and diffraction level mainly depend on a physical property like the amount of light and less on the photographic equipment.
In the case of focus at infinity he presents the following simple formulas, where R is the linear scaling factor of the sensorsize from camera-setup 2 to camera-setup 1:
f2 = f1/R
N2 = N1/R
S2 = S1/R^2
Example: If I want the equivalent situation on my APS-C sensor compared to a fullframe sensor with say f1 = 50 mm, N1 = 4.0 and S1 = 800, then I will need f2 = 32.9 mm, N2 = 2.6 and S2 = 346 (crop-factor 1.52). Rowlands also provide formulas for the more advanced situation, where focus is NOT at infinity. These formulas actually contain the pupil factor!
I will probably at one time look at the mathematics involved in Panorama images.
Regarding Macro lenses, I guess it is most appropriate to ask questions in another thread, when I am ready for that.
Big thanks for your replies. It helped a lot!
Erik
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
Thank you for the link. That is the most comprehensive treatment that I have seen.erikV wrote: ↑Thu Dec 28, 2023 4:45 pmhttps://www.spiedigitallibrary.org/jour ... 10801.full#_=_
From the assumptions he deduce that the diameters of the Entrance pupils need to be the same, exactly as you pointed out. Quite pleasing that DOF, overall noise and diffraction level mainly depend on a physical property like the amount of light and less on the photographic equipment.
There is one place where the paper temporarily raised a warning flag for me. In section 2.1, "Same perspective", the words say that
The warning flag was because the part about "only if" is not correct; it is an unnecessary restriction. As we have discussed earlier in this thread, pupil magnification has no effect on the perspective of the projected image. To maintain precise equivalence, it is sufficient to use exactly the same entrance pupil (size, location, and shape) for all different formats and pupil magnifications.Precise equivalence between different formats is possible with focus set at any chosen object-plane distance only if the lens designs have the same symmetry and therefore the same pupil magnification.
Yes, and in this paper I have no complaint about the formulas. That is because, quickly scanning the math, it looks to me like the pupil factor is used only to a calculate what the values of some other parameters must be, in order to accomplish the fundamental requirement of maintaining the same entrance pupil while image size scales in proportion with the sensor format. I am quite confident that, barring clerical error in manipulating the symbols, the math would show that maintaining the entrance pupil and image scaling also results in proper scaling of the image-side effective aperture, and thus DOF and diffraction, independent of pupil factor.Rowlands also provide formulas for the more advanced situation, where focus is NOT at infinity. These formulas actually contain the pupil factor!
--Rik
Re: Pupil factor and linear perspective
I haven't digged down into the formulas provided by Rowlands. They are quite involved.
I have however made some progress on the question of preserving linear perspective in photograhic images. Earlier in this thread it was stated (Bercovich) that rectilinear photograhic lenses do preserve linear perspective. As a consequence using a telephoto lens instead of a wide-angle lens will result in a cropped image. I wanted to test that experimentally. I needed a place with some depth, so I went to a supermarket and took a few photos down the ailes after the shops were closed and there were no more people left
I took three photos in the following way:
In situation A and B the position and direction of the camera was fixed. Only the focal length was changed. I used a zoom-lens on a tripod. in Situation A I used a focal length of 16 mm and in situation B a focal length of 68 mm. In the final photo from situation A (photoA) I located the part of it covered by photoB as well, then cropped photoA to that part to get photoA_crop. Then I looked back and forth between photoA_crop and photoB, displayed with the same size (same as photoA_crop being scaled to the size of photoB). Those two images were very close to one another - can be considered the same within uncertainty/experimental errors, I will say. This confirmed the theory at least when it comes to the different items, their sizes and location in the images. I wasn't really careful to think about depth of field and using an appropriate focus point. What do you think here: Does the Equivalence Theory imply that one can choose an appropriate F-number, shutter speed and ISO value for situation B so that photoB will have the same DOF and illuminance like the cropped and scaled photoA? And what about noise and diffraction level? Can anything be said here, you think? I guess the cropping and scaling will mean bigger noise ...
Beside situation A and B I also shot a photo with the wide-angle 16 mm setting in a closer position to the scene - with the same framing as in situation B. I call it situation C. Quite a few people, unexperienced at photography think that instead of shooting a photo from a near position to a scene one can as well choose to shoot the scene from further away with a telephoto lense and receiving the same result. This is of course wrong. Not just can new objects become visible when one is moving closer to a scene, it will also result in bigger perspective foreshortenings when shooting closer to the scene. The photos shot in situation B and C just did show that, of course. Regarding the latter: I guess the the reason for many people's loose and wrong dealing with the concept of distortion have to do with situations like that, especially when wide-angle lenses are being used. Often it is pointed out that telephoto lenses do make the scene look flatter, while wide-angle lenses do make the scene look "deeper". But as we know this is only apparent!
Erik
I have however made some progress on the question of preserving linear perspective in photograhic images. Earlier in this thread it was stated (Bercovich) that rectilinear photograhic lenses do preserve linear perspective. As a consequence using a telephoto lens instead of a wide-angle lens will result in a cropped image. I wanted to test that experimentally. I needed a place with some depth, so I went to a supermarket and took a few photos down the ailes after the shops were closed and there were no more people left


In situation A and B the position and direction of the camera was fixed. Only the focal length was changed. I used a zoom-lens on a tripod. in Situation A I used a focal length of 16 mm and in situation B a focal length of 68 mm. In the final photo from situation A (photoA) I located the part of it covered by photoB as well, then cropped photoA to that part to get photoA_crop. Then I looked back and forth between photoA_crop and photoB, displayed with the same size (same as photoA_crop being scaled to the size of photoB). Those two images were very close to one another - can be considered the same within uncertainty/experimental errors, I will say. This confirmed the theory at least when it comes to the different items, their sizes and location in the images. I wasn't really careful to think about depth of field and using an appropriate focus point. What do you think here: Does the Equivalence Theory imply that one can choose an appropriate F-number, shutter speed and ISO value for situation B so that photoB will have the same DOF and illuminance like the cropped and scaled photoA? And what about noise and diffraction level? Can anything be said here, you think? I guess the cropping and scaling will mean bigger noise ...
Beside situation A and B I also shot a photo with the wide-angle 16 mm setting in a closer position to the scene - with the same framing as in situation B. I call it situation C. Quite a few people, unexperienced at photography think that instead of shooting a photo from a near position to a scene one can as well choose to shoot the scene from further away with a telephoto lense and receiving the same result. This is of course wrong. Not just can new objects become visible when one is moving closer to a scene, it will also result in bigger perspective foreshortenings when shooting closer to the scene. The photos shot in situation B and C just did show that, of course. Regarding the latter: I guess the the reason for many people's loose and wrong dealing with the concept of distortion have to do with situations like that, especially when wide-angle lenses are being used. Often it is pointed out that telephoto lenses do make the scene look flatter, while wide-angle lenses do make the scene look "deeper". But as we know this is only apparent!
Erik
- rjlittlefield
- Site Admin
- Posts: 24434
- Joined: Tue Aug 01, 2006 8:34 am
- Location: Richland, Washington State, USA
- Contact:
Re: Pupil factor and linear perspective
When you crop, it is just like having used a smaller sensor in the first place.erikV wrote: ↑Sat Dec 30, 2023 10:11 pmDoes the Equivalence Theory imply that one can choose an appropriate F-number, shutter speed and ISO value for situation B so that photoB will have the same DOF and illuminance like the cropped and scaled photoA? And what about noise and diffraction level? Can anything be said here, you think? I guess the cropping and scaling will mean bigger noise ...
So yes, equivalence theory does say that you can make the crop-and-enlarge image have the same DOF, diffraction, noise, etc. as the one that used the whole sensor with a longer lens.
In practice there will be second-order effects caused by the physical system not perfectly meeting the assumptions of the theory, for example due to lens aberrations and sampling artifacts. Those effects usually favor the larger sensor.
--Rik