Photography Asked by Randomgenerator on February 7, 2021
How do I convert from the percent magnification that many photo editing apps use to times magnification? For example, if I have taken a photo at 30mm and then zoom to 100% percent in Lightroom, what focal length would that correspond to?
100% = "actual pixels". A pixel on your monitor equals a pixel from your camera. OK, not exactly, if your computer display scaling is not 100%, then there is another complication, but that can be ignored for this calculation.
You need to start with the zoom level that causes the 30mm image to fill your monitor/editing window. Then multiply the focal length by the ratio of the percentages. E.g. if your image was taken with a 30mm lens, and 10% zoom fills your window; then at 100% zoom this will be the equivalent of 30mm * (100%/10%) = 300mm.
This is an approximation that works for normal lenses. For an ultrawide, it won't be nearly as accurate. There may be a lens geek here that knows how to convert for any lens focal length.
Edit: Test on my camera. All images taken from exactly the same spot. All images uncropped. Some are zoomed of course.
26mm shot, display percentage set to fill the screen
Same shot (26mm) displayed at 100%
73mm shot with display adjusted until the checkerboard was the same size as the previous (26mm) image. Had to set the percentage to 34% to get them to be the same size.
Now calculate the ratio to predict what focal length should give the same field of view:
100%/34% * 26mm = 76.5mm. close to 73mm. Not perfect, but fairly good evidence that this method works.
Answered by Mattman944 on February 7, 2021
The percentage is the scaling applied to render the image onto the display. The size of the image on screen will therefore depend on your screen size. Conceptually this can be thought of:
For those who take issue with the simplification of the above list, strictly speaking the mapping can be more complicated to provide a clearer zoom but in terms of scale, that is what happens.
The percentage zoom is unrelated to focal-length. It is simply a pixel-to-screen scaling.
Answered by Itai on February 7, 2021
There's no direct constant relationship between the two things.¹
This is because different cameras have different pixel densities as well as different sensor sizes.
It's also because different display devices also have different sizes and pixel densities.
You can view the same image at 100% on two different monitors and the total magnification factor will be different if the monitors have different pixel densities.
Or you can view images from two different cameras on the same monitor at 100% and have different magnification factors because the two cameras have different pixel densities.
When you view an image at 100% on a display device, it means you are using one pixel group (one RGB set) on the monitor to view one pixel in the image at a 1:1 correspondence. 100% is not an expression of a specific amount of magnification, it's an expression of the ratio between the number of pixels, in linear terms, on the screen and the number of image pixels being displayed, in linear terms, on the screen.
If you have a 1920 x 1080 monitor and an image that is 1920 x 1080 pixels, then viewing the image full screen would give a 100% view. But if you take a 3840 x 2160 pixel image and view it at 100% on the same monitor, you'll only see 1/4 of the entire image: half of the width and half of the height. That's because the second image is twice as many pixels wide and twice as many pixels high, which gives four times as many pixels in the image compared to the number of pixels the screen has.
¹ That is, between focal length and screen magnification.
Answered by Michael C on February 7, 2021
This percentage is a ratio of the displayed image to the full-size image.
If your viewing application is in "dot-for-dot" mode (one pixel in the source image is one pixel on the screen, which is usually the default) then at 100% you actually have a one-to-one mapping between pixels on screen and in the source image (de-mosaiced camera sensor "pixels"). At 50%, a pixel on screen is the interpolation of a 2x2 square from the source, and at 200% a 2x2 square on the display represents one pixel of the source.
In non "dot-for-dot" viewing modes, the viewing application checks the "print definition" in the image, which indicates how big the image should be in print (print size = pixels ÷ print definition
) so the zoom ratio is the ratio of the image on the screen to the image in print (this assumes that the application knows the pixel density of your screen but it can find this out). However, since by default the print definition is 72PPI (and this is not specified in pictures from cameras) the default print size of your image is about twice bigger than that of the dot-for-dot view (because modern "standard" displays are around 120-150PPI) so the 100% in dot-for-dot produces roughly the same size as 50% in "print size" in mode.
Answered by xenoid on February 7, 2021
There are some "broken links", some "unrelated" steps and variables in the process to know a magnification of an image on your screen. Let me summarize.
A) Size of the object
B) Shape of the object
C) Distance to the object
D) Focal length
E) Internal optics of the lens
F) Sensor size
G) Megapixels of the sensor
G1) Actual configuration on the camera of the Mpx used
H) Resolution on your screen
I) Configuration of the resolution of the graphics card
J) Or if you have a projector the resolution of the projector
K) The optics of the projector
L) Distance for the projection
M) The final size
And the only variable we actually know is that the software is displaying 100% zoom, this is 1 image-pixel to 1 screen-pixel
If we know all the variables we might have the real ratio of magnification.
One example of how "magnification" means on some section of this diagram is here: What does "magnification" mean?
Answered by Rafael on February 7, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP