Photography Asked on May 19, 2021
I expect that the out-of-focus areas would look like smooth gaussian blur ideally. But I found that they look like looking through frosted glass. Colors around the edges are blending with a grainy effect instead of gradually.
I found this effect on a lot of web images and confirmed it with my iPhone camera.
I guess it may be related to the imperfection of the camera, but not sure which part causes it and how.
Well, the subject area you indicate does look like you may have photographed it through glass (from inside a room to outside).
But that aside, most of the "frosted glass" look you see here is noise reduction/interpolation. Sensors in phones are small and comparatively noisy (particularly in limited light) but the processing power in phones makes it comparatively attainable to do significant computational processing that glosses over that noise. This smearing effect is not unique to phone pictures and is typical for smaller sensor cameras with high pixel counts under limited light. The resulting overunique areas are often likened to a "watercolor" effect. Your phone is apparently going more for a Monet effect. Adaptive algorithms mean that this kind of smearing tends to get applied to regions without obvious edges, and for obvious edges to cover small areas, the image needs to be quite in-focus.
If you want a less impressionistic rendition of the out-of-focus areas, you need better cameras (the remaining noise makes obvious that there is a reason for the noise reduction level) or higher exposure.
However, if this is a phone photograph we are talking about, the reason may be a completely different one: the kind of blur you show here is more than the small optics of a phone camera would usually deliver. Granted, there is a close subject in front that the focus is on and that is sort of the best case for blurring. But a lot of "smart" smartphone camera processing these days computationally tries to add blurring to out-of-focus areas that just isn't delivered by the optics, in order to make for a more "photographic" look.
The blur blotching depends on a 3D model of the scene estimated from smaller amounts of blur and possibly an additional camera with a different perspective, and the model will be blotchy due to noise again.
Blur blotching is only approximate: for example, if you have a point light source hidden beyond a small in-focus object, with actual optics "looking around" the in-focus object, you'll see the bokeh of the point light as a halo around the object. Smartphone blurring, in contrast, cannot make visible by blurring what it cannot see.
If you take large/sensitive sensors and/or old cameras that don't do a lot of smart processing, the results will be a lot closer to what you expect them to be.
By the way: you don't get "Gaussian blur": that's more for the effects of inexact optics. Out-of-focus bokeh has the shape of the entrance pupil as visible from the respective point in the scene (this is relevant for wide angles and/or "cat's eye bokeh"), scaled by the percentage that the scene point is away from the focus plane (bokeh before and behind the focus plane are inverted in orientation).
Answered by user98068 on May 19, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP