Physics Asked on May 29, 2021
I have a hardware + software setup that uses a sensor to give good estimates of depth, onto a pixel map – think Kinect or similar. Example below for context:
Now assume I can access individual pixel values that give an estimate of depth distance from the camera source.
My question is, can I use this information to infer/estimate horizontal distances? For example, I want to know how wide an aisle is, or the approximately horizontal distance between two points.
For example, I would want to estimate the distance of the horizontal arrow line.
Hope this question is appropriate for here, if not let me know a better stackexchange. Thanks!
If you know your distance and your horizontal field of view, then you know what the distance is from the left side of your screen to the right side of your screen represents for any given distance from the camera using simple trig $Y = dtan(u) $
You then just use the ratio of how wide the object is on the screen relative to the width of the screen, $frac{X}{Y}$ to figure out the width of the object.
Works for height too.
You can also measure your field of view the same way by filming something at a known distance and working backwards.
Answered by DKNguyen on May 29, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP