You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Aug 31, 2021. It is now read-only.
Hi, in order to get an artifact-free video from my recoreded footage I have a few questions:
We shot a scene where a person is walking towards and around the camera
at a distance between 1 - 1.5 meters. At some of those frames, the person
start to disappear partly (looks like some image colums of the person disappears)
or heavy distortions on the head (minimum distace to cameras) of the person occurs.
How does the minimum distance of an object to the ODS system affects the stitching?
Is it the optical flow that fails because the objects are too near resulting in
great horizontal disparity? Have you observed something similar?
Also, in the rendering pipeline, when I am stacking the optical flow fields horizontally
and visualize them with e.g. color-coding of middlebury or normalized
vertical disparity, it is clearly visible that image consists of 14 vertical
stripes.
Should the visualized optical flows fields stacked together look consistent?
The text was updated successfully, but these errors were encountered:
Hi @smdr2670. Optical flow becomes increasingly more difficult the closer you get to the camera rig, exactly for the reason you mention. We have success with objects at > 8ft, and we start having issues closer (depends on the type of object and movement).
Regarding the optical flow visualization, if you can attach a screenshot I can try to make sense of it :)
Hi, in order to get an artifact-free video from my recoreded footage I have a few questions:
We shot a scene where a person is walking towards and around the camera
at a distance between 1 - 1.5 meters. At some of those frames, the person
start to disappear partly (looks like some image colums of the person disappears)
or heavy distortions on the head (minimum distace to cameras) of the person occurs.
How does the minimum distance of an object to the ODS system affects the stitching?
Is it the optical flow that fails because the objects are too near resulting in
great horizontal disparity? Have you observed something similar?
Also, in the rendering pipeline, when I am stacking the optical flow fields horizontally
and visualize them with e.g. color-coding of middlebury or normalized
vertical disparity, it is clearly visible that image consists of 14 vertical
stripes.
Should the visualized optical flows fields stacked together look consistent?
The text was updated successfully, but these errors were encountered: