Recently, we are trying to setup a framework to account latency for either video pass-through or optical see-through glasses based on IMU data. Suppose we can find the pattern of head movement, especially for rotation movement, so we can predict where the head’s rotation will be. Thus we can pre-rotate the world for the user to provide better rendered results. Oculus already applied similar techniques for VR to reduce latency by using head movement to predict where images would visibly land. It is called Asynchronous Spacewarp (ASW).
Here attached the very rough result of our initial prediction. We applied straightforward regression algorithms to predict gyro data in 30ms based on history. The experiment here is that I start to yaw to the left from the center at the beginning, and then yaw to the right, and then do it again. The featured figure is the visualization for the gyro in three axes. Green dots are the ground truth and the red dots are the predicted results. Obviously, when I was yawing, only one axis has large changes. The above two figures are the rotation applied on a forward vector (0,0,1) from two different views. From figure a (top-down view), it is quite clear I was doing yaw operation, the circle is the start point. From figure b (front view), it showed I slightly moved up and down during yaw operation.
Let us see how it is going when combining with the rendering.