This blog is an update of the full body tracking project. In my last blog, I introduced a full body tracking solution including finger tracking with the hardware combo of Rokoko SmartSuit and LeapMotion.
In the past two weeks, Ken and I talked about the possibilities of adapting this mocap solution in our Siggraph show. If all related work can be done in time, we may show the solution as part of the show. Because the primary stage of our show takes up most of the space, we need to figure out a way to show the full body tracking in a limited space which is basically a standing-only space. An intuitive method is that player steps in place physically and the stepping can be converted into walking in the VR. So the solution can be simplified to stepping recognition and mapping up/down gesture of legs into natural walking.
Stepping recognition detects repeated lifting up/down gestures on the legs, and in VR, moves the avatar to the direction in which the pelvis is facing. Mapping stepping into walking is trickier. What I did was to define a certain step length, and move the feet with the step length along its corresponding direction whenever a step is detected. The components of legs between the pelvis and both feet will be taken care by inverse kinematics.
Some other VR platforms (https://youtu.be/qh2UdRKNqH4), such as Virtuix Omni, use treadmill-like hardware to accomplish the same goal. But this solution is still valuable since each of them costs a few thousand dollars, but there is no extra cost and no extra setup for this. It may be not as realistic and natural as natural walking, but for a project with a limited space and when accuracy is not the main focus, it could be a good way to balance off space limitation and extra cost.