Earlier this year, our lab got a lightweight IMU-based wearable motion capture system called Notch. The system is a set of six waterproof IMU sensors designed to be mounted on the body. The motion capture system does not use external cameras like Optitrack or Vicon which uses a group of infrared cameras to triangulate the position of retroreflective markers; instead, it uses Inertial Measurement Unit which combines an accelerometer to sense non-gravitational accelerations and combines a gyroscope to obtain orientation based on gravity sensing.
With the help of the straps and clip mounts that come with the kit, we can connect the sensors with a smartphone and conveniently obtain the position and orientation of the body parts the sensors mounted on. Due to the nature of the design, IMU-based motion capture is not as accurate as the marker-based motion capture. But it is much more convenient to use at home without the extra space and work for the installation of the external cameras.
Almost the major VR systems have a VR headset to track head and two controllers to track both hands. With the help of machine learning and/or IK, adding an IMU on each of the feet can achieve a full-body-rigged avatar to represent the user in VR. This could be a great solution for the next generation of VR systems.
Qualisys Motion Systems launched a new 6-6dof product at Siggraph called Traqr and its designed for LBE/LBVR. With 6 6dof active devices on a person the system automatically targets the units, displays their position, uses the Skeleton Solver software and imports into Iclone, Motion Builder, Unity or Unreal in real-time to animate. Of course multiple subjects and props can be markered up, the system can also use normal passive and active together.