This week, we were able to do some mocap tests with actors to get ready for our next production for SIGGRAPH. The tests went well and most of the data was really easy to clean except for a few problems with the actors’ feet – they probably grazed some of the markers on their legs and accidentally moved them. All in all, it was a good test.
The new thing we did in addition to the standard mocap was perform facial capture at the same time. We thought about the problems we had in our earlier tests with the faceware cameras and tried to mitigate them. One of the problems we had before was that turning the camera on and off for each take would move the camera slightly and that meant I would have to work on each clip individually. This time, we had the camera recording the entire time. But we we still ran into problems. In our previous tests, the actor had not been moving around a lot, only turning her head slightly when she needed to. For our new scene, the actor was moving around the area as they were acting. It meant that the camera had a high chance of moving on the actors head. It also meant that the lighting on the actor’s face would change dramatically depending on whether they were facing the windows or not. The dramatic change in lighting seems to have the same effect on faceware’s tracking algorithm as a changing camera angle does. A few different times, the lighting would obscure the actor’s face and the software would not understand where his mouth was.
I think in order to avoid this problem in the future, we need to apply lights onto the camera rig itself so that the actors’ faces can have more consistent lighting, like in movies. We should also paint tracking markers onto the actors to make it easier for me to match their facial poses.