I wish it is still not too late to talk about the conference. I was planning to talk about is two weeks ago but… Anyway, there are not a lot of interesting papers this year to me. Instead, I caught up with quite a few old friends who only show up at the conferences. That is the most fun. Once I lower my expectation, I found quite a few courses and experiences in the exhibition hall worth trying. I will talk about what happened in Hall G-K today. First of all, the job fair is not cool this year. Now let’s begin.
Motion Capture. I saw motion capture everywhere this year. Previously, I think that is what I saw every day in the lab. I think I used OptiTrack so I know motion capture. Things slightly changed this year because of my internship. I pay more attention to the tracking, the capture, and the experience. Content and experience is the key to promote technology in my mind. I was a circus performer doing performances in OptiTrack booth. You can see how they process the self-occlusion and how low latency it is. Similarly, FTRACK were capturing golf real-timely. Recently, I saw the news of Siren and the Hololens translation demo. Motion capture is an essential part of realistic modeling.
GauGAN. That is the most impressive real-time live demo. It is quite convincing to me GauGAN received the award. Glad they have booth and we can try it. Well, that proved even the painting tool is really smart, some people, such as me, still cannot draw a beautiful natural landscape painting. First, the resolution is quite low. I can’t imagine why it is not noticed during the real-time live. Second, because the transitions between different elements are really smooth, sometimes they are too smooth and it ended up very blur. I felt surprised when my drawing is really foggy. But the idea is still attractive to me.
Unreal VR Experience. It is not new to us that we can apply some retargeting operation in VR. This is the first time I tried time-retargeting. ERATO played pre-recording captures when the user is not focusing. It is not a VR experience. They are doing video pass-through. There are eye trackers attached in the headset, which help to know where the focus is. Thus, physical objects can fly or disappear. That is very inspiring. Another dimension, I tried an MR experience with head-mounted cameras. Meanwhile, the regular human being inside the view might become alien. I felt it really so real…
Optical display. I saw quite a few optical displays. Optical Cloaking Display showed a virtual object rendered in front of 2D background with hight resolution and brightness. The booth next to it (I forgot the name) showed how to hide an object by adding an identical background to foreground. Also, adobe provided a quite impressive transparent display.
Interaction Beyond the Dimension. AIR showed how they manipulate models between 3D printers and phones through mid-air. It is not new but it is always a crush to me.
Mica. I tried it, twice. Since most of our lab are way more familiar with it than me. I will save it.