This is the first blog posted when I officially came back to the lab! So excited that now I can talk about the detail of the ongoing project.
The lab has been working on bringing Chalktalk to immersive environment for a while, and a lot of different devices has been involved in, like Vive, Oculus Rift, Oculus Go, Google Tango/Lenovo phones and Mirage AR headset. We want the creature in Chalktalk come to live and play with us in our 3D world very urgently.
Here is a really rough sketch I made long time ago to explain the architecture of combining behavior server and game engine.
You see at that time we use line render to visualize all the data, especially point data in unity. However, as we all know, Chalktalk has stronger capability than that. And we really want to make it fully functional. So step by step, we get filled shape worked months ago. Instead of line render, we created customized mesh to get the shape to be filled. Then we start to do some real presentation with Chalktalk in Unity. In this scenario, we found text are really important besides the sketch. So we started to figure out how to send, receive and visualize it in Unity. Text mesh is the best we can find in Unity so we choose that as the third optional component. And then, similarly, text will be sent as a byte array which we used all the time to save network bandwidth. Too much detailed work happened to understand the position of text in Chalktalk and thanks to Karl we have a workable version for that.
Right now a complicated lecture could be seen in a fully immersive 3D environment!
Thanks for Karl to hold the headset and find a star war viewport.