Recently we are wrapping the ChalktalkVR code and optimizing it, to some extent. First, we got the “splitting” feature integrated into the main branch. That means no matter how many lines we have in Chalktalk, ChalktalkVR is able to render that. (If not, report to KTR.) However, we found that when the amount of the lines (or points) was really large, both Chalktalk and ChalktalkVR became very laggy. Then we started to think about how to improve this for future design.
The count of the objects in Unity is an important factor. So one alternative is combining different sketches into one object, aka one mesh. Thus we started to look for line render assets. We found “vectrosity” which has demos for rendering different types of lines. We took a quick try and the featured image is one object rendering multiple lines with welded joints.
The visualization of the lines is what we list in the high priority list. The result of native line render is quite basic. This is one reason that we looked for different assets. In Chalktalk, the lines will change the facing direction based on the camera so it looks the same no matter where the viewport it. We took a look at how vectrosity implemented that. The idea is as expected, that every frame the quad will be generated by the position of the endpoints of a line, the width, and the camera transformation. However, probably because I ran it in VR, I felt the lines are moving when I was moving. I am wondering does that mean we need to have two results for two eyes.
The welded joint is one todo task too. Vectrosity does support that, however, the lines looked shift a lot when the VR camera moved.
Well. Yes. Actually, it is a user test after one-day vectrosity trying. It is powerful and fun and it supports texts!