Aruco Detection with Built-in Cameras

opencvAruco

Recently Alex and I were working on Aruco board detection. In brief, our result was very unstable and seems wrong. We don’t understand why, especially the code we were using basically the official plugin code from OpencvUnity. So we split the team. She started to write the part for sensor fusion and I decided to work on Aruco detection from the scratch again.

Step 1, Get Aruco board work with USB camera

One step at a time. I try to fully get the Aruco code work first. The last time I used Aruco is around 9 months ago. At that time, I remember, I think I totally understood all the equations, variables, and coordinate system. Well, it seems I forgot those again. Now I need to get familiar with that again. Getting usb camera working is quite simple. With standard Matlab calibration, the Aruco detection went fine even without function refineDetection. That is really confusing.

Step 2. Seriously and carefully calibrate the iphone camera

We took a bunch of photos from iPhone’s front camera (well, last time Alex took 10 from the back camera….). The resolution is not standard. It is sth like 1376×1798 (from memory). So we still calibrated it and went with the new Aruco detection code. Weirdly, the rotation seems reasonable, but the position is so off. I talked to Connor about it because he has a mobile app which has the control of the camera center, aka he can shift the detection result. Then he told me basically the shift is about the camera matrix. Well, that means sth wrong with the calibration.

Step 3, Run capture app with built-in camera with unity

I started to realize that it is very possible the built-in camera in unity has different behavior from the camera app of iPhone itself. And this is the key! Once we get images from unity app and then calibrate the camera from those images. Aruco detection code worked perfectly, to some extent.

Leave a Comment

Your email address will not be published. Required fields are marked *