Tracky
Introducing Tracky, our open source camera tracking tool

Today we’re excited to introduce a new open-source tool named Tracky! Tracky is a two-part kit aimed at helping AR prototypers, VFX artists, and developers. The first part is an iOS app that works like a video camera app, but also scans the room and records the device’s orientation (and more!) alongside each video recording. The second part is a Blender plugin that imports these recordings and recreates the device’s experience within Blender as faithfully as possible.

On our team at Shopify we do a lot of AR prototyping. Sometimes this is a high fidelity task like building a real-time interactive experience, but sometimes we can explore an idea with concept sketches alone. Most often we find ourselves wanting to explore and express ideas in terms of videos. Tracky was developed for our own use - now we can record a prototype interaction video using the iOS app, sketch the AR concept using Blender’s tools, then import the recording and render out the final video without ever leaving Blender.

VFX artists will be interested in all the aligned data that Tracky records. It records video from the back camera, audio from the microphone, the camera’s orientation in space, its focal length, its sensor height, a depth map captured via LiDAR, and a foreground segmentation mask—all aligned to the Blender timeline. It also scans the world for horizontal and vertical planes such as floors and walls, so that you can snap things to them easily. Tracky also lets you tap to place anchors, and then brings those anchors in as empties into Blender, so that you can track important points in space.

Developers will note that the .bren file that stores all of this information is actually JSON (shh, don’t tell @brennan!) You can use this to access the data directly, with the Blender plugin showing one way to parse the JSON and access the various properties, like the camera’s transform matrix and focal length over time. Developers will also note extra data that can be used stored with the recordings, like the .arworldmap files that can be used to realign an interactive AR session to its recorded environment.

We see this as a starting point for Tracky. In our own work, we already know some features we’d like to see. Can we extract environment maps from the AR session? Can we do skeletal tracking and/or eye tracking and add that to the captured output? What about importing 3D models into the live preview for reference? Can we record the LiDAR depth data as lossless HDR? We’ll probably develop some of these for our use, but what we’re really excited to see is where the community takes it. If you add support for Unity, if you make a compatible Android app, heck if you even make a video with the tool - please let us know, we’d love to hear about it!

March 9, 2023