Camera and object tracking

LightAct supports a wide range of 3rd party tracking solution. For the latest list, it's best to go to Devices window and check out the Trackers category in the right click menu.

Good to know: when we say 'camera tracking' we refer to whether a particular protocol is able to transmit lens data such as FOV and lens distortion data on top of regular position and rotation data. 'Object tracking' usually refers to tracking data that does not include camera-specific data.

Good to know: Setting up any kind of tracking is done first in Devices window, where you receive the data. Then, if you want to track a moving object, you can go to CalTrack or use node-based tracking workflows further processing.

Receive Tracking Data

In order to receive camera/object data via one of these protocols, first we need to create a Receiver device. Right-click on a Device window and choose the specific Receiver device.

Visualize Tracking Data

LightAct has a feature called Tracking Visualizer which allows you to visualize tracking data.

Tracking Visualizer

CalTrack

When it comes to tracking technologies, CalTrack enables imported 3D models to mimic the movements of their real-world counterparts.

Read more about it in the link below:

CalTrack

Node-based workflows

LightAct also allows you to get the tracking data into Layer Layouts in which case you are able to perform various triggers based on this data or process it in some way.

Node-based Tracking Workflows

Last updated

Was this helpful?