LightAct 4 Docs
LightAct WebpagePDF Version
  • LightAct 4 Documentation
  • Understanding the Basics
    • Overview of Content Pipeline
    • Pushing Content out of LightAct
    • Terminology
    • How to map Content
      • Creating a Canvas
      • Creating a Layer
      • Creating a Video Screen
      • Mapping a Canvas to a Video screen
      • Creating a Perspective Thrower
      • Using Nodes to Set Content
      • Mapping to Outputs
    • Working quickly in LightAct
      • Window docking & workspaces
      • Drag & drop
      • Viewport Navigation
      • Shortcuts
    • Launching LightAct with Arguments
    • Top Bar
    • LightAct Performance
  • Transitioning to LightAct from other media servers
    • Switching to LightAct
    • LightAct for Disguise Operators
      • GUI Overview
      • Content Sequencing
      • Content Mapping
      • Projection Mapping
    • LightAct for Pixera Operators
  • Content mapping
    • Canvas mapping
    • Throwers
      • Thrower Properties
      • Perspective Thrower
      • Cubic thrower
      • Spherical Thrower
    • 3D model
      • Rendering Mode
      • Multiple materials & sources
      • Preparing 3D Models
        • Organize the 3D Scene
        • Materials in 3D models
        • UV Mapping of 3D Models
      • Outputting 3D Model's Texture
  • TIMELINES
    • Overview
    • Timelines Window
    • Timeline Editor Window
      • Sections
      • Markers
      • Curve Editor
    • Cues
      • Cue List
  • LAYERS AND LAYOUTS
    • Overview
    • Stock Layer Templates
      • Content Layer Templates
        • Video
      • Generative Layer Templates
        • Checkerboard
        • Color grid
        • Gradient
        • Mapping ID
        • Render Time
        • Scroll Texture
        • Send Texture
        • Solid Color
        • Strobe
        • Text
        • Texture to Mapping
      • Control Layer Templates
        • Go to Marker
        • Go to Section
        • Receive message
        • Send DMX
        • Send message
        • Set Fade
        • Set Timeline State
        • Set Volume
    • Layers
      • Layer Properties
      • Layer Layouts
      • Cross-fade
    • User Layer Templates
    • Variables
      • Variable Management
      • Control Variables with DMX, OSC, or Curves
    • Nodes
      • Node Connections
      • Node Action Flow
      • Order of Node Execution
      • Texture Processing
        • LUT
        • Luma Key
        • Rotate
      • Computer Vision
        • Optical Flow
        • Find Blobs
        • Mog2 Background Subtraction
        • Combine
        • Convert CV Color
        • Subtract
  • Node reference
    • Node reference
    • Layout nodes
      • Lifeline and actions node category
        • Tick node
      • Texture generators node category
    • Device nodes
  • Content Playback
    • Video Formats & Codecs
    • Playing Video
    • Playing Image Sequences
    • Images
    • Playing Notch Blocks
  • Projection Mapping
    • Projection Mapping
    • Projection Study
    • 2D Projection Mapping Workflow
    • 3D Projection Mapping Workflow
    • 3DCal
    • Iris
      • Setting up Iris Device Node
      • Adjusting Focus
      • Adding and calibrating Iris camera
      • Calibrating projectors using Iris
      • IrisVirtual
    • AutoBlend
    • Warp & Blend
      • Texture Warp & Blend node
      • Texture Mesh Warp node
      • Texture Perspective Warp node
      • Texture Softedge node
      • Warp & Blend Output Window
    • Texture Mask
  • DMX
    • DMX Overview
    • DMX Setup
    • DMX Out
      • Fixture Editor
      • Output to DMX Fixture
      • Set DMX in Layer Layouts
    • DMX In
      • Adjust layer variables with DMX
      • Index manager
      • Get DMX in Layer Layouts
      • Control by DMX
    • DMX by UV
  • Unreal Engine
    • Unreal Engine Integration Overview
    • Required Unreal Engine Plugins
      • Installing LightAct Plugin
      • Setting up the Plugin
        • LightAct Runtime Actor
        • LightAct Custom Time Step
        • LightAct Timecode
    • UnrealLink
    • Texture Sharing
      • Thrower2UCam
      • Projector2UCam
      • ViewportUCam
  • Integrations
    • Camera and object tracking
      • Camera Tracking
        • Stype
        • Mo-Sys
        • FreeD
        • Ncam
        • SPNet
      • Object Tracking
        • PSN
        • BlackTrax
        • OptiTrack
        • Vive
        • Antilatency
      • Tracking Visualizer
      • Tracking Follower
    • Network
      • TCP & UDP
      • OSC
        • OSCLearn
      • TUIO
    • Serial
      • MIDI
    • Audio
    • Content IO
      • NDI
      • Deltacast
        • FLEX Management
      • ZED
      • RealSense
    • OSCIn
    • 10-Bit Displays
  • LightTrack
  • LightNet
    • LightNet Overview
    • LightNet Clusters
      • Scan Network
      • Connect to Secondary Machines
      • Transfer and Open Project
      • Cluster Management
    • LightNet Performance
      • Canvas Distribution
      • Layer Distribution
      • Asset Distribution
      • Viewport Object Distribution
      • VRAM Improvements
    • LightSync
  • WebUI
    • WebUI Setup
  • GPU Management
    • EDID Management
      • Export EDID
      • Load EDID
      • Unload EDID
    • Synchronize GPU Outputs
    • Multiple GPUs
      • Mosaic
        • Set up Mosaic
        • Modify or Disable Mosaic
      • Primary Display
      • Workflow
    • DisplayPort - HDMI converters
  • Licensing
    • Licensing Overview
    • Adding or updating a License
    • Applying a License
    • Transferring a License
  • Troubleshooting
    • Iris Troubleshooting
    • Unreal Engine Integration Troubleshooting
      • Texture Sharing Troubleshooting
    • LightNet Troubleshooting
    • Notch Troubleshooting
    • GPU Management Troubleshooting
    • Asking for support
Powered by GitBook
On this page
  • Set up Tracking in the Devices window
  • Setup in Layer Layouts

Was this helpful?

  1. Integrations

Camera and object tracking

PreviousViewportUCamNextCamera Tracking

Last updated 1 year ago

Was this helpful?

LightAct has integrated the following camera tracking protocols: MoSys, FreeD, Stype, Ncam and SPNet (Stage Precision). Most of them use UDP protocol under the hood, while Ncam can use either UDP or TCP.

LightAct has integrated the following object tracking protocols: PSN, BlackTrax, OptiTrack, SPNet and Vive. All of them use UDP protocol under the hood.

On the table below you can find which protocols support camera tracking, which support object tracking, and which support both.

Good to know: when we say 'camera tracking' we refer to whether a particular protocol is able to transmit lens data such as FOV and lens distortion data on top of regular position and rotation data.

Protocol
Camera tracking
Object tracking

PSN

MoSys

FreeD

BlackTrax

OptiTrack

Stype

Ncam

SPNet

Vive

Good to know: Setting up any kind of tracking is done first in Devices window, where you receive the data and then in one of the Layer Layouts you can decide what you want to do with this data.

Set up Tracking in the Devices window

In order to receive camera/object data via one of these protocols, first we need to create a Receiver device. Right-click on a Device window and choose the specific Receiver device.

Setup in Layer Layouts

In order to use the values we received in the Devices window, we need to create a Read node in one of the layers. So open a layout of one of the layers (by double clicking on it), right-click on a Layer window and choose a specific Layer reader node.

In the next chapters, we will create the camera and object tracking devices for demonstration purposes. We will also describe unique specifics for each device that differentiate one from the other.