LightAct 4 Docs
LightAct WebpagePDF Version
  • LightAct 4 Documentation
  • Understanding the Basics
    • Overview of Content Pipeline
    • Pushing Content out of LightAct
    • Terminology
    • How to map Content
      • Creating a Canvas
      • Creating a Layer
      • Creating a Video Screen
      • Mapping a Canvas to a Video screen
      • Creating a Perspective Thrower
      • Using Nodes to Set Content
      • Mapping to Outputs
    • Working quickly in LightAct
      • Window docking & workspaces
      • Drag & drop
      • Viewport Navigation
      • Shortcuts
    • Launching LightAct with Arguments
    • Top Bar
    • LightAct Performance
  • Transitioning to LightAct from other media servers
    • Switching to LightAct
    • LightAct for Disguise Operators
      • GUI Overview
      • Content Sequencing
      • Content Mapping
      • Projection Mapping
    • LightAct for Pixera Operators
  • Content mapping
    • Canvas mapping
    • Throwers
      • Thrower Properties
      • Perspective Thrower
      • Cubic thrower
      • Spherical Thrower
    • 3D model
      • Rendering Mode
      • Multiple materials & sources
      • Preparing 3D Models
        • Organize the 3D Scene
        • Materials in 3D models
        • UV Mapping of 3D Models
      • Outputting 3D Model's Texture
  • TIMELINES
    • Overview
    • Timelines Window
    • Timeline Editor Window
      • Sections
      • Markers
      • Curve Editor
    • Cues
      • Cue List
  • LAYERS AND LAYOUTS
    • Overview
    • Stock Layer Templates
      • Content Layer Templates
        • Video
      • Generative Layer Templates
        • Checkerboard
        • Color grid
        • Gradient
        • Mapping ID
        • Render Time
        • Scroll Texture
        • Send Texture
        • Solid Color
        • Strobe
        • Text
        • Texture to Mapping
      • Control Layer Templates
        • Go to Marker
        • Go to Section
        • Receive message
        • Send DMX
        • Send message
        • Set Fade
        • Set Timeline State
        • Set Volume
    • Layers
      • Layer Properties
      • Layer Layouts
      • Cross-fade
    • User Layer Templates
    • Variables
      • Variable Management
      • Control Variables with DMX, OSC, or Curves
    • Nodes
      • Node Connections
      • Node Action Flow
      • Order of Node Execution
      • Texture Processing
        • LUT
        • Luma Key
        • Rotate
      • Computer Vision
        • Optical Flow
        • Find Blobs
        • Mog2 Background Subtraction
        • Combine
        • Convert CV Color
        • Subtract
  • Node reference
    • Node reference
    • Layout nodes
      • Lifeline and actions node category
        • Tick node
      • Texture generators node category
    • Device nodes
  • Content Playback
    • Video Formats & Codecs
    • Playing Video
    • Playing Image Sequences
    • Images
    • Playing Notch Blocks
  • Projection Mapping
    • Projection Mapping
    • Projection Study
    • 2D Projection Mapping Workflow
    • 3D Projection Mapping Workflow
    • 3DCal
    • Iris
      • Setting up Iris Device Node
      • Adjusting Focus
      • Adding and calibrating Iris camera
      • Calibrating projectors using Iris
      • IrisVirtual
    • AutoBlend
    • Warp & Blend
      • Texture Warp & Blend node
      • Texture Mesh Warp node
      • Texture Perspective Warp node
      • Texture Softedge node
      • Warp & Blend Output Window
    • Texture Mask
  • DMX
    • DMX Overview
    • DMX Setup
    • DMX Out
      • Fixture Editor
      • Output to DMX Fixture
      • Set DMX in Layer Layouts
    • DMX In
      • Adjust layer variables with DMX
      • Index manager
      • Get DMX in Layer Layouts
      • Control by DMX
    • DMX by UV
  • Unreal Engine
    • Unreal Engine Integration Overview
    • Required Unreal Engine Plugins
      • Installing LightAct Plugin
      • Setting up the Plugin
        • LightAct Runtime Actor
        • LightAct Custom Time Step
        • LightAct Timecode
    • UnrealLink
    • Texture Sharing
      • Thrower2UCam
      • Projector2UCam
      • ViewportUCam
  • Integrations
    • Camera and object tracking
      • Camera Tracking
        • Stype
        • Mo-Sys
        • FreeD
        • Ncam
        • SPNet
      • Object Tracking
        • PSN
        • BlackTrax
        • OptiTrack
        • Vive
        • Antilatency
      • Tracking Visualizer
      • Tracking Follower
    • Network
      • TCP & UDP
      • OSC
        • OSCLearn
      • TUIO
    • Serial
      • MIDI
    • Audio
    • Content IO
      • NDI
      • Deltacast
        • FLEX Management
      • ZED
      • RealSense
    • OSCIn
    • 10-Bit Displays
  • LightTrack
  • LightNet
    • LightNet Overview
    • LightNet Clusters
      • Scan Network
      • Connect to Secondary Machines
      • Transfer and Open Project
      • Cluster Management
    • LightNet Performance
      • Canvas Distribution
      • Layer Distribution
      • Asset Distribution
      • Viewport Object Distribution
      • VRAM Improvements
    • LightSync
  • WebUI
    • WebUI Setup
  • GPU Management
    • EDID Management
      • Export EDID
      • Load EDID
      • Unload EDID
    • Synchronize GPU Outputs
    • Multiple GPUs
      • Mosaic
        • Set up Mosaic
        • Modify or Disable Mosaic
      • Primary Display
      • Workflow
    • DisplayPort - HDMI converters
  • Licensing
    • Licensing Overview
    • Adding or updating a License
    • Applying a License
    • Transferring a License
  • Troubleshooting
    • Iris Troubleshooting
    • Unreal Engine Integration Troubleshooting
      • Texture Sharing Troubleshooting
    • LightNet Troubleshooting
    • Notch Troubleshooting
    • GPU Management Troubleshooting
    • Asking for support
Powered by GitBook
On this page
  • Tracking Follower Setup
  • 1. Place an Object onto the Viewport
  • 2. Choose a Tracking Device
  • 3. Select three different points from that Object
  • 4. Assign the Trackers to the points

Was this helpful?

  1. Integrations
  2. Camera and object tracking

Tracking Follower

PreviousTracking VisualizerNextNetwork

Last updated 1 year ago

Was this helpful?

Tracking Follower is a feature exclusive to primitive objects, video screens, and imported 3D objects. It allows you to visualize the location of environment trackers through the responsive movement of the chosen Viewport Object.

Tracking Follower Setup

To apply the transformation of incoming tracking device data to an Object, all you have to do is follow three simple steps:

  1. Insert an Object onto the Viewport,

  2. choose a Tracking device,

  3. place at least three different Tracking points on the Object, and

  4. assign the Trackers to the Tracking points.

1. Place an Object onto the Viewport

In the Viewport window, choose a Primitive object, Video Screen, or imported 3D Object from the Toolbar and drag it to the desired spot on the Viewport floor.

In reality, here, you will want to place the virtual twin of the physical 3D Object you will be tracking.

2. Choose a Tracking Device

In Tracker device dropdown menu you can choose one of the Device receiver nodes placed in the Devices window.

3. Select three different points from that Object

Next, you need to choose at least three different points on the virtual Object that match the position of the physical Trackers, in relation to the physical 3D Object.

To do so, under the Tracking follower section click on Add point button. Upon clicking, you will enter a special mode, indicated by the Pick a point label on the top left corner of the Viewport window.

Good to know: In the Pick a point mode, the chosen Viewport Object will be isolated and all other Viewport objects will become invisible.

If Render spheres checkbox is enabled, anytime you hover over the chosen object, a sphere will be drawn out.

To add a point to the object, all you have to do is click on it.

Good to know: You can also exit the Pick a point mode without adding any points by pressing ESC key or left-clicking somewhere else in the Viewport.

4. Assign the Trackers to the points

Under the Tracker column, enter the name of the physical tracker for each corresponding point.

Once you enable the Apply transformation override checkbox, the virtual Object will jump to the position in the Viewport that matches the position of the physical Object in your tracking environment.