This wiki is intended for older versions of Motive. For the latest documentation, please refer to

Difference between revisions of "OptiTrack Unreal Engine 4 Plugin"

Line 226: Line 226:
==Drawing Markers in the Scene==
<div class="padded">
When needed, you can also draw streamed labeled marker data from Motive into the scene in UE. In most applications, you do not have to draw the markers into the scene as rigid body data and the skeleton data will be mainly used; however, getting markers generated in the scene may be helpful for debugging purposes. To enable drawing of the markers, expand the OptiTrackClientOrigin component properties, and set the ''Draw Markers'' to true, and also, the [[Data_Streaming_Pane#Labeled_Markers|Labeled Markers]] in the Motive's data streaming pane must be set to true as well.
[[Image:UEP_DrawMarker.png|thumb|center|800px|Labeled markers shown in UE.]]

Revision as of 15:42, 11 April 2019

Back to the Main page

Plugin Overview

The OptiTrack Unreal Engine Plugin allows you to stream real-time rigid body, skeleton, and HMD tracking data from Motive into Unreal Engine 4 (UE4). This article focuses on how to set up those different streaming workflows. For instructions on setting up the the motion capture system, please refer to the Getting Started guide instead.

The OptiTrack Unreal Engine Plugin is divided into two plugins:

  • The OptiTrack - NatNet Streaming Client which allows you to stream rigid body and skeleton data from Motive onto Unreal Engine.
  • The OptiTrack - OculusVR plugin which allows you to implement HMD tracking onto your scenes.

Once the plugins are enabled, they receive the streaming data from Motive through an OptiTrack Client Origin object in your Unreal Engine level.

Version Requirements

This plugin is supported for the following versions of the Unreal Engine, and it also requires the corresponding version of Oculus runtime as indicated in the table.

Plugin Version UE Version Oculus SDK
1.0 UE 4.10 0.8
1.1 UE 4.11 1.3
1.2 UE 4.12 Latest
1.3 UE 4.13 Latest
1.4 UE 4.14 Latest
1.5 UE 4.15 Latest
1.6 UE 4.16 Latest
1.7 UE 4.17 Latest
1.8 UE 4.18.3 Latest
1.9 UE 4.19.2 Latest
1.20 UE 4.20.3 Latest
1.21 UE 4.21.2 Latest


As of OptiTrack Unreal Engine 4 Plugin 1.2.0, Oculus DK2, CV1, and Gear VR HMD models are supported

Motive Data Streaming Setup (Server)

First of all, follow the below instructions and set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.

Streaming in Motive

Broadcast Frame Data set to true for streaming.

Open the streaming settings in Motive and set the values below:

  • Enable Broadcast Frame Data setting at the top
  • From Local Interface drop-down menu, select a network address that you wish to stream the data over to.
  • Select desired data types to stream under streaming options. Streaming of rigid body is required. Streaming of skeleton is optional for skeleton tracking. For tracking HMDs, disabling the Marker streaming is adviced.
  • Set the Skeleton Coordinates to Local.
  • When streaming skeletons, set the appropriate bone naming convention for the client application. (If the character uses the FBX naming convention, this will need to be set to FBX.)
  • For streaming over a wifi network, setting the Transmission Type to Unicast is strongly adviced.


Additional Tips

  • In order to stream data from Edit mode, a capture recording must be playing back in Motive.
  • For best results, it is advised to run Motive and Unreal Engine separately on different computers, so that they are not competing for processing resources.
  • When streaming the data over a wifi network, Unicast transmission must be used.
  • For additional information on data streaming in general, read through the Data Streaming page.

Unreal Engine Setup (Client)

Next step is to set up the client. Follow the below instructions to install and configure the Unreal Engine plugin to receive the streamed tracking data.

Setting Up the Plugin

  1. Download the plugin ZIP file.
  2. Extract the contents from the ZIP file.
  3. Open the OptiTrack folder, transfer the OptiTrackNatNet and OptiTrackOculusRift folders into the Unreal Engine plugin directory. By default, this directory is installed in the C:\Program Files\Epic Games\4.##\Engine\Plugins folder. There will be other plugins installed in the folder already. Simply copy and paste the OptiTrack plugin folders into this directory.
  4. Open/Create a UE4 project.
  5. Under the Edit menu, open the Plugin Panel.
  6. Browse to Virtual Reality section and enable the OptiTrack plugins (OptiTrack - NatNet Streaming Client and OptiTrack - OculusVR) that are added to the folder.
UE4 plugins panel listing the enabled OptiTrack plugins.


Note: When using the Oculus Plugin, make sure to disable the built-in Oculus Rift (or the Gear VR) and Stream VR HMD plugins under virtual reality plugins category. Otherwise, the editor may crash, or they may end up having a higher priority than the OptiTrack Oculus Plugin, and if enabled, will be used in preference to it.

Client Origin Setup

OptiTrack Client Origin

Once the plugin is properly added, the Client Origin Object can be found under Modes panel in UE4.

The OptiTrack Client Origin enables the Unreal Engine (client) to communicate with the NatNet server where the Motive’s rigid body, skeleton, and HMD tracking data are streamed.

Add this object simply by dragging and dropping the OptiTrack Client Origin from the Modes panel into the scene. Once the OptiTrack Client Origin object is placed in the scene, its position and orientation will reconcile the global origin of Motive in Unreal Engine. In other words, the tracking data will be represented according to where this Client Origin object is positioned and how it is oriented.


Global Origin: Both position and orientation of the OptiTrackClientOrigin will represent the global origin of the tracking volume within Motive.

Connecting Unreal Engine to Motive

  1. [Motive] First of all, make sure the Broadcast Frame Data entry is checked in the Data Streaming pane of Motive.
  2. [UE4] Once the plugin is added and enabled in the project, OptiTrack Client Origin class will be available from the Modes panel.
  3. [UE4] Drag and drop the OptiTrack Client Origin into the scene.
  4. [UE4] Place the OptiTrack Client Origin to the desired tracking origin within the scene.
  5. [UE4] Select the instantiated OptiTrackClientOrigin object from the World Outliner panel.
  6. [UE4] In the Details panel, make sure its Auto Connect setting is checked. This configures the client origin to automatically search the network and connect to Motive.
  7. Now that the client origin is set, the client origin will attempt to connect to Motive and start receiving the tracking data whenever the scene is played.
  • Data streaming settings in Motive
  • OptiTrack Client Origin properties defined with corresponding server and client address.


Connecting to a designated IP address

If you wish to connect to a server on a specific network address, you can uncheck the Auto Connect setting and enter the IP address of the client where the instance of UE is running and also the IP address of the server which was designated in the Data Streaming pane in Motive. You may need to run the ipconfig command in the command prompt to obtain an appropriate IP address of the client.
UEP DesignateIP.png


Advance settings: Auto-initialize

In the plugin version 1.8 and above, you can set up the project so at the client origin gets initialized at a user-defined event. This setting is enabled by default. But when there is a need to provide the server IP address dynamically within the game, you can disable the ‘’auto-initialize’’ feature and configure using blueprint how and when the client origin gets initialized.
UEP AutoInitialize 420.png

Animating Rigid Body

OptiTrack Rigid Body Actor

Actor objects in Unreal Engine can be animated using rigid body tracking data from Motive. Once the OptiTrack - NatNet Streaming Client plugin is enabled in the project, OptiTrack Rigid Body component will be available to use. By attaching this component to an actor, you can animate its child actors according to the movement of a rigid body in Motive. Each rigid body component is given a Trackable ID value and rigid bodies in Motive with the matching Streaming ID will be used to update the transform of the target actor in Unreal Engine. Please note that the Hide on Invalid Definition component property is set to true, by default, and the rigid body actor will be visible within the scene only when the corresponding rigid body data is streamed and received by the client.
You can also use the OptiTrack Rigid Body Actor, which will be available on the Modes panel. This actor is simply a blank actor with just the OptiTrack Rigid Body component attached to it, and you can use this to easily associate the objects within the scene:


  1. [UE4] From the Modes panel, search for OptiTrack Rigid Body Actor. Then drag and drop the actor into the scene.
  2. [UE4] Under this rigid body actor, attach the target actor that you wish to animate.
  3. [UE4] Set the relative location and rotation to zero on this target actor, which should be a child of the rigid body actor at this point.
  4. [Motive] In Motive, assign a value to Streaming ID property for the target rigid body.
  5. [UE4] In the properties of the OptiTrack Rigid Body Actor component, match the Tracking ID with the Streaming ID of the rigid body asset in Motive.
  6. Make sure both Motive and OptiTrack Client Origin is set up for streaming, hit Play, and the attached actor object will be animated according to the live-streamed rigid body tracking data.
  • Streaming ID of a selected rigid body asset in Motive.
  • Once the OptiTrack NatNet streaming client plugin is properly installed, the OptiTrack Rigid Body component will be available within the components list in UE4.
  • Within the OptiTrack Rigid Body Component, input the Streaming ID for the corresponding rigid body asset in Motive.


RigidBodyComponent Properties

Tracking ID
ID of the rigid body used to derive the position and orientatation transform of the attached actor. This ID must match with the Streaming ID of the respective rigid body in Motive.
Hide on Invalid Definition
When this is checked, the corresponding rigid body actor will be hidden from the level until the associated rigid body data is streamed out from Motive and received by the plugin.
Disable Low Latency Update
Low latency update feature allows rigid body position and orientation transform to be updated immediately before rendering minimizing the latency. This is enabled by default. For debugging, you can check this setting to disable this behavior.
Tracking Origin
This sets a specific client origin to use for receiving tracking data. When this is unset, the plugin will default to the first client origin that it finds in the scene.
Respect Parent Transform
When this is set to true, the rigid body transform data from Motive will be applied in respect to the parent actor's pivot coordinates. By default, this is set to false, and all of the tracking data will be applied in respect to the pivot axis of the client origin.

Animating Skeleton

Characters in Unreal Engine can be animated using skeleton tracking data in Motive. There are two different approaches to this: skeleton bone mapping and skeleton retargetting.

Bone Mapping

For plugin versions above 1.9.2, the skeleton component can be used to allow streaming of the skeleton via bone mapping. This means that the motion of each skeleton bone in Motive can be projected to the bones of the character model. Please note that this is different from skeleton retargeting, the bone segments in Motive get mapped directly to the character model (bone mapping) instead of being translated into something that is usable by a more abstract biped model (retargeting). The instructions on setting up the skeleton component are documented in the following page:

Go to: Unreal Engine Skeleton Streaming

Skeleton Retargetting

Skeleton tracking data can also be integrated into Unreal Engine via a 3D retargeting process. This can be accomplished through a third party software that retargets the 3D data to produce realistic motion in a scene. For more information, contact us.


Notes for IKinema LiveAction for UE 4.18.

When using IKinema LiveAction for UE 4.18 along with OptiTrack UE plugin 4.18.3, you will need to replace NatNet library file (NatNetLIb.dll / NatNetLib.lib) included in the IKinema LiveAction plugin directory with the library files included within the Optitrack UE plugin package. This is required only for this particular version of the LiveAction because it uses a slightly different version of the NatNet in its package. Copy both the LIB and DLL files for the corresponding platform, 32-bit or 64-bit, from the OptiTrack plugin and replace the library files included on the LiveAction plugin package:
  • The OptiTrack UE plugin library files exist under the following folder:
C:\Program Files\Epic Games\UE_4.18\Engine\Plugins\UE_4.18\OptiTrack\OptitrackNatNet\ThirdParty\NatNetSDK\lib
  • The LiveAction plugin library files exists under the following folder:
C:\Program Files\Epic Games\UE_4.18\Engine\Plugins\LiveAction\Source\ThirdParty\NatNet\NatNetSDK\lib\

Drawing Markers in the Scene

When needed, you can also draw streamed labeled marker data from Motive into the scene in UE. In most applications, you do not have to draw the markers into the scene as rigid body data and the skeleton data will be mainly used; however, getting markers generated in the scene may be helpful for debugging purposes. To enable drawing of the markers, expand the OptiTrackClientOrigin component properties, and set the Draw Markers to true, and also, the Labeled Markers in the Motive's data streaming pane must be set to true as well.

Labeled markers shown in UE.

Integrating HMD (Oculus DK2, CV1)

To integrate an HMD, make sure the OptiTrack – OculusVR is enabled in your project under the Plugins panel and follow the below steps to link the HMD tracking data.

UEP OculusVRPlugin.png

Linking the HMD Tracking Data

  1. [UE4] Enable the OptiTrack – Oculus Rift plugin from the plugins panel.
  2. [UE4] Open the World Settings panel under the Window tab.
  3. [UE4] If not done already, create a new GameMode by adding a new blueprint under the Game Mode section.
  4. [UE4] Change the Default Pawn Class to OptiTrackHmdPawn. You may need to restart the editor before this change takes effect.
  5. [UE4] Select the OptitrackClientOrigin and access its settings under the Details panel.
  6. [UE4] Expand the OptiTrack properties section.
  7. [UE4] In the HMD Rigid Body ID entry, input the respective streaming ID of the HMD rigid body in Motive.
  8. [UE4] In the HMD Orientation section, make sure the axis orientation matches the axis orientation of the HMD rigid body in Motive.
  9. [UE4] Now the tracking data is linked to the Unreal Engine project, and the HMD will display the scene when it is played.
  • Defining the default pawn class from World Setttings: GameMode in UE4.
  • Inputting the Streaming ID information for tracking the HMD.

Integrating HMD (Android: GearVR)

Client origin configuration for GearVR

The plugin also support integration of GearVR HMDs. The HMD tracking data can be streamed over a WiFi network, and in order to do so, a router with sufficient bandwidth must be used. The required bandwidth will vary depending on many factors (e.g. router type, the number of tracked object, etc.). The following settings must be configured in addition to the above HMD settings for developing Gear VR experiences using the plugin. For more information on developing Gear VR applications in Unreal Engine, refer to UE 4 documentations.


  • [UE 4] For developing Android applications in Unreal Engine, make sure the project environment is set up for Android development: Android Quick Start Guide .
  • [UE4] Follow the GearVR Quick Start Guide and configure the project for GearVR development.
  • [UE4] Generate and include the OSIG file (oculus signature) in the directory. See: Oculus Signature File Generator.
  • [Motive] Under the Data Streaming pane, set the Network Type setting under the Advanced Settings to Unicast. Note that the plugin currently only supports the Unicast broadcasting for streaming onto Android, multicasting will be supported in the future releases.

Connecting via Wifi

1. [Android] Connect the smartphone to the internet router which the host PC (server) is connected to.
2. [UE4] Configure the Client object.
- Use Connection Type: Unicast.
- Enable Auto Connect, unless there is a need to designate the network address.
3. [UE4] Connect a smartphone into the PC, build and run the project.
4. Make sure Motive is streaming, and the Gear VR HMD tracking data will be streamed into the UE4 application.


Connecting to a Designated Network.

- Set the Local IP address to ( The local address is not required for unicast connections.
- For the Server Address, connect to the IP address that the host PC (server) is connected to. This could also be found from the Android OS when the smartphone is connected to the wifi router.

HMD Tracking Setup

First of all, setup and optimize the motion capture volume as explained in the Getting Started guide or the Hardware Setup documentation. If you plan to install any obstacles (e.g. walls) within the capture volume, make sure they are non-reflective, and place and orient the cameras so that every corner is thoroughly captured by multiple cameras. For typical rigid body tracking, attach the rigid body markers as instructed in the Rigid Body Tracking page.

General Setup Steps

  1. Attach the markers on the HMD
  2. Create a rigid body asset
  3. Calibrate the Pivot Point of the rigid body
  4. Configure the rigid body settings in Motive

Marker Attachment

There are two different types of markers that can be used to track an object with OptiTrack systems: Passive markers and Active markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light. Either type of markers can be used to track HMDs.

Use appropriate adhesives and marker posts to securely attach the markers to the HMD. For active markers, the active Tag must be attached also in order to synchronize and power the LEDs also. Oculus CV1 HMDs can use the active CV1 clip enclosure to easily place 8 active markers over the HMD. Depending on which marker you are using to track the HMD, the setup instructions will vary slightly.

Active Markers

Attachment & Placement

With the HMD bracket, simply attach the active marker bracket onto an CV1 HMD and use the active markers on it to define a rigid body.

When using the active markers, the markers are actively labeled individually from each of their own unique illumination patterns, and only the matching active markers get contributed to the rigid body solve. The biggest benefit from the individual labeling is that markers can be placed in perfectly symmetrical placements among multiple rigid body assets.

  • HMD markers with passive markers
  • Putting the HMD bracket on
  • When detaching the bracket from the HMD, first pull up the two latches behind the bracket and slowly pull it from the top.

Passive Markers


HMD markers with passive markers

When attaching retroreflective markers, make sure markers are securely attached and readily captured by the cameras. For attaching the markers, we recommend using our 20 mm wide and 30 mm tall M4 threaded plastic marker bases with Acrylic adhesives, available at the webstore, to attach the markers onto the HMD.


A markered HMD will be defined as a rigid body in Motive. When placing markers, make sure the placement asymmetry is respected in the arrangement within the HMD. Also, the marker arrangements between multiple HMDs must be incongruent. For more details, read about marker placement from the Rigid Body Tracking page.

Create an HMD Rigid Body

Creating Active Clip HMD

Creating an HMD rigid body in the Builder pane.

This feature can be used only with HMDs that have the OptiTrack Active HMD clips mounted.

For using OptiTrack system for VR applications, it is important that the pivot point of HMD rigid body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.


HMDs with passive markers can utilize the External Pivot Alignment tool to calibrate the pivot point.


  1. First of all, make sure Motive is configured for tracking active markers.
  2. Open the Builder pane under View tab and click Rigid Bodies.
  3. Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD rigid body.
  4. If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
  5. Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.
  6. Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.
  7. Select the 8 active markers in the 3D viewport.
  8. Click Create. An HMD rigid body will be created from the selected markers and it will initiate the calibration process.
  9. During calibration, slowly rotate the HMD to collect data samples in different orientations.
  10. Once all necessary samples are collected, the calibrated HMD rigid body will be created.

Creating HMD from the markers

In Motive, select the markers that are attached to the HMD and create a rigid body (Hotkey: CTRL + T) as you would do for other rigid bodies. Once this is set up, use the External Pivot Alignment tool in the Builder pane (rigid body → Edit) to calibrate the pivot point of the created rigid body:

For using OptiTrack system for VR applications, it is important that the pivot point of HMD rigid body gets placed at the appropriate location, which is at the root of the nose in between the eyes. External Pivot Alignment tool utilizes both the OptiTrack mocap system and the HMD's own tracking system (Oculus Tracker / Vive Base Stations) for precisely calculating the location of the pivot point of an HMD. When using this feature, both systems must be tracking the HMD simultaneously.


HMD Compatibility Notes:

  • For Motive versions 2.1 Final and later: Oculus and Vive HMDs are supported.
  • For older versions of Motive: Only Oculus HMDs are supported.
  • The corresponding native HMD tracker must be tracking the HMD along with the motion capture system simultaneously.
  • For CV1 active HMD clips, the rigid body definition will be provided separately.


External Pivot Calibration Tips:

  • Do not move around the HMD during calibration. You will want to keep it at one position and slowly rotate the HMD so that it can calibration from different orientations. When rotating, please rotate it slowly.
  • Camera Exposure: Lower down the camera exposure setting to where the exposure is minimized but the HMD is still tracked well in Motive. This will help with preventing IR interference between the VR HMD's own tracking system and the motion capture system during the pivot calibration process. Set this to around 30us ~ 100us.
  • Frame rate: Lower down the system frame rate down to 30 FPS for better sampling.
  • Calibration Result: For best VR experience, please aim for the error result below ~1mm/marker.


  1. First of all, connect the HMD Tracker (Oculus Tracker / Vive Base Stations) and have its corresponding software running in the background.
  2. Launch Motive.
  3. [Motive] Create a rigid body asset from the markers on the HMD.
  4. [Motive] Open the Builder pane and access editing options for rigid bodies.
  5. [Motive] Select the HMD rigid body in Motive.
  6. [Motive → Builder pane] Under Detected Type, it should indicate the type of HMD you will be using.
  7. [Motive → Builder pane] Set the desired number of sample counts (Default: 750) and the desired orientation.
  8. [Motive → Builder pane] Press start to initiate the sampling process.
  9. [Motive → Builder pane] During calibration, make sure both the mocap system and the HMD tracker (Oculus Tracker / Vive Base Stations) is tracking the HMD.
  10. [Motive → Builder pane] Once it starts collecting the samples, point the HMD towards the tracker and slowly rotate it and sample different orientations.
  11. [Motive → Builder pane] When it finishes collecting samples, it will display the offset distance, in mm, between positions of the pivot point detected by the HMD tracking system and the motion capture system.
  12. [Motive → Builder pane] If the results are satisfying, press Apply to calibrate the HMD rigid body.

Rigid Body Properties

Once the rigid body is created, select the asset and open the Properties pane. The following settings need to be configured for HMD tracking.

Rigid body properties for the HMD asset.


Deflection setting is the tolerable distance, in millimeters, that a rigid body marker may deviate from its expected position before it is unlabeled and unassociated with the rigid body. The deflection is set to 4 mm by default. For HMD tracking applications, we recommended lowering this value to 3 or 2. This will reduce the amount of computation required for labeling, and overall latency may be reduced.

Tracking Algorithm

The tracking algorithm setting determines which protocol to use for solving the rigid body in Motive.
Passive Markers
Use the ray-based tracking algorithm for HMDs with passive markers. This algorithm is recommended because it allows tracking to be less susceptible to marker occlusions and solves rigid bodies more robustly without introducing trajectory gaps. Using ray-based solving will increase processing load, but since it can stabilize the tracking a lot, this is more suitable for tracking the passive marker HMDs where even minor occlusions can acutely affect the experience.
Active Markers
Use the marker-based tracking algorithm for HMDs using active marker clips. Ray-based tracking is not supported for active markers since active markers themselves already provide stable tracking from active labeling using unique illumination patterns. Thus, marker-based tracking will provide robust tracking for HMDs with active markers.

Streaming ID

The User Data value for each rigid body is used as an identification number to reference the rigid body in external applications. Log this value for each rigid body (including the HMD(s)) in the scene. This number will be used to associate the assets in the client applications using the plugin devices.

Min Marker Count (Passive Markers only)

This setting may need to be modified when tracking HMDs with passive markers. To prevent the swapping of the rigid body definition, set this setting to 4. When tracking multiple HMDs, there could be limitations to a variety of unique marker arrangements that could be achieved. If this value is set to a lower value, a set of three markers on an HMD may be congruent to another set in a different HMD, and the rigid body definitions may be switched in Motive.

Recalibrating Pivot Point

After the rigid body definition have been created for the HMD, the position and orientation of the rigid body pivot point should get placed precisely on the root of the player's nose with the respective orientation axis pointing the forward direction. If needed, you can use the Builder pane edit tools to recalibrate the HMD pivot point positions.

  • The pivot point is placed on the bridge of the nose; specifically, right at the midpoint between two eyes.
  • Orientation axes of the rigid body should precisely align with the physical HMD component.
  • Uncalibrated HMD rigid body definition. The pivot point is not positioned nor oriented properly.
  • Calibrated HMD rigid body definition. x-axis forward for UE4 integration.

Manually Calibrating the HMD Pivot Point

In most cases, you will be utilizing the previously described calibration tool for precisely positioning and orienting the HMD rigid body. However, if, for some reason, you want to manually position the pivot point without using the HMD Calibration feature, follow the instruction on this page: Manually Calibrating the HMD Pivot Point.

Export the Rigid Body


Tip: Once you have the rigid body asset for the HMD configured, you can export the asset into a MOTIVE file (*.motive) for future uses. Importing the MOTIVE file (e.g. CV1.tra) will load the rigid body (HMD) asset and make it available for use; however, the marker placement must remain unchanged in order to re-load previously created rigid bodies.

Live Streaming

Once you have connected the client object and configured the script components on the assets from the scene. Play the scene, and if the properties are correctly configured, the assets in the game scene will be animated according to the position and orientation of the rigid bodies and the HMD in Motive. When playing the scene, the HMD and the assigned actor objects will be placed in respect to their location in Motive’s coordinate system disregards to their position in the editor.

Wireless Multiplayer Setup

When setting up multiplayer games on wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.