This page has been translated automatically.
Programming
Fundamentals
Setting Up Development Environment
Usage Examples
UnigineScript
C#
UUSL (Unified UNIGINE Shader Language)
File Formats
Rebuilding the Engine and Tools
GUI
Double Precision Coordinates
API
Containers
Common Functionality
Controls-Related Classes
Engine-Related Classes
Filesystem Functionality
GUI-Related Classes
Math Functionality
Node-Related Classes
Networking Functionality
Pathfinding-Related Classes
Physics-Related Classes
Plugins-Related Classes
CIGI Client Plugin
Rendering-Related Classes
Warning! This version of documentation is OUTDATED, as it describes an older SDK version! Please switch to the documentation for the latest SDK version.
Warning! This version of documentation describes an old SDK version which is no longer supported! Please upgrade to the latest SDK version.

Syncker Plugin

Syncker is a multi-node rendering system that makes it easy to create an immersive CAVE system or a large-scale visualization wall on a computer cluster synchronized over the network in real-time.

Simply connect multiple computers into one high-resolution seamless display using LAN. And it does not matter if these computers are running on Windows, Linux and Mac OS X at the same time, as Syncker works even across different platforms. Moreover, created virtual environments can have any monitor configuration, since all viewports are fully configurable and multi-monitor grids on slave computers are supported.

Syncker

Computers in LAN connected via Syncker

See Also

Syncker API:

  • The Manager interface article for more details on managing Syncker via code (C++, UnigineScript).
  • The Master interface article for more details on managing Syncker master via code (C++, UnigineScript).
  • The Slave interface article for more details on managing Syncker slave via code (C++, UnigineScript).

Structure and Principles

Syncker allows you to render a world across different computers synchronized over LAN. These computers can be of two types:

Master Master application is an application running on the main computer apart from rendering it also performs calculations of physics, animations and controls slaves.
Notice
The main computer should have the most advanced hardware configuration, as it performs most of the calculations.
  • Nodes, materials and cameras are created/deleted and moved according to the application logic.

All other applications are synchronized with a master.

Slaves Slave applications are all other applications running on computers connected over the network. There can be an unlimited number of slaves connected to one master (as long as network bandwidth allows it). The purpose of such applications is to render all nodes that are seen in their viewports.
  • All rendering is done directly on a slave GPU.
  • Physics simulation is performed by the master.
    Notice
    Physics simulation in slave applications should be disabled to improve performance.
  • Objects of the following types are synchronized automatically: ObjectWaterGlobal, ObjectCloudLayer, WorldLight if they present in the *.world file on all computers.
  • Animations are updated on the master computer, while slaves only get calculated bone transformations.
  • Particle systems are updated on the slave but according to parameters sent from the master.
  • Physics simulation in slave applications should be disabled not to override position and orientation of nodes sent over the network from a master.

Configuration of slaves is set up to fit the configuration of monitors.

Launching Order

The order of launching master and slave applications does not matter: you can launch several slaves, then the master, and then the rest of the slaves - synchronization will start automatically. This is achieved by having a buffer of object creation/deletion calls. Each time a new slave connects to the network, the master tells it, which objects are to be created or deleted.

Interpolation and Extrapolation

Unigine Syncker uses Interpolated Snapshots (IS) to tackle with the problem of lost packets. It works by taking two old, but known positions and interpolating the object between them. It is accomplished by having a buffer of received positions and rotations, along with the time they represent. We usually take our current local time minus some predefined amount - interpolation period (40 ms by default), then go into our buffer, find the two indices which are just before and just after this time and interpolate.

If we don't have a received position and rotation for the time we're looking for, the extrapolation (guessing) is used. It also has a limited time - extrapolation period (200 ms by default). If the extrapolation period is over, but there are still no packets received and all objects will freeze.

In most cases this method provides a very accurate representation of the world to each of the slaves, as in general only already known positions of the remote objects are rendered and in rare cases the system will try to extrapolate (guess) where an object is. This, however, comes at a cost, as we always render 40 ms (interpolation period) behind current time, so that new packets have time to arrive with data.

Optimized Synchronization

The scene contains static and dynamic objects, and "dynamic" doesn't mean that object's states are changed constantly. Taking this into account Syncker uses the following strategy:

  • for static objects synchronization packets are sent on their creation and deletion.
  • for dynamic objects the packets are sent only when their state has changed.

Moreover, Syncker offers you a flexible customization of the whole synchronization process, based on custom user TCP/UDP messages. Instead of sending the whole bunch of transformation data for objects that can only be rotated or sending huge amounts of transformation data for all parts of complex objects, when their positions can be determined using just a handful of simple parameters, you can send a single quaternion or a set of parameters.

All packets are compressed before sending (LZ4 or ZLIB). You can choose the desired algorithm and compression quality.

Thus, the excessive load on the network is minimized, while keeping performance high.

Two-Way Communication

Slaves are not silent anymore. If a slave fails to receive a message via UDP protocol, it immediately reports to the master (via TCP) and requests to resend the message again. In this case the master sends the current state of all synchronized objects via UDP to perform update.

Notice
All critical messages are sent via TCP protocol with guaranteed delivery, others are sent via UDP.

Moreover, a slave can control the master and other slaves (e.g. run the profiler on all computers or shutdown all applications) by sending the syncker_console command.

Multiple Cameras and Multi-Monitor Slaves

Syncker allows to synchronize views from multiple cameras. There are two types of cameras:

  • Main master camera - a single camera that corresponds to the main viewer's position. The configuration mesh determines viewports relative to this camera. Example: a camera in the plane's cockpit, corresponding the the pilot's point of view.
  • Auxiliary camera - an additional camera (static or dynamic), that can be set anywhere on the scene. You can have as many cameras of this type as necessary. Example: a ground-based surveillance camera or a thermal imaging camera mounted on the plane's wing.
By default, the main master camera is used. You can create auxiliary cameras and specify their viewports to be displayed by selected slaves. Such cameras will be synchronized automatically.

Syncker offers you an extreme flexibility of viewport configuration. You can use up to 6 monitors on a single slave, each having it own viewport assigned. For this purpose you will have to use the AppWall plugin.

Screen Configurations

Screen configuration for the Syncker is determined by the configuration mesh that has to meet the following requirements:

  • Each surface of the mesh should correspond to one of the monitors.
  • Each surface of the mesh must be represented as a rectangle.
  • Coordinate system of the mesh must be expressed in meters.
  • Position and orientation of each monitor must be relative to the main master camera. The closer the monitor is to the viewer, the higher the FOV value.
  • The pivot point of the mesh must correspond to the viewer's position.

Two types of configuration mesh can be used:

  • Standard configuration mesh. The pivot point of such mesh corresponds to a fixed camera position.
    Notice
    Projection and modelview matrices are automatically calculated on the basis of the camera position.

    Screen configuration (the right picture) represented by a mesh (the left picture)
  • As a CAVE (Cave Automatic Virtual Environment) configuration mesh that is a special case of the mesh created in a 3rd-party 3D editor (see the previous paragraph). The pivot point of such mesh corresponds to the origin of coordinates for the viewer's head position and is automatically updated as this position changes.

Syncker Pipeline

To connect computers and synchronize them, the following pipeline is used.

  1. After being launched the master starts sending broadcast messages that it is available using a UDP protocol.
  2. All slaves in LAN after being launched listen to the selected UDP port. Upon receiving this message, they connect to a master via a TCP protocol.
    Notice
    Actually 2 TCP ports are used:
    • TCP port - for messages.
    • TCP ping port - for latency measurement (to ensure correct interpolation).
  3. After the connection was established, both UDP and TCP protocols are used to exchange data between the master and slaves.
    • Over UDP protocol non-critical messages are sent. This includes messages on render parameters (for example, ambient color or HDR exposure), position of a player and all nodes and materials that are synchronized. It also includes frame-related information (if the game is enabled, the current frame number, frame duration, a time scale and even the game seed for random number generators).
    • Over TCP protocol critical messages are sent. These are messages on creation and deletion of nodes and user messages.

Using Syncker

The basic workflow is as follows:

  1. Implement Syncker logic for your master and slave application (you can use the same application on both master and slave sides).
  2. Prepare a configuration mesh.

    Basically, there are two ways you can create a configuration mesh:

    • Using a 3rd-party 3D editor.

      In this case you can choose any 3rd-party 3D editor and create a mesh meeting the requirements above.

    • Using a generation script (UnigineScript).

      In this case you should write a special script, that performs generation of the mesh. Here is an example of such script, that creates a configuration mesh for 3 viewports:

      Source code (UnigineScript)
      #!/usr/bin/env usc
      //#define ROTATE_SCREENS
      
      float screen_bezels[] = ( 0.01f, 0.01f, 0.01f, 0.01f );	// bottom, right, top, left
      
      int create_display(Mesh mesh,string name,float w,float h,float bezels[]) {
      	
      	int num = mesh.addSurface(name);
      	
      	mesh.addVertex(vec3(0.0f,-w * 0.5f + bezels[3],-h * 0.5f + bezels[0]),num);
      	mesh.addVertex(vec3(0.0f, w * 0.5f - bezels[1],-h * 0.5f + bezels[0]),num);
      	mesh.addVertex(vec3(0.0f, w * 0.5f - bezels[1], h * 0.5f - bezels[2]),num);
      	mesh.addVertex(vec3(0.0f,-w * 0.5f + bezels[3], h * 0.5f - bezels[2]),num);
      	
      	mesh.addIndex(0,num);
      	mesh.addIndex(1,num);
      	mesh.addIndex(3,num);
      	mesh.addIndex(1,num);
      	mesh.addIndex(2,num);
      	mesh.addIndex(3,num);
      	
      	mesh.createTangents();
      	mesh.createBounds(num);
      	
      	return num;
      }
      
      void main() {
      	
      	Mesh mesh = new Mesh();
      	
      	// distance from front display and player's head (meters)
      	float player_distance = 0.35f;
      	
      	// size of displays (meters)
      	float width = 0.7f;
      	float height = 0.392f;
      	
      #ifdef ROTATE_SCREENS
      	float rad = atan(width * 0.5f / player_distance) * 2.0f;
      	
      	int c1 = create_display(mesh, "0", width, height, screen_bezels);
      	mesh.setSurfaceTransform( translate( -cos(-rad) * player_distance, sin(-rad) * player_distance, 0.f) * rotateZ(rad * RAD2DEG), c1 );
      	int c2 = create_display(mesh, "1", width, height, screen_bezels);
      	mesh.setSurfaceTransform( translate( -player_distance, 0.0f, 0.f), c2 );
      	int c3 = create_display(mesh, "2", width, height, screen_bezels);
      	mesh.setSurfaceTransform( translate( -cos(rad) * player_distance, sin(rad) * player_distance, 0.f) * rotateZ(-rad * RAD2DEG), c3 );
      #else
      	int c1 = create_display(mesh, "0", width, height, screen_bezels);
      	mesh.setSurfaceTransform( translate( -player_distance, -width, 0.0f), c1 );
      	int c2 = create_display(mesh, "1", width, height, screen_bezels);
      	mesh.setSurfaceTransform( translate( -player_distance, 0.0f, 0.0f), c2 );
      	int c3 = create_display(mesh, "2", width, height, screen_bezels);
      	mesh.setSurfaceTransform( translate( -player_distance, width, 0.0f), c3 );
      #endif
      	
      	mesh.save("../data/views.mesh");
      	mesh.save("views.mesh");
      }

      When your script is ready, you can use it to generate the mesh

      Shell commands
      usc_x64.exe mesh_generator.usc

      The following mesh will be generated:

  3. Prepare your environment.

    It is recommended to use a 100 Mb LAN. Otherwise, you may experience network lags (see Troubleshooting section).

    All applications you use must have access to all Unigine files and project data. So, you should copy your project to all computers. If some nodes are missing in the world file on a local computer, they will not be rendered.

  4. Run the master application.

    To run a master application you should provide necessary startup command-line options, e.g.:

    Shell commands
    <your_app_name> -extern_plugin "Syncker" -sync_master 1 -sync_broadcast_address xxx.xxx.xxx.xxx -sync_mesh <mesh_name>.mesh -sync_view <mesh_surface_name>
    Notice
    The order of launching master and slave applications does not matter.
  5. Run a slave application.

    To run a slave application you should provide necessary startup command-line options, e.g.:

    Shell commands
    <your_app_name> -extern_plugin "Syncker" -sync_master 0 -sync_slave_name <slave_name> -sync_mesh <mesh_name>.mesh -sync_view <mesh_surface_name>

    If you want a slave with a multi-monitor configuration to render several viewports, you should use the AppWall plugin and assign viewports to monitors via the sync_view_n options, e.g.:

    Shell commands
    <your_app_name> -extern_plugin "AppWall,Syncker" -sync_master 0 -sync_slave_name <slave_name> -sync_mesh <mesh_name>.mesh -sync_view_0 <mesh_surface_0> -sync_view_1 <mesh_surface_1>
    Notice
    The order of plugins in the list matters: AppWall must be specified before Syncker.

Troubleshooting

If the network latency is too large despite 1Gb bandwidth or higher, it can be caused by a 100 Mb or 10 Mb device connected to a network. Data exchange rate will drop down to the maximum rate supported by such device, slowing down Syncker connection speed.

  • Some 100 Mb or 10 Mb devices can have a working network interface when they are turned off.
  • It is also possible that when turned off, 1 Gb devices have a network interface working at 100 Mb rate, which will slow down connection in LAN.

Useful Tool

If you have a source SDK, you can use a simple and useful tool to monitor the network messages exchange speed. It is server.usc found in source/tools/Interpreter/scripts/network/.

Last update: 2017-10-20
Build: ()