This page has been translated automatically.
Video Tutorials
Interface
Essentials
Advanced
How To
Rendering
Professional (SIM)
UnigineEditor
Interface Overview
Assets Workflow
Version Control
Settings and Preferences
Working With Projects
Adjusting Node Parameters
Setting Up Materials
Setting Up Properties
Lighting
Sandworm
Using Editor Tools for Specific Tasks
Extending Editor Functionality
Built-in Node Types
Nodes
Objects
Effects
Decals
Light Sources
Geodetics
World Nodes
Sound Objects
Pathfinding Objects
Players
Programming
Fundamentals
Setting Up Development Environment
Usage Examples
C++
C#
UnigineScript
UUSL (Unified UNIGINE Shader Language)
Plugins
File Formats
Materials and Shaders
Rebuilding the Engine Tools
GUI
VR Development
Double Precision Coordinates
API
Animations-Related Classes
Containers
Common Functionality
Controls-Related Classes
Engine-Related Classes
Filesystem Functionality
GUI-Related Classes
Math Functionality
Node-Related Classes
Objects-Related Classes
Networking Functionality
Pathfinding-Related Classes
Physics-Related Classes
Plugins-Related Classes
IG Plugin
CIGIConnector Plugin
Rendering-Related Classes
VR-Related Classes
Content Creation
Content Optimization
Materials
Material Nodes Library
Miscellaneous
Input
Math
Matrix
Textures
Art Samples
Tutorials
Warning! This version of documentation is OUTDATED, as it describes an older SDK version! Please switch to the documentation for the latest SDK version.
Warning! This version of documentation describes an old SDK version which is no longer supported! Please upgrade to the latest SDK version.

Getting Started with VR

Notice
The article reviews the creation of C++ VR projects only. You can switch to the C# version in the upper right corner of the page.

This article is for anyone who wants to start developing Virtual Reality projects in UNIGINE and is highly recommended for all new users. We're going to look into the VR Sample demo to see what's inside and learn how to use it to create our own project for VR. We're also going to consider some simple examples of making modifications and extending the basic functionality of this sample.

So, let's get started!

VR Sample#

Thinking about VR developers, we created the VR Sample demo, enabling you to jump straight in and start creating projects of your own. We recommend using this demo as a basis for your VR project.

The demo is based on the VR Template that supports all SteamVR-compatible devices out-of-the-box. It provides automatic loading of controller models in run-time. Additionally, the template includes the implementation of basic mechanics such as grabbing and throwing objects, pressing buttons, opening/closing drawers, and a lot more.

Notice
The world in this sample project has its settings optimized for the best performance in VR and includes a .render asset that can be loaded at any time by simply double-clicking on it in the Asset Browser to reset any changes you made to default optimized values.

The sample project is created using the Component System, so the functionality of each object is determined by the components attached.

You can extend an object's functionality simply by adding more components. For example, the laser pointer object has the following components attached:

1. Making a Template Project#

So, well, we have a cool demo with some stuff inside, but how do we use it as a template? It's simple – just open your SDK Browser, go to the Samples tab, and select Demos.

Find the VR Sample in the Available section and click Install. After installation, the demo will appear in the Installed section, and you can click Copy as Project to create a project based on this sample.

In the Create New Project window that opens, enter the name for your new VR project in the corresponding field and click Create New Project.

2. Setting Up a Device and Configuring Project#

Suppose you have successfully installed your Head-Mounted Display (HMD) of choice.

Learn more on setting up devices for diffent VR platforms. If you have any difficulties, please visit the Steam Support. In case of Vive devices, this Troubleshooting guide might be helpful.

Notice
For Mixed Reality application development, you will need to download and install Varjo Base in addition to SteamVR.

All SteamVR-compatible devices are supported out-of-the-box.

By default, VR is not initialized. So, you need to perform one of the following:

  • If you run the application via UNIGINE SDK Browser, set the Stereo 3D option to the value that corresponds to the installed HMD (OpenVR or Varjo) in the Global Options tab and click Apply.

  • If you run the application from the command line, specify the -vr_app command line option on the application start-up. For OpenVR and Oculus it should be as follows:
    Shell commands
    your_app_name -vr_app openvr
    For Varjo:
    Shell commands
    your_app_name -vr_app varjo
    Alternatively, you can specify this command-line option in the Customize Run Options window when running the application via the SDK Browser:

Notice
The integration of OpenVR functionality in the UNIGINE VR system enables the implementation of applications for Oculus HMDs. That's why the OpenVR should be specified in case of utilizing Oculus devices.

3. Open Project's Source#

To open your VR project in an IDE, select it on the Projects tab of the UNIGINE SDK Browser and click Open Code IDE.

As the IDE opens, you can see, that the project contains a lot of different classes. This brief overview will give you a hint on what are they all about.

Don't forget to set the appropriate platform and configuration settings for your project before compiling your code in Visual Studio.

Now, we can try and build our application for the first time.

Build your application in Visual Studio (Build → Build Solution) or otherwise, and launch it by selecting the project on the Projects tab of the UNIGINE SDK Browser and clicking Run.

Before running your application via the UNIGINE SDK Browser make sure, that appropriate Customize Run Options (Debug version in our case) are selected, by clicking an ellipsis under the Run button.

4. Attaching Objects to HMD#

Sometimes it might be necessary to attach some object to the HMD to follow it (e.g. a hat). All movable objects (having the movable.prop property assigned and the Dynamic flag enabled) have a switch enabling this option.

For example, if you want to make a cylinder on the table attachable to the HMD, just select the corresponding node named "cylinder" in the World Hierarchy click Edit in the Reference section and enable the Can Attach to Head option.

Then select the parent node reference, and click Apply.

The same thing can be done via code at run time:

Source code (C++)
#include "Framework/Components/Objects/ObjMovable.h"
#include <UnigineWorld.h>

using namespace Unigine; 

...

// retrieving a NodeReference named "cylinder" and getting its reference node
NodePtr node = checked_ptr_cast<NodeReference>(World::getNodeByName("cylinder"))->getReference();

// checking if this node is a movable object by trying to get its ObjMovable component
ObjMovable *obj = ComponentSystem::get()->getComponent<ObjMovable>(node);
if (obj != nullptr)
{
	// making the object attachable to the HMD 
	obj->can_attach_to_head = 1;
}

5. Accessing Mixed Reality Features (Optional)#

Notice
The Mixed Reality features are available out-of-the-box for applications running with Varjo headsets - Varjo XR-3 and Varjo XR-4. The Varjo Mixed Reality features supported by the UNIGINE VR system are listed here.

To start developing a mixed reality application in UNIGINE, you should set up the environment as described above. Do not forget to install Varjo Base and ensure that VR is initialized successfully.

The VR Template provides the MixedRealityMenuGui property assigned to the head_menu GUI node. It demonstrates the operation of the available Mixed Reality settings: you can tweak them to see how they affect the rendered image.

Via this menu, you can toggle the video signal from the real-world view, adjust different settings for the camera, such as white balance correction, ISO, and others. The widgets for the menu are initialized in run-time, but the node is created via UnigineEditor.

To manage mixed reality, use methods of the VRMixedReality and VRMarkerObject classes of UNIGINE API.

6. Accessing Eye-Tracking Feature (Optional)#

Notice
Eye-tracking is also available out-of-the-box for applications running with Varjo headsets.

In the VR Sample, there is the eyetracking_pointer property assigned to the node of the same name. It shows the name of the node towards which the gaze is directed. The implementation of the property is available in the vr_sample/Demo/Global folder. You can extend or change the functionality by modifying the EyetrackingPointer component.

7. Attaching Objects to Controllers#

If you need to attach some object loaded in run-time to a controller (e.g., a menu), you can assign the AttachToHand property to this object.

For example, if you have a GUI object and want to attach it to the controller, select this object in the World Hierarchy, click Add New Property in the Parameters window, and specify the AttachToHand property.

The property settings allow specifying the controller to which the object should be attached (either left or right), as well as the object's transformation.

In the VR Sample, this component is attached to the hand_menu node and initializes widgets in run-time.

8. Switching Nodes by Gesture (Optional)#

When the application is launched with the Ultraleap integration plugin, you can control the objects with your hands.

Notice
Hand-tracking in VR available for Varjo VR and XR headsets.

The VR Template provides the NodeSwitchEnableByGesture property assigned to the vr_layerVRUltraleap dummy node and is available when using a VR device with hand-tracking support. The property settings allow specifying the number of nodes you can switch between, the nodes to swtich, and the gesture type for switching.

When you hold your left wrist with your right hand, the menu appears:

9. Adding a New Interaction#

Suppose we want to extend the functionality of the laser pointer in our project, that we can grab, throw and use (turn on) for now, by adding an alternative use action (change material of the object being pointed at, when certain button is pressed).

So, we're going to add a new altUseIt() method to the VRInteractable class for this new action and map it to the state of a certain controller button.

VRInteractable.h

Source code (C++)
#pragma once
#include <UnigineNode.h>
#include <UniginePhysics.h>
#include "../Framework/ComponentSystem.h"
#include "Players/VRPlayer.h"

using namespace Unigine;
using namespace Math;

class VRPlayer;

class VRInteractable : public ComponentBase
{
public:
	// ... 

	// interact methods
	virtual void grabIt(VRPlayer* player, int hand_num) {}
	virtual void holdIt(VRPlayer* player, int hand_num) {}
	virtual void useIt(VRPlayer* player, int hand_num) {}
	virtual void altuseIt(VRPlayer* player, int hand_num) {} //<-- method for new alternative use action
	virtual void throwIt(VRPlayer* player, int hand_num) {}
};

Declare and implement an override of the altUseIt() method for our laser pointer in the ObjLaserPointer.h and ObjLaserPointer.cpp files respectively:

ObjLaserPointer.h

Source code (C++)
#pragma once
#include <UnigineWorld.h>
#include "../VRInteractable.h"

class ObjLaserPointer : public VRInteractable
{
public:
	// ... 

	// interact methods
	// ...
	// alternative use method override
	void altuseIt(VRPlayer* player, int hand_num) override;

	// ...

private:
	// ...
	int change_material;	//<-- "change material" state

	// ...
};

ObjLaserPointer.cpp

Source code (C++)
// ...

void ObjLaserPointer::init()
{	
  	// setting the "change material" state to 0
	change_material = 0;
	
    // ...
}

void ObjLaserPointer::update()
{
	if (laser->isEnabled())
	{
      	// ...
      		// show text
		
		if (hit_obj && hit_obj->getProperty() && grabbed)
		{
			//---------CODE TO BE ADDED TO PERFORM MATERIAL SWITCHING--------------------
			if (change_material)// if "alternative use" button was pressed
			{
				// change object's material to mesh_base
				hit_obj->setMaterialPath("Unigine::mesh_base", "*");
			}
			//---------------------------------------------------------------------------
			// ...
		}
		else
			obj_text->setEnabled(0);
	}
	// unsetting the "change material" state
	change_material = 0;
}

// ...

// alternative use method override
void ObjLaserPointer::altuseIt(VRPlayer* player, int hand_num)
{	
	// setting the "change material" state
	change_material = 1;
}

// ...

Now, we're going to map this action to the state of the YB controller button. For this purpose we should modify the VRPlayer class (which is the base class for all VR players) by adding the following code to its postUpdate() method:

VRPlayer.cpp

Source code (C++)
// ...

void VRPlayer::postUpdate()
{
	for (int i = 0; i < getNumHands(); i++)
	{
		int hand_state = getHandState(i);
		if (hand_state != HAND_FREE)
		{
			auto &components = getGrabComponents(i);
			
            // ...
            //-------------CODE TO BE ADDED--------------------------
			// alternative use of the grabbed object
			if (getControllerButtonDown(i, BUTTON::YB))
			{
				for (int j = 0; j < components.size(); j++)
					components[j]->altuseIt(this, i);
				// add callback processing if necessary
			}
            //--------------------------------------------------------
		}
	}
	update_button_states();
}

// ...

10. Adding a New Interactable Object#

The next step in extending the functionality of our VR Sample is adding a new interactable object.

Let's add a new type of interactable object, that we can grab, hold and throw with an additional feature: object will change its form (to a certain preset) when we grab it, and restore it back, when we release it. It will also display certain text in the console, if the corresponding option is enabled.

So, we're going to use the following components:

  • ObjMovable - to enable basic grabbing and throwing functionality
  • new ObjTransformer component to enable form changing and log message printing functionality

The following steps are to be performed:

  1. Add a new ObjTransformer class inherited from the VRInteractable. In Visual Studio we can do it by choosing Project → Add Class from the main menu, clicking Add, specifying class name and base class in the window that opens, and clicking Finish:

  2. Implement functionality of transformation to the specified node on grabbing, and restoring previous form on releasing a node

    Below you'll find header and implementation files for our new ObjTransformer class:

    ObjTransformer.h

    Source code (C++)
    #pragma once
    #include <UnigineNode.h>
    #include "Components/VRInteractable.h"
    #include "Framework/Utils.h"
    class ObjTransformer :
    	public VRInteractable
    {
    	public:
    		ObjTransformer(const NodePtr &node, int num) : VRInteractable(node, num) {}
    		virtual ~ObjTransformer() {}
    
    		// property name
    		UNIGINE_INLINE static const char* getPropertyName() { return "transformer"; }
    
    		// parameters
    		PROPERTY_PARAMETER(Toggle, show_text, 1);			// Flag indicating if messages are to be printed to the console
    		PROPERTY_PARAMETER(String, text, "TRANSFORMATION");	// Text to be printed to the console when grabbing or releasing the node
    		PROPERTY_PARAMETER(Node, target_object);			// Node to be displayed instead of the transformer-node, when it is grabbed
    
    		// interact methods
    		void grabIt(VRPlayer* player, int hand_num) override;	// override grab action handler
    		void throwIt(VRPlayer* player, int hand_num) override;	// override trow action handler
    		void holdIt(VRPlayer* player, int hand_num) override;	// override hold action handler
    
    	protected:
    		void init() override;
    };

    ObjTransformer.cpp

    Source code (C++)
    #include "ObjTransformer.h"
    
    REGISTER_COMPONENT( ObjTransformer );		// macro for component registration by the Component System
    
    // initialization
    void ObjTransformer::init(){
    				
    	// hiding the target object (if any)
    	if (target_object){
    		target_object->setEnabled(0);
    	}
    }
    
    // grab action handler
    void ObjTransformer::grabIt(VRPlayer* player, int hand_num)
    {
    	// if a target object is assigned, showing it, hiding the original object and displaying a message in the log
    	if (target_object){
    		target_object->setEnabled(1);
    
    		// hide original object's surfaces without disabling components
    		ObjectPtr obj = checked_ptr_cast<Object>(node);
    		for (int i = 0; i < obj->getNumSurfaces(); i++)
    			obj->setEnabled(0, i);
    				
    		if (show_text)
    			Log::message("\n Transformer's message: %s", text.get());
    	}
    }
    
    // throw action handler
    void ObjTransformer::throwIt(VRPlayer* player, int hand_num)
    {
    	// if a target object is assigned, hiding it, and showing back the original object
    	if (target_object){
    		target_object->setEnabled(0);
    					
    		// show original object's surfaces back
    		ObjectPtr obj = checked_ptr_cast<Object>(node);
    		for (int i = 0; i < obj->getNumSurfaces(); i++)
    			obj->setEnabled(1, i);
    	}
    }
    
    // hold action handler
    void ObjTransformer::holdIt(VRPlayer* player, int hand_num)
    {
    	// changing the position of the target object
    	target_object->setWorldPosition(player->getHandNode(hand_num)->getWorldPosition());
    }
  3. Build your application and launch it as we did earlier, a new property file (transformer.prop) will be generated for our new component.
  4. Open the world in the UnigineEditor, create a new box primitive (Create -> Primitive -> Box), and place it somewhere near the table, create a sphere primitive (Create -> Primitive -> Box) to be used for transformation.
  5. To add components to the box object select it and click Add New Property in the Node Properties section, then drag the movable.prop property to the new empty field that appears. Repeat the same for the transformer.prop property, and drag the sphere from the World Hierarchy window to the Target Object field.

  6. Save your world and close the UnigineEditor.
  7. Launch your application.

10. Restricting Teleportations#

By default, it is possible to teleport to any point on the scene. To avoid user interaction errors in VR (e.g., teleporting into walls or ceilings), you can restrict teleportation to certain areas. To do so, perform the following:

  1. Create a mesh defining the area you want to restrict user teleportation to.
  2. Set an intersection mask to the desired surface(s) of this mesh either in the UnigineEditor or using the setIntersectionMask() method:
    Source code (C++)
    // defining the teleportation mask as a hexadecimal value (e.g. with only the last bit enabled)
    int teleport_mask = 0x80000000;
    
    // setting the teleportation mask to the MyAreaMesh object's surface with the num index
    MyAreaMesh->setIntersectionMask(num, teleport_mask);
  3. Set the same intersection mask for the teleport ray using the following method: VRPlayerVR::setTeleportationMask(teleport_mask).
Notice
Multiple meshes can be used to define teleportation area.

Where to Go From Here#

Congratulations! Now, you know how to create your own VR project on the basis of the VR Sample demo, and extend its functionality. So, you can continue developing it on your own. There are some recommendations for you, that might be useful:

  • Try to analyse the source code of the sample further and figure out how it works, use it to write your own.
  • Read the Virtual Reality Best Practices article for more information and useful tips on preparing content for VR and making user experience better.
  • Read the Component System article for more information on working with the Component System.
  • Check out the Component System Usage Example for more details on implementing logic using the Component System.
Notice
You can select the VR project template when creating a new application via the SDK Browser to create an empty VR application from scratch (having demo content and code stripped off).
Last update: 2024-04-04
Build: ()