shown in other instead.
This article covers UNIGINE-specific terminology and concepts and is highly recommended for all new users.
How Are Virtual Worlds Organized?#
When you create an application in UNIGINE, it is represented by a project. A project is a "container" for the application's code, content, and meta-data.
A project can consist of one or several complex 3D scenes, which are called worlds.
All projects are managed via the UNIGINE SDK browser.
A UNIGINE-based virtual world is a 3D scene that contains a set of different scene graph nodes (e.g. static meshes, lights, cameras, etc.) placed into specified positions, and global settings (rendering, physics, etc.) applied to the whole scene.
A scene graph in UNIGINE is a multi-root tree (hierarchy) of nodes.
Each world is represented by an XML file with the .world extension.
When you create a new project via the SDK Browser, a new world is also created with the same name as your project.
To edit your existing worlds and create new ones, use the UnigineEditor. You can create and use as many worlds as you need.
In terms of UNIGINE, all objects added to the scene are called nodes. Nodes can be of different types, determining their visual representation and behavior.
There is a rich set of built-in node types. Though the set covers almost all of required cases, it can be manually extended by the user.
You can also extend basic functionality of any node by adding components to it.
Every node has a transformation matrix, which encodes position, rotation, and scale of the node in the world.
There are node types that appear visually: Objects, Decals, and Effects. Other node types (Light Sources, Players, etc.) are invisible.
Node parameters are regularly stored in the .world file, but also can be saved into a separate XML file with the .node extension (and later be referenced from the .world file via the special nodes called Node References, which is used for node instancing).
One of the most important node types is Object. Objects represent imitations of entities existing in the real world: objects (people, trees, cars, planes, etc.), sky, terrains, water, and so on. Objects have a single surface or a set of surfaces defining their visual appearance. They can have a shape to represent the volume they occupy in 3D space and a body to participate in physical interactions.
A mesh is a collection of polygons defining object's geometry. It has the following properties:
- Each mesh has one or several surfaces.
- Maximum number of polygons per surface is 2,147,483,647 (ObjectMeshStatic, ObjectMeshSkinned) or 65,535 (ObjectMeshDynamic).
- There are 2 UV channels for texturing.
- Mesh supports vertex colors.
Animation in UNIGINE can be performed by skinned meshes (bone-based), morph targets (keyframe) or dynamic meshes (code-controlled).
At run time, meshes are stored in proprietary UNIGINE formats with .mesh (static mesh + optional animation data) and .anim (external animation data) extensions.
When importing an FBX model to the Engine, it is automatically converted to the .mesh format.
A surface is a named non-overlapping subset of the object geometry (i.e. object mesh). Each surface can have its own material or property assigned. It is also possible to enable/disable surfaces independently from each other.
Surfaces can be organized in a hierarchy within a mesh (it can be used for LOD switching).
At the picture below, a soldier 3D mesh consists of 4 surfaces: eyes, skin, bags (body armor, radio set, bag), and body (overalls, shoes, hard hat, gloves).
In terms of UNIGINE, a material is a rule defining how the surface will look like: its interaction with lights, reflection parameters, etc. It is based on:
- Vertex, fragment and geometry shaders that actually draw the material based on different conditions.
- User-defined textures passed to shaders.
- States that specify conditions, based on which the appropriate shaders will be applied.
- Parameters defining how shaders will be used.
UNIGINE provides a rich set of built-in base materials out of the box. The recommended way is to inherit a new user material from a base one and tweak it. You can also create custom shaders either using UUSL (Unified UNIGINE Shader Language) or HLSL/GLSL, but in the latter case you would need to migrate your custom shaders with every SDK release by yourself.
In addition to regular materials applied to certain surfaces, there is a special type of materials called post materials, that are applied above the final screen composition (e.g. to create a night or thermal vision effect). You can also create your own post-effects with your custom post-materials and custom shaders.
Materials are organized in hierarchy with parameters inheritance and overloading (much like in object-oriented programming). When a material is inherited, all its parameters are inherited from the parent. If a parameter value is changed for the parent, it will be automatically changed for the child as well, unless it was overridden (set to a different value) for the child before that.
Example: a material A has two parameters (color: blue and specular power of 1.4), a material B is inherited from the material A and has color parameter overridden (red). If we change specular power to 2.0 in the material A, then the material B will have the following parameters: red color (overridden) and 2.0 value of specular power (inherited).
By using parameter inheritance it is very convenient to mass control values of multiple materials parameters.
A property is a "material" for application logic. It specifies the way the object will behave and interact with other objects and the scene environment. Properties can have parameters of various types — from a simple integer representing your character's hit points, to node, material, file (for textures, meshes, sounds, etc.), or property, which simplifies access to various resources.
Properties can be used to build components to extend the functionality of nodes.
Properties, like materials, are organized in a hierarchy with parameter inheritance. But, unlike materials, they can be applied either per-surface or per-node.
How to Add Content to the Virtual World?#
Every piece of content, that can be used in your world or project, is an asset. An asset may come from a file created using a third-party application, such as a 3D model, an audio file, an image, or any other type supported by the UNIGINE Engine.
The main front-end tool of the Asset System is the Asset Browser in UnigineEditor. It is used to organize content in your project: create, import, view, rename your assets, move them between the folders and manage their hierarchy.
Each time an asset is created or imported, UnigineEditor does all necessary job, including conversion of your assets (be it a JPG texture or and FBX model) to its native format (such as compressed .dds textures, .mesh geometry, .anim animations, etc.) to be used by the Engine at run time. Such files, generated as a result of conversion, are called "run-time files", and they are updated by UnigineEditor each time you modify the corresponding asset.
It is recommended that you familiarize yourself with the Assets Workflow to learn all the details.
UnigineEditor allows you to assemble a virtual world: import and set nodes, assign materials and properties to them, setup lighting, adjust global settings (physics, rendering, etc.) and more. It features What You See Is What You Get approach: you can instantly see the scene with final quality (as at run time).
Watch the tutorial below to learn how to import 3D models to UNIGINE:
How Do We See the Virtual World?#
For visual representation, UNIGINE uses a standard perspective projection. The orthogonal projection is also available.
In UNIGINE, the way how the world is seen is based on the 3 entities:
- A camera is a structure containing 2 matrices: view and projection. Through this structure, you set the camera parameters: field of view, masks, near and far clipping planes, and post materials. Then, the camera is passed to a Viewport that will render an image from this camera. The camera is assigned to a Player, that will further control its position.
- A viewport receives a camera and renders an image from it on the screen. In addition, it provides all functions of the main renderer, for example, cube maps rendering, stereo rendering, panoramic rendering and so on.
- A player is a node controlled via the input devices (keyboard, mouse, joystick). It has a camera assigned. Once a player has changed its position, its internal camera's view matrix will be changed as well.
UNIGINE features several types of players: Player Dummy, Player Actor, Player Persecutor and Player Spectator.
Lighting in your worlds is created by placing Light Sources. These nodes contain parameters, which determine various light characteristics, such as brightness, color, etc. You can also use physically-based parameters, like color temperature and illuminance, to set up your lights.
There are different kinds of lights and they emit light in different ways. A light bulb, for example, emits light in all directions — in UNIGINE it is represented by the omni light. A projector or car headlights emit a cone of light in a certain direction — projected light. Light beams that come from the sun appear to be parallel, as their source is located so far away. To simulate this type of lighting in UNIGINE, the world light is used.
To learn more about lighting in UNIGINE, see the Lighting Video Tutorial.
UNIGINE has a combination of full deferred renderer with forward rendering techniques:
- All opaque (non-transparent) geometry is rendered in the deferred pass.
- Transparent geometry is rendered in the forward pass.
To learn more about the applied rendering techniques, see the Rendering Sequence article.
In UnigineEditor all rendering settings (such as global illumination, shadows, environment, anti-aliasing, post-effects, etc.) can be adjusted via the Rendering Settings section and saved to a *.render file to be used later. Each new project contains settings for low, medium, high, and ultra quality presets as well as settings optimized for best performance in VR. You can now simply double-click on any of them in the Asset Browser to apply corresponding settings.
How Do We Hear Sounds?#
Next to the visual component, sound is a very important domain of real-time technologies. It is the key resource for creating the proper feeling of immersion in the virtual world. For example, rolling echo cues us to anticipate a scene to take place in a lofty dome. Soft tapping of a stone hit by an incautious foot or a car whooshing by with lightning speed — all that can be modelled aurally. UNIGINE offers a multi-channel sound system with support for binaural HRTF-based sound, various 3D effects, sound occlusion and multiple reverberation zones. You can assign any MP3, WAV, or OGA sound to be played by contact of the objects simulating their physical properties on the level of sound. For the moving sources, like a car, the Doppler effect is applied so that the movement of the sound source relative to the listener is authentically imitated.
All sound settings can be adjusted via the Sound Settings section in UnigineEditor. You can also save your presets to *.sound files to be used later.
How Is Physical Behavior Defined?#
UNIGINE features a built-in simplified game-like Newton physics. The use cases for physics properties (where physics is more preferable than hard-coding objects animation) are the following:
- Collision detection (preventing moving through walls)
- Simulating perfectly elastic collisions (redistribution of kinetic energy)
- Simulation of simple mechanisms by rigid bodies and destructible joints
- Simulation of basic physical phenomena: gravity, friction, buoyancy
- Procedural destruction of meshes
- Simulation of cloth and ropes movement
Physics is simulated with its own update FPS and is in effect within the physics simulation distance. Physical properties can be applied to the objects only.
To assign some physical properties to an object, so that it could participate in interactions between objects or external physical forces, it should have a body. There are several types of bodies: Rigid, Rag Doll, Fracture, Rope, Cloth, Water, Path.
Almost like in the real world, virtual physics of the body follows the concepts of velocity, force, mass, density and so on.
While a body simulates various types of physical behavior, a shape represents the volume (sphere, capsule, cylinder, box, convex hull) that a rigid body occupies in space. A physically simulated object usually has a single body and one or several shapes which allow objects to collide with each other.
A collision detection algorithm detects contact points between shapes and prevents them from penetrating each other. Contact points and normals are accessible via API.
Joints are used to connect several objects with respect to the mass balance and represent constraints removing certain degrees of freedom. There are different types of joints: Fixed, Hinge, Ball, Prismatic, Cylindrical, Wheel.
Global Physics Settings#
There are global physics settings (gravity, penetration factor, etc.) affecting all physical objects present in the world.
How to Control the Virtual World?#
To implement your project's logic in UNIGINE, you can use the following programming languages:
- C# for a good balance between speed and ease of use. A recommended approach here is C# (.NET Core) API. It allows using C# Component System enabled by default and integrated into the UnigineEditor. It is the easiest way to implement your application logic in components and assign them to any node to be executed. Moreover, .NET Core API is cross-platform unlike the .NET Framework supported on Windows only.
- C++ for maximum performance and seamless integration with the existing code base.
- UnigineScript, fast iterative scripting language featuring instant compilation and thousands of useful functions.
All the APIs are unified: every class, method, and constant are accessible via any API. However, there are minor language-specific differences.
To learn more, see the following usage examples articles:
- UnigineScript API, C++ API and C# API usage examples
- Examples of UnigineScript extension using C++ API
- Examples of UnigineScript extension using C# API
You can stick to a single language: C++ if maximum performance is a key factor, or C# for optimum balance. In case of C# (.NET Core), UNIGINE provides the C# Component System integrated into UnigineEditor. This approach is deemed to be the most convenient and ensuring good performance for complex applications with elaborate logic.
Alternatively, you can have different programming languages (C++, C#, and UnigineScript) for different pieces of your project: for example, you can use C++ for base classes and performance consuming operations; and implement some simple application logic in UnigineScript. You can also call methods from one API when using another, and manually expand API functionality.
Every UNIGINE-based application has its life cycle, that consists of certain stages, some of them are performed once, others are repeated each frame. In short, these stages are as follows:
UNIGINE has three main logic components, Each of them has a set of functions (named init(), update(), postUpdate(), etc.) that contain actions to be performed at corresponding stages of the Engine's working cycle. These components are:
System Logic is the code that is run during the whole application life cycle (its scope exists even when switching between worlds).
- For applications written using UnigineScript, the system logic is written to the system script file (unigine.usc).
- For applications that use C++ AppSystemLogic.cpp is created, and for C# applications — AppSystemLogic.cs. This file is stored in the source/ folder of your project. It has implemented methods to put your logic code inside.
World Logic is the logic of the virtual world. The logic takes effect only when the world is loaded.
- For applications written using UnigineScript, the world logic is written to the world script file (*.usc named after your project) and is loaded and unloaded together with the corresponding world.
- For applications that use C++ AppWorldLogic.cpp is created, and for C# applications — AppWorldLogic.cs. This file is stored in the source/ folder of your project and stays loaded during the whole engine runtime. It has implemented methods to put your logic code inside.
Editor Logic is the logic of the Editor. The logic takes effect only when UnigineEditor is loaded.
- You should create an editor logic file to put your logic code inside and add it using the command-line argument.
It is highly recommended that you familiarize yourself with the execution sequence to know the details, including the multi-threaded mode.
Component System enables you to implement your application's logic via a set of building blocks — components, and assign these blocks to nodes extending their basic functionality. You can add a component to a node via code or via UnigineEditor.
Each component's logic is implemented inside the following functions: init(), update(), postUpdate(), etc. Just like for the main logic components, these functions are executed at corresponding stages of the Engine's working cycle.
The logic of a certain component is active only when the corresponding node and property are enabled. Thus, you can enable/disable logic of each particular component at run time, when necessary.
You can assign several properties corresponding to different components to a single node. The order in which the logic of components is executed can be changed at run time to fit your needs, giving you an exceptional flexibility and freedom to implement any functionality you have in mind.
In UNIGINE, there are two implementations of the Component System:
- C# Component System available for C# projects. In this case, a logic component integrates a node and a C# class, containing logic implementation (actions to be performed), defining a set of additional parameters to be used.
For projects using C# (.NET Core) API, the Component System is enabled automatically and integrated into UnigineEditor.
- C++ Component System available for C++ projects. In this case, a logic component integrates a node, a C++ class, containing logic implementation (actions to be performed), and a property, defining a set of additional parameters to be used.
UNIGINE features a fast intersection detection algorithm. It can be used for ray-casting, calculating line of sight (LOS) or height above terrain (HOT), etc.
Intersections can also be found between the node's bounding volume and another bounding volume of the specified size.
Samples and Demos#
UNIGINE provides a rich set of built-in samples and demo projects covering basic principles of working with the Engine (operations with the built-in nodes, output rendering, GUI setting, etc.). There are different samples for each of the 3 programming languages. All samples come with the full source code. To check them out, go to SDK browser -> Samples.
An extended set of art samples is also included for your convenience to illustrate various aspects of working with content: you can learn how to use UNIGINE's built-in objects with different settings, set up LODs and materials, adjust rendering settings, or work with vegetation. These art samples are available via UNIGINE SDK Browser as a demo: go to Samples -> Demos and install the Art Samples demo.
How to Optimize Your Project's Performance?#
UNIGINE offers a wide range of various optimizations to ensure that your application runs at its best and fastest. But in order to enable them, you should do your part of the job: follow recommendations when preparing and setting up your project's content, e.g., use LODs, bit masking, and other techniques.
Smooth alpha-blended levels of details (LODs) are used to decrease geometry complexity of 3D objects, when these objects move away from the camera, making it possible to lighten the load on the renderer.
Usually, LODs are used for:
- Switching high-polygonal surfaces of the mesh to low-polygonal ones.
- Switching surfaces with complex materials to surfaces with simplified optimized ones.
- Switching several high-polygonal surfaces to a single simplified surface.
- Switching one node type to the other (for example, switching a high-polygonal mesh to a billboard, a two-polygonal object that always faces the camera).
Switching between LODs can depend not only on the distance from the camera, but also from the distance from the certain object.
UNIGINE offers two mechanisms of LODs switching:
- Disable one LOD and enable another at the specified distance defined by two values: maximum visibility distance of the first LOD (Max Visibility) and minimum visibility distance of the second LOD (Min Visibility).
- Smoothly fade one LOD into another at the specified interval defined by two values: minimum fade distance of the first LOD (Min Fade) and maximum fade distance of the first LOD/minimum fade distance of the second LOD (Max Fade).
See the Setting up object LODs article for the details.
Bit masking defines whether two entities affect each other or not. It can be used to:
- Render some objects to the viewport and not to render others.
- Apply collision to some bodies and not to apply to others.
- Render shadows for some objects and not to render for others.
- Apply physics interaction for some objects and ignore others.
The bit masking algorithm compares 32-bit masks of each of two nodes using binary operator and. If masks match, one object will influence another object; if they don't, the objects won't interact.
If two masks have at least one matching bit, the masks match. E.g. the following two masks match, as they have 4 matching bits:
You can learn more about content optimization by watching the following tutorial:
Now you are prepared to start your experience with UNIGINE. Enjoy!