shown in other instead.
Migrating to Unigine from Unity
This section gives a not so brief overview of UNIGINE from a Unity user's perspective and provides basic information to help you transfer your Unity experience to UNIGINE.
This chapter matches common Unity terms on the left and their UNIGINE equivalents (or rough equivalent) on the right. UNIGINE keywords are linked directly to related articles to provide you with more in-depth information.
|Project and SDK management||Hub||SDK Browser|
|Editor UI||Hierarchy Panel||World Hierarchy window|
|Project Browser||Asset Browser window|
|Scene View||Scene Viewport|
|Gameplay Types||Component||Component System|
|Meshes||Mesh Renderer||Static Mesh, Dynamic Mesh|
|Skinned Mesh Renderer||Skinned Mesh|
|Effects||Particle System||Particle System|
|Lens Flares||Lens Flares|
|Projector / Decal Projector (HDRP)||Decals|
|Trees / Grass||Mesh Clutter, Grass, Vegetation Add-On|
|Wind Zones||Animation Field|
|Lighting||Light Sources||Light Sources|
|Reflection Probes||Environment Probes|
|Custom Shaders: HLSL, Shader Graph||HLSL / GLSL / UUSL|
|Compute Shaders||UUSL Compute Shaders|
|Rendering Paths||Rendering Sequence|
|Multi-Display Rendering||Multi-Monitor Rendering plugins, Syncker Plugin for Multi-Node Rendering|
|Programming||C#||C++ / C#|
|Scriptable Render Pipeline (URP / HDRP)||Rendering Sequence if fully accessible from API / Scriptable Materials|
|Rigid Body||Rigid Body|
|Navigation and Pathfinding||NavMesh, NavMeshAgent, Off-Mesh Link, NavMesh Obstacle||Navigation Areas, Obstacles|
Project and SDK Management#
As a Unity user, you are accustomed to use Unity Hub — the application that streamlines the way you find, download and manage your projects and installations.
UNIGINE SDK Browser is the first step to start working with UNIGINE Engine. This application enables you to manage your projects and installed SDKs, as well as gives you access to the samples and the knowledge base.
UNIGINE provides several programming workflows, to easily adapt your experience in Unity scripting, it is recommended to use C# Component System. Creating a project using this workflow in SDK Browser is done as follows:
- Click Create New in My Projects section
- Among other options, choose C# (.NET Core) project type in the API + IDE field.
- If you want to make the project compatible with one of supported VR headsets, proceed to the Plugins section, check the required plugins in the Stereo 3D section and click Ok (more about VR-compatibility).
- Click Create New Project.
- Upon completion, click Edit Content to run UnigineEditor.
Below you can see interfaces of the Unity Editor and UnigineEditor. Interface elements in the images are color-coded to indicate common functionality. Each element has a label to show UNIGINE's equivalent. The layout of UnigineEditor is fully customizable by resizing, dragging and dropping the tabbed windows. UnigineEditor has the dark color scheme theme by default.
To learn more about the UnigineEditor interface, read this article.
You may find the Scene View and the Scene Viewport controls looking very much alike.
Unity Scene View
UNIGINE Scene Viewport
You can use:
- Camera panel to switch between cameras and configure the current one.
- Rendering Debug to display the contents of rendering buffers the same way as using the Draw Mode in Unity.
- Navigation panel to quickly set up and switch between camera speed presets and change the camera position.
Navigation inside the Scene Viewport is pretty much the same as in Unity's Scene View. However, get familiar with Scene Navigation so as not to miss details.
Also, a set of global switchers is available in the top toolbar:
- Helpers panel provides quick access to auxiliary visualizers, such as icons, gizmos and wireframes.
- Precompile All Shaders toggle is used for Forced Shaders Compilation
- Animation toggle
- Physics toggle
- Audio toggle
- Play controls
You can use as many Scene Viewports as you need.
Game View and Play Mode#
You are accustomed that Unity has the Play mode inside the Game View which is rendered from the camera(s) in your scene, being representative of your final build.
In UNIGINE, the Play button is used to run an instance of the application in a separate window. You can switch between Play presets to change essential parameters of the play mode, such as VR Mode to enable compatibility with one of supported VR headsets, as an example.
By default, the mouse cursor is grabbed when clicked in the Play mode. There are two ways to setup default Input Bindings (key states and the mouse behaviour):
It is available both in UnigineEditor and a running application. To open the Console window in the Editor, go to Windows -> Console menu:
A built-in console is called by pressing the grave accent (`) key:
You can use it to print user messages from code.
Building a Project#
In Unity, you got used to building your projects via the editor. In UNIGINE, building a project is also done via UnigineEditor.
Projects and Files#
Directories and Files#
A project in UNIGINE, just like a Unity project, is stored in its own folder, project settings are stored in the *.project file. There are various subfolders inside the project's folder, where your content and source files as well as various configuration files and binaries are stored. The most important are the data and source sub-folders.
In UNIGINE, each project has a data folder. Similar to a Unity project's Assets folder, this is where your project's assets are stored. To import assets into your project, simply drop files into your project's data directory and they will be automatically imported and appear in the Asset Browser. The assets in the editor will update automatically as you make changes to the files using an external program.
Supported File Types#
Unity supports a wide range of file formats, while UNIGINE supports the most commonly used and some specific ones:
|Asset Types||Supported Formats|
|Geometry||.fbx, .obj, .3ds, .dae, .glb/.gltf, .stp/.step, .igs/.iges, .brep, .stl|
|Textures||.png, .jpeg, .tif, .tga, .rgba, .psd, .hdr, .dds, and more|
|Sound and Video||.wav, .mp3, .oga/.ogg, .ogv|
Bringing Your Assets from Unity#
UNIGINE uses the same units (meters) as Unity, so you don't need to rescale meshes.
Similar to Unity, Unigine works with PBR materials and supports both Metalness and Specular workflows (similar to Unity materials based on the Standard and Standard (Specular Setup) shaders), a rich out-of-the-box library of materials enables to create almost any material. So, materials created for the model in Unity can be re-created in Unigine using mesh_base material, the base material in Unigine used for physically based materials.
Textures can be imported as a part of a model or separately and then applied to a mesh. To import textures, you might have to do some adjustments in advance. For example, the shading texture in Unigine stores the metalness, roughness, specular, and microfiber maps in its corresponding channels, so you need to modify the texture using third-party software, such as GIMP or Photoshop, and then import it to Unigine.
You can import an animated model you used in Unity to Unigine if you have it as an FBX model. While importing the FBX model, enable the Import Animations option and fine-tune the import using additional options.
For more details, see import recommendations.
The concept of the Scene in both engines is the same. However, Unity3D and UNIGINE use different coordinate systems.
Unity uses a left-handed coordinate system where the vertical direction is usually represented by the +Y axis.
One unit is one meter.
Axes and Directions:
Positive rotation angle sets the rotation clockwise.
File format: *.scene
UNIGINE uses a right-handed coordinate system where the vertical direction is usually represented by the +Z axis.
One unit is one meter.
Axes and Directions:
Positive rotation angle sets the rotation counterclockwise.
File format: *.world
This section gives a brief description of basic scene objects in both engines as well as their basic similarities and differences.
GameObjects are containers for all other Components. Components add functionality to the GameObject.
Every GameObject has the Transform component by default.
GameObjects can be organized into a hierarchy (parent-child relation).
World Hierarchy window
Each node has a transformation matrix, which encodes its position, rotation, and scale in the world.
Nodes can be organized into a hierarchy (parent-child relation).
All scene objects added to the scene regardless of their type are called nodes.
The workflow in Unity is based on prefabs. It is usual for you to assemble a complex object from GameObjects with certain components and properties and create a prefab from such object. Prefabs can then be placed in your world via Editor, or instantiated at run time.
UNIGINE's workflow is based on Node References that are very similar to prefabs. In order to make a complex object to be instanced in your world, you just build the desired hierarchy from nodes, assign materials and properties to them, and save it as a Node Reference. Then you can use this node reference as many times as necessary and, just like with the prefabs, modify the node reference by changing any of its instances.
To learn more about creating Node References and managing them, please follow the link below:
How to Collaborate?#
Unity provides the Smart Merge tool and support for custom tools for resolving conflicts when merging results of teamwork. Scenes and other files should use the YAML format in order to be merged.
In UNIGINE, all native file formats are text-based by default, so you can use any VCS you are used to and merge worlds, nodes and other assets. You can extend the file system to keep the shared assets by using Mount Points. Also, a normal workflow is to split work of different team members using separate Node Layers, so there will be no need to match the conflicted files when merging the project modifications.
Check out the related article for more details:
- Configuring Version Control
Cameras, the entities essential for rendering, are treated slightly differently in both engines.
In Unity, Camera component is responsible for capturing a view and sending it to render. All enabled cameras present in the scene are rendered in the viewport (Game View) and may overlap each other. To switch between cameras, one usually needs to toggle off the current camera and enable the other one.
In UNIGINE, the Camera is a rendering-related object and implemented by the Player nodes in the world. There are several player types with different behaviour provided in order to simplify creation of the most commonly used cameras controlled via the input devices (keyboard, mouse, joystick):
- Dummy is a simple camera wrapper. You can use it for static cameras or enhance with custom logic.
- Spectator is a free flying camera.
- Persecutor is a flying camera that has the target and orbits it at the specified distance. It is a ready-to-use simple solution for a third-person camera.
- Actor is a player that is capable of providing physical interaction with scenery. It has a rigid physical body, which is approximated with a capsule shape. It is a ready-to-use simple solution for a first-person character similar to Unity's Character Controller.
Only one player can be rendered into the viewport at a moment. To switch between cameras in the Scene Viewport of the UnigineEditor, use the Camera panel:
Overall project settings adjustment in Unity is usually done via the Project Settings window (menu: Edit > Project Settings). The Audio, Graphics, Physics, Quality levels and other settings affect the whole project.
In Unigine, the Common Settings and Preferences are available via Windows -> Settings menu at the Runtime section. The World settings are set for each world separately.
In Unity, Asynchronous Shader compilation is toggled on and off in the Editor settings (menu: Edit > Project Settings > Editor > Shader Compilation).
In UNIGINE, a similar editor feature, Forced Shader Compilation, is available via both the toolbar and the Editor section of the Settings window.
You use Presets in Unity Editor when you need to reuse property settings that have bearing to different tasks, be it component settings, import settings or especially Project Settings. You can save the settings preset for a certain section of the Project Settings as a .preset asset and reuse in development later.
Presets are an Editor-only feature in Unity.
In UNIGINE, you can save and load presets for general physics, sound and render settings. The presets are stored as assets with the *.physics, *.sound and *.render extensions respectively. Use Load and Save .* asset buttons of the Settings window to work with presets of the corresponding settings section.
Saved assets appear in the Asset Browser. You can load the render settings by double-clicking the required .render asset.
Presets are not an Editor-only feature in UNIGINE. You can use Physics, Sound and Render classes to manage presets for corresponding settings, for example, to switch between quality levels at run time.
In Unity, settings for graphics quality are mostly gathered in the following sections:
- The Graphics section contains global settings for graphics. The Tier Settings provide platform-specific adjustments to rendering and shader compilation. One of the three Tier levels to be applied is defined automatically based on the platform used.
- The Quality section handles levels of graphical quality defined for each platform.
In UNIGINE, the rendering settings of the world can be found in the Render section of the Settings window. You can also toggle on and off the most common render features by using the Rendering menu:
There is no platform-dependent quiality adjustments in UNIGINE, you should write your own logic to control the quality levels. You can use Render Presets for this purpose.
Let's consider the most commonly used rendering settings in Unity and their corresponding analogs in UNIGINE:
|HDR Mode||Render -> Buffers -> Color 16F|
|Rendering Path||see below|
|Shaders Preloading||Render -> Streaming -> Preload at World Loading|
|Pixel Light Count||Forward Per-Object Limits|
|Texture Quality||Render -> Textures -> Quality|
|Anisotropic Textures||Render -> Textures -> Anisotropy|
|Anti Aliasing||Render -> Antialiasing -> Supersampling|
|Soft Particles||particles_base -> Soft Interaction|
|Realtime Reflection Probes||Menu: Rendering -> Dynamic Reflections -> Enabled|
|Texture Streaming||Render -> Streaming Settings|
|Shadows||Render -> Shadows Settings|
|Shadow Cascades||set per each World Light source|
|VSync Count||Runtime -> Video settings|
Unity provides two non-legacy lighting and shading workflows: Deferred and Forward Rendering Paths defining the shading fidelity, as well as the rendering consumption and required hardware. You can choose the rendering path that your Project uses in the Graphics window, and you can override that path for each Camera.
Unigine has the fixed Rendering Sequence represented by a combination of a full deferred renderer with forward rendering techniques:
- All opaque (non-transparent) geometry is rendered in the deferred pass.
- Transparent geometry is rendered in the forward pass.
You can reduce computational load by skipping certain rendering stages. Watch the dedicated video tutorial on using the Microprofile tool to optimize rendering:
- Video tutorial: Microprofile for Artists
In Unity, availability of post-processing effects is determined by the render pipeline used. In UNIGINE, similar effects are not a part of Post-processing but are integrated into the Rendering Sequence. Thus, Unity's High Definition Render Pipeline (HDRP) is much closer to the rendering workflow in UNIGINE, than other render pipelines.
In Unity the Volume framework is used to define the volumes where post-processing parameters and effects are locally (or globally) overridden. In UNIGINE you will have to write your own logic to smoothly interpolate between settings at different spaces (if such a requirement appears).
This section lists all common Unity post-processing techniques that can be achieved in UNIGINE as well regardless of render pipeline.
|Ambient Occlusion||Screen-Space Ambient Occlusion|
|Auto Exposure||Camera Effects:|
|Chromatic Aberration||Postprocess Materials:
|Depth of Field||Depth of Field|
|Motion Blur||Motion Blur|
|Screen Space Reflection||SSR|
|Contact Shadows||Screen Space Shadows|
|Micro Shadows||Cavity of SSAO|
Where to go from here?
Thank you for reading the guide! You can proceed to the following sections for further learning: