This page has been translated automatically.
Video Tutorials
Interface
Essentials
Advanced
How To
Basics
Rendering
Professional (SIM)
UnigineEditor
Interface Overview
Assets Workflow
Version Control
Settings and Preferences
Working With Projects
Adjusting Node Parameters
Setting Up Materials
Setting Up Properties
Lighting
Sandworm
Using Editor Tools for Specific Tasks
Extending Editor Functionality
Built-in Node Types
Nodes
Objects
Effects
Decals
Light Sources
Geodetics
World Nodes
Sound Objects
Pathfinding Objects
Players
Programming
Fundamentals
Setting Up Development Environment
Usage Examples
C++
C#
UnigineScript
UUSL (Unified UNIGINE Shader Language)
Plugins
File Formats
Materials and Shaders
Rebuilding the Engine Tools
GUI
Double Precision Coordinates
API Reference
Animations-Related Classes
Containers
Common Functionality
Controls-Related Classes
Engine-Related Classes
Filesystem Functionality
GUI-Related Classes
Math Functionality
Node-Related Classes
Objects-Related Classes
Networking Functionality
Pathfinding-Related Classes
Physics-Related Classes
Plugins-Related Classes
IG Plugin
CIGIConnector Plugin
Rendering-Related Classes
VR-Related Classes
Content Creation
Content Optimization
Materials
Material Nodes Library
Miscellaneous
Input
Math
Matrix
Textures
Art Samples
Tutorials

Render

Camera to Texture#

This sample shows how to capture the output of a camera in real time and project it onto a material's albedo texture using Viewport::renderTexture2D().

A renderable 2D texture is created and set as the albedo texture of the material assigned to the plane. Each frame, the active camera's view is rendered into this texture, updating the appearance of the object it's applied to. Texture sampling settings like linear filtering and anisotropy are configured to ensure visual quality. The UV transform is also adjusted to correct for platform-dependent flipping.

This method can be used for features such as security monitors, live camera feeds, dynamic mirrors, portals, or mini-maps in your worlds.


SDK Path: <SDK_INSTALLATION>source/render/camera_to_texture

Compute Shader#

This sample demonstrates a GPU-based particle system implemented with compute shaders. It initializes a dynamic particle renderer, updates particle positions and velocities each frame using a ping-pong texture mechanism, and uses UV-mapped static meshes to influence particle behavior.

These mesh-to-particle transformations are well-suited for real-time visual effects, such as fluid or smoke simulations or interactive art installations.


SDK Path: <SDK_INSTALLATION>source/render/compute_shader

Compute Shader Image#

This sample demonstrates how to create a texture at runtime and update it entirely on the GPU using compute shaders with read-write (unordered) access. The 2D texture is dynamically created and assigned to a material's albedo slot.

Each frame, the compute shader updates the texture by using simulation parameters frame time (ifps) and dispatching GPU threads in 32×32 groups. Then the shader modifies the texture content using unordered access, with all calculations and writes handled fully on the GPU. The CPU is only responsible for initial setup and parameter updates.

The compute shader is used to perform custom operations on textures, allowing for real-time image manipulation or procedural content generation.


SDK Path: <SDK_INSTALLATION>source/render/compute_shader_image

Ffp Depth#

This sample demonstrates how to render custom visual elements (lines) using the FFP with depth testing enabled.

The elements are drawn during the visualizer stage and properly sorted with respect to scene geometry using the depth buffer.

The sample sets up a render callback via Render::getEventEndVisualizer(), which is used to draw a simple line segment in camera space. Drawing is performed using FFP, with blending and depth-testing configured manually via RenderState class.

The scene features a red line intersecting the object. The projection matrix is modified to account for reverse depth and range remapping, ensuring correct visual sorting. The line vertices are transformed into camera space using the current modelview matrix, and rendered in screen space.

This sample can be used for fast rendering of various additional or debug elements (such as semi-transparent objects, frames, 3D grids and coordinate systems, path traces, motion trails, line-of-sight visualization etc.) while ensuring consistency with the scene content.


SDK Path: <SDK_INSTALLATION>source/render/ffp_depth

GBuffer Read#

This sample demonstrates how to access G-buffer textures at different stages of the rendering process by configuring a custom Viewport and intercepting its output at the G-buffer rendering stage.

A temporary viewport is used to render a selected node with a simplified pipeline to stop rendering after the G-buffer is filled. When the rendering reaches this stage, a callback is triggered to fetch G-buffer textures such as depth, albedo, normals, metalness, and roughness. These textures are then copied to user-defined render targets and displayed on screen planes using custom materials.

This setup is useful for debugging, material analysis, or developing post-processing effects that require direct access to the intermediate data in the rendering pipeline.


SDK Path: <SDK_INSTALLATION>source/render/gbuffer_read

GBuffer Write#

This sample demonstrates how to modify G-buffer textures at different stages of the rendering process by injecting a custom material at the end of the G-buffer pass.

An event handler is registered using getEventEndOpacityGBuffer(), which is triggered once all G-buffer textures (albedo, normal, etc.) are populated. In this callback, a custom post-material is applied to modify the contents of these textures using dynamic parameters such as influence, plasticity, and color. Temporary render targets are used to perform intermediate writes, and the modified textures are then swapped back into the pipeline.

This technique allows real-time manipulation of G-buffer data during the rendering process, enabling custom surface effects or advanced material pre-processing.


SDK Path: <SDK_INSTALLATION>source/render/gbuffer_write

Gui to Texture#

This sample demonstrates how to render GUI elements into a texture using Gui::render(). Instead of drawing directly to the screen, the GUI is redirected to a custom framebuffer, which isolates its rendering pipeline and allows the resulting texture to be applied to materials.

The GuiToTexture component supports two update modes:

  • In Manual mode, the texture is updated only when explicitly calling renderToTexture(). This is used in the WidgetClock example, where the GUI (a digital clock) is re-rendered once per second, only when the displayed time changes.
  • Automatic mode is enabled by default and updates the GUI texture every frame. This is demonstrated in the WidgetNoSignal example, where a "No Signal" label moves across the screen like a screensaver. Because the position of the widget changes every frame, the texture must be continuously updated to reflect those changes.

The render flow involves saving and clearing the current render state, binding a texture, configuring the viewport, rendering the GUI widgets, and restoring the render state. Mipmaps are also generated to ensure proper filtering at different scales and distances.

You can use this sample to display dynamic GUI elements on in-game monitors, control panels, or other similar surfaces.


SDK Path: <SDK_INSTALLATION>source/render/gui_to_texture

Node to Texture#

This sample demonstrates how to render a specific node and its children into a texture using Viewport::renderNodeTexture2D(). Instead of rendering the whole scene, a custom viewport captures only the target node as seen from a camera, and outputs the result into a 2D texture.

The texture is then assigned as the material's albedo texture. To create clean output, the viewport is configured to skip transparent objects, velocity buffers, post-effects, and debugging visualizers. Lighting is inherited from the world, and the environment is overridden to a black cubemap to prevent background influence.

This sample can be useful for rendering character previews, dynamic item thumbnails, or isolated object views directly onto surfaces.


SDK Path: <SDK_INSTALLATION>source/render/node_to_texture

Render Target#

This sample demonstrates how to draw directly into a texture using the RenderTarget class. The system initializes multiple textures, including a background layer and a dynamic result texture, which then is assigned to the object's material. The RenderTarget is used to redirect rendering output from the main framebuffer to a texture, allowing full control over how sprites are composited. Each draw operation includes a UV-based transform to place and scale the sprite, and uses a custom material with direct access to the involved textures. After each draw, the result is blended with the background and mipmaps are generated for correct appearance at various distances.

Render states are saved and restored around each operation to isolate the off-screen rendering flow from the rest of the pipeline.

This technique is suitable for creating effects like decal placement, user-driven painting, hit markers, or procedural texture overlays in real time.


SDK Path: <SDK_INSTALLATION>source/render/render_target

Screenshot#

Simple demonstration of how to make a screenshot by grabbing the final image from the rendering sequence.
SDK Path: <SDK_INSTALLATION>source/render/screenshot

Screenshot Advanced#

This sample showcases how to capture high-resolution screenshots from focused windows, individual cameras, the main player camera, or grouped windows in real time, with optional GUI overlays.

At runtime, several additional windows are created, each with its own dedicated camera for multi-view rendering. All windows are stackable. To create a group, arrange the windows by dragging them close to each other - a green alignment grid will appear to assist with layout. Once grouped, screenshots can be captured from the group view.

Captured images are saved with timestamped filenames to a user-defined directory. By default, this is the folder displayed in the Parameters section.

Use Cases:

  • Capturing viewport content (GUI and non-GUI screenshots)
  • Testing multi-camera setups

SDK Path: <SDK_INSTALLATION>source/render/screenshot_advanced

Split-Screen Texture#

This sample demonstrates how to capture views from two different cameras into separate textures and implement a split-screen layout using the C++ API.

Each camera renders its output to a texture using Viewport::renderTexture2D(). These textures are displayed in a vertical split-screen layout using WidgetSprite elements, and at the same time are applied to surfaces by assigning them to the albedo slot of static mesh materials. The layout adjusts dynamically to screen size changes.

This setup can be used for multiplayer screen sharing, camera comparisons, or in-game monitors rendered from multiple viewpoints.


SDK Path: <SDK_INSTALLATION>source/render/split_screen_texture

Structured Buffer#

This sample demonstrates how to compress a texture into DXT1 format using compute shaders and structured buffers.

At initialization, a compute material is loaded, and a source texture is prepared. A DXT1Block structure is used to hold the compressed data, with its size based on the texture's dimensions (in 4x4 blocks). A structured buffer is created on the GPU to store the output.

The compute shader runs in two stages: a warm-up to trigger shader compilation and the actual compression pass. The number of compute groups is calculated dynamically from the texture size and thread group size to achieve efficient parallel execution.

When compression is complete, the GPU buffer is transferred to CPU memory asynchronously. A new image with the compressed data is created in DXT1 format and saved to a *.dds file.

This method offloads all heavy processing to the GPU, minimizes CPU overhead, and takes advantage of structured memory access for optimal performance.


SDK Path: <SDK_INSTALLATION>source/render/structured_buffer

Textures#

This sample demonstrates how to create and update procedural textures on static meshes in real time using the C++ API.

Each mesh is assigned a custom-generated image, which is updated every frame using a time-based color pattern. The pixel data is written manually into an Image class instance, and the result is applied to the albedo texture slot of the mesh's material using Material::setTextureImage().

This method allows generating dynamic patterns, noise, or other procedural visuals without using compute shaders or external image files.


SDK Path: <SDK_INSTALLATION>source/render/textures

Weapon Clipping#

In first-person games, weapon models can clip through walls or geometry when the camera gets too close. This sample solves the problem by rendering the weapon separately into a texture and compositing it over the main scene.

Two cameras with identical transforms are used to ensure alignment. Their Viewport masks are set via UNIGINE Editor in the following way: one renders the environment without the weapon, and the other renders only the weapon. The weapon view is drawn into a texture using Viewport::renderTexture2D(), and then overlaid onto the screen with Render::renderScreenMaterial(). The system handles window resizing, render-state isolation, and allows optional settings like skipping shadows in the weapon rendering pass.

This approach keeps the weapon always visible, even when the camera is close to walls, without interfering with the main rendering pipeline.


SDK Path: <SDK_INSTALLATION>source/render/weapon_clipping

The information on this page is valid for UNIGINE 2.20 SDK.

Last update: 2025-06-30
Build: ()