Jump to content

Search the Community

Showing results for tags 'shaders'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome to UNIGINE Forums
    • News & Announcements
    • Getting started
  • Development
    • Content Creation
    • World Design
    • Rendering
    • Animation
    • Physics, Navigation and Path Finding
    • UI Systems
    • Sound & Video
    • Editor
    • C++ Programming
    • C# Programming
    • Networking
    • Sim IG (Image Generator)
    • VR Discussions
    • General
  • Improving UNIGINE
    • Documentation
    • Feedback for UNIGINE team
    • Bug Reports
    • Unigine SDK Beta feedback
  • Community
    • Add-on Store (https://store.unigine.com/)
    • Showcase
    • Collaboration
    • Tools, Plugins, Materials & Tutorials
    • General Discussions
  • Legacy
    • UnigineScript

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 6 results

  1. I try to do some shaders but I don't found includes or shaders using on the engine. for sample if you open "evaluation_sim_windows_2.9.0.2\data\core\materials\default\mesh\mesh_base.basemat" this include many shaders as.. vertex="core/shaders/mesh/transparent/ambient.shader" fragment="core/shaders/mesh/transparent/ambient.shader" vertex="core/shaders/mesh/transparent/lights.shader" fragment="core/shaders/mesh/transparent/lights.shader"/> vertex="core/shaders/mesh/opacity/deferred.shader" fragment="core/shaders/mesh/opacity/deferred.shader"/> .... But this don't appear on all the SDK, include in the generate project, the folder "core/shaders" don't appear too. Now if you do a search for *.shader you can found some samples, but they have a include that I don't found too. #include <core/shaders/mesh/common/common.h> #ifdef VERTEX #include <core/shaders/mesh/common/vertex.h> #elif FRAGMENT #include <core/shaders/mesh/common/fragment.h> #endif This are into the "core/shader/" folder too /roberto
  2. Material Layers

    Hi everyone, I saw an article on 80.lv regarding unigine, and really loving the possibilities. I looked at it way back in 1.0, and it looks like it has progressed significantly. One workflow that's becoming more and more common now is the idea of blending textures at the material level, rather than baking it out as a unique texture. The best example of this I've seen as a supported feature is in the Xenko Engine here: https://doc.xenko.com/latest/en/manual/graphics/materials/material-layers.html Is this workflow possible in Unigine, or is this something that will need a custom shader/shader feature for? Thank you for the help!
  3. Hi all, I removed a post earlier, looking for help with GUI WidgetSpriteVideo GUIobjects, but I've since been experimenting and found it's not quite what i'm looking for. I have been following the WidgetSpriteVideo documentation page in C#, however i can't seem to wrap my head around applying shaders other than to world space objects. I've realized i need to use a screen space Gui object rather than world space. The application is to render a webcam view in over the top of Unigine(in Vr so i will need to separate the two cameras per eye for depth) and use a green screen to mask out areas of the webcam to allow the digital scene to come through( IE the user can see their hands in front of them, with Unigine in the background.) My main requirements are: 1. Bring in the video feed from the camera (in OGV format) and stream it to the GUI. 2. Use a chroma-key/shader to mask off and create transparency in the webcam feed. 3. Do this for two separate cameras and have them render to each eye individually.(or technically hide each one from their opposing eye) If anyone can point me towards the documentation to assist with this, or other references or libraries and plugins, I would be very appreciative.
  4. Hello , I have written a material <material name="my_custom_material" parent="mesh_base" > <options object="mesh_dynamic" transparent="2" order="1"/> <state name="workflow">1</state> <state name="auxiliary" defines="name" pass_defines="auxiliary"/> <blend src="src_alpha" dest="one_minus_src_alpha"/> <state name="deferred">0</state> <state name="ambient" type="switch" items="none,opacity,transparent" transparent="2">2</state> <parameter name="auxiliary_color" auxiliary="1" type="color">1.0 1.0 1.0 1.0</parameter> <parameter name="my_param_a" workflow="1" shared="1" type="slider">1</parameter> <parameter name="my_param_b" workflow="1" shared="1" type="slider">0</parameter> <parameter name="my_param_c" workflow="1" shared="1" type="slider">0.0</parameter> <parameter name="my_param_d" workflow="1" shared="1" type="slider">0</parameter> <shader pass="ambient" ambient="2" object="mesh_dynamic" defines="BASE_AMBIENT,AMBIENT,ZERO_DEPTH" vertex="shaders/my_custom_material.vert" fragment="shaders/my_custom_material.frag" geometry="shaders/my_custom_material.geom"/> </material> But my shader specified in shader tag are not used while rendering , but when i remove the parent material "mesh_base" they are used in rendering . Can you help me ? Can we override the shaders in child material ?
  5. Hi. Can you give an exaple of compute shaders using?
  6. Another similar question. From my understunding depth = getDeferredDepth(deferred_depth)*s_depth_range.y returns distance betwen camera position and the point (even in ortho). Unlike "traditional" depth values which return z values in the camera cordinates. How to reconstruct z values in eye cordinates? The method i invented was doing that was multiplying inverseProjectionMatrix to the point (IN.texcoord_0.x*2.0 - 1.0 ,2.0 * (1.0 - IN.texcoord_0.y) -1.0, 0.0, 1.0 ). (obtained y,x values doesn't depend from depth value) In obtained vertex values vertex.x / vertex.w , vertex.x/vertex.w are x and y cordinates of the point in eye space. so z*z will be equal to depth*depth - x*x - y*y.. It's not very straight forward method. Is there any easier way to do that in unigine?
×
×
  • Create New...