Jump to content

Using Shaders with Screen Space GUI objects


photo

Recommended Posts

Hi all, 

I removed a post earlier, looking for help with GUI WidgetSpriteVideo GUIobjects, but I've since been experimenting and found it's not quite what i'm looking for.

I have been following the WidgetSpriteVideo documentation page in C#, however i can't seem to wrap my head around applying shaders other than to world space objects.

I've realized i need to use a screen space Gui object rather than world space. The application is to render a webcam view in over the top of Unigine(in Vr so i will need to separate the two cameras per eye for depth) and use a green screen to mask out areas of the webcam to allow the digital scene to come through( IE the user can see their hands in front of them, with Unigine in the background.)

My main requirements are: 

1. Bring in the video feed from the camera (in OGV format) and stream it to the GUI.

2. Use a chroma-key/shader to mask off and create transparency in the webcam feed.

3. Do this for two separate cameras and have them render to each eye individually.(or technically hide each one from their opposing eye)

 

If anyone can point me towards the documentation to assist with this, or other references or libraries and plugins, I would be very appreciative.

Link to comment

Here is a good visual example of what I am trying to achieve in Unigine. We have example code of the Chroma Key Shader working in Unity. Can you please advise how we can achieve similar results in Unigine? Thanks. 


 

 

Link to comment

de-Roo.Lukas

WidgetSpriteVideo will not fit in this case, I'm afraid. It supports playback only from *.ogv videos on disk. If you need to get stream from the camera attached to PC you need to find out the way to map video stream to the DirectX11 / OpenGL texture.

According to your video, some Wizapply software is doing it. There is SDK available on GitHub: https://github.com/Wizapply that probably can map video stream from some(?) devices into the texture.

You can also check how to get access to the texture in our plugins: <SDK>/source/plugins/App/AppOculus / <SDK>/source/plugins/App/AppVive and samples in SDK Browser: Samples -> C++ API -> Render -> GBufferRead / GBufferWrite.

Once you have live videostream mapped to texture you can assign it to some internal texture inside engine and then you can apply some shaders to it.

Thanks!

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment
×
×
  • Create New...