trey.coursey Posted August 1, 2018 Share Posted August 1, 2018 I wasn't sure where to put this, if this isn't the best spot please move or let me know. I found from @silent a reference to how to include video on a GUI object. I'm now trying to reverse engineer what that sample created so that I can apply this sprite video texture to an object I've already placed in my world. I think the only thing I'm missing is how to reference the appropriate object that is already in my world. See below... This first set of code I will get rid of as it's creating the object at runtime and I already will have the object in the correct place in my world. ObjectGui object_gui = addToEditor(new ObjectGui(20.0f,15.0f)); object_gui.setWorldTransform(Mat4(translate(31.162f,1.029f,8.0f) * rotateY(90.0f) * rotateZ(90.0f))); object_gui.setMaterial("gui_base","*"); object_gui.setSurfaceProperty("surface_base","*"); object_gui.setMaterialState("mode",1,0); This is the code I think I need to change but am not sure of the proper reference to get the object from my world... The first line is getting reference to the object that was just created (above). If I could reference the already in the world object I would then use that reference instead of this one. Gui gui = object_gui.getGui(); sprite_video = new WidgetSpriteVideo(gui,"worship.ogv"); gui.addChild(sprite_video,GUI_ALIGN_EXPAND); I hope that makes sense. Anyone with any thoughts? Thanks. Link to comment
fox Posted August 2, 2018 Share Posted August 2, 2018 Hi Trey! Suppose you have a Gui object named "MyGUI" in your world, you can play your video by attaching a WidgetSpriteVideo to oject's gui simply like this: // Declaring a Gui Object and video sprite ObjectGui object_gui; WidgetSpriteVideo sprite_video; int init() { // Write here code to be called on world initialization: initialize resources for your world scene during the world start. /*...*/ // getting your existing GuiObject by name object_gui = node_cast(engine.editor.getNodeByName("MyGUI")); // getting your object's gui Gui gui = object_gui.getGui(); // creating a sprite video and adding it to your object's gui sprite_video = new WidgetSpriteVideo(gui,"worship.ogv"); gui.addChild(sprite_video,GUI_ALIGN_EXPAND); // disable GPU YUV conversion sprite_video.setYUV(0); // set looped playback mode sprite_video.setLoop(1); // play video sprite_video.play(); return 1; } Hope this helps! Thanks! Link to comment
trey.coursey Posted August 2, 2018 Author Share Posted August 2, 2018 4 hours ago, fox said: Hi Trey! Suppose you have a Gui object named "MyGUI" in your world, you can play your video by attaching a WidgetSpriteVideo to oject's gui simply like this: // Declaring a Gui Object and video sprite ObjectGui object_gui; WidgetSpriteVideo sprite_video; int init() { // Write here code to be called on world initialization: initialize resources for your world scene during the world start. /*...*/ // getting your existing GuiObject by name object_gui = node_cast(engine.editor.getNodeByName("MyGUI")); // getting your object's gui Gui gui = object_gui.getGui(); // creating a sprite video and adding it to your object's gui sprite_video = new WidgetSpriteVideo(gui,"worship.ogv"); gui.addChild(sprite_video,GUI_ALIGN_EXPAND); // disable GPU YUV conversion sprite_video.setYUV(0); // set looped playback mode sprite_video.setLoop(1); // play video sprite_video.play(); return 1; } Hope this helps! Thanks! Fox, thank you so much! I think that first line with the node_cast is what I was missing. Wasn't sure how to do that in Unigine. In Unity we do something similar and I knew we had to tell Unigine what object we were wanting to add the video too. Really appreciate the full code effort! Thanks again. Link to comment
trey.coursey Posted August 2, 2018 Author Share Posted August 2, 2018 (edited) @fox or @silent Is there anyway to INSTANCE one screen and have TWO. I have setup the code to put video on TWO different GUI's but the system slows down, more importantly the videos can get out of sync when moving around. I think there is instancing in Unigine but don't know how that would work for this kind of case. Thoughts? Edited August 2, 2018 by trey.coursey Link to comment
fox Posted August 3, 2018 Share Posted August 3, 2018 Hi Trey, Unfortunately, instancing won't work in this case. So, WidgetSpriteVideo could be a slowdown, if you plan to use it for multiple screens playing video at once. A more performance-friendly option would be to inherit a new material from the mesh_base, assign it to all of your screens, and modify the albedo texture of this material via API. This approach will be fast, will work for streamed video also, and will be free of any problems with synchronization. But, that's a bit tricky, as you'll have to manually get frames data from the video (stream or file) and put it to the albedo texture via API. An example demonstrating the concept of working with material's albedo texture (but using an image rendered from a camera, not video frames from a file or stream) can be found here. Link to comment
trey.coursey Posted August 3, 2018 Author Share Posted August 3, 2018 4 hours ago, fox said: Hi Trey, Unfortunately, instancing won't work in this case. So, WidgetSpriteVideo could be a slowdown, if you plan to use it for multiple screens playing video at once. A more performance-friendly option would be to inherit a new material from the mesh_base, assign it to all of your screens, and modify the albedo texture of this material via API. This approach will be fast, will work for streamed video also, and will be free of any problems with synchronization. But, that's a bit tricky, as you'll have to manually get frames data from the video (stream or file) and put it to the albedo texture via API. An example demonstrating the concept of working with material's albedo texture (but using an image rendered from a camera, not video frames from a file or stream) can be found here. That sounds very interesting. When you say STREAM, I'm hoping you mean something from the internet, as in a URL. That would be fantastic. We had posted a question/feature request about that exact thing. Wanting to include a LIVE stream on a TV in a virtual world! Do you think this is something you would be able to program yourself? If so would you consider doing that for pay? Once it's programmed do you think it would be easy enough to modify for other projects we could use it for, of course not an easy UI plugin type thing...but change a few parameters in the setup of the code and it could work on another project? Let me know your thoughts? Thanks again for the idea. Link to comment
t.coursey Posted August 29, 2018 Share Posted August 29, 2018 (edited) @fox or @silent do either of you know if I can use this technique on a Gui Mesh instead of a GUI object? I'd like to export my SCREEN from my modeling program so it's perfectly placed in my scene, rather than adding a GUI object in Unigine and trying to ALIGN it. I'm also wondering if using a GUI Mesh, if there are more than one screen using the same UV's if the speed of updating those WidgetSprites would be any better. What would I need to change to use GUI Mesh instead of Gui? Thanks for any thoughts... Attached is what I'm trying to setup screens for. The center cluster would be difficult since they are really a video wall spread apart... Edited August 29, 2018 by t.coursey Added picture Link to comment
t.coursey Posted August 29, 2018 Share Posted August 29, 2018 (edited) Ok so I found that if I use ObjectGuiMesh and have all the screens in one object and apply the sample code from gui_01 that the performance is much improved and all the screens play the same video with simple code. What could I do to get the EMISSIVE property applied with the same video file as the emmision texture? This is what makes the screens look like video... Edited August 29, 2018 by t.coursey Link to comment
morbid Posted August 31, 2018 Share Posted August 31, 2018 On 8/30/2018 at 3:31 AM, t.coursey said: What could I do to get the EMISSIVE property applied with the same video file as the emmision texture? If static light from the screen is enough - you can use Light Source with a blurred texture that imitates an image on the screen: https://developer.unigine.com/en/docs/2.7.2/objects/lights/proj/ You can adjust the light shape according to your in-scene displays. Any dynamic screen emission in realtime will require coding. You'll have to grab a texture from the ObjectGuiMesh and assign it to a light source. How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
t.coursey Posted August 31, 2018 Share Posted August 31, 2018 That would probably work. Thanks for the idea. If later we need the dynamic portion shouldn't be too hard to get that texture from the update section and assign to light source! Thanks again. Link to comment
Recommended Posts