christian.wolf2 Posted August 20, 2020 Share Posted August 20, 2020 Hi, I just want to know if there is some way to have a different camera viewport shown on the desktop screen when somebody else have an VR device on? We have customer that wants to have some kind of supervisor view for an VR training where he has an top-down-view on the world and the VR player is shown in them. Also some GUI-interace needs to be added to that viewport, so that the supervisor can trigger some events as well. Is that possible with the current engine version or do we need to adapt something over network, where we need to syncronize states and events via those two systems? Many thanks in advance, Christian Link to comment
rohit.gonsalves Posted August 20, 2020 Share Posted August 20, 2020 Dear Christian, I am achieving the same thing as Program and Preview for my Mixer application. I have two virtual camera rendered using two separate view ports. I can actually add as many cameras As I need until I reach to 0 FPS. One such scenario as you need. So technically it is possible. This is one and same engine. You may develop GUI over this, but my software works with GUI in MFC which is connected to this engine over network. So main player will show the scene seen by the player and you may add another additional view port for top down camera. For this camera you will setup the XY position values based on your VR tracking data, won't you? One more important thing. I am actually preparing all these textures using Viewport->render(camera) to custom textures and then achieving custom post processing to add all of them to single texture and show. But in the past I had actual separate Rendering Context for the same to show the result on two separate windows. RenderState::saveState(); RenderState::clearStates(); RenderState::flushStates(); m_CameraOutputUnit.texturerender->bindColorTexture(0, texture); m_CameraOutputUnit.texturerender->enable(); m_CameraOutputUnit.viewport->render(m_CameraOutputUnit.player->getCamera()); m_CameraOutputUnit.texturerender->flush(); m_CameraOutputUnit.texturerender->disable(); m_CameraOutputUnit.texturerender->unbindColorTexture(0); RenderState::restoreState(); Hope this help as a proof of concept. Rohit Link to comment
christian.wolf2 Posted August 25, 2020 Author Share Posted August 25, 2020 Hi Rohit, thanks for the detailed example. So just to make sure I understand your approach correct: You are using two different applications, while one is receiving different Viewport renderings than the other application (where for example your VR application is running and updating the headset?) Because in your above pictures it looks like you are rendering two different viewports in the same application, which might be the same as some kind of splitscreen. I am a little bit unsure if that fits with the output to the VR device. Best Christian Link to comment
rohit.gonsalves Posted August 25, 2020 Share Posted August 25, 2020 Dear @christian.wolf2, Quote Because in your above pictures it looks like you are rendering two different view ports in the same application, which might be the same as some kind of split screen. I am a little bit unsure if that fits with the output to the VR device. Yes. In the above approach as I have mentioned, I am rendering split screen. For VR head set too, you may render such thing. Where One view port will render left and right eye split screen or only one of the eye rendering. Second viewport will render Top down view. Then you may post process these textures onto another texture and present like the example above. You may achieve it just by modifying things inside AppVive Plugin. I just wonder about Frame Rate as this will involve exactly three renders. But it is also possible to render this onto two separate window handles from the same application. But in that case you need two separate Render Context. In my opinion you may prototype your need with first approach. You may take it further to next level afterwards. Rohit. 1 Link to comment
christian.wolf2 Posted August 25, 2020 Author Share Posted August 25, 2020 Hi Rohit, ow I got it. Unfortunately AppVivePlugin and AppOculusPlugin only provide reflections to the source, so any changes in viewport won't be possible that easy. I think the best approach we are currently prefer is providing two different applications and synchronize some variables over network. Anyway, thanks for your help. Much appreciated. Link to comment
Recommended Posts