Jump to content

Cloud streaming for VR


photo

Recommended Posts

Thinking about VR applications for our simulator, I want to know if anyone does have experience on the usage of cloud streaming services, and if there is any supporting Unigine.

Since with ever changing VR headset is also ever changing the needed GPU hardware, the idea is avoiding investing on every time new GPU if we will simple rent new headset for demos / end user testing, also providing more flexibility if we will be able to integrate more than one user into the virtual world.

Link to comment

Hi Davide, at the moment we have no built-in integration with streaming services. This can change in the future.

Unigine will run on a Linux-based server with dedicated GPUs, but current streaming solutions shows noticeable input lag and performance issues.

UPD
You may be interested in this discussion:

Thanks.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Reading a bit here and there seems cloud VR it's still something in the future.

Just switching a bit on the same VR topic: Unigine support any headset using SteamVR, such as the new Valve Index, or a specific integration is needed. Same for the VRengineer XTAL that I read in the news being integrated but dind't find in the documentation.

Edited by davide445
Link to comment

We support out-of-the-box all SteamVR compatible HMDs. As for XTAL — this device also works with SteamVR.

4 hours ago, davide445 said:

Reading a bit here and there seems cloud VR it's still something in the future.

Could you please share what've you read on the topic?

Thanks.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Speaking with research institutions, users and Varjo rep their experience / ideas on remote VR it's not positive.

The Shadow service nominated in the Google Stadia thread it's not available in our country.

Still I want to deepen the topic since (even if I'm also exploring the portable hw alternative as in the other topic) I find this a game changing option.

Discussing with Olga in Elecard there are two key questions to answer: whats the best way to grab the render output for next encoding the video stream (need we to just access the HDMI output? Use Unigine API? use offline render?) and how to decode the stream client side and pass it to the HMD (will ask Valve)

Link to comment

Putting together more things, apart the frame grabbing topic the whole pipeline seems to have latency problems.

Testing both retail ADSL and 4G+ network we have a latency of around 30ms or more. The enterprise grade optic fiber Internet connection in our office show a more promising 4ms latency on near servers. 5G network testing in the only city where is currently deployed show a 20-30ms latency, even if they are promising 1-2ms in the future.

At the network latency need to be added the encoder and decoder. For the decoder I was reading about Parsec where they declare a 7ms latency for they client. Not sure how much the encoding will add, but there are also the host platform added latency etc.

This considering the Valve Index is capable of 80, 90, 120, 144 fps, meaning down to 7ms per frame on 4.6 Mpixel resolution (2x 1440x1600). A Varjo will be also challenging with 8.6Mpixel every 11ms at 90fps.

A topic I was reading about is Google Stadia researching about "negative latency" solution, meaning predicting next frame and sending this with the current one, so to avoid waiting for next input and using the (more than) available bandwidth.

So my question: will be possible to run multiple copies of the same project on different GPUs, and rendering different frames (i.e. the "right" one, and 1 degree on the left, 1 degree on the right, etc)? Next I can send them remotely and choose what to show.

Also all of this will be easier with something such as Variable Rate Shading (on headset with eye tracking), being DX12 for sure will be not in current release, maybe there is already something similar I didn't find in the documentation. 

Edited by davide445
Link to comment

Hi Davide,

Stadia has also noticeable lags and right now is not suitable for normal use. Surprisingly, Geforce Now has the best performance out of the competitors - did you have chance to take a look at this software?

Regarding the rendering the same project on different GPUs - you can in theory run two instances of the engine on a same PC on different GPUs, but CPU performance and PCI-E bandwidth will not let you to achieve the highest FPS.

Thanks!

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment
  • 2 weeks later...

I'm trying to adress the various elements for the cloud VR so that was today discussing with a Vodafone Group level research team member to better understand latencies and possible solutions. 

Will check about GeForce Now, but the possibility to parallel render different frames might be a base step. 

So considering not a problem CPU and PCIe (with AMD servers you have plenty of cores and PCIe lanes at reasonable cost), will be possible to have a single Unigine engine (or multiple synchronized) rendering simultaneously different views (so the current one + let's say 1° left, 1 right etc) on more than 2 GPUs in VR settings? Since was reading multi-GPU rendering it's not supported in VR was not sure about.

As before the idea might be to render also the possible next movements in advance and cache them locally near the user for future usage. 

Edited by davide445
Link to comment
10 hours ago, davide445 said:

will be possible to have a single Unigine engine (or multiple synchronized) rendering simultaneously different views (so the current one + let's say 1° left, 1 right etc) on more than 2 GPUs in VR settings?

We never did this, but, in theory this is possible. DirectX allows you to choose which GPU will the application use. Then you'll have to somehow synchronize rendered images for each eye. Could be a challenging task that requires a lot of research. For VR we use industry standard — SteamVR, and we hadn't planned to develop some other custom solutions for VR rendering.

Thanks.

  • Like 1

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment
  • 2 weeks later...

Proceeding on the topic let's say we want to render on 6 different GPU on one server. Not sure if multiple Unigine instances or single instance might be the right solution but want to have a basic idea  wath kind of hw we will need. 

My idea was 24 core / 48 thread CPU, 128 GB RAM, 6xRTX 2070 SUPER, 2x2TB NVMe SSD, 2X4TB SATA HDD

The server need to render using Unigine all the VR frames for Vive Pro-like resolution as headless solution (GPU intensive task) and encode the resulting frames (CPU intensive) for remote transmission. 

Did such a conf make sense, or might be and overkill. 

Edited by davide445
Link to comment

Hi Davide,

This setup looks powerful, but I can't say you will it be enough or not because we're talking about non-existent solution. I mean that to make some decisions and prognosis you need a prototype that will show the resulting performance of the whole system.

As Andrew mentioned, there could be a big limitation with PCI-E bandwidth.

Thanks.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Hi morbid

I've talked one hour ago with a specialized HPC system designer providing solutions at CERN among others, they have a lab where we can test different configurations to figure out what will be needed. 

Link to comment
  • 2 weeks later...

Just noticed the Syncker plugin 

https://developer.unigine.com/en/docs/2.10/code/cpp/plugins/syncker/

Will this enable us to render different views? Will be similar to having a very high res screen and rendering different directions starting from the central view, if this reasoning might be correct. 

If this make sense, will be possible to run the different instances on the same server? I suppose we can setup virtual IP for every of them if this is needed from the plugin. 

Another option will be to integrate Nvidia VRWorks or AMD LiquidVR or OpenGL multiview extension, suppose will be easier to use an already integrated tool such as the Syncker plugin. 

Edited by davide445
Link to comment

Hi Davide,

Syncker will introduce some additional latency between the input and the final image in VR headset (it's already will be pretty large due to the cloud rendering), so I don't think this is the good approach.

VRWorks and LiquidVR requires crucial changes of how the engine rendering the final frame. Engine itself has already some built-in optimizatiins like single pass shadows in stereo that allow us to achieve decent framerate in complex scenes. In the 2.11 update we also will improve the CPU rendering side to get more room for the GPU.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

It depends on number of objects that you need to synchronize. Right now I don't have the exact numbers for Be-200, but depending on the network conditions and it can vary between 1ms and 10ms (but there is not so many dynamic objects).

Another drawback using Syncker - you need to write an additional logic for each object that may be modified in run-time. You no longer will be able to simply add particle system and expect it to work (as it works right now for VR - just right out of the box) because you need to synchronize every particle position, rotation and scale on the other instance. It complicates the development process.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Also in our case (power plant sim) there will be few dynamic objects. 

And I suppose will be always better in term of effort and risk than basically rebuild the rendering pipeline. 

There is any demo project using Syncker if we want to have a fast test on the HPC server? 

Edited by davide445
Link to comment

The HPC server we can test on will have probably only enterprise class GPU, we will have no benefit on Unigine, but there will be any problem? I.e. With a P6000. 

Edited by davide445
Link to comment

We are just testing not purchasing so not my problem :), just wanted to know if using some enterprise GPU will not penalizing the performances so to have a valid reference if we later will use the "right" one.

Link to comment

Very passionated discussion, but all these theoretical VR cloud streaming ideas will be - at least for some time -  blown away by reality (latency, jittering, ...). Even highly stable multi-channel IG rendering within a LAN (master + n render client PC's each having its own high-end GPU) is still tricky in practice 

Link to comment
18 minutes ago, ulf.schroeter said:

Very passionated discussion, but all these theoretical VR cloud streaming ideas will be - at least for some time -  blown away by reality (latency, jittering, ...). Even highly stable multi-channel IG rendering within a LAN (master + n render client PC's each having its own high-end GPU) is still tricky in practice 

Personaly I have doubts abouts about usage of streaming overall. For some kinds of games it is usable, but for anything with high FPS and fast response (fps shooter) you are always limited by physics (speed of light), you simply cant get under some latency no matter of what....You need 20FPS? Probably doable. You need 100+ with ultra low response to not vomit? (VR). Probably never...

Edited by demostenes
Link to comment
×
×
  • Create New...