vacious Posted September 23, 2019 Share Posted September 23, 2019 Are there plans to support OpenXR in Unigine? Link to comment
silent Posted September 24, 2019 Share Posted September 24, 2019 vacious OpenXR looks promising, however not all HMDs support it yet (like Varjo, for example). In this case we still would need to do a native plugin + OpenXR implementation which is quite odd. As soon as there will be more HMDs supporting OpenXR out of the box (including Windows / Linux integrations) we sure will take a look into it. More likely it would happen next year (we didn't discuss timeframe for OpenXR integration yet, so I can't give you any exact ETA right now, sorry). Is there any specific feature in OpenXR that you are looking for? Thanks! How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
vacious Posted September 25, 2019 Author Share Posted September 25, 2019 Thanks for the info. We need to support the Oculus S & Vive Cosmos in our application. From what i understand, Unigine support for both Oculus & HTV Vive is for the previous generation devices which used base stations, it seems (please correct me if im wrong) that the current generation devices which use inside-out tracking are not supported. Open XR seemed like a shortcut to add support for these devices. Link to comment
silent Posted September 25, 2019 Share Posted September 25, 2019 Right now we don't have such devices on our test farm, so I can't say that everything will work smooth with current SDK version. Also I can't see any significant diff with new OpenVR and Oculus APIs, so I assume that they should somehow work with current AppVive and AppOculus implementations. The only minor issue that may occur is the new mappings on controllers, but it can be easily added on your side by modifying AppVive sources. Do you have these devices on your hands right now? It would be nice to see if they will work or not. How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
vacious Posted September 25, 2019 Author Share Posted September 25, 2019 Good to hear that the differences with the new API's dont appear significant! We should be getting the Oculus S next week and the Vive Cosmos in 2-3 weeks time - I'll post our findings here. Link to comment
vacious Posted October 1, 2019 Author Share Posted October 1, 2019 I can confirm that the head-tracking (rotation & movement) work out of the box with Rift S & Unigine (2.5). We've yet to try the controllers - Will update in a couple of weeks. Link to comment
silent Posted October 1, 2019 Share Posted October 1, 2019 That's good to hear :) How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
davide445 Posted December 19, 2019 Share Posted December 19, 2019 Going to use for a period a Vive Cosmos, will be this working also? Link to comment
morbid Posted December 20, 2019 Share Posted December 20, 2019 Vive Cosmos supports SteamVR and more likely Unigine will work with this HMD, however, we never tested it with Cosmos. How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
Lales.Charles Posted November 9, 2020 Share Posted November 9, 2020 Hi, Any follow-up on OpenXR integration? Can see that they are some progress MS side:https://docs.microsoft.com/en-us/windows/mixed-reality/develop/development?tabs=native Kind regards, Charles Link to comment
silent Posted November 11, 2020 Share Posted November 11, 2020 Hi Charles, OpenXR is not yet planned for the integration. We need to do some additional research. I can see that more HMD vendors is starting to support this API, so it's a definitely a good sign :) How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
Lales.Charles Posted November 11, 2020 Share Posted November 11, 2020 Ok, good. That could maybe bring AR on board ;-)https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/openxr Regards, Charles 1 Link to comment
sebastian.vesenmayer Posted October 14, 2021 Share Posted October 14, 2021 Hello silent, are there any updates on OpenXR support? :) Thanks. Link to comment
silent Posted October 14, 2021 Share Posted October 14, 2021 We are tracking news regarding OpenXR updates and it looks like the next year is the best time for starting our initial OpenXR integration. Drawback for Linux OS would be minspec rising (Debian 10+ because of Monado runtime). Need also to decide what to do with existing SteamVR (AppVive) integration. Some doubts we have about Varjo code (very custom one), right now we are not sure if everything will work just fine with OpenXR. We also have plans for Custom App refactoring and unification that would make possible to run VR without any plugins (so this will allow to use VR in custom apps like Editor). OpenXR looks promising right now, hope it would not end up to something like OpenGL :) How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
christian.wolf2 Posted October 15, 2021 Share Posted October 15, 2021 18 hours ago, silent said: We are tracking news regarding OpenXR updates and it looks like the next year is the best time for starting our initial OpenXR integration. Does this also mean, that we might get support for AR devices such as Hololens 2? AR integration into UNIGINE would be a huge step because only for that products we need to stick with Unity3D. Link to comment
silent Posted October 15, 2021 Share Posted October 15, 2021 That's highly unlikely. I can see only a way to stream video (you will render a picture on PC and send the final results via some protocol to the HMD) with all it's negative costs: latency, quality and so on. How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
christian.wolf2 Posted October 15, 2021 Share Posted October 15, 2021 Thanks for the info silent. Unfortunately we need the full control and interaction of the Hololens 2 to work with and our clients are working in area with sparse internet connection, so even a local PC is not an option. Maybe can you please elaborate a little bit more about the details of why an Hololens 2 integration wouldn't be possible for the engine team? Can such thing be implemented on our own in some way (like CustomApp)? Best Christian Link to comment
silent Posted October 15, 2021 Share Posted October 15, 2021 Hololens 2 is basically a mobile ARM-based device (SoC: Qualcomm Snapdragon 850), so there is not enough computational power for high quality rendering purposes (only very simple mobile-based shaders can be used there). Since we don't have mobile support in UNIGINE 2, it would be pretty much hard to make engine work on this platform. 1 How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
christian.wolf2 Posted October 15, 2021 Share Posted October 15, 2021 Okay, already thought so, thanks for the confirmation. Than we will se how we further proceed with that. Link to comment
sebastian.vesenmayer Posted October 18, 2021 Share Posted October 18, 2021 (edited) We are trying to integrate OpenXR right now as a prototype. Edited October 18, 2021 by sebastian.vesenmayer 1 Link to comment
silent Posted October 19, 2021 Share Posted October 19, 2021 That's interesting :) Do you have maybe an intermediate results / feedback regarding the current state of this API? How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
sebastian.vesenmayer Posted October 19, 2021 Share Posted October 19, 2021 I used the guide and example from microsoft to integrate OpenXR in our Application with Unigine. https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/openxr-getting-started It is still under development. But basically you get the estimated position times , eye positions, fovs and textures for the frame and calculate the image for each eye into their swapchain. This will then work with any hmd which supports the extensions you have definend. 1 Link to comment
sebastian.vesenmayer Posted October 19, 2021 Share Posted October 19, 2021 (edited) We are using two WidgetSpriteViewport to render both eyes, but we are getting very much jittering. I am not sure if this is due to depth reprojection in openxr. Is the depth buffer also written to the rendertargetview with WidgetSpriteViewport? Or maybe it is the one frame latency. Unigine is swapping the last rendered frame, if i am right. Would it be better to render to texture to get arround this? Is the depth buffer available for copying in any way? And maybe it is the device itself... Edited October 19, 2021 by sebastian.vesenmayer Link to comment
silent Posted October 19, 2021 Share Posted October 19, 2021 Are you using 2.14 right now? Looks like there is some issue with WidgetSpriteViewport rendering at this moment. In the meantime you can write your own implementation based on WidgetSprite and Viewport: WidgetSpritePtr w_sprite; TexturePtr texture; ViewportPtr viewport; int init() { w_sprite = WidgetSprite::create(Gui::get()); Gui::get()->addChild(w_sprite, Gui::ALIGN_FIXED | Gui::ALIGN_OVERLAP); texture = Texture::create(); texture->create2D(width, heigth, Unigine::Texture::FORMAT_RG11B10F /* just an example */, Texture::FILTER_LINEAR | Texture::ANISOTROPY_16 | Texture::USAGE_RENDER); viewport = Viewport::create(); viewport->setSkipFlags(Viewport::SKIP_VELOCITY_BUFFER); viewport->setRenderMode(Viewport::RENDER_DEPTH_GBUFFER_FINAL); Engine::get()->addCallback(Engine::CALLBACK_BEGIN_RENDER, MakeCallback(&begin_render)); } void begin_render() { viewport->renderTexture2D(camera_for_viewport, texture); w_sprite->setRender(texture, !Render::isFlipped()); } void shutdown() { viewport.deleteLater(); texture.deleteLater(); w_sprite.deleteLater(); } In init() you need to create a regular sprite, texture and viewport and in begin_render() callback you need to render actual viewport into the texture and set it to sprite. Hope that helps :) Thanks! How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN Link to comment
sebastian.vesenmayer Posted October 20, 2021 Share Posted October 20, 2021 Ok we disabled the XR_KHR_composition_layer_depth in OpenXR , now we get a smooth picture. It works but reprojection would be better with depth buffer. Is it possible to get the depth buffer texture from the WidgetSpriteViewport somehow? Link to comment
Recommended Posts