Jump to content

OpenXR


photo

Recommended Posts

vacious

OpenXR looks promising, however not all HMDs support it yet (like Varjo, for example). In this case we still would need to do a native plugin + OpenXR implementation which is quite odd.

As soon as there will be more HMDs supporting OpenXR out of the box (including Windows / Linux integrations) we sure will take a look into it. More likely it would happen next year (we didn't discuss timeframe for OpenXR integration yet, so I can't give you any exact ETA right now, sorry).

Is there any specific feature in OpenXR that you are looking for?

Thanks!

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Thanks for the info.

We need to support the Oculus S & Vive Cosmos in our application.  From what i understand, Unigine support for both Oculus & HTV Vive is for the previous generation devices which used base stations, it seems (please correct me if im wrong) that the current generation devices which use inside-out tracking are not supported.  Open XR seemed like a shortcut to add support for these devices.

 

Link to comment

Right now we don't have such devices on our test farm, so I can't say that everything will work smooth with current SDK version.

Also I can't see any significant diff with new OpenVR and Oculus APIs, so I assume that they should somehow work with current AppVive and AppOculus implementations. The only minor issue that may occur is the new mappings on controllers, but it can be easily added on your side by modifying AppVive sources.

Do you have these devices on your hands right now? It would be nice to see if they will work or not.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Good to hear that the differences with the new API's dont appear significant! We should be getting the Oculus S next week and the Vive Cosmos in 2-3 weeks time - I'll post our findings here.

 

Link to comment

I can confirm that the head-tracking (rotation & movement) work out of the box with Rift S & Unigine (2.5).  We've yet to try the controllers - Will update in a couple of weeks.

Link to comment
  • 2 months later...
  • 10 months later...
  • 11 months later...

We are tracking news regarding OpenXR updates and it looks like the next year is the best time for starting our initial OpenXR integration. Drawback for Linux OS would be minspec rising (Debian 10+ because of Monado runtime). Need also to decide what to do with existing SteamVR (AppVive) integration.

Some doubts we have about Varjo code (very custom one), right now we are not sure if everything will work just fine with OpenXR.

We also have plans for Custom App refactoring and unification that would make possible to run VR without any plugins (so this will allow to use VR in custom apps like Editor).

OpenXR looks promising right now, hope it would not end up to something like OpenGL :)

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment
18 hours ago, silent said:

We are tracking news regarding OpenXR updates and it looks like the next year is the best time for starting our initial OpenXR integration.

 

Does this also mean, that we might get support for AR devices such as Hololens 2? AR integration into UNIGINE would be a huge step because only for that products we need to stick with Unity3D.

Link to comment

Thanks for the info silent. Unfortunately we need the full control and interaction of the Hololens 2 to work with and our clients are working in area with sparse internet connection, so even a local PC is not an option. Maybe can you please elaborate a little bit more about the details of why an Hololens 2 integration wouldn't be possible for the engine team? Can such thing be implemented on our own in some way (like CustomApp)?

 

Best

Christian

Link to comment

Hololens 2 is basically a mobile ARM-based device (SoC: Qualcomm Snapdragon 850), so there is not enough computational power for high quality rendering purposes (only very simple mobile-based shaders can be used there).

Since we don't have mobile support in UNIGINE 2, it would be pretty much hard to make engine work on this platform.

  • Like 1

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

I used the guide and example from microsoft to integrate OpenXR in our Application with Unigine.

https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/openxr-getting-started

It is still under development. But basically you get the estimated position times , eye positions, fovs and textures for the frame and calculate the image for each eye into their swapchain.

This will then work with any hmd which supports the extensions you have definend.

  • Thanks 1
Link to comment

We are using two WidgetSpriteViewport to render both eyes, but we are getting very  much jittering.

I am not sure if this is due to depth reprojection in openxr. Is the depth buffer also written to the rendertargetview with WidgetSpriteViewport?

Or maybe it is the one frame latency. Unigine is swapping the last rendered frame, if i am right.

Would it be better to render to texture to get arround this?

Is the depth buffer available for copying in any way?

And maybe it is the device itself...

Edited by sebastian.vesenmayer
Link to comment

Are you using 2.14 right now?

Looks like there is some issue with WidgetSpriteViewport rendering at this moment. In the meantime you can write your own implementation based on WidgetSprite and Viewport:

WidgetSpritePtr w_sprite;
TexturePtr texture;
ViewportPtr viewport;


int init()
{
	w_sprite = WidgetSprite::create(Gui::get());
	Gui::get()->addChild(w_sprite, Gui::ALIGN_FIXED | Gui::ALIGN_OVERLAP);

	texture = Texture::create();
	texture->create2D(width, heigth, Unigine::Texture::FORMAT_RG11B10F /* just an example */, Texture::FILTER_LINEAR | Texture::ANISOTROPY_16 | Texture::USAGE_RENDER);

	viewport = Viewport::create();
	viewport->setSkipFlags(Viewport::SKIP_VELOCITY_BUFFER);
	viewport->setRenderMode(Viewport::RENDER_DEPTH_GBUFFER_FINAL);

Engine::get()->addCallback(Engine::CALLBACK_BEGIN_RENDER, MakeCallback(&begin_render));
}

void begin_render()
{
	viewport->renderTexture2D(camera_for_viewport, texture);
	w_sprite->setRender(texture, !Render::isFlipped());
}

void shutdown()
{
	viewport.deleteLater();
	texture.deleteLater();
	w_sprite.deleteLater();
}

In init() you need to create a regular sprite, texture and viewport and in begin_render() callback you need to render actual viewport into the texture and set it to sprite.

Hope that helps :)

Thanks!

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Ok we disabled the XR_KHR_composition_layer_depth in OpenXR , now we get a smooth picture.

It works but reprojection would be better with depth buffer.

Is it possible to get the depth buffer texture from the WidgetSpriteViewport somehow?

 

Link to comment
×
×
  • Create New...