AntoineAB Posted June 19, 2024 Posted June 19, 2024 Hello, I recently downloaded the VR Demo sample in 2.18.1 and copied it as a C++ project. I have two questions regarding the sample : 1 - In the code there is a file named VRPlayerVive.h. This header is including AppViveProxy.h and EyetrackingProxy.h that are not in the project. I was wondering what is that header used for ? (it doesn't seem to be compiled with the project). 2 - In the editor, in the world vr_sample.world there is the vr_node in it, but in that vr_node there is no reference to the vive_pad.node. I was wondering how are the vive pad models put in the scene when I use it in VR with my vive controller ? (for exemple when I press run in the editor for the project vr_sample) Regards.
bmyagkov Posted June 20, 2024 Posted June 20, 2024 Hello! 17 hours ago, AntoineAB said: 1 - In the code there is a file named VRPlayerVive.h. This header is including AppViveProxy.h and EyetrackingProxy.h that are not in the project. I was wondering what is that header used for ? (it doesn't seem to be compiled with the project). These headers are leftovers from the previous version of the VR template. With the transition to the engine's built-in VR capabilities these proxies are no longer used. All necessary functionalities are now available through the engine's API. You can safely ignore these files as they are no longer needed and will be completely removed in the upcoming update 2.19 this summer. Quote 2 - In the editor, in the world vr_sample.world there is the vr_node in it, but in that vr_node there is no reference to the vive_pad.node. I was wondering how are the vive pad models put in the scene when I use it in VR with my vive controller ? (for exemple when I press run in the editor for the project vr_sample) Starting from SDK 2.18 the VR system is integrated into the engine and fetches the controller models directly from OpenVR. You might take a look at the classes InputVRDevice and its subclass InputVRController. The models are retrieved using the method InputVRDevice::getModelMesh. For more details, you can refer to the documentation here: https://developer.unigine.com/en/docs/2.18.1/api/library/controls/class.inputvrdevice?rlang=cpp#getModelMesh_int_Mesh Thanks! 1
AntoineAB Posted June 27, 2024 Author Posted June 27, 2024 Hello, Thank you for the information, it was helpful. I have two other questions regarding mixed reality : - In the vr_sample project, when I enable mixed reality, then enable Blend Masking and I put the blend masking mode to "Restrict Video to Mask" -> I see a cube behind the wooden desk that shows the "real world" and I can't find that cube in the editor. Do you know where it is (or where is the node created) and how can I disable it ? (see screenshots for reference) - I want to get the same comportment of the cube in the previous question but on other meshes. I know I require a special material for varjo to recognize some meshes as "masks" but how to configure that material ? is there a example in the vr_sample ? (I saw in this forum post : that I should create some kind of "varjo_mesh_mask" ?) Regards.
bmyagkov Posted June 28, 2024 Posted June 28, 2024 Hello! 17 hours ago, AntoineAB said: - In the vr_sample project, when I enable mixed reality, then enable Blend Masking and I put the blend masking mode to "Restrict Video to Mask" -> I see a cube behind the wooden desk that shows the "real world" and I can't find that cube in the editor. Do you know where it is (or where is the node created) and how can I disable it ? (see screenshots for reference) - I want to get the same comportment of the cube in the previous question but on other meshes. I know I require a special material for varjo to recognize some meshes as "masks" but how to configure that material ? is there a example in the vr_sample ? (I saw in this forum post : This specific object is created from the code in the MixedRealityMenuGui.cpp component: // box for blend masking { ObjectMeshDynamicPtr mask_mesh = Unigine::Primitives::createBox(vec3(1.0f, 1.0f, 1.0f)); mask_mesh->setWorldPosition(Math::Vec3(0.0f, 3.0f, 0.0f)); MaterialPtr mat = Materials::findManualMaterial("Unigine::mixed_reality_mesh_blend"); if (mat) { mat = mat->inherit(); mat->setParameterFloat4("mask_color", vec4(0.7f, 0.9f, 0.0f, 1.0f)); mask_mesh->setMaterial(mat, 0); } } The material "mixed_reality_mesh.basemat" is set through the code and can also be found in the Asset Browser at the "core -> materials -> base -> vr -> objects -> mesh" folder. So, basically, there should be no issue in creating such an object straight through the editor. Thanks! 1
AntoineAB Posted July 5, 2024 Author Posted July 5, 2024 Hello ! I have another question regarding the controllers. In the SDK 2.16 I had a vr_layer with the controller nodes in it, so I could modify it to add a body rigid and play with collisions. But now in 2.18.1 the controller models are retrived from OpenVR, as you said : On 6/20/2024 at 9:52 AM, bmyagkov said: The models are retrieved using the method InputVRDevice::getModelMesh. So how could I add a body rigid to the controllers to make them have collisions with other meshes ? Should I use the old way and create my own controllers and not use the mesh from openVR ? or is there a way to add a bodyrigid of the right shape to the controllers retrieved from OpenVR ? Regards.
silent Posted July 5, 2024 Posted July 5, 2024 Hi Antoine, Both ways are possible to use. For simplicity, I would recommend sticking with the previous approach from version 2.16 and using the old meshes and physics setup that worked for you. For that you would need to modify VRPlayerVR::controller_update() for your needs and replace getCombinedModelMesh() methods with your own code that you were using before. Thanks! 1 How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN
AntoineAB Posted July 8, 2024 Author Posted July 8, 2024 Hello, Thank you for your recommendation, it helped a lot. On another topic, I've noticed that when using Mixed Reality and enabling the Depth Test checkbox, there is a mismatch between a virtual object going through another virtual object and a real object going through a virtual object. The real object needs to be placed further away than the virtual object to "collide" with a virtual element. I made a video to better explain the situation where you can see that the virtual controller is going through the virtual desk before my real hand (right hand) does. My hands and the controller have the same height (they are on the same plane). MR_compare_occlusion.mp4 When using Mixed Reality, I can feel that there is an offset between the real and the virtual objects. Do you know if this can be changed ? are there functions to enable to avoid this issue ? I suppose this offset is linked with the offset between the eye position and the camera position in the Varjo XR-3. Varjo has a feature called "Eye Reprojection" that could maybe solve the problem (see : https://developer.varjo.com/docs/native/adjusting-camera-settings#eye-reprojection) but I didn't manage to find that camera property in the VRMixedReality class. Regards.
sweetluna Posted July 10, 2024 Posted July 10, 2024 Hi AntonieAB, There is VRMixedReality::setViewOffset function that may correct this offset. You can refer to this page from Varjo. Hope this helps. May RenderDoc/Nsight Graphics/Intel GPA bless you
AntoineAB Posted July 16, 2024 Author Posted July 16, 2024 Hello, Thank you for the help. This function reduces the offset by "bringing" the virtual content closer to the user's eye, but there is still an offset between the real and virtual when using depth test. The issue that brings is a feeling that the scale of the 3D models is wrong when using depth test and try to reach some virtual parts with our real hands (they seem to be further away than they look). Maybe I missed a configuration on my headset, have you experienced that issue when using the VR demo in XR with Unigine SDK 2.18.1 ? Or do you know any other functions that could help fixing this issue ? Best regards.
silent Posted July 18, 2024 Posted July 18, 2024 On our test stand there is almost no offset between real controller and virtual model with the following console command: vr_mixed_reality_view_offset 1: However, due to how this technology works there will be always a slight difference between real and virtual controller position. Try to power off HMD, put it in front of you and recalibrate in SteamVR. Maybe that will improve the situation. Thanks! How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN
AntoineAB Posted July 19, 2024 Author Posted July 19, 2024 (edited) Hello, Yes on that point there is no offset between real and virtual (when putting the view offset to 1) ! But my issue is not here. When using Depth Test, you see your real hands mixed with the virtual content. But when you want to "touch" some of the virtual content with the hands, the intersection is further than it looks. For example, I want to put my real hand below the desk in the vr_sample. I'll have to place it way below than the virtual desk looks to be to see my hand disappear below it. And for comparison if I take virtual object with a controller (for example a cube) and I want to place it below the desk, the cube will disappear at the right position. Here is another video to show my issue : MixedRealityDepthTestOffset.mp4 You can see on the video that my real hand is hidden by the desk way below than the virtual controller is. The view offset is at 1, so the real life model and the virtual model are perfectly matching (you can see it on the controller). Do you also have that offset when doing the same test ? Best regards. MixedRealityDepthTestOffset.mp4 Edited July 19, 2024 by AntoineAB
silent Posted July 19, 2024 Posted July 19, 2024 Well, I think that's what we get with the current depth sensor. Don't expect too much from it :) We will try to understand what's going on here next week, but chances that everything will work as you might expect are not very high. IMHO the only useful MR mode is when you are seated inside an actual vehicle, and the outside view is being generated in HMD via 3D engine (referred to as Masking in Varjo): In that case, you don't need to struggle with low accuracy depth, and you can interact with real physical elements inside a vehicle without any hassle. Thanks! How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN
AntoineAB Posted July 22, 2024 Author Posted July 22, 2024 Hello, Thank you for the quick response. I have tried doing the same experiment inside a Unity project with the varjo XR-3 and the same varjo parameters and there was no offset like I showed on the video. The only parameter that was different was the EyeReprojection cameraProperty mode (https://developer.varjo.com/docs/native/adjusting-camera-settings#eye-reprojection) that I didn't find in Unigine::VRMixedReality class. And yes, you are right, seeing a real vehicle and a virtual outside environment is a nice use case for Mixed Reality ! But I also want to do the opposite (virtual vehicle inside a real environment). Best Regards. 1
silent Posted July 22, 2024 Posted July 22, 2024 Thanks for the update. We will see what we can do in order to improve this behavior. At this moment, all the developers are focused on the 2.19 update. Shortly after that, I believe we can spend more time investigating this Varjo XR-3 behavior. Could you please inform us about the urgency of this task or provide the deadline? 1 How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN
AntoineAB Posted July 22, 2024 Author Posted July 22, 2024 Thank you, don't worry, this is not an urgent task. It can wait a few month. Best regards. 1
silent Posted August 23, 2024 Posted August 23, 2024 @AntoineAB We've made some adjustments for Varjo in 2.19 and added missing VST Reprojection configuration. So far the best results we can achieve with vr_mixed_reality_camera_vst_reprojection_mode set to 0 in console: varjo_capture_2024-08-10_18-51-35-224.mp4 Could you please check that you can now achieve better results with thix mixed reality mode in 2.19 release? Thanks! How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN
Recommended Posts