Jump to content

Projected Texture for Mixed Reality


photo

Recommended Posts

Hello There,

I am working on one mixed reality implementation. I have tracking device for tracking position and orientation of real camera and the real camera feed comes in the PC where I am mixing the real feed with Virtual with matching FOV. I have first implemented a layer based approach where the AR objects from Unigine are rendered on top of the Real feed and I do it in post processing. There is one issue. I don't get the Virtual objects shadow and or SSR on real feed. For this approach I have implemented Projected texture material. Please find attached.

The mesh_projected_texture_base is replica of mesh_base with few modifications for projected texture from current camera position and with same camera projection matrix. The actual changes are in fragment.h (line 406) and emission.shader only. Other are just path changes for this material. The idea is I will create a cube big enough matching with dimensions of my tracking area.or bigger than that and place my world inside this cube. When my tracking camera moves it will never go out of this cuboid. Now in this scenario I will always see the full cuboid rendered.So I will Project the incoming input feed onto it. Later I will implement object mesh dynamic to hold the cube and I can dynamically allow the mesh resizing. With this approach I will always see a full screen camera aligned input video and in between virtual objects will cast deferred shadows and other effects will also possible.

Now I have two issues.

1] Check Lit problem video. I actually required a unlit texture so there won't be seams in between and lightning effect of virtual should change the uniformity of the texture. Can I achieve this by changing some things in material and getting rid of some passes?

2] In second video If I use alpha blend which will require if incoming video is chroma keyed, I have flicker on other objects in front of cuboid. This effects is not there with opaque state. 

3] When I change to opaque state the UV are changing which I have not accounted for projected textures. Where should I look for this?

Please advise. 

The faster reply is much appreciated.

Rohit

 

texture_projection_test.zip

Link to comment

Hello Rohit,

The issues are visible, thanks for the videos and sample project. We can't provide you a fast response on this, sorry. Majority of our team left for vacation, the'll be back next week. The other thing that bothers me is switching the context from "regilar" desktop rendering to AR/Mixed reality. Investigating this case might take some time.

How to submit a good bug report
---
FTP server for test scenes and user uploads:

Link to comment

Dear Morbid,

Patience is all here. But my situation is like ->

Quote

 Patience is all about concealing your impatience. -By Guy Kawasaki 

Only one point... 

Quote

 The other thing that bothers me is switching the context from "regilar" desktop rendering to AR/Mixed reality. 

 This is just a normal desktop rendering technique used in mixed reality to have shadows and SSR on real objects or having reflection of real things on virtual. 

Hoping for the best. Anyways I am seriously impatient for my implementation to work fast and I am working on. If I succeed, will keep you posted.

Regards,

Rohit.

Link to comment

Dear @morbid,

Here are few pointers after yesterdays work.

In the first video there is lighting. I want something unlit. With Alpha Blend the blur issue is not there as I mistakenly turn on overlap. As soon as you put overlap aside, TAA comes into picture and caching makes that weird transformations of UV's probably.

In second video, After getting rid of overlap, the flickering on objects edges go away. and If I turn of TAA, I get the smooth UV and projected texture.

What I want to achieve is 

1] How could I project a texture on cube which will be similar to only Albedo Texture and unlit?

2] Can I get rid of TAA caching for particular material?

3] Still all deferred effects like shadows, SSR should work.

I hope This is a clear definition of my requirement now.

Rohit

Link to comment

Hello Rohit,

Here is my proposal

1) To do unlit output you can use ambient pass and do all the shading there. It can be as simple as outputting albedo texture. 
2) You can't get rid of TAA but you can make it work correctly. To achieve that your opacity/deferred.shader should output zero velocity, because texture is bound to the screen, not to the object's surface
3) SSR is a little bit tricky. It uses screen color from the previous frame but gbuffer from the current frame which contains only opacity geometry. So workaround I came up with here is to make a copy of every object and assign opacity material to them. It fills gbuffer with needed info. But lighting is overridden with alpha blend material after that.

Modified source code is attached. Hope it helps!

texture_projection_test_v2.zip

  • Like 2
Link to comment

Dear @andrey-kozlov,

Thank you very much for the answer above. I took some time to work on this. But now The velocity buffer set to Zero is amazing stuff and it solves the velocity artifacts on same place projected texture. And that second cuboid workaround is awesome to understand and know about it. So we are actually fooling the rendering pipeline to achieve good results. Computer Graphics is always all about this.

Now I have made few changes and those are attached. Things looks promising now but two small issues.

What I have done and Why?

Actually for my implementation I will always remain inside the cube than outside of it. As keeping tracking area inside of cube will always give screen filled texture and the whole part of video texture will always mapped onto screen. I will map incoming video feed to this material. By getting inside of cube I lose many things from the previous approach and there are no shadows. So I changed following in ambient shader. Emission is added for safety if one wants clean texture.

OUT_COLOR = float4_zero;
	gbufferSRGB(gbuffer);
	OUT_COLOR.rgb = gbuffer.albedo;
	OUT_COLOR.a = gbuffer.transparent;
	
	Data data;
	dataCalculateAll(data, gbuffer, DATA_POSITION, IN_POSITION.xy);
	
	#ifdef LIGHT_WORLD
			float3 diffuse_light = float3_zero;
			float3 specular_light = float3_zero;
			float3 light_modulation;
			
			getWorldLight(diffuse_light, specular_light, gbuffer, data, light_modulation);
			
			//OUT_COLOR.rgb += diffuse_light * gbuffer.albedo * data.dielectric + specular_light * gbuffer.occlusion;
			OUT_COLOR.rgb += diffuse_light * gbuffer.albedo * data.dielectric + specular_light * gbuffer.occlusion;			
	#endif	
	
	#ifdef EMISSION && (EMISSION_BAKE_GI || (!BAKING_LIGHTING))
		//[AroRT] Change for Projected Texture [AroRT] ****************************/
		//float3 emission = TEXTURE_EMISSION(TEX_EMISSION).rgb * m_emission_color.rgb;
		float3 emission = float3(0.f,0.f,0.f);
			
		// Determine if the projected coordinates are in the 0 to 1 range.  If it is then this pixel is inside the projected view port.
		if((saturate(projectTexCoord.x) == projectTexCoord.x) && (saturate(projectTexCoord.y) == projectTexCoord.y))
		{
			emission = TEXTURE_BIAS_ZERO(TEX_EMISSION, projectTexCoord).rgb * m_emission_color.rgb;
		}		
		//[AroRT] Change for Projected Texture [AroRT] ****************************/
		
		#ifdef VERTEX_COLOR && VERTEX_EMISSION
			emission *= DATA_VERTEX_COLOR.rgb;
		#endif
		OUT_COLOR.rgb += srgbInv(emission) * m_emission_scale;
	#endif

1. Used the code as you have supplied.

2. Got rid of second cuboid and change the normal mapping to Object space. Small change in Translucent  parameter to get rid of one specular shade.

3. The ambient shader is changed to following so I can also get shadows. You may change the textures to baby.png, ab.png and outdoor to check it.

4. Actually the real scenario will have transparent image and that's why the second cuboid is creating problem inside cube with opaque workflow.

5. Still with the attached stuff, I get little changes in texture (acceptable at this moment), with shadows and SSR on other objects like on material ball. But reflection it is not coming on to the floor.

6. So I stopped floor SSR and try to use planar reflection.It works but projects the same material. So What could be the possible way in fragment.h to change the direction if it is rendering for Planar reflection. Probably following thing will solve but which define and where to map it.

// Calculate the projected texture coordinates.	
    float2 	projectTexCoord;			
	projectTexCoord.x =  position.x / position.w / 2.0f + 0.5f;
#ifdef (somehow PLANAR reflection)
	projectTexCoord.y = position.y / position.w / 2.0f + 0.5f;
#else
	projectTexCoord.y = -position.y / position.w / 2.0f + 0.5f;
#endif

The other problem is how I can increase the intensity of shadows if required.

If these two issues are solved I think I will get the acceptable behavior to match the real and virtual stuff.

Regards,

Rohit

 

texture_projection_test_v3.zip

Link to comment

Hello Rohit,

1) We don't have a separate state for planar reflections. But you can check if the transformation matrix has reflection operation by this snippet

float4 row_0 = s_transform[0];
float4 row_1 = s_transform[1];
float4 row_2 = s_transform[2];

bool is_reflection = dot(cross(row_0.xyz, row_1.xyz), row_2.xyz) < 0.0f;

2) Shadows are visibility term and they conceptually don't have intensity. If you want to get more contrast between shadowed and non-shadowed surfaces you can do that by tweaking light intensity. Maybe our artists can help you with setting up lighting of your scene if you need additional consulting on it.

Link to comment

I probably misunderstood the question about shadows. If you were talking about applying shadows in your ambient pass then you can extract world light shadows and then apply them manually. But principle is the same - shadowed output should be much less bright to get dramatic shadows. Modified source is attached

ambient.shader

Link to comment
×
×
  • Create New...