Jump to content

how to export camera path from softimage


photo

Recommended Posts

I've read the manual, as I mailed the support earlier before, the xsi plugin seems used wrong matrix to transfrom from Softimage to unigine.

 

I read the manual, unigine use right hand coordinate system, softimage also use right hand coord system. but the difference is that unigine use Z-up, softimage use Y-up.

 

I've tried to use the same matrix from export plugin to export camera's animation, but the result is always wrong.

 

So I'm asking a small question, how to convert Y-up to Z-up right hand coordinate system with correct rotation?

 

Softimage are exactly same to unigine except that si use Y up, default rotation order is xyz

Link to comment

the problem of xsi export plugin is that plugin think -x is forward direction, y as right direction.

 

although the result looks like same, but indeed, it's different, I've also asked this before, and ulf told me I was wrong. now I finally found the problem is the wrong export plugin.

Link to comment

I saw this in core/scripts/camera.h

		quat rotation = inverse(lerp(rot[p0],rot[p1],time));
		float fov = lerp(fov[p0],fov[p1],time);
		vec3 z = normalize(rotation * vec3(0.0f,0.0f,1.0f));
		vec3 x = normalize(cross(rotation * vec3(0.0f,1.0f,0.0f),z));
		vec3 y = normalize(cross(z,x));
		mat4 transform = translate(position);
		transform.m00m10m20 = x;
		transform.m01m11m21 = y;
		transform.m02m12m22 = z;

 

and this in the manuql

Each selected camera path will be exported in a separate texture file (whithout any extension) with the following format:

 

Output

N

position0.x position0.y position0.z rotation0.x rotation0.y rotation0.z rotation0.w fov0

positionN.x positionN.y positionN.z rotationN.x rotationN.y rotationN.z rotationN.w fovN

 

Here N is the number of time units (frames). The camera in each time unit is described by a single line containing a position, a rotation, and a field of view.

 

from the code, it seems it doesn't use the original quat from camera's transform, but the manual said it is the quat of camera in the frame.

 

I've figured out that if I rotate 90 degrees around x axis, then I'll get the correct transfrom matrix in unigine ( including all SRT ). but if I just export the pos, rotation(quat), fov to that format, and use the script to play the path, the camera is always wrong. if I change the code to this

		vec3 position = lerp(pos[p0],pos[p1],time);
		quat r = lerp(rot[p0],rot[p1],time);
		float fov = lerp(fov[p0],fov[p1],time);
//			vec3 z = normalize(rotation * vec3(0.0f,0.0f,1.0f));
//			vec3 x = normalize(cross(rotation * vec3(0.0f,1.0f,0.0f),z));
//			vec3 y = normalize(cross(z,x));
		mat4 transform = translate(position) * rotation(r);
//			transform.m00m10m20 = x;
//			transform.m01m11m21 = y;
//			transform.m02m12m22 = z;
		player.setWorldTransform(transform);
		player.setFov(fov);

 

it worked perfectly. why? what kind of rotation quat does this script need?

Link to comment
×
×
  • Create New...