Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to control unsteady rotation and unwanted flips of brain slices generated via IDL bridged to python using HTC Vive Controller? #18

Open
vitthal-bhandari opened this issue Jun 21, 2019 · 6 comments

Comments

@vitthal-bhandari
Copy link

We are trying to generate 2D brain slices from 3D model of the brain using VR controller ( HTC Vive). We want to control brain orientation using controller's rotation and be able to generate slices in all possible directions by moving the controller. Our current code is able to extract the slices but traversing them and changing the orientation of brain is difficult because the controls are unstable. The slices flip randomly over a slight change in the controller position. The code does not read the slice in the right orientation of the controller pose. This needs to be corrected.

We have written the code in python and are extracting the brain slice using IDL bridged to python. To track controller's movement and linking it 3-D model of the brain we are using a python library triad_openvr. To get controller's co-ordinates and rotation we are using get_euler_pose()[]. We set the initial co-ordinates and rotation to 0 so that controller sets the origin to the place it was kept initially.

The slices appear as expected most of the times. It is the traversal that is causing an issue. Movement of controller causes slices to behave differently at times. The slices tend to flip over a certain rotation of the controller and the movement is not smooth. It has become difficult to synchronize the axes of the brain with the real life axes.

import triad_openvr
import time
import sys
from idlpy import *

x=0.0
y=0.0
z=0.0
xrot = 0.0
yrot=0.0
zrot=0.0
sign = ""
v = triad_openvr.triad_openvr()
v.print_discovered_objects()

#Getting the initial pose of the controller so that everything can be reset to that pose as origin
initX=100*v.devices["controller_1"].get_pose_euler()[0]
initY=100*v.devices["controller_1"].get_pose_euler()[1]
initZ=100*v.devices["controller_1"].get_pose_euler()[2]
initXRot=v.devices["controller_1"].get_pose_euler()[3]
initYRot=v.devices["controller_1"].get_pose_euler()[4]
initZRot=v.devices["controller_1"].get_pose_euler()[5]


if len(sys.argv) == 1:
	interval = 1/10.0 # originally 1/250
	print("1")
elif len(sys.argv) == 2:
	interval = 1/float(sys.argv[0])
	print("2")
else:
	print("Invalid number of arguments")
	interval = False
print(interval)
if interval:
	IDL.run("file = 'C:\Program Files\Exelis\IDL85\examples\data\head.dat'")
	IDL.run("vol = READ_BINARY(file, DATA_DIMS = [80, 100, 57])")
	IDL.run("LOADCT, 5")
	IDL.run("DEVICE, DECOMPOSED = 0, RETAIN = 2")
	while(True):
		start = time.time()
		x=100*v.devices["controller_1"].get_pose_euler()[0]
		y=100*v.devices["controller_1"].get_pose_euler()[1]
		z=100*v.devices["controller_1"].get_pose_euler()[2]
		newx = x - initX + 40
		newy = y - initY + 50
		newz = z - initZ + 28
		xrot=v.devices["controller_1"].get_pose_euler()[3]
		yrot=v.devices["controller_1"].get_pose_euler()[4]
		zrot=v.devices["controller_1"].get_pose_euler()[5]
			 
		newXRot = xrot - initXRot 
		newYRot = yrot - initYRot
		newZRot = zrot - initZRot

		
		commandToRun = "slice = EXTRACT_SLICE(vol, 110, 110,"+ str(newx) + "," + str(newy)+","+str(newz)+","+ str(newXRot) + "," +str(newYRot) + "," + str(newZRot) + ", OUT_VAL=0B)"

		#print(str(newXRot) + "," + str(newYRot) + "," + str(newZRot))
		
		IDL.run(commandToRun)
		IDL.run("TVSCL, CONGRID(slice, 800, 800,/INTERP)")
		sleep_time = interval-(time.time()-start)
		sleep_time = interval-(time.time()-start)
		if sleep_time>0:
			time.sleep(sleep_time)
@lgbeno
Copy link
Collaborator

lgbeno commented Jun 21, 2019

This appears to be related to how this library converts the pose matrix to euler angles. This has been suspect many other times as well and has to do with how the sin and cos functions used in this conversion handle the transition from 0 deg to 360.

Also in your code, you are calling .get_pose_euler() 6 times and accessing each element. This could mean that the pose information comes from separate samples. Better to do this as:
x,y,z,xrot,yrot,zrot = v.devices["controller_1"].get_pose_euler()

If possible can you adapt your IDL script to use a pose matrix instead of euler angles? How about quaternions?

@vitthal-bhandari
Copy link
Author

This appears to be related to how this library converts the pose matrix to euler angles. This has been suspect many other times as well and has to do with how the sin and cos functions used in this conversion handle the transition from 0 deg to 360.

Also in your code, you are calling .get_pose_euler() 6 times and accessing each element. This could mean that the pose information comes from separate samples. Better to do this as:
x,y,z,xrot,yrot,zrot = v.devices["controller_1"].get_pose_euler()

If possible can you adapt your IDL script to use a pose matrix instead of euler angles? How about quaternions?

Thanks @lgbeno . We'll try and let you know in a while what happens ahead.

@vitthal-bhandari
Copy link
Author

@lgbeno we tried implementing the above method :
x,y,z,xrot,yrot,zrot = v.devices["controller_1"].get_pose_euler()
but it didn't work. Now we are stuck at getting the pose matrix of the controller. We came across this code snippet :

def get_pose_radians(self):
	pose = self.vr.getControllerStateWithPose(openvr.TrackingUniverseStanding,self.index,openvr.k_unMaxTrackedDeviceCount)
        return convert_to_radians(pose[2].mDeviceToAbsoluteTracking)

However, self.index is throwing an error :
triad_openvr instance has no attribute 'index'
But we have already imported openvr . What should be done? Is there any alternate method to get the pose matrix?

@lgbeno
Copy link
Collaborator

lgbeno commented Jun 25, 2019

I added a "get_pose_matrix" function however will leave it to you to test and see if it works: https://github.com/TriadSemi/triad_openvr/blob/master/triad_openvr.py#L113

self.index is a member of the vr_tracked_device_class so you have to define the function inside that class.

now you should be able to do:

mat = v.devices["controller_1"].get_pose_matrix()

@vitthal-bhandari
Copy link
Author

Hey @lgbeno ! thanks for the "get_pose_matrix" function. We tested it and it works just fine.

Upon moving ahead to use quaternions for displaying the slices, we are facing a difficulty. We do not know the exact usage of "r_w" obtained from "convert_to_quaternion" function. When we use euler coordinates, we get six parameters( x,y,z,r_x,r_y,r_z) which are passed directly into the
EXTRACT_SLICE( Vol, Xsize, Ysize, Xcenter, Ycenter, Zcenter, Xrot, Yrot, Zrot )
function of IDL which displays the slice. However upon using quaternions, we are getting an extra parameter "r_w" which does not fit into the "EXTRACT_SLICE" function. Consequently we do not obtain the correct output.

  1. How do we have to use the extra parameter "r_w" in our code. What is the difference between the six parameters returned by "convert_to_euler" and the seven returned by "convert_to_quaternion" .

  2. We have tried a lot of "quaternion stuff" available on the issues section here but none of them gives a stable output. Could you suggest something concrete?

@westin123
Copy link
Contributor

@Vitthal98, maybe you have found your answer by now, but quaternions are an alternative to euler angles which solves an issue known as gimbal lock. In 3D rendered things, quaternions are commonly used due to their advantages, so optimally you would use the 4 part quaternion for angles and not worry about euler angles.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants