Replies: 3 comments 2 replies
-
Hi Ede, At the moment the optical sensor data is read from the device as an array of int16 values [xDelta1, yDelta1, brightness1, xDelta2, yDelta2, brightness2] which are both parsed in Bonsai and transmitted directly to the motor control board via the slip-ring. I'm not that familiar with the requirements for a HID (this refers to human-interface device?) output but from a quick read of its description it seems like we could probably package the data packet into the appropriate format. Is the idea that the HARP device would be read as a HID device over USB, communicating the optical flow x, y? |
Beta Was this translation helpful? Give feedback.
-
Thanks Andrew, The idea is to have a parallel output of the optical sensors go to a separate visual stimulus delivery device on the platform. The reason for this parallel development is that this will be ready to go in a few weeks, and we are keen to start experiments (we will do this on the original platform, but it would be nice to be compatible with the HARP). One simple solution could be a separate optical sensor, including its control board, and feed that USB to the Raspberry Pi of the visual stimulus device. The practical problem with that could be consistency between optical sensors, such as the visual and motor commands are the same. |
Beta Was this translation helpful? Give feedback.
-
OK - I understand a bit better now I think. Since we have direct tx/rx output on the sensor board, we should be able to have a parallel stream to the Pi. I will have a think about other potential solutions this week. |
Beta Was this translation helpful? Give feedback.
-
Hello,
I have a question regarding the output from the HARP board. How will the optical sensor data be communicated? Is it possible to get a HID-type output?
I am asking because we are building a simpler visual stimulation platform based on this (https://github.com/sn-lab/mouseVRheadset) and psychopy.
The idea is to 1. use a simpler and cheaper visual stimulus delivery for simple experiments before moving the mouse onto the multi-screen VR on a second rotation platform, and 2. start experiments in a month or so.
cheers,
Ede
Beta Was this translation helpful? Give feedback.
All reactions