-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
H1 --> H2 Motor control signal #53
Comments
@filcarv here are some example flow sensor values from the rig as requested. |
Expanding on the algorithm in more detail here: Required registers (can be renamed):
Algorithm (if ENABLE_DIRECT_CONTROL):
|
Update |
@ederancz @EleonoraAmbrad Could you summarise any issues you have with the current Bonsai implementation here? |
Current bonsai implementation works well when flowYtomotor gain is set to values between 4000 and 8000 and runGain = 0.0008. I have also tried setting runGain to 1 and flowYtomotorgain to 6.4 (corresponding to 8000*0.0008) and it was also good. One thing that would be nice is to have the possibility to smooth the signal of the optic flow sensor before converting it into motor commands. I have noticed that when a sudden acceleration is produced, the motor starts running very abruptly and this is quite disturbing for the animal. Reducing the gain helps, but not entirely. Is this something that could be fixed by changing the MAX_PULSE_INTERVAL? |
edit on previous comment: I repositioned the optic flow sensor ad it got better, but actually I think that smoothing the optic flow sensor signal would solve the issues. Another minor thing happening is that if the block is over during a rotation, the motor will keep spinning regardless of optic flow signal. |
let me butt in:
1. the optic flow signal should be smoothed, I thought this was already implemented. Ideally running average for ~10 (?) points (we can experiment), but if it can only be done by averaging chunks of data (and loosing temporal resolution), than fewer may suffice
2. H2 motor command should already get clamped to 0 when the workflow is stopped, but perhaps it is missing with the new block based experimental control
On 18 Sep 2024, at 16:16, EleonoraAmbrad ***@***.***> wrote:
edit on previous comment: I repositioned the optic flow sensor ad it got better, but actually I think that smoothing the optic flow sensor signal would solve the issues.
Another minor thing happening is that if the block is over during a rotation, the motor will keep spinning regardless of optic flow signal.
—
Reply to this email directly, view it on GitHub<#53 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AIRGCCMTQ2QJJR3NQDTFV73ZXGDKNAVCNFSM6AAAAABHKZEA5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNJYGYYDAOJWGE>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
On 1. The optical flow signal smoothing is not currently implemented (either in software or firmware), a running average is definitely possible though with the number of points set via the schema or a H1 register.
|
great! thanks.
For 1. I think workflow is better through the schema, unless you think there are substantial advantages of setting it in firmware.
On 18 Sep 2024, at 16:26, RoboDoig ***@***.***> wrote:
On 1. The optical flow signal smoothing is not currently implemented (either in software or firmware), a running average is definitely possible though with the number of points set via the schema or a H1 register.
1. This is an oversight by me, if the workflow closes with the last motor command >0 it will keep running at that speed. I will add a clamp to 0 command in the shutdown procedure.
—
Reply to this email directly, view it on GitHub<#53 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AIRGCCJVRYKIBOBZZ7AEW3TZXGETHAVCNFSM6AAAAABHKZEA5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNJYGYZDQOBUGQ>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
I wanted to retract a previous comment where I said that setting runGain to 1 and flowYtomotorgain to 4-6 is ok. It is actually not, the motor commands are not smooth at all then. |
wait, if I stop the workflow the motor stops, but if the block finishes and the motor is spinning, it will stay spinning. Sorry for the misunderstanding |
@EleonoraAmbrad - ah ok I may be mistaken then, I may need to add the clamp between blocks rather than on shutdown. @ederancz - In direct control mode (direct H1-->H2 communication for closed loop) the motion control algorithm bypasses Bonsai entirely. Therefore we would need an implementation in both software and firmware for parity between the two modes. I will implement the software version first and we can use that to inform how it should be done in firmware. |
ah, got it!
yes, let’s focus on software before finalising what goes into H1—>H2 closed loop
On 18 Sep 2024, at 16:56, RoboDoig ***@***.***> wrote:
@EleonoraAmbrad<https://github.com/EleonoraAmbrad> - ah ok I may be mistaken then, I may need to add the clamp between blocks rather than on shutdown.
@ederancz<https://github.com/ederancz> - In direct control mode (direct H1-->H2 communication for closed loop) the motion control algorithm bypasses Bonsai entirely. Therefore we would need an implementation in both software and firmware for parity between the two modes. I will implement the software version first and we can use that to inform how it should be done in firmware.
—
Reply to this email directly, view it on GitHub<#53 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AIRGCCN4W4PTT5ESQSSWMLDZXGIBNAVCNFSM6AAAAABHKZEA5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNJYG4YDOMZQGI>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Re: the motor continuing to spin, is this when the hall sensor is not plugged in? The default behaviour of the workflow is to run the homing routine after each block which rotates the platform until the magnet is detected. If no sensor is present I imagine it will just keep rotating at the end of each block. |
PR #86 contains an implementation of the motion smoothing. The smoothing parameter (buffer size for the running average) is set per-block in the experiment schema. Default value of 1 = no smoothing. |
the smoothing works very well. For saving, probably it makes sense to save the smoothed flow sensor signal? |
Smoothing is merged into main with #92 I think saving the raw values rather than smoothed might be better. Since you log the smoothing window anyway, you can always reconstruct the smoothed motion in postprocessing, and you don't throw away any raw values. |
We're starting to work on implementing the motion smoothing on the firmware side. A few points to consider:
Assuming we smooth on the H2 side we need to implement:
Questions for Filipe:
|
Hi @RoboDoig,
I was thinking about this again. What is the advantage of the direct hardware mode?
If it works fine in software, why not jut keeping it there. Do you expect delay or jitter in Bonsai?
If we want it in firmware (H2), the only other register I can think of is the gain.
On 7 Nov 2024, at 16:19, RoboDoig ***@***.***> wrote:
We're starting to work on implementing the motion smoothing on the firmware side. A few points to consider:
* Where should the smoothing be implemented? Can either be on H1 firmware (directly smoothing the flow sensor signal as its collected) or on the H2 side after it has been received via serial. Probably makes more sense to do this on the H2 side so that we don't lose the raw H1 flow events and don't need to smooth signals that are not being used to drive the motor.
* What control registers should be added? At minimum just need 1: the number of samples in the window to be smoothed.
|
Hi @ederancz, There will be some extra delay, as we need to read H1 via USB and then write to H2 again via USB. Latency will be pretty low (<2ms) but potentially with some jitter. Going on the original designs, the idea was that we'd have the option for direct hardware mode to reduce latency as much as possible. We do have that currently but not with any kind of motion smoothing. If you're happy with the software implementation then we can also leave things as they are on the firmware side. One easy option would be to run a benchmark on H1-->H2 jitter and latency via USB since they are on the same clock. E.g. we could set a flow sensor threshold and write a motor command to H2 once the threshold is crossed and then directly compare the timestamps. If the latency + jitter are not within acceptable range then we could proceed with the firmware update. |
Thanks @RoboDoig, 2ms is more than adequate for us, even with a ~10 ms jitter. If it is easy, please run a benchmark, otherwise we’ll kick this in the long grass.
There will be some extra delay, as we need to read H1 via USB and then write to H2 again via USB. Latency will be pretty low (~2ms) but potentially with some jitter.
Going on the original designs, the idea was that we'd have the option for direct hardware mode to reduce latency as much as possible.
If you're happy with the software implementation then we can also leave things as they are on the firmware side. One easy option would be to run a benchmark on H1-->H2 jitter and latency via USB since they are on the same clock. E.g. we could set a flow sensor threshold and write a motor command to H2 once the threshold is crossed and then directly compare the timestamps. If the latency + jitter are not within acceptable range then we could proceed with the firmware update.
|
From meeting today: the more accurate benchmark for the delay would be time from threshold crossing to actual motor movement, so we should add a camera on the platform or take the encoder signal as movement reference. |
Plus add an ONIX node in a separate Bonsai instantiation to stress-test it. Output would be flow sensor time stamp to encoder time stamp, average delay and jitter. |
Average latency just under 10ms, seems that 3ms of that is just the time between the motor command and getting a reading on the encoder. Seems there is a hard cap at the lower end around 3ms, which is probably from the USB round trip latency. Added the analysis script on the latency-benchmark branch if you want to test with other processes running. |
that sounds very reasonable! do you mind uploading / pointing to the data?
On 15 Nov 2024, at 18:23, RoboDoig ***@***.***> wrote:
image.png (view on web)<https://github.com/user-attachments/assets/fefb6b1e-1cc7-4c19-94dd-64cef205f8c2>
image.png (view on web)<https://github.com/user-attachments/assets/2f3e963d-5253-4bc8-b92c-9ac9d78a6cc5>
Average latency just under 10ms, seems that 3ms of that is just the time between the motor command and getting a reading on the encoder.
—
Reply to this email directly, view it on GitHub<#53 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AIRGCCKT72LJRRTHAAC2VYT2AYUYTAVCNFSM6AAAAABHKZEA5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZZGQ4TMMZVHE>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
D:\BenchmarkData\MotionLatencyTest + whatever date/time is in the notebook |
Now that the motor control is implemented in Bonsai, we should add the option to implement the motion control algorithm via direct communication from H1 --> H2.
This basic algorithm is as follows:
The text was updated successfully, but these errors were encountered: