Jax support
#1714
Replies: 1 comment
-
We already support Just-In-Time paradigm for CPU, see the paper. I don't think using JAX will lead to better performances in this aspect, mostly because the library has been tailored for ML applications, not properly supporting operations on Lie algebra. In addition, JAX focuses on Python usage, meaning it would be less easy to integrate than classic JIT frameworks. External help is more than welcome. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I understood from a quick discussion with @perrin-isir that JAX is becoming the standard for high-performance machine learning research, offering significant performance gains compared to pytorch and tensorflow.
I was wondering whether pinocchio can be interfaced as-is with JAX, or if some additional development work is needed.
/cc @duburcqa
Beta Was this translation helpful? Give feedback.
All reactions