How can I compile pytorch3d inference to run it in production? #936
Unanswered
Alexankharin
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hi Alexander, I was wondering if you had any luck with inferencing a Pytorch3D model with onnx? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've trained a pytirch3d model that I want to place inference on server with GPUs (AWS). It is highly desirable for me to keep as few dependencies as possible. Is it possible to convert the pytorch3d model (I've used only pointcloud rasterizer function from the library) into jit-traced/scripted model or to onnx format?
Beta Was this translation helpful? Give feedback.
All reactions