Batch inference? #124
Answered
by
arogozhnikov
mengmeng233
asked this question in
Q&A
-
Is it possible to perform inference in batches? I have observed that the inference time for a single instance often requires at least 60 seconds. If the dataset is large, the cumulative waiting time becomes significantly prolonged. |
Beta Was this translation helpful? Give feedback.
Answered by
arogozhnikov
Oct 22, 2024
Replies: 1 comment 1 reply
-
Hi @mengmeng233 no, that's not practically feasible - typical bottleneck is memory. In case you have only small proteins, you can just start several processes in parallel, don't expect significant speed up - GPU utilization is quite ok. |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
mengmeng233
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @mengmeng233
no, that's not practically feasible - typical bottleneck is memory.
In case you have only small proteins, you can just start several processes in parallel, don't expect significant speed up - GPU utilization is quite ok.