Skip to content

Batch inference? #124

Answered by arogozhnikov
mengmeng233 asked this question in Q&A
Oct 22, 2024 · 1 comments · 1 reply
Discussion options

You must be logged in to vote

Hi @mengmeng233

no, that's not practically feasible - typical bottleneck is memory.

In case you have only small proteins, you can just start several processes in parallel, don't expect significant speed up - GPU utilization is quite ok.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@mengmeng233
Comment options

Answer selected by mengmeng233
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #123 on October 22, 2024 06:37.