low-memory
#266
Replies: 1 comment
-
it doesn't affect memory at all (unless you can't run it, but with 24GB you're ok)
yes, they improve results; specially when mining relevant MSAs is hard
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I've have a GPU with 24G and have been able to predict structures over 700 aa. I tried --low-memory on a 1200 aa structure but it runs out of memory. But I believe low-memory is the reason I was able to get a 900 aa prediction.
Has anybody explored how the scaling works for maximum sequence length and memory usage between normal inference and with the --low-memory flag?
Does using ESM embeddings significantly affect memory usage? I'm also interested in your thoughts on whether these embeddings improve the output - from limited experience and simply by the looks of it, it seems to.
Thanks,
Dan
Beta Was this translation helpful? Give feedback.
All reactions