-
Its tight on ram but I though I might get away with setting up fast swap. I'm wanting to run LocalAI but most of the models require 10gb of ram at minimum. We'll see if swap is enough. |
Beta Was this translation helpful? Give feedback.
Answered by
Darin755
Jan 16, 2024
Replies: 1 comment
-
This is not a good use case |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
Darin755
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This is not a good use case