Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much RAM needed ? #146

Open
kuronico59 opened this issue Nov 13, 2024 · 5 comments
Open

How much RAM needed ? #146

kuronico59 opened this issue Nov 13, 2024 · 5 comments

Comments

@kuronico59
Copy link

kuronico59 commented Nov 13, 2024

Hi, i have 32 gb RAM, and a 4060 ti 16g, I cant load the model, my ram is at 100%, and takes too long, is it "normal" ? must I have more Ram ?

@bubbliiiing
Copy link
Collaborator

Currently, you may need some swap memory. We are currently trying to develop a smaller model to meet the low memory limit (30GB)

@kuronico59
Copy link
Author

Thanks for the reply, even with swap memory it crashes. I will wait patiently

@bubbliiiing
Copy link
Collaborator

Thank you, we will hurry up.

@nitinmukesh
Copy link

nitinmukesh commented Nov 15, 2024

Thank you @bubbliiiing, you are looking into possibility of reducing the memory requirements.

It would be helpful if you can provide fp8, quantized or GGUF model (not sure if possible for this). I have seen the requirement for some tools was 4xH100 for inference now running on consumer GPU's with as low as 12 GB VRAM.

1 such example is
https://huggingface.co/Kijai/Mochi_preview_comfy/tree/main

@bubbliiiing
Copy link
Collaborator

30GB RAM will be support in #154

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants