Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Got Error torch.OutOfMemoryError: Allocation on device #48

Open
Omario92 opened this issue Nov 29, 2024 · 4 comments
Open

Got Error torch.OutOfMemoryError: Allocation on device #48

Omario92 opened this issue Nov 29, 2024 · 4 comments

Comments

@Omario92
Copy link

I try an example file for flux_regional but got this Error
image

My GPU is 4090RTX, I don't think this is error of GPU, any suggest?

@logtd
Copy link
Owner

logtd commented Nov 29, 2024

This is an issue with how ComfyUI handles masks when xformers is enabled. You can start ComfyUI with xformers disabled by passing the argument --disable-xformers

@PixelWunderwerk
Copy link

Is there another workaround? I am not able to disable xformers on my vm.
It alway crashs if I use the "Configure Modfied Flux"-Node with OOM Error. Even with light weight fp8 model and 48GB VRAM.

@whitepapercg
Copy link

Is there another workaround? I am not able to disable xformers on my vm. It alway crashs if I use the "Configure Modfied Flux"-Node with OOM Error. Even with light weight fp8 model and 48GB VRAM.

Disabling xformers didn't help in any way. I get two errors in a circle (at the ModelSampling or KSampler stage), no matter what I do.

@EnragedAntelope
Copy link

Confirming disabling xformers worked to fix memory allocation error on 4090. But hoping there will be a solution that does not require that. Is this an issue with xformers or with comfy? Do you know if there is already an error report in that we can add to?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants