-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Got Error torch.OutOfMemoryError: Allocation on device #48
Comments
This is an issue with how ComfyUI handles masks when xformers is enabled. You can start ComfyUI with xformers disabled by passing the argument |
Is there another workaround? I am not able to disable xformers on my vm. |
Disabling xformers didn't help in any way. I get two errors in a circle (at the ModelSampling or KSampler stage), no matter what I do. |
Confirming disabling xformers worked to fix memory allocation error on 4090. But hoping there will be a solution that does not require that. Is this an issue with xformers or with comfy? Do you know if there is already an error report in that we can add to? |
I try an example file for flux_regional but got this Error
My GPU is 4090RTX, I don't think this is error of GPU, any suggest?
The text was updated successfully, but these errors were encountered: