Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA VRAM requirements and tweaking #226

Open
alexandervlpl opened this issue Nov 24, 2024 · 0 comments
Open

CUDA VRAM requirements and tweaking #226

alexandervlpl opened this issue Nov 24, 2024 · 0 comments

Comments

@alexandervlpl
Copy link

alexandervlpl commented Nov 24, 2024

I'm unable to run this on my GTX 1650 with 4GB VRAM. The part with the progress bar (which I guess is pure whisper) runs fine even with large and turbo models. But the next post-processing stage crashes every time, even with the medium model:

torch.OutOfMemoryError: CUDA out of memory

Is this expected? If so, what are the minimal VRAM requirements for whisper_timestamped? Is there anything I can do to get this running? Maybe there's a PYTORCH_CUDA_ALLOC_CONF that will work, or I can use CUDA only for the whisper part and CPU for the rest? I have plenty of RAM on this system.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant