-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to infer on less RAM GPU? I want to work on 8 GB. #12
Comments
Well, inference uses batch size of one, so BATCH_SIZE_PER_GPU has no effect. Image size and GPU memory size are the only things that make OOM. This means you should reduce the image size or buy a better GPU. |
You mean H and V size of tested image files? OK. P.S. What is IMAGE_SIZE properties of TRAIN and TEST sections of CAT_full.yaml file? |
In train.py: TRAIN.BATCH_SIZE_PER_GPU and TRAIN.IMAGE_SIZE determine the batch size and image crop size, respectively. |
Very interesting! Sometimes I am want test 4000x6000 images and here no GPU with such RAM available. Conculsion: The best way for infer is grid-aligned crop large image to several of small images, test it separately and combine back to one big image. Right? |
Yes. |
I am have now only 8GB RAM GPU. How I am can run infer.py? I am try play with CAT_Full.yaml, TEST section, BASE_SIZE and BATCH_SIZE_PER_GPU but anyways get Out of GPU Memory error.
The text was updated successfully, but these errors were encountered: