Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to infer on less RAM GPU? I want to work on 8 GB. #12

Open
Vadim2S opened this issue Nov 16, 2021 · 5 comments
Open

How to infer on less RAM GPU? I want to work on 8 GB. #12

Vadim2S opened this issue Nov 16, 2021 · 5 comments

Comments

@Vadim2S
Copy link

Vadim2S commented Nov 16, 2021

I am have now only 8GB RAM GPU. How I am can run infer.py? I am try play with CAT_Full.yaml, TEST section, BASE_SIZE and BATCH_SIZE_PER_GPU but anyways get Out of GPU Memory error.

@CauchyComplete
Copy link
Collaborator

Well, inference uses batch size of one, so BATCH_SIZE_PER_GPU has no effect. Image size and GPU memory size are the only things that make OOM. This means you should reduce the image size or buy a better GPU.

@Vadim2S
Copy link
Author

Vadim2S commented Nov 18, 2021

You mean H and V size of tested image files? OK.

P.S. What is IMAGE_SIZE properties of TRAIN and TEST sections of CAT_full.yaml file?

@CauchyComplete
Copy link
Collaborator

In train.py: TRAIN.BATCH_SIZE_PER_GPU and TRAIN.IMAGE_SIZE determine the batch size and image crop size, respectively.
In infer.py: Those are ignored. Batch size is fixed to 1 and image size becomes the image size of an input image.
Also, note that reducing image size using resize operation is in fact applying another image manipulation. So, the performance may deteriorate. One remedy is to use cropping. Please use grid-aligned cropping as described in the paper. Avoid random cropping. But, cropped images lose overall content, so the performance may deteriorate also (but better than resizing I think).

@Vadim2S
Copy link
Author

Vadim2S commented Nov 18, 2021

Very interesting! Sometimes I am want test 4000x6000 images and here no GPU with such RAM available.

Conculsion: The best way for infer is grid-aligned crop large image to several of small images, test it separately and combine back to one big image. Right?

@CauchyComplete
Copy link
Collaborator

Yes.

@CauchyComplete CauchyComplete pinned this issue Jan 22, 2024
@CauchyComplete CauchyComplete changed the title How to infer on less RAM GPU? I am want work on 8 GB. How to infer on less RAM GPU? I want work on 8 GB. Jan 22, 2024
@CauchyComplete CauchyComplete changed the title How to infer on less RAM GPU? I want work on 8 GB. How to infer on less RAM GPU? I want to work on 8 GB. Jan 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants