editGAN_release icon indicating copy to clipboard operation
editGAN_release copied to clipboard

gpu out of memory

Open sybilWsybil opened this issue 4 years ago • 5 comments

HI: I only have v100 16G not 32G. How to testing face?and How to change xxx.json for my 16G gpu? thank you very mush

sybilWsybil avatar Apr 06 '22 09:04 sybilWsybil

If you want to use 16G GPU memory V100, there are also multiple tricks to reduce memory usage. For example, you can consider reducing stylegan model size to 256* 256 or you can use fewer stylegan features and reduce DatasetGAN input feature size as indicated in https://github.com/nv-tlabs/editGAN_release/blob/release_final/experiments/datasetgan_car.json#L6. or reduce ensemble model number as indicated https://github.com/nv-tlabs/editGAN_release/blob/release_final/experiments/datasetgan_car.json#L15.

However we didn't test that.

arieling avatar Apr 07 '22 18:04 arieling

If you want to use 16G GPU memory V100, there are also multiple tricks to reduce memory usage. For example, you can consider reducing stylegan model size to 256* 256 or you can use fewer stylegan features and reduce DatasetGAN input feature size as indicated in https://github.com/nv-tlabs/editGAN_release/blob/release_final/experiments/datasetgan_car.json#L6. or reduce ensemble model number as indicated https://github.com/nv-tlabs/editGAN_release/blob/release_final/experiments/datasetgan_car.json#L15.

However we didn't test that.

So would this imply that the existing checkpoints are unusable? Also can multiple GPUs be used to not encounter this problem?

Regardless of the fact that this is mostly research driven, one ought not expect people to have a 12K GPU lying around ;)

TSTsankov avatar Apr 22 '22 16:04 TSTsankov

I have changed dim=64 batch_size=1, model_num=1,and run 'python run_app.py'.but still out of memory ,and tried to allocate 5.88GB has not changed. shoud I retrain the model? Thanks

sybilWsybil avatar Apr 24 '22 07:04 sybilWsybil

You can use pytorch half precision inference, which saves lots of GPU memory

silaopi avatar Jun 08 '22 08:06 silaopi

You can use pytorch half precision inference, which saves lots of GPU memory

Could you tell me where you changed in the scripts?

MrTornado24 avatar Aug 21 '22 15:08 MrTornado24