text2text icon indicating copy to clipboard operation
text2text copied to clipboard

RuntimeError: CUDA out of memory

Open danltw opened this issue 5 years ago • 4 comments

Having some issues with the CUDA memory allocation. Is there a way around this or maybe I can just use CPU to train instead? What do I need to comment out?

RuntimeError: CUDA out of memory. Tried to allocate 14.00 MiB (GPU 0; 3.95 GiB total capacity; 2.56 GiB already allocated; 10.88 MiB free; 2.57 GiB reserved in total by PyTorch)

I'm using a GeForce GTX 1050 Mobile card so I understand that it's not exactly built for high end processing

danltw avatar Sep 18 '20 07:09 danltw

Can you describe the setup? Have you tried the demo Colab notebook?

Yes, it should work on CPU.

artitw avatar Sep 19 '20 16:09 artitw

Can you describe the setup? Have you tried the demo Colab notebook?

Yes, it should work on CPU.

How do I change the code to force run on CPU?

danltw avatar Sep 21 '20 01:09 danltw

Can you describe the setup? Have you tried the demo Colab notebook?

Yes, it should work on CPU.

sorry, accidentally closed the issue. I followed your README as shown on the repo page and even decreased the amount of questions generated. My card is not built for heavy processing like this as it only has 2GB of RAM

danltw avatar Sep 21 '20 01:09 danltw

@danltw please see this line of code: https://github.com/artitw/text2text/blob/e6bc1fbd24346b470168837797346f08d88736d9/text2text/text_generator.py#L89

Currently, we attempt to use GPU and fallback on CPU if it doesn’t exist. Would you be interested in submitting a pull request adding functionality for specifying which device to use so that users can force CPU if necessary?

artitw avatar Sep 26 '20 22:09 artitw