Is SeeKeR in interactive chat mode supposed to use gpu?
Hi I just installed SeekeR in a new environment for parlai v1.6.0 and have been testing its response to interactive chat queries. It seems pretty slow and I noticed that it does not appear to be changing the use of gpu memory when I run it. I am using this command...
parlai i -mf zoo:seeker/seeker_dialogue_3B/model -o gen/seeker_dialogue --search-server 127.0.0.1:8080
where the search-server I am running is the same one that I use for blenderbot2. Is there a command option that needs to be set to use the gpu? I am running under Windows 10.
Thanks!
do you have gpus available on your machine?
what is the output of python -c "import torch; print(torch.cuda.device_count())"
Hi! Yes, I have a NVIDIA GEFORCE RTX 2070 Super. Below is the result of the test you suggested...
(D:\conda_env\parlai160) D:\>python -c "import torch; print(torch.cuda.device_count())"
1
(D:\conda_env\parlai160) D:\>
Hi,
Same problem. SeeKeR won't change GRAM on Windows 10, I'm checking with gpustat --watch
Tried both no_cuda: True and no_cuda: False
Edit:
gpustat returns [0] NVIDIA GeForce GTX 1080 Ti | 47°C, 0 % | 1024 / 11264 MB
are you both able to get other models running on the GPU on your machine via ParlAI?
I checked Blenderbot2 on both my Parlai 1.5.1 and 1.6.0 anaconda environments using this command...
parlai interactive -mf zoo:blenderbot2/blenderbot2_400M/model --search-server 127.0.0.1:8080 --n-docs 2 --fp16 true
I got the same answers to a question I asked ("Who is the current President of the US?"), and verified that both are using the search server (for the second time I asked the question). However, Parlai 1.6.0 did not use the GPU and Parlai 1.5.1 did. Hopefully this information is helpful.
can you try with just parlai interactive -mf zoo:blender/blender_3B/model on 1.6.0 and see if it uses GPU?
I did the parlai interactive -mf zoo:blender/blender_3B/model on 1.6.0 and saw no change in GPU usage, while under 1.5.1 the GPU usage went from 1.1GB to 7GB. One difference I noticed is that I installed the 1.6.0 repo as a Development Installation and the model was downloaded into the data folder under that repo folder
D:\conda_env\ParlAI\data\models\blender\blender_3B\BST3B.tgz
and not the 1.6.0 environment folder which is D:\conda_env\parlai160
But I installed 1.5.1 as a Standard Installation and it downloaded the model to
D:\conda_env\parlai151\Lib\site-packages\data\models\blender\blender_3B\BST3B.tgz
where the 1.5.1 environment folder is D:\conda_env\parlai151
The bot appears to be working and it gives the same answer to the same interactive question in both cases, but could using a Development Installation be the source of the problem?
hmm yes perhaps try installing as a standard install
with the dev install, it's likely your pytorch cuda did not install correctly
That fixed it!
But I need to note that with Windows you can't just do a pip install parlai for a standard install
I needed to make a local copy of the requirements.txt file (I called it requirements-local.txt) where I had to comment out the sh==1.12.14 package to avoid an error (since sh is linux-only) and to change pyzmq==18.1.0 to pyzmq==18.1.1 to avoid an invalid wheel error . Then it is installed as
pip install --no-deps -r requirements-local.txt parlai
But there were still missing packages that I needed to install...
pip install iopath
pip install charset-normalizer
pip install idna
pip install certifi
pip install packaging
before the parlai command would work. There was an installation of torch from the parlai install, but no cudatoolkit, so I also reinstalled pytorch with the toolkits using the command on the pytorch website to be safe.
conda install pytorch torchvision torchaudio cudatoolkit=11.6 -c pytorch -c conda-forge
For good measure, I also installed cudnn, but I don't know if it is needed.
Glad that worked, and thank you for the windows install instructions! I'll go ahead and close this for now but please reopen if there are lingering concerns