Robin Li

Results 4 comments of Robin Li

> > 0 only use CPU for inference > > Are you using just the CPU for inference or CUDA 0 and CPU? I changed code and just use CPU...

code change like: ![image](https://user-images.githubusercontent.com/58796726/231371510-7355321e-3c52-4592-8e8a-08bea26f735e.png) ![image](https://user-images.githubusercontent.com/58796726/231371610-93985b50-f33f-4f73-b070-aced3a4a16df.png)

> > > > 0 only use CPU for inference > > > > > > > > > Are you using just the CPU for inference or CUDA 0...

> I found that this was due to the tokenizer not having truncation and max_length set correctly. Once I set it for an appropriate amount I never saw this error...