THUMT
THUMT copied to clipboard
use cpu to inference
Hi, How can I set params or modify "translator.py" for using cpu to inference?
Unfortunately, the PyTorch implementation currently does not support CPU for inference.
Unfortunately, the PyTorch implementation currently does not support CPU for inference.
Thanks. But I use tensorflow implementation. Does tensorlfow implementation support CPU for inference?
@qpzhao Yes, the TensorFlow implementation supports inference with CPU.
Hi, How can I set params or modify "translator.py" for using cpu to inference?
Now, the PyTorch implementation supports CPU for inference. You can add a parameter --cpu
to make the translator.py
work on the CPU. Note that when you use CPU to inference, you would not be allowed to use the --half
parameter.