Im2Text
Im2Text copied to clipboard
CPU deployments?
This library looks excellent. It works great when I trained with the GPU. Eventually I'll need to add the model into an app that won't have GPU capability. Is it possible to do forward-pass on a CPU? If not, could you provide any pointers for tweaks to make it work?
Thanks for open sourcing such a great project!
Thanks for your interest! Currently we are using cudnn in the CNN part, so we cannot run on pure CPU environment. However, I think you can use cudnn.convert
(https://github.com/soumith/cudnn.torch#conversion-between-cudnn-and-nn) to convert the cudnn part to nn.
@da03 In OpenNMT we have a tool called release_model.lua
for exactly this use case. https://github.com/OpenNMT/OpenNMT/blob/master/tools/release_model.lua . Can we check if that works for im2text?
Sure I'll check.