-
Notifications
You must be signed in to change notification settings - Fork 197
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
use cpu to inference #95
Comments
Unfortunately, the PyTorch implementation currently does not support CPU for inference. |
Thanks. But I use tensorflow implementation. Does tensorlfow implementation support CPU for inference? |
@qpzhao Yes, the TensorFlow implementation supports inference with CPU. |
Now, the PyTorch implementation supports CPU for inference. You can add a parameter |
Hi, How can I set params or modify "translator.py" for using cpu to inference?
The text was updated successfully, but these errors were encountered: