Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use cpu to inference #95

Open
qpzhao opened this issue Jul 23, 2020 · 4 comments
Open

use cpu to inference #95

qpzhao opened this issue Jul 23, 2020 · 4 comments

Comments

@qpzhao
Copy link

qpzhao commented Jul 23, 2020

Hi, How can I set params or modify "translator.py" for using cpu to inference?

@Playinf
Copy link
Collaborator

Playinf commented Aug 19, 2020

Unfortunately, the PyTorch implementation currently does not support CPU for inference.

@qpzhao
Copy link
Author

qpzhao commented Aug 24, 2020

Unfortunately, the PyTorch implementation currently does not support CPU for inference.

Thanks. But I use tensorflow implementation. Does tensorlfow implementation support CPU for inference?

@GrittyChen
Copy link
Member

@qpzhao Yes, the TensorFlow implementation supports inference with CPU.

@GrittyChen
Copy link
Member

Hi, How can I set params or modify "translator.py" for using cpu to inference?

Now, the PyTorch implementation supports CPU for inference. You can add a parameter --cpu to make the translator.py work on the CPU. Note that when you use CPU to inference, you would not be allowed to use the --half parameter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants