Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 769 Bytes

File metadata and controls

5 lines (3 loc) · 769 Bytes

Sequence to Sequence Modeling for English-German translation

Sequence to Sequence modeling have seen great performance in building models where the input is a sequence of tokens (words for example) and output is also a sequence of tokens. These models have been widely used for machine translation, abstractive summarization, image captioning etc. The notebook SageMaker-Seq2Seq-Translation-English-German.ipynb provides an end-to-end training example of training an English-German translation model.

Also, as the training takes long time to complete, we are providing a section which works with a pre-trained model (a model that we have trained with exact same setup by running the training for ~10 hours) which you can use to test the inference experience.