NMT Pre-trained Models


The Neural Machine Translation (NMT) models downloadable from this page are in-domain models, which means they are trained and tested only on specialized data, and they can act better than generic models for the specified "domain". In other words, in-domain models can observe terminology and generate translations that are much more in line with the specialized context.

How to translate with pre-trained models

There are three methods to use the downloadable pre-trained NMT models.

• First Method:

You can use OpenNMT-py (PyTorch version) from the command line as follows:

python3 OpenNMT-py/translate.py -model model.pt -src source.txt -output target.txt -replace_unk

• Second Method:

You can run a simple OpenNMT-py REST API.

• Third Method:

You can use a stand-alone executable GUI. It is still basic, but it works. Currently, I (Yasmin) is working on improving the GUI translator and expanding its features. If you have suggestions, please feel free to contact me.

Framework

Currently, all the models on this page are created using OpenNMT-py, based on PyTorch.

Licence of Models

For internal use only. For research, please attribute:
Yasmin Moslem, 2020, OpenNMT-py Pre-trained Models, MachineTranslation.io

Download NMT Pre-trained Models

 
Language English-Arabic
Domain Software Localization
Configuration OpenNMT-Py RNN-LSTM default options
Data MS ≈500k segments
BLEU 48
 
Language French-English
Domain International Organizations
Configuration OpenNMT-Py Transformer standard options - 140k steps
Data UN Corpus ≈20m segments
BLEU 49
 
Language French-English
Domain International Organizations
Configuration OpenNMT-Py Transformer standard options
Data MultiUN ≈13m segments
BLEU 37