nvidia/megatron-1b-nmt
Enable smooth global interactions in 32 languages.
Description:
The Megatron Multilingual 1.5B Neural Machine Translation model translates text in any to any directions across the 33 supported languages, including non-English centric translation (such as French to Chinese, etc). The Supported languages are: English(en), Czech(cs), Danish (da), German(de), Greek(el), Spanish(es), Finnish(fi), France(fr), Hungarian(hu), Italian(it), Lithuanian(lt), Latvian(lv),Dutch(nl), Norwegian(no), Polish(pl), Portugese(pt), Romanian(ro), Russian(ru), Slovak(sk), Swedish(sv), Chinese(zh), Japanese(ja), Hindi(hi), Korean(ko), Estonian(et), Slovenian(sl), Bulgarian(bg), Ukrainian(uk), Croatian(hr), Arabic(ar), Vietnamese(vi), Turkish(tr), Indonesian(id). This model is ready for commercial use.
Model Architecture
Architecture Type: Transformer
Network Architecture: Megatron
The model is based on Transformer architecture originally presented in "Attention Is All You Need" paper [1]. In this particular instance, the model has 24 layers in the encoder and 24 layers in the decoder. It is using SentencePiece tokenizer [2].
Input:
Input Type(s): Text String
Input Format(s): List
Other Properties Related to Input: No Pre-Processing Needed; No Tokenization required; 512 Character Text String Limit (No non-textual characters)
Output:
Output Type(s): Text String
Output Format: List
Output Parameters: Selected Language
Other Properties Related to Output: Outputs are not tokenized or processed to hide sensitive input information
Training & Evaluation Dataset:
** Data Collection Method by dataset
- Human
** Labeling Method by dataset
- Automated
Properties (Quantity, Dataset Descriptions, Sensor(s)): This model is trained on open-sourced datasets and synthetic datasets of text parallel corpora generated via back-translation.
References:
[1] Vaswani, Ashish, et al. "Attention is all you need." arXiv preprint arXiv:1706.03762 (2017). [2] https://github.com/google/sentencepiece [3] https://en.wikipedia.org/wiki/BLEU [4] https://github.com/mjpost/sacreBLEU [5] NVIDIA NeMo Toolkit
Software Integration
Runtime Engine(s):
- Riva 2.15.0 or Higher
Supported Operating System(s):
- Linux
Model Version(s):
nmt_megatron_1b_any_any:2.15.1
Inference
Engine: Triton
Test Hardware:
- NVIDIA H100 GPU
- NVIDIA A100 GPU
- NVIDIA L40 GPU
Ethical Considerations (For NVIDIA Models Only):
NVIDIA believes Trustworthy AI is a shared responsibility and we have established policies and practices to enable development for a wide array of AI applications. When downloaded or used in accordance with our terms of service, developers should work with their internal model team to ensure this model meets requirements for the relevant industry and use case and addresses unforeseen product misuse. For more detailed information on ethical considerations for this model, please see the Model Card++ Explainability, Bias, Safety & Security, and Privacy Subcards. Please report security vulnerabilities or NVIDIA AI Concerns here.
GOVERNING TERMS:
This trial is governed by the NVIDIA API Trial Terms of Service (found at https://assets.ngc.nvidia.com/products/api-catalog/legal/NVIDIA%20API%20Trial%20Terms%20of%20Service.pdf). The use of this model is governed by the AI Foundation Models Community License Agreement (found at NVIDIA Agreements | Enterprise Software | NVIDIA AI Foundation Models Community License Agreement).