메뉴 건너뛰기
Library Notice
Institutional Access
If you certify, you can access the articles for free.
Check out your institutions.
ex)Hankuk University, Nuri Motors
Log in Register Help KOR
Subject

Neural Machine Translation with Word Embedding Transferred from Language Model
Recommendations
Search
Questions

논문 기본 정보

Type
Academic journal
Author
Chanung Jeong (Handong University) Heeyoul Choi (Handong University)
Journal
Digital Contents Society Journal of Digital Contents Society Vol.20 No.11 KCI Accredited Journals
Published
2019.11
Pages
2,211 - 2,216 (6page)
DOI
10.9728/dcs.2019.20.11.2211

Usage

cover
📌
Topic
📖
Background
🔬
Method
🏆
Result
Neural Machine Translation with Word Embedding Transferred from Language Model
Ask AI
Recommendations
Search
Questions

Abstract· Keywords

Report Errors
Since neural machine translation (NMT) has become a new paradigm in machine translation, it relies on large amounts of parallel corpora to train neural networks, while language models (LMs) are trained on abundant monolingual corpus. Thus, there have been a few approaches to use monolingual corpus in training NMT systems. In this paper, we propose to use pretrained LM for NMT. After training two LMs for source and target languages of NMT, we transfer the word embedding matrices from the LMs to the target NMT model. In the experiments with the task of En-De and En-Fi translation, the proposed method keeps the translation quality the same or slightly better (up to +0.57 BLEU) using only around 40% of the previous model size.

Contents

[요약]
[Abstract]
Ⅰ. Introduction
Ⅱ. Background: LM and NMT
Ⅲ. Word Embedding Transfer
Ⅳ. Experiment
Ⅴ. Conclusion
References

References (22)

Add References

Recommendations

It is an article recommended by DBpia according to the article similarity. Check out the related articles!

Related Authors

Frequently Viewed Together

Recently viewed articles

Comments(0)

0

Write first comments.