Metody adaptace neuronového strojového překladu
Thesis title in Czech: | Metody adaptace neuronového strojového překladu |
---|---|
Thesis title in English: | Adaptation methods of Neural Machine Translation |
Key words: | hluboké neuronové sítě, neuronový strojový překlad |
English key words: | deep neural networks, neural machine translation |
Academic year of topic announcement: | 2020/2021 |
Thesis type: | dissertation |
Thesis language: | |
Department: | Institute of Formal and Applied Linguistics (32-UFAL) |
Supervisor: | doc. RNDr. Pavel Pecina, Ph.D. |
Author: | hidden![]() |
Date of registration: | 08.09.2020 |
Date of assignment: | 08.09.2020 |
Confirmed by Study dept. on: | 30.09.2020 |
Guidelines |
Neural methods have become the state-of-the-art in the field of machine translation and brought about significant improvements of translation quality.translation. This thesis will focus on methods for adaptation of neural machine translation to specific languages (e.g. dialects) and/or domains) for which there are no sufficient amounts of parallel training data available. |
References |
Goodfellow, I., Y. Bengio, and A. Courville 2016. Deep learning. Cambridge, MA, USA: MIT press.
Orhan Firat, Kyunghyun Cho, and Yoshua Bengio. Multi-way, multilingual neural machine translation with a shared attention mechanism. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 866–875, San Diego, California, June 2016. Association for Computational Linguistics. Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. Attention is All you Need. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems 30, pages 6000–6010. Curran Associates, Inc., 2017. |