Review on Machine Translation Model under Low Resource Condition
- DOI
- 10.2991/978-94-6463-370-2_30How to use a DOI?
- Keywords
- Machine Translation; Natural Language Processing; Low Resource Condition
- Abstract
This paper provides a comprehensive review of the development, challenges, and future prospects of machine translation. It covers the evolution from rule-based systems to neural machine translation (NMT) models, using recurrent neural networks (RNNS) and convolutional neural networks (CNNS) as examples to illustrate ways to solve problems such as accuracy and fluency. Multilingual translation, domain adaptation, and decoding acceleration are also discussed as potential development areas. Despite the progress, challenges remain, such as dealing with rare words and long sentences. The paper emphasizes the importance of conducting research in a variety of language disciplines to overcome these limitations. Overall, machine translation will continue to evolve, with the goal of achieving greater accuracy, efficiency, and intelligence in the future.
- Copyright
- © 2024 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - Yihang Feng PY - 2024 DA - 2024/02/14 TI - Review on Machine Translation Model under Low Resource Condition BT - Proceedings of the 2023 International Conference on Data Science, Advanced Algorithm and Intelligent Computing (DAI 2023) PB - Atlantis Press SP - 270 EP - 282 SN - 1951-6851 UR - https://doi.org/10.2991/978-94-6463-370-2_30 DO - 10.2991/978-94-6463-370-2_30 ID - Feng2024 ER -