Sequential Models for Text Classification Using Recurrent Neural Network
- DOI
- 10.2991/aisr.k.200424.050How to use a DOI?
- Keywords
- sequential, text classification, glove, multilabel
- Abstract
Neural network-based applications are recently shown promising results for text classification. However, it is still challenging for the model to contemplate local features and word contingent on the information of the sentence. This work proposed a deep learning approach to generate a more precise sentence that leverages the preceding texts when classifying a subsequent one. One of the deep learning methods used is Recurrent Neural Network (RNN) with the architecture Long Short-Term Memory (LSTM). By training four variant models of 1-layer LSTM for each balance dataset in pre-processing process with 20,000, 25,000, 30,000, 35,000, 40,000, and 45,000 using optimizer Adam and RMSProp. The results show that; first, the more data input, the higher the accuracy it gets and the second is Adam can perform better as optimizer than RMSProp in this research. The highest Precision, Recall, and F1-score obtain are 97.
- Copyright
- © 2020, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Winda Kurnia SARI AU - Dian Palupi RINI AU - Reza Firsandaya MALIK AU - Iman Saladin B. AZHAR PY - 2020 DA - 2020/05/06 TI - Sequential Models for Text Classification Using Recurrent Neural Network BT - Proceedings of the Sriwijaya International Conference on Information Technology and Its Applications (SICONIAN 2019) PB - Atlantis Press SP - 333 EP - 340 SN - 1951-6851 UR - https://doi.org/10.2991/aisr.k.200424.050 DO - 10.2991/aisr.k.200424.050 ID - SARI2020 ER -