A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism
Authors
WeiJun Yang, ZhiCheng Tang, XinHuai Tang
Corresponding Author
WeiJun Yang
Available Online May 2018.
- DOI
- 10.2991/amcce-18.2018.89How to use a DOI?
- Keywords
- Neural Abstractive Summarization, Self-Attention Mechanism
- Abstract
Recently, the attentional seq2seq model had made a remarkable progress on the abstractive summarization. But most of these models do not considers the relation between original sentences, which is the important feature in extractive method. In this work, we proposed a Hierarchical Neural model to address problem. First, we use a self-attention to discovers the relation between original sentences. Secondly, we use a copy mechanism to solve the OOV problem. The experiment demonstrates that our model achieves state-of-the-art ROUGE scores on LCSTS dataset.
- Copyright
- © 2018, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - WeiJun Yang AU - ZhiCheng Tang AU - XinHuai Tang PY - 2018/05 DA - 2018/05 TI - A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism BT - Proceedings of the 2018 3rd International Conference on Automation, Mechanical Control and Computational Engineering (AMCCE 2018) PB - Atlantis Press SP - 514 EP - 518 SN - 2352-5401 UR - https://doi.org/10.2991/amcce-18.2018.89 DO - 10.2991/amcce-18.2018.89 ID - Yang2018/05 ER -