Proceedings of the 2017 2nd International Conference on Automatic Control and Information Engineering (ICACIE 2017)

Fast Associative Attentive Memory Network

Authors
Xiaomin Wang, Samuel Cheng
Corresponding Author
Xiaomin Wang
Available Online August 2017.
DOI
10.2991/icacie-17.2017.37How to use a DOI?
Keywords
Cloze Style, Fast Weights, Attention, Memory Network
Abstract

To solve the Cloze-style reading comprehension task, a challenging task to test the understanding and reasoning abilities of model, we propose a general and novel model called Fast Associative with Attention Memory Network in this paper. Unlike regular language model, we use fast weights to store associative memory for the recent past instead of activity hidden units which pay attention to the recent past. Our preliminary experiments indicate that our model outperforms regular RNN and LSTM.

Copyright
© 2017, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Download article (PDF)

Volume Title
Proceedings of the 2017 2nd International Conference on Automatic Control and Information Engineering (ICACIE 2017)
Series
Advances in Engineering Research
Publication Date
August 2017
ISBN
978-94-6252-398-2
ISSN
2352-5401
DOI
10.2991/icacie-17.2017.37How to use a DOI?
Copyright
© 2017, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Cite this article

TY  - CONF
AU  - Xiaomin Wang
AU  - Samuel Cheng
PY  - 2017/08
DA  - 2017/08
TI  - Fast Associative Attentive Memory Network
BT  - Proceedings of the 2017 2nd International Conference on Automatic Control and Information Engineering (ICACIE 2017)
PB  - Atlantis Press
SP  - 158
EP  - 161
SN  - 2352-5401
UR  - https://doi.org/10.2991/icacie-17.2017.37
DO  - 10.2991/icacie-17.2017.37
ID  - Wang2017/08
ER  -