Proceedings of the 2019 International Conference on Computer, Network, Communication and Information Systems (CNCI 2019)

An Improved Mechanism for Universal Sentence Representations Learnt from Natural Language Inference Data Using Bi-directional Information

Authors
Dian Jiao, Sheng Gao, Baodong Zhang
Corresponding Author
Dian Jiao
Available Online May 2019.
DOI
10.2991/cnci-19.2019.26How to use a DOI?
Keywords
Universal Sentence Encoder, Supervised, SNLI, Transfer Tasks, Pooling, Attention.
Abstract

BiLSTM with max pooling is adopted as a well-performed supervised universal sentence encoder. Max pooling is a common mechanism to get a fixed-size sentence representation. But we find that the max pooling for sentence encoder discards some useful backward and forward information at each time step and depends on a large number of parameters. In this paper, we propose an improved pooling mechanism based on max pooling for universal sentence encoder. The proposed model uses three kinds of methods to refine the backward and forward information at each time step, and then use a max-pooling layer or attention mechanism to obtain a fixed-size sentence representation from variable-length refined hidden states. Experiments conducted on Stanford Natural Language Inference (SNLI) Corpus, and we use it as a pretrained universal sentence encoder for transfer tasks. Experiments show that our model with less parameters performs better.

Copyright
© 2019, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Download article (PDF)

Volume Title
Proceedings of the 2019 International Conference on Computer, Network, Communication and Information Systems (CNCI 2019)
Series
Advances in Computer Science Research
Publication Date
May 2019
ISBN
978-94-6252-713-3
ISSN
2352-538X
DOI
10.2991/cnci-19.2019.26How to use a DOI?
Copyright
© 2019, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Cite this article

TY  - CONF
AU  - Dian Jiao
AU  - Sheng Gao
AU  - Baodong Zhang
PY  - 2019/05
DA  - 2019/05
TI  - An Improved Mechanism for Universal Sentence Representations Learnt from Natural Language Inference Data Using Bi-directional Information
BT  - Proceedings of the 2019 International Conference on Computer, Network, Communication and Information Systems (CNCI 2019)
PB  - Atlantis Press
SP  - 191
EP  - 198
SN  - 2352-538X
UR  - https://doi.org/10.2991/cnci-19.2019.26
DO  - 10.2991/cnci-19.2019.26
ID  - Jiao2019/05
ER  -