Cell-aware Stacked LSTMs for Modeling Sentences

Information

Title Cell-aware Stacked LSTMs for Modeling Sentences
Authors
Jihun Choi, Taeuk Kim, Sang-goo Lee
Year 2019 / 11
Keywords natural language processing, text classification, sentence encoder
Publication Type International Conference
Publication The 11th Asian Conference on Machine Learning (ACML 2019)
Link url

Abstract

We propose a method of stacking multiple long short-term memory (LSTM) layers for modeling sentences. In contrast to the conventional stacked LSTMs where only hidden states are fed as input to the next layer, our architecture accepts both hidden and memory cell states of the preceding layer and fuses information from the left and the lower context using the soft gating mechanism of LSTMs. Thus the proposed stacked LSTM architecture modulates the amount of information to be delivered not only in horizontal recurrence but also in vertical connections, from which useful features extracted from lower layers are effectively conveyed to upper layers. We dub this architecture Cell-aware Stacked LSTM (CAS-LSTM) and show from experiments that our models achieve state-of-the-art results on benchmark datasets for natural language inference, paraphrase detection, and sentiment classification.