Dynamic Compositionality in Recursive Neural Networks with Structure-aware Tag Representations

Information

Title Dynamic Compositionality in Recursive Neural Networks with Structure-aware Tag Representations
Authors Taeuk Kim, Jihun Choi, Daniel Edmiston, Sanghwan Bae, Sang-goo Lee
Year 2019 / 1
Keywords natural language processing, text classification, tree-structured neural network
Acknowledgement HPC
Publication Type International Conference
Publication Thirty-Third AAAI Confernce on Artificial Intelligence (AAAI 2019)
Link url

Abstract

Most existing recursive neural network (RvNN) architectures utilize only the structure of parse trees, ignoring syntactic tags which are provided as by-products of parsing. We present a novel RvNN architecture that can provide dynamic compositionality by considering comprehensive syntactic information derived from both the structure and linguistic tags. Specifically, we introduce a structure-aware tag representation constructed by a separate tag-level tree-LSTM. With this, we can control the composition function of the existing word-level tree-LSTM by augmenting the representation as a supplementary input to the gate functions of the tree-LSTM. In extensive experiments, we show that models built upon the proposed architecture obtain superior or competitive performance on several sentence-level tasks such as sentiment analysis and natural language inference when compared against previous tree-structured models and other sophisticated neural models.