Slot Filling with Delexicalized Sentence Generation

Information

Title Slot Filling with Delexicalized Sentence Generation
Authors
Youhyun Shin, Kang Min Yoo, Sang-goo Lee
Year 2018 / 09
Keywords spoken language understanding, slot filling, delexicalization, encoder-decoder, attention model
Acknowledgement HPC
Publication Type International Conference
Publication Interspeech 2018
Link url

Abstract

We introduce a novel approach that jointly learns slot filling and delexicalized sentence generation. There have been recent attempts to tackle slot filling as a type of sequence labeling problem, with encoder-decoder attention framework. We further improve the framework by training the model to generate delexicalized sentences, in which words according to slot values are replaced with slot labels. Slot filling with delexicalization shows better results compared to models having a single learning objective of filling slots. The proposed method achieves state-of-the-art slot filling performance on ATIS dataset. We experiment different variants of our model and find that delexicalization encourages generalization by sharing weights among the words with same labels and helps the model to further leverage certain linguistic features.