A Syllable-based Technique for Word Embeddings of Korean Words

Information

Title A Syllable-based Technique for Word Embeddings of Korean Words
Authors
Sanghyuk Choi, Taeuk Kim, Jinseok Seol, Sang-goo Lee
Year 2017 / 08
Keywords convolutional neural network, word embedding, natural language processing
Acknowledgement HPC
Publication Type International Conference
Publication The 1st Workshop on Subword and Character level models in NLP (SCLeM) at EMNLP 2017
Link url

Abstract

Word embedding has become a fundamental component to many NLP tasks such as named entity recognition and machine translation. However, popular models that learn such embeddings are unaware of the morphology of words, so it is not directly applicable to highly agglutinative languages such as Korean. We propose a syllable-based learning model for Korean using a convolutional neural network, in which word representation is composed of trained syllable vectors. Our model successfully produces morphologically meaningful representation of Korean words compared to the original Skip-gram embeddings. The results also show that it is quite robust to the Out-of-Vocabulary problem.