Dependency structure language model for topic detection and tracking |
| |
Authors: | Changki Lee Gary Geunbae Lee Myunggil Jang |
| |
Institution: | 1. Knowledge Mining Laboratory, Speech/Language Technology Research Department, Electronics and Telecommunications Research Institute, 161 Gajeong-dong, Yuseong-gu, Daejeon 305-350, South Korea;2. Department of Computer Science and Engineering, Pohang University of Science and Technology, San 31 Hyoja dong, Nam Gu, Pohang 790-784, South Korea |
| |
Abstract: | In this paper, we propose a new language model, namely, a dependency structure language model, for topic detection and tracking (TDT) to compensate for weakness of unigram and bigram language models. The dependency structure language model is based on the Chow expansion theory and the dependency parse tree generated by a linguistic parser. So, long-distance dependencies can be naturally captured by the dependency structure language model. We carried out extensive experiments to verify the proposed model on topic tracking and link detection in TDT. In both cases, the dependency structure language models perform better than strong baseline approaches. |
| |
Keywords: | Dependency structure language model Term dependence Dependency parse tree Topic detection and tracking |
本文献已被 ScienceDirect 等数据库收录! |
|