Learning and optimization of an aspect hidden Markov model for query language model generation.
Bruza, Peter D.
MetadataShow full item record
HUANG, Q., SONG, D., RUGER, S. and BRUZA, P. D., 2007. Learning and optimization of an aspect hidden Markov model for query language model generation. In: S. DOMINICH and F. KISS, eds. Proceedings of the 1st International Conference on the Theory of Information Retrieval (ICTIR 2007). 18-20 October 2007. Budapest, Hungary: Infota. Pp. 157-164.
The Relevance Model (RM) incorporates pseudo relevance feedback to derive query language model and has shown a good performance. Generally, it is based on uni-gram models of individual feedback documents from which query terms are sampled independently. In this paper, we present a new method to build the query model with latent state machine (LSM) which captures the inherent term dependencies within the query and the term dependencies between query and documents. Our method firstly splits the query into subsets of query terms (i.e., not only single terms, but different combinations of multiple query terms). Secondly, these query term combinations are then considered as weighted latent states of a hidden Markov Model to derive a new query model from the pseudo relevant documents. Thirdly, our method integrates the Aspect Model (AM) with the EM algorithm to estimate the parameters involved in the model. Specifically, the pseudo relevant documents are segmented into chunks, and different chunks are associated with different weights in relation to a latent state. Our approach is empirically evaluated on three TREC collections, and demonstrates statistically significant improvements over a baseline language model and the Relevance Model.