The hierarchical hidden Markov model (HHMM) is an extension of the hidden Markov model to include a hierarchy of the hidden states. This form of hierarchical modeling has been found useful in applications such as handwritten character recognition, behavior recognition, video indexing, and text retrieval. Nevertheless, the state hierarchy in the original HHMM is restricted to a tree structure. This prohibits two different states from having the same child, and thus does not allow for sharing of common substructures in the model. In this paper, we present a general HHMM in which the state hierarchy can be a lattice allowing arbitrary sharing of substructures. Furthermore, we provide a method for numerical scaling to avoid underflow, an important issue in dealing with long observation sequences. We demonstrate the working of our method in a simulated environment where a hierarchical behavioral model is automatically learned and later used for recognition.
History
Event
Nineteenth National Conference on Artificial Intelligence : Sixteenth Innovative Applications of Artificial Intelligence Conference (2004 : San Jose, Calif.)
Pagination
324 - 329
Publisher
[American Association for Artificial Intelligence]
Location
San Jose, Calif.
Place of publication
[San Jose, Calif.]
Start date
2004-07-25
End date
2004-07-29
Language
eng
Notes
Proceedings - Nineteenth National Conference on Artificial Intelligence (AAAI-2004): Sixteenth Innovative Applications of Artificial Intelligence Conference (IAAI-2004);San Jose, Calif.
Publication classification
E1.1 Full written paper - refereed
Title of proceedings
Proceedings of the National Conference on Artificial Intelligence