Discovering semantics from multiple correlated time series stream
conference contribution
posted on 2013-12-01, 00:00 authored by Z Qiao, Guangyan HuangGuangyan Huang, J He, P Zhang, L Guo, J Cao, Y ZhangIn this paper, we study a challenging problem of mining data generating rules and state transforming rules (i.e., semantics) underneath multiple correlated time series streams. A novel Correlation field-based Semantics Learning Framework (CfSLF) is proposed to learn the semantic. In the framework, we use Hidden Markov Random Field (HMRF) method to model relationship between latent states and observations in multiple correlated time series to learn data generating rules. The transforming rules are learned from corresponding latent state sequence of multiple time series based on Markov chain character. The reusable semantics learned by CfSLF can be fed into various analysis tools, such as prediction or anomaly detection. Moreover, we present two algorithms based on the semantics, which can later be applied to next-n step prediction and anomaly detection. Experiments on real world data sets demonstrate the efficiency and effectiveness of the proposed method. © Springer-Verlag 2013.
History
Volume
7819Pagination
509-520Location
Gold Coast, Qld.Publisher DOI
Start date
2013-04-14End date
2013-04-17ISSN
0302-9743eISSN
1611-3349ISBN-13
9783642374562Language
engPublication classification
E Conference publication, E1.1 Full written paper - refereedCopyright notice
2013, SpringerEditor/Contributor(s)
Pei J, Tseng VS, Cao L, Motoda H, Xu GTitle of proceedings
17th Pacific-Asia Conference, PAKDD 2013, Gold Coast, Australia, April 14-17, 2013, Proceedings, Part IIEvent
Pacific-Asia Conference on Knowledge Discovery and Data Mining (17th ; 2013 : Gold Coast, QLD)Issue
Part 2Publisher
SpringerPlace of publication
Berlin, GermanySeries
Lecture Notes in Artificial IntelligenceUsage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC