Alleviating the independence assumptions of averaged one-dependence estimators by model weighting
journal contribution
posted on 2021-01-01, 00:00authored byLi-Min Wang, Peng Chen, Musa MammadovMusa Mammadov, Yang Liu, Si-Yuan Wu
Of numerous proposals to refine naive Bayes by weakening its attribute independence assumption, averaged one-dependence estimators (AODE) has been shown to be able to achieve significantly higher classification accuracy at a moderate cost in classification efficiency. However, all one-dependence estimators (ODEs) in AODE have the same weights and are treated equally. To address this issue, model weighting, which assigns discriminate weights to ODEs and then linearly combine their probability estimates, has been proved to be an efficient and effective approach. Most information-theoretic weighting metrics, including mutual information, Kullback-Leibler measure and the information gain, place more emphasis on the correlation between root attribute (value) and class variable. We argue that the topology of each ODE can be divided into a set of local directed acyclic graphs (DAGs) based on the independence assumption, and multivariate mutual information is introduced to measure the extent to which the DAGs fit data. Based on this premise, in this study we propose a novel weighted AODE algorithm, called AWODE, that adaptively selects weights to alleviate the independence assumption and make the learned probability distribution fit the instance. The proposed approach is validated on 40 benchmark datasets from UCI machine learning repository. The experimental results reveal that, AWODE achieves bias-variance trade-off and is a competitive alternative to single-model Bayesian learners (such as TAN and KDB) and other weighted AODEs (such as WAODE).