Discriminative structure learning of Bayesian network classifiers from training dataset and testing instance

Wang, Limin, Liu, Yang, Mammadov, Musa, Sun, Minghui and Qi, Sikai 2019, Discriminative structure learning of Bayesian network classifiers from training dataset and testing instance, Entropy, vol. 21, no. 5, doi: 10.3390/e21050489.

Attached Files
Name Description MIMEType Size Downloads

Title Discriminative structure learning of Bayesian network classifiers from training dataset and testing instance
Author(s) Wang, Limin
Liu, Yang
Mammadov, Musa
Sun, Minghui
Qi, Sikai
Journal name Entropy
Volume number 21
Issue number 5
Total pages 26
Publisher MDPI
Place of publication Basel, Switzerland
Publication date 2019-05
ISSN 1099-4300
Summary © 2019 by the authors. Over recent decades, the rapid growth in data makes ever more urgent the quest for highly scalable Bayesian networks that have better classification performance and expressivity (that is, capacity to respectively describe dependence relationships between attributes in different situations). To reduce the search space of possible attribute orders, k-dependence Bayesian classifier (KDB) simply applies mutual information to sort attributes. This sorting strategy is very efficient but it neglects the conditional dependencies between attributes and is sub-optimal. In this paper, we propose a novel sorting strategy and extend KDB from a single restricted network to unrestricted ensemble networks, i.e., unrestricted Bayesian classifier (UKDB), in terms of Markov blanket analysis and target learning. Target learning is a framework that takes each unlabeled testing instance P as a target and builds a specific Bayesian model Bayesian network classifiers (BNC)P to complement BNCT learned from training data T . UKDB respectively introduced UKDBP and UKDBT to flexibly describe the change in dependence relationships for different testing instances and the robust dependence relationships implicated in training data. They both use UKDB as the base classifier by applying the same learning strategy while modeling different parts of the data space, thus they are complementary in nature. The extensive experimental results on the Wisconsin breast cancer database for case study and other 10 datasets by involving classifiers with different structure complexities, such as Naive Bayes (0-dependence), Tree augmented Naive Bayes (1-dependence) and KDB (arbitrary k-dependence), prove the effectiveness and robustness of the proposed approach.
Language eng
DOI 10.3390/e21050489
Field of Research 01 Mathematical Sciences
02 Physical Sciences
HERDC Research category C1 Refereed article in a scholarly journal
Copyright notice ©2019, The Authors
Persistent URL http://hdl.handle.net/10536/DRO/DU:30122710

Connect to link resolver
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 0 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 54 Abstract Views, 2 File Downloads  -  Detailed Statistics
Created: Thu, 13 Jun 2019, 13:40:06 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.