An objective function based on Bayesian likelihoods of necessity and sufficiency for concept learning in the absence of labeled counter-examples
conference contribution
posted on 2004-01-01, 00:00authored byAndrew Skabar
Supervised machine learning techniques generally require that the training set on which learning is based contain sufficient examples representative of the target concept, as well as known counter-examples of the concept; however, in many application domains it is not possible to supply a set of labeled counter-examples. This paper proposes an objective function based on Bayesian likelihoods of necessity and sufficiency. This function can be used to guide search towards the discovery of a concept description given only a set of labeled positive examples of the target concept, and as a corpus of unlabeled examples. Results of experiments performed on several datasets from the VCI repository show that the technique achieves comparable accuracy to conventional supervised learning techniques, despite the fact that the latter require a set of labeled counter-examples to be supplied. The technique can be applied in many domains in which the provision of labeled counter-examples is problematic.
History
Title of proceedings
IC-AI & MLMTA 2004 : Proceedings of the International Conference on Artificial Intelligence & Proceedings of the International Conference on Machine Learning : Models, Technologies & Applications
Event
International Conference on Artificial Intelligence and International conference on Machine Learning; Models, Technologies and Applications (2004 : Las Vegas, Nev.)