File(s) under permanent embargo
BoostML: An adaptive metric learning for nearest neighbor classification
conference contribution
posted on 2010-12-01, 00:00 authored by Nayyar ZaidiNayyar Zaidi, D M G Squire, D SuterA Nearest Neighbor (NN) classifier assumes class conditional probabilities to be locally smooth. This assumption is often invalid in high dimensions and significant bias can be introduced when using the nearest neighbor rule. This effect can be mitigated to some extent by using a locally adaptive metric. In this work we propose an adaptive metric learning algorithm that learns an optimal metric at the query point. We learn a distance metric using a feature relevance measure inspired by boosting. The modified metric results in a smooth neighborhood that leads to better classification results. We tested our technique on major UCI machine learning databases and compared the results to state of the art techniques. Our method resulted in significant improvements in the performance of the K-NN classifier and also performed better than other techniques on major databases. © 2010 Springer-Verlag Berlin Heidelberg.
History
Volume
6118 LNAIIssue
PART 1Pagination
142 - 149Publisher DOI
ISSN
0302-9743eISSN
1611-3349ISBN-13
9783642136566ISBN-10
3642136567Publication classification
E1.1 Full written paper - refereedTitle of proceedings
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)Usage metrics
Keywords
Science & TechnologyTechnologyComputer Science, Artificial IntelligenceComputer Science, Information SystemsComputer Science, Theory & MethodsComputer ScienceAdaptive Metric LearningNearest NeighborBias-Variance analysisCurse-of-DimensionalityFeature Relevance IndexArtificial Intelligence and Image Processing
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC