Deakin University
Browse

File(s) under permanent embargo

BoostML: An adaptive metric learning for nearest neighbor classification

conference contribution
posted on 2010-12-01, 00:00 authored by Nayyar ZaidiNayyar Zaidi, D M G Squire, D Suter
A Nearest Neighbor (NN) classifier assumes class conditional probabilities to be locally smooth. This assumption is often invalid in high dimensions and significant bias can be introduced when using the nearest neighbor rule. This effect can be mitigated to some extent by using a locally adaptive metric. In this work we propose an adaptive metric learning algorithm that learns an optimal metric at the query point. We learn a distance metric using a feature relevance measure inspired by boosting. The modified metric results in a smooth neighborhood that leads to better classification results. We tested our technique on major UCI machine learning databases and compared the results to state of the art techniques. Our method resulted in significant improvements in the performance of the K-NN classifier and also performed better than other techniques on major databases. © 2010 Springer-Verlag Berlin Heidelberg.

History

Volume

6118 LNAI

Issue

PART 1

Pagination

142 - 149

ISSN

0302-9743

eISSN

1611-3349

ISBN-13

9783642136566

ISBN-10

3642136567

Publication classification

E1.1 Full written paper - refereed

Title of proceedings

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)