Best-Case Results for Nearest-Neighbor Learning

Steven Salzberg, David Heath, Simon Kasif, Arthur L. Delcher

Research output: Contribution to journalArticle

Abstract

In this paper we propose a theoretical model for analysis of classification methods, in which the teacher knows the classification algorithm and chooses examples in the best way possible. We apply this model using the nearest-neighbor learning algorithm, and develop upper and lower bounds on sample complexity for several different concept classes. For some concept classes, the sample complexity turns out to be exponential even using this best-case model, which implies that the concept class is inherently difficult for the NN algorithm. We identify several geometric properties that make learning certain concepts relatively easy. Finally we discuss the relation of our work to helpful teacher models, its application to decision tree learning algorithms, and some of its implications for current experimental work.

Original languageEnglish (US)
Pages (from-to)599-608
Number of pages10
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume17
Issue number6
DOIs
StatePublished - Jun 1995

Keywords

  • Machine learning
  • geometric concepts
  • nearest-neighbor

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Best-Case Results for Nearest-Neighbor Learning'. Together they form a unique fingerprint.

  • Cite this