Resource limitations have played an important role in the development and verification of theories about intelligent behaviour. This paper is a step towards answering the question of what effect limited memory has on the ability of intelligent machines to learn from data. Our analysis is applicable to many existing learning methods, especially those that incrementally construct a generalization by making repeated passes through a set of training data (e.g. some implementations of perceptrons, neural nets, and decision trees). Most of these methods do not store the entire training set, since they allow themselves only limited storage, a restriction that forces them to produce a compressed representation. The question we address is, how much (if any) additional processing time is required for methods with limited storage? We measure processing time for learning algorithms by the number of passes through a data set necessary to obtain a correct generalization. Researchers have observed that for some learning methods (e.g. neural nets) the number of passes through a data set gets smaller as the size of the network is increased ; however, no analytical study that explains this behaviour has been published. We examine limited storage algorithms for a particular concept class, nested hyperrectangles. We prove bounds that illustrate the fundamental trade-off between storage requirements and processing time required to learn an optimal structure. It turns out that our lower bounds apply to other algorithms and concept classes as well (e.g. decision trees). We discuss applications of our analysis to learning and to problems in human perception. We also briefly discuss parallel learning algorithms.
|Original language||English (US)|
|Number of pages||19|
|Journal||Journal of Experimental and Theoretical Artificial Intelligence|
|State||Published - Apr 1996|
- Incremental learning
ASJC Scopus subject areas
- Artificial Intelligence