Lazy learning

DW Aha - Lazy learning, 1997 - Springer
Lazy learning, 1997Springer
LazY learning algorithms exhibit three characteristics that distinguish them from other
learning algorithms (ie, algorithms that lead to performance improvement over time). First,
they defer processing of their inputs until they receive requests for information; they simply
store their inputs for future use. Next, they reply to information requests by combining their
stored (eg, training) data. Finally, they discard the constructed answer and any intermediate
results. In contrast, eager learning algorithms greedily compile their inputs into an …
LazY learning algorithms exhibit three characteristics that distinguish them from other learning algorithms (ie, algorithms that lead to performance improvement over time). First, they defer processing of their inputs until they receive requests for information; they simply store their inputs for future use. Next, they reply to information requests by combining their stored (eg, training) data. Finally, they discard the constructed answer and any intermediate results. In contrast, eager learning algorithms greedily compile their inputs into an intensional concept description (eg, represented by a rule set, decision tree, or neural network), and in this process discard the inputs. They reply to information requests using this a priori induced description, and retain it for future requests. This lazy/eager distinction exhibits many interesting tradeoffs. For example, while lazy algorithms have lower computational costs than eager algorithms during training, they typically have greater storage requirements and often have higher computational costs when answering requests. For the first time, this distinction, and its implications, are the focus of a (quintuple) special issue; AI Review has brought together 14 articles that review and/or investigate state-of-the-art learning algorithms that display lazy behaviors.
There is a risk involved in propagating jargon in any scientific discipline. To be useful, such jargon must clarify an important concept that is not otherwise easily described, and perhaps is also not well recognized." Lazy learning" fulfills this requirement. It focuses attention on the lazy/eager distinction, which is often neglected. This is important; lazy algorithms offer a powerful alternative perspective on how to solve learning tasks. For example, lazy methods often use local approaches (Bottou and Vapnik 1992), which yield highly adaptive behavior not usually found in eager algorithms, and lazy algorithms are the basis of many industrial applications (eg, Jabbour et al. 1987;
Springer