Because computation is postponed until a new instance is observed, these algorithms are sometimes referred to as "lazy.
"[2] It is called instance-based because it constructs hypotheses directly from the training instances themselves.
[3] This means that the hypothesis complexity can grow with the data:[3] in the worst case, a hypothesis is a list of n training items and the computational complexity of classifying a single new instance is O(n).
One advantage that instance-based learning has over other methods of machine learning is its ability to adapt its model to previously unseen data.
To battle the memory complexity of storing all training instances, as well as the risk of overfitting to noise in the training set, instance reduction algorithms have been proposed.