[2]: 182 [note 1] In pseudocode, the training algorithm for an OvR learner constructed from a binary classification learner L is as follows: Making decisions means applying all classifiers to an unseen sample x and predicting the label k for which the corresponding classifier reports the highest confidence score: Although this strategy is popular, it is a heuristic that suffers from several problems.
[2]: 339 Like OvR, OvO suffers from ambiguities in that some regions of its input space may receive the same number of votes.
[2]: 183 This section discusses strategies of extending the existing binary classifiers to solve multi-class classification problems.
Neural Network-based classification has brought significant improvements and scopes for thinking from different perspectives.
Naive Bayes is a successful classifier based upon the principle of maximum a posteriori (MAP).
This approach is naturally extensible to the case of having more than two classes, and was shown to perform well in spite of the underlying simplifying assumption of conditional independence.
The tree tries to infer a split of the training data based on the values of the available features to produce a good generalization.
In these extensions, additional parameters and constraints are added to the optimization problem to handle the separation of the different classes.
Each of these programs can be used to generate the output for a class, thus making MEP naturally suitable for solving multi-class classification problems.
The online learning algorithms, on the other hand, incrementally build their models in sequential iterations.