Error-correcting output coding (ECOC) is a method for constructing a multi-valued classifier using a combination of given binary classifiers. ECOC can estimate the correct category by other binary classifiers even if the output of some binary classifiers is incorrect based on the framework of the coding theory. The code word table representing the combination of these binary classifiers is important in ECOC. ECOC is known to perform well experimentally on real data. However, the complexity of the classification problem makes it difficult to analyze the classification performance in detail. For this reason, theoretical analysis of ECOC has not been conducted. In this study, if a binary classifier outputs the estimated posterior probability with errors, then this binary classifier is said to be noisy. In contrast, if a binary classifier outputs the true posterior probability, then this binary classifier is said to be noiseless. For a theoretical analysis of ECOC, we discuss the optimality for the code word table with noiseless binary classifiers and the error rate for one with noisy binary classifiers. This evaluation result shows that the Hamming distance of the code word table is an important indicator.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1142/S0129065723500041 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!