In the field of pattern recognition, more specifically in the area of supervised and feature-vector-based classifications, various classification methods exist but none of them can return always right results for any given kind of data. Each classifier behaves differently, having its own strengths and weaknesses. Some are more efficient then others in particular situations. The performance of these individual classifiers can be improved by combining them into one multiple classifier. In order to make more realistic decisions, the multiple classifier can analyze internal values generated by each classifier and can also rely on statistics learned from previous tests, such as reliability rates and confusion matrix. Individual classifiers studied in this project are Bayes, k-nearest neighbors, and neural network classifiers. They are combined using the Dempster-Shafer (1967, 1976) theory. The problem simplifies in finding weights that best represent individual classifier evidences. A particular approach has been developed for each of them, and for all of them it has been proven better to rely on classifiers internal information rather than statistics. When tested on a database comprised of 8 different kinds of military ships, represented by 11 features extracted from FLIR images, the resulting multiple classifier has given better results than others reported in the literature and tested in this work.
New initial basic probability assignments for multiple classifiers
2002
10 Seiten, 9 Quellen
Aufsatz (Konferenz)
Englisch
Optimization of multiple objective gate assignments
Online Contents | 2001
|Learning Classifiers from Imbalanced Data Based on Biased Minimax Probability Machine
British Library Conference Proceedings | 2004
|UAV cooperative multiple task assignments using genetic algorithms
Tema Archiv | 2005
|Combining Multiple Classifiers Based on Third-Order Dependency
British Library Conference Proceedings | 2003
|