Combining Diverse Classifiers by Learning Weights Robust to the presence of Class Label Noise
In this paper, we introduced a classifier ensemble approach to combine heterogeneous classifiers together in the presence of class label noise in the datasets. To enhance the performance of our proposed classifier ensemble, we give a preprocessing step to filter out class label noise present in the dataset. This noise free data is further used to learn individual classifier model. After that, a weight learning method is introduced to learn weights on each individual classifier to construct a classifier ensemble. We applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give best accuracy. The proposed approach is evaluated on variety of real life datasets. The proposed technique is also compared with existing standard ensemble techniques such as Adaboost, Bagging and RSM to show the superiority of proposed ensemble method as compared to its competitors and also to show the sensitivity of competitors to class label noise. We have also given an analysis of the performance of proposed approach on the imbalanced and overlapping classes.