Multistage neural network ensemble: adaptive combination of ensemble results

Yang, Shuang (2003) Multistage neural network ensemble: adaptive combination of ensemble results. Doctoral thesis, London Metropolitan University.

[img]
Preview
Text
425920.pdf - Published Version

Download (40MB) | Preview

Abstract / Description

In the past decade, more and more research has shown that ensembles of neural networks (sometimes referred to as committee machines or classifier ensembles) can be superior to single neural network models, in terms of the generalization performance they can achieve on the same datasets. Combining a set of neural network classifiers whose error distributions are diverse can lead to the generation of superior results than those achieved by any single classifier. Common combination strategies used to combine the results of individual ensemble members are simple averaging, weighted averaging, majorities voting and ranking. These are catalogued as static combination schemes, which require no prior training. One deficiency of such schemes is that weightings for the importance of the output of each ensemble member must be pre-chosen and then applied to produce the combination result. It appears attractive to make the combination process adaptive, so that no a-priori (and possibly incorrect) combination weightings need to be chosen. Therefore, a model is proposed where the procedure of combining ensemble classifiers is turned into the training of another neural network. In this thesis, we propose a novel trainable neural network ensemble combination schema: the multistage neural network ensemble (MNN). Two stages of neural network models are constructed. In the first stage, neural networks are used to generate the ensemble candidates. The second stage neural network model approximates a combination function based on the results generated from the ensemble members from the first stage. A sample of the data sets from the UCI Machine Learning Depository and human gene splice data sets were modeled using MNNs, and significant improvements were obtained by MNN in comparison with the performance of a majority voting scheme. The results suggest that the MNN approach can be used as an alternative ensemble combination method.

Item Type: Thesis (Doctoral)
Additional Information: uk.bl.ethos.425920
Uncontrolled Keywords: neural networks (computer science); committee machines; classifier ensembles; neural network models; neural network classifiers
Subjects: 000 Computer science, information & general works
Department: School of Computing and Digital Media
Depositing User: Mary Burslem
Date Deposited: 22 Mar 2022 15:02
Last Modified: 22 Mar 2022 15:02
URI: http://repository.londonmet.ac.uk/id/eprint/7263

Downloads

Downloads per month over past year



Downloads each year

Actions (login required)

View Item View Item