Towards Optimally Diverse Randomized Ensembles of Neural Networks
English

About The Book

The concept of ensemble learning has become exceptionally popular over the last couple decades due to the ability of a group of base classifiers trained for the same problem to often demonstrate higher accuracy than that of a single model. The main idea behind such an ensemble of models which outperforms a single model is to combine a set of diverse classifiers. This work concentrates on neural networks as base classifiers and explores the influence of the parameters of neural networks whose randomization leads to generating diverse ensembles with better generalisation ability compared to a single model. For stimulating disagreement among the members of an ensemble of neural networks we apply the sampling strategy similar to one implemented by Random Forests together with the variation of the network parameters. Experimental results demonstrate that by random varying different network parameters it is possible to induce diversity to an ensemble of neural networks but it does not necessarily lead to an accuracy improvement. This work will be useful for people who are interested in ensemble methods and Artificial Neural Networks as a base classifier.
Piracy-free
Piracy-free
Assured Quality
Assured Quality
Secure Transactions
Secure Transactions
Delivery Options
Please enter pincode to check delivery time.
*COD & Shipping Charges may apply on certain items.
Review final details at checkout.
downArrow

Details


LOOKING TO PLACE A BULK ORDER?CLICK HERE