CNN ACTIVATION


LOOKING TO PLACE A BULK ORDER?CLICK HERE

Piracy-free
Piracy-free
Assured Quality
Assured Quality
Secure Transactions
Secure Transactions
Fast Delivery
Fast Delivery
Sustainably Printed
Sustainably Printed
Delivery Options
Please enter pincode to check delivery time.
*COD & Shipping Charges may apply on certain items.
Review final details at checkout.

About The Book

Deep Neural Networks are characterized by the weight bias and activation function. The activation functions decide whether a neuron should be activated or not by computing weighted sums and biases. In this book I represent an experimental review on the eight different activation functions for the Convolutional layers in Neural Networks.For my experiment I selected eight activation functions for three different datasets. The activation functions are – Sigmoid Softmax tanh Softplus Softsign ReLU ELU and SELU. I also experimented on the networks where I did not use any activation function for the convolutional layers. After analyzing the results I see that the models with three different activation functions achieved higher performance for the three datasets. The interesting thing is the best average performance for the three datasets is achieved by using the Softmax activation function for the Convolutional layer. At present time ReLu and ELU are the most used activation functions but I see that tanh Softplus and Softsign achieved better average performance for the three datasets.In this book I just focus only on the activation function for the Convolutional layer to test the performance of the Convolutional Neural Networks.
downArrow

Details