Convolutional Neural Networks (CNN) — Architecture Explained | by Dharmaraj | Medium

Convolutional Neural Networks (CNN) — Architecture Explained

Introduction

Kernel or Filter or Feature Detectors

Formula = [i-k]+1

i -> Size of input , K-> Size of kernel

Stride

Formula =[i-k/s]+1

i -> Size of input , K-> Size of kernel, S-> Stride

Padding

Formula =[i-k+2p/s]+1

i -> Size of input , K-> Size of kernel, S-> Stride, p->Padding

Pooling

Flatten

Layers used to build CNN

  • Convolutional layer
  • Pooling layer
  • Fully-connected (FC) layer

Convolutional layer

This layer is the first layer that is used to extract the various features from the input images. In this layer, We use a filter or Kernel method to extract features from the input image.

Pooling layer

The primary aim of this layer is to decrease the size of the convolved feature map to reduce computational costs. This is performed by decreasing the connections between layers and independently operating on each feature map. Depending upon the method used, there are several types of Pooling operations. We have Max pooling and average pooling.

Fully-connected layer

The Fully Connected (FC) layer consists of the weights and biases along with the neurons and is used to connect the neurons between two different layers. These layers are usually placed before the output layer and form the last few layers of a CNN Architecture.

Dropout

Activation Function

Sigmoid — For a binary classification in the CNN model

tanH – The tanh function is very similar to the sigmoid function. The only difference is that it is symmetric around the origin. The range of values, in this case, is from -1 to 1.

Softmax– It is used in multinomial logistic regression and is often used as the last activation function of a neural network to normalize the output of a network to a probability distribution over predicted output classes.

RelU– the main advantage of using the ReLU function over other activation functions is that it does not activate all the neurons at the same time.