BackPropagation Neural Networks- Classification and Regression from scratch with python
Mục Lục
BackPropagation Neural Networks- Classification and Regression from scratch with python
Get code on Github:
Follow me on Github: https://github.com/durveshshah
Designing backpropagation neural network
Backpropagation is just updating the weights. In straightforward terms, when we backpropagate we are basically taking the derivative of our activation function. You will improve when I’ll disclose to you a model.
While planning any neural network, the hidden layers and neurons are not fixed by any means. To plan an ideal model, we want to do numerous trials and errors to get our model far from overfitting and underfitting. Some ideas are fundamental for building our model for backpropagation.
- Input (Features)
- Activation Function
- Number of Hidden Layers and neurons
- Output
- Derivation of the activation function
Activation Function
Usually Sigmoid is used as an activation function in classification scenarios. However, it also depends on our selection of hidden layers and the number of neurons. If our model fits well in two to three hidden layers then a sigmoid will be the best choice because sigmoid ranges between 0 and 1. If our model has more hidden layers and neurons then I will suggest using other activation functions such as hyperbolic tanh which ranges from -1 to 1.
We need to take care of the activation function because a wrong activation function can lead to incorrect accuracy of our model and thus incorrect prediction.
Backpropagation of neural network. Source: [1]
Working of Backpropagation Neural Networks
Steps:-
- As we can see in the above image, the inputs are nothing but features. In other words, we can say this as columns of our dataset.
- These features are passed as inputs through the hidden layer with an activation function. This activation function trains the model and is passed onto the output layer.
- The output layer then takes the derivative of the activation function and backpropagates to the input.
- Then it divides the trained weights by the actual output size.
- Finally, it updates the weights by subtracting the previous weights. For example, layer4 = layer4 -eta * alpha * weights4
- The above steps are iterated until the model is successfully trained.
Use Case — Classification:
I will be predicting the authenticity of bank notes whether the bank notes are real or forged.
UCI Dataset: https://archive.ics.uci.edu/ml/datasets/banknote+authentication
A classification problem requires to be classified into one of two or more classes [2].
Number of Features in the dataset:
- The variance of the image
- The skewness of the image
- Kurtosis of the image
- The entropy of the image
Output:
5. Class
Accuracy Graph
Use Case — Regression:
UCI Dataset Link: https://archive.ics.uci.edu/ml/datasets/combined+cycle+power+plant
The dataset contains 9568 rows from a Combined Cycle Power Plant spanning a project for 6 years (2006–2011). This dataset will predict electrical energy output based on the number of features.
A regression problem is the prediction of a quantity. In this case, how much electrical energy output will be consumed [2].
Number of Features in the dataset:
- Temperature (T)
- Ambient Pressure (AP)
- Relative Humidity (RH)
- Exhaust Vacuum (V).
Output:
5. Net hourly electrical energy output (PE)
Woohoo. Finally, we have completed our model with different datasets on classification and regression.
References
[1] Admin. “How Do Neural Networks Update Weights and Biases during Back Propagation?” I2tutorials, 18 Oct. 2019, www.i2tutorials.com/how-do-neural-networks-update-weights-and-biases-during-back-propagation.
[2] Brownlee, Jason. “Difference Between Classification and Regression in Machine Learning.” Machine Learning Mastery, 22 May 2019, machinelearningmastery.com/classification-versus-regression-in-machine-learning/#:%7E:text=Classification%20is%20the%20task%20of,of%20predicting%20a%20continuous%20quantity.
Become a ML Writer