Multi-Layer Neural Networks with Sigmoid Function— Deep Learning for Rookies (2)

Review from DLFR 1

Graph 1: Procedures of a Single-layer Perceptron Network

Breakthrough: Multi-Layer Perceptron

Geoffrey Hinton, one of the most important figures in Deep Learning. Source: thestar.com

Neural Networks with Hidden Layers

Graph 2: Left: Single-Layer Perceptron; Right: Perceptron with Hidden Layer

Graph 3: We label input layer as x with subscripts 1, 2, …, m; hidden layer as h with subscripts 1, 2, …, n; output layer with a hat

Definition of a Dot Product

Graph 4: ExampleGraph 5: Procedure to Hidden Layer Outputs

Graph 6: Procedure to Final Output

Sigmoid Neurons: An Introduction

Graph 7: Step Function

Graph 8: We Want Gradual Change in Weights to Gradually Change Outputs

Sigmoid FunctionGraph 9: Sigmoid Function using Matplotlib

Graph 10. Representation of Neural Networks with Hidden Layers to Classify XOR Gate. Source: http://colinfahey.com/

Multi-Layer Neural Networks: An Intuitive Approach

Graph 11. MNIST dataset of Hand-written Digits. Source: http://neuralnetworksanddeeplearning.com/

Graph 12. MNIST digit 5, which consist of 28×28 pixel values between 0 and 255. Source: http://neuralnetworksanddeeplearning.com/

Graph 13: Multi-Layer Sigmoid Neural Network with 784 input neurons, 16 hidden neurons, and 10 output neurons

Graph 14. An Intuitive Example to Understand Hidden Layers

Graph 15. Neural Networks are Black Boxes. Each Time is Different.

Our quote this week 😉

Graph 16: Each Hidden Layer Learns More Complex Features. Source: Scientific American

Recap