Skip to content

SushankMishra/S6

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

S6

Welcome to the Notes explaining the major steps taken to emulate backpropagationon Excel for a NN with a single hidden layer

Neural Network

Steps for the Process

  1. Layer Initialisation: The nodes of the output and hidden layer are filled using the initial parameter values of the 8 weights W1 till W8.
  2. Loss Calculation: The error is calculated using the error loss functions (1/2)(t1 - Ao1) & (1/2)(t2 - Ao2).
  3. BackPropagation: The partial gradients of loss for each parameter of weight is calculated using the chain rule.
  4. Updating the Weights : The weights are updated using the following expression [W_{new} = W_{old} - a \cdot K].

Variation of Loss

Loss Graph_0pt1

Loss function vs Epochs @ N=0.1

Loss Graph_0pt2

Loss function vs Epochs @ N=0.2

Loss Graph_0pt5

Loss function vs Epochs @ N=0.5

Loss Graph_0pt8

Loss function vs Epochs @ N=0.8

Loss Graph_1

Loss function vs Epochs @ N=1

Loss Graph_1

Loss function vs Epochs @ N=2

Final Look

FinalBPTable

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published