From the course: Training Neural Networks in Python
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
Solution: Write your own Backpropagation method - Python Tutorial
From the course: Training Neural Networks in Python
Solution: Write your own Backpropagation method
- So here's my solution. Step one is the simplest. In line 93, we just run X through the network and assign the result to a new array called outputs. Step two is where we calculate the mean squared error. So first, I save the simple errors in an array called errors, and notice that I'm using NPI vector operations like the subtraction Y minus outputs. Then the mean squared error is the sum of the values in error squared divided by the number of neurons in the last layer. Step three is also done in vector operations. Just following the equation, notice that the result goes to the last element in our D array. Now for step four, pay attention to the loop starting at line 1 0 3. First, I calculate the weighted sum of the forward error terms and then use that sum for the current neurons error term. Notice that the outputs are not recalculated They are fetched from the values cash. All this is assigned to each element in the D…
Contents
-
-
-
-
-
-
The need for training4m 45s
-
(Locked)
The training process3m 47s
-
(Locked)
The error function2m 27s
-
(Locked)
Gradient descent2m 53s
-
(Locked)
The Delta rule3m 34s
-
(Locked)
The Backpropagation algorithm9m 12s
-
(Locked)
Challenge: Write your own Backpropagation method3m 20s
-
(Locked)
Solution: Write your own Backpropagation method4m 50s
-
-
-