Replies: 1 comment 2 replies
-
Well well welll I did read the maths behind it. what we actually do is and hat's how we get new weight. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey,
I wanted to confirm if i am understanding this correct. It was mentioned in the youtube video on the time: 6:15:40 that Back propagation(4. loss backward) is moving backwards through the network to calculate the gradients if the parameters of our model w.r.t the loss.
But as i crosschecked and always understood was that we calculate the gradients of the loss w.r.t to parameters and not visa versa.
Please clear if this is a different thing you are referring to or am i right?
Beta Was this translation helpful? Give feedback.
All reactions