Комментарии:
If you found this video helpful, then hit the like button👍, and don't forget to subscribe ▶ to my channel as I upload a new Machine Learning Tutorial every week.
Ответитьlet bro cook
ОтветитьNicely explained. Keep up the good job!
Ответитьwait you haven't explained backpropagation at all
ОтветитьYour videos are very helpful. It will be great if you sort the video..Thank you😇😇😇
Ответитьgreat video
ОтветитьFantastic explanation. Thank you
Ответитьy no subtitles?
ОтветитьLiterally best. Crisp and clear!! Thank you
ОтветитьVery helpful and to the point and correct!
ОтветитьBrother your explanation was great but there are some mistakes i have pointed out.
Ответитьbest explanation, best playlists
I don't usually interact with the algorithm much by giving likes and dropping comments or liking but you beat me into submission with this. Hopefully I understand the rest of it too lol.
You explain better than popular course instructor on deep learning
ОтветитьBest explanation I've seen so far
ОтветитьB1 and B2 are initialized randomly too ?
ОтветитьAbsolutely loved the way you explain. So easy to understand. Thank you
ОтветитьExcellent explanation jazakallah bro
ОтветитьPlease share code algorithm backpropagate
Ответитьso glad I found this channel!!
ОтветитьSuper sir. I have learned more information from this and also calculation way. It's very useful to our study. Thank you sir
ОтветитьCan A* actually be Z*, e.g. A1 = Z1?
ОтветитьI've always felt as if I was on the cusp of understanding neural nets but this video brought me past the hump and explained it perfectly! Thank you so much!
ОтветитьSir it's W ¹¹[¹] * a⁰[1] right? You've done it as W ¹¹[¹] * a¹[1] at the matrix multiplication, can you just verify I'm wrong?
Ответитьwhat is this B1
ОтветитьSuch a simple and neat explanation.
ОтветитьThis was actually pretty straight forward
ОтветитьSmall doubt, what is f(z1)...I am assuming these are just different type of activation functions...where input is just the weight of current layer*input from previous layers...is that correct?
Ответитьthis video should be titled " Explain - Forward and Backward Propagation - to Me Like I'm Five. Thanks man you saved me a lot of time.
ОтветитьGood job. But Gradient descent W2 and W1 mus be updated simultaneously.
ОтветитьAmazing work, keep it going :)
ОтветитьThanks man. The slides were amazingly put up.
ОтветитьGood Explanation !!
Ответитьyou explained in very clear and easy ways. Thank you, this is so helpful!
ОтветитьIsnt the equation : Z= W.X+B = transpose(W)*X + B.Hence the weight matrix what you have given is wrong right?
Ответитьhi can you put caption option
ОтветитьGreat video, and great explanation thanks dude!
ОтветитьIm a bit confuse through the exponent notations since some of it were not corresponding to the other
ОтветитьYou are great. It will be very good if you continue.
ОтветитьYour videos on neural networks are really good. Can you please also upload videos for generalized neural networks too, that would really be helpful P.S Keep Up the good work!!!
ОтветитьAwesome, really helpful! Thank you
Ответитьthank u sir it was really helpful
Ответитьyou are really awesome. love your teaching ability
Ответитьhi, how to calculate the cost?
Ответитьgreat video as always
Ответитьgreat video,
Please also make a video on SVM as soon as possible