Комментарии:
Thank you so much for this playlist! Got to learn a lot of things in a very clear manner. TRIPLE BAM!!!
Ответитьthe best video I saw about this topic so far. Great Content! Congrats!!
ОтветитьThanks!
ОтветитьI admire your work a lot. Salute from Brazil.
ОтветитьHi Josh! Thank you so much for the clear explanation! I'm just having trouble understanding why is it that we DON'T want to predict "abandon" but we are still predicting the weights that lead to it? Shouldn't it be that we WANT to predict "abandon", and the Negative Sampling selects a subset of words that we WANT TO PREDICT?
ОтветитьHey Josh, i'm a brazilian student and i love to see your videos, it's such a good and fun to watch explanation of every one of the concepts, i just wanted to say thank you, cause in the last few months you made me smile beautiful in the middle of studying, so, thank you!!! (sorry for the bad english hahaha)
ОтветитьHi Josh, thank you for your excellent work! Just discovered your videos and consuming like a pack of crisps. I was wondering about the desired output when using the skip-gram model. When we have a word as input, the desired output is to have all the words found within the window size on any sentence of the corpus activate to 1 at the same time on the output layer, right? It is not said explicitly but I guess it is the only way it can be.
ОтветитьTodaaay "guitar"
I ordered "guitar"
The-Illustrated-Guide-To-Machine-Learning by.... "guitar"
StatQueeeeest "guitar guitar"
Squash is cute <3
ОтветитьThank you so much for these videos. It really helps with the visuals because I am dyslexic… Quadruple BAM!!!! lol 😊
ОтветитьThanks for enlightening us Master.
ОтветитьThe input : "great!"
What is the ouput ?
can you do this for Images?
ОтветитьBro , i have my master degree in ML but trust me you explain it better than my teachers ❤❤❤
Big thanks
When I watched this,I have only one question which is why all the others failed to explain this if they are fully understood the concept?
ОтветитьWow, Awesome. Thank you so much!
Ответитьfunny and very nicely explained.
ОтветитьI have a question? Are the number of outputs softmax generating at the end of word 2 Vec varying between 2 to 20? Thats why the numbers of params calculated as 3M × 100 × 2? If it were to predict probs for all 3M words, would it have been 3M × 100 × 3M?
Ответить