Word Embedding and Word2Vec, Clearly Explained!!!

Word Embedding and Word2Vec, Clearly Explained!!!

StatQuest with Josh Starmer

1 год назад

278,381 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Muthu Aiswaryaa Swaminathan
Muthu Aiswaryaa Swaminathan - 07.11.2023 23:00

Thank you so much for this playlist! Got to learn a lot of things in a very clear manner. TRIPLE BAM!!!

Ответить
gustavo
gustavo - 06.11.2023 19:21

the best video I saw about this topic so far. Great Content! Congrats!!

Ответить
Joel Gsponer
Joel Gsponer - 06.11.2023 09:15

Thanks!

Ответить
Rayner GS
Rayner GS - 30.10.2023 17:08

I admire your work a lot. Salute from Brazil.

Ответить
Yuhan Zhou
Yuhan Zhou - 17.10.2023 01:00

Hi Josh! Thank you so much for the clear explanation! I'm just having trouble understanding why is it that we DON'T want to predict "abandon" but we are still predicting the weights that lead to it? Shouldn't it be that we WANT to predict "abandon", and the Negative Sampling selects a subset of words that we WANT TO PREDICT?

Ответить
Marvin Mendes Cabral
Marvin Mendes Cabral - 07.10.2023 03:31

Hey Josh, i'm a brazilian student and i love to see your videos, it's such a good and fun to watch explanation of every one of the concepts, i just wanted to say thank you, cause in the last few months you made me smile beautiful in the middle of studying, so, thank you!!! (sorry for the bad english hahaha)

Ответить
Guillaume Barreau
Guillaume Barreau - 05.10.2023 21:32

Hi Josh, thank you for your excellent work! Just discovered your videos and consuming like a pack of crisps. I was wondering about the desired output when using the skip-gram model. When we have a word as input, the desired output is to have all the words found within the window size on any sentence of the corpus activate to 1 at the same time on the output layer, right? It is not said explicitly but I guess it is the only way it can be.

Ответить
G
G - 03.10.2023 14:34

Todaaay "guitar"
I ordered "guitar"
The-Illustrated-Guide-To-Machine-Learning by.... "guitar"

StatQueeeeest "guitar guitar"

Ответить
whataquirkyguy
whataquirkyguy - 03.10.2023 03:11

Squash is cute <3

Ответить
Colin Timmins
Colin Timmins - 26.09.2023 09:52

Thank you so much for these videos. It really helps with the visuals because I am dyslexic… Quadruple BAM!!!! lol 😊

Ответить
Danish
Danish - 25.09.2023 10:34

Thanks for enlightening us Master.

Ответить
Luận Hồ
Luận Hồ - 24.09.2023 16:36

The input : "great!"
What is the ouput ?

Ответить
Dennis Huber
Dennis Huber - 21.09.2023 17:13

can you do this for Images?

Ответить
fouad boutaleb
fouad boutaleb - 20.09.2023 10:48

Bro , i have my master degree in ML but trust me you explain it better than my teachers ❤❤❤
Big thanks

Ответить
AM
AM - 15.09.2023 14:42

When I watched this,I have only one question which is why all the others failed to explain this if they are fully understood the concept?

Ответить
Pakapon Wiwat
Pakapon Wiwat - 15.09.2023 06:37

Wow, Awesome. Thank you so much!

Ответить
steve samson
steve samson - 13.09.2023 09:58

funny and very nicely explained.

Ответить
Pratham Gupta
Pratham Gupta - 09.09.2023 18:29

I have a question? Are the number of outputs softmax generating at the end of word 2 Vec varying between 2 to 20? Thats why the numbers of params calculated as 3M × 100 × 2? If it were to predict probs for all 3M words, would it have been 3M × 100 × 3M?

Ответить