Комментарии:
Would love to request an in person version
ОтветитьSexy
ОтветитьExcellent video, thank you!
ОтветитьGotta echo the other comments here. Very succinct and understandable. You brought in the linear algebra without getting bogged down in it. Folks that don't have a strong grasp of that subject will still probably be able to get the main points of your presentation. Nicely done!
ОтветитьPS: Video is targeted at people who already have a deep knowledge of what the video is trying to explain.
Ответитьthank you for this amazing video
ОтветитьI have always dreaded statistics, but this video made these concepts so simple while connecting it to Linear algebra. Thank you so much ❤
ОтветитьWhy do you stop making videos?
ОтветитьI thought PCA was a hard concept. Your video is so great!
Ответитьthank you for this amazing and simple explanation
ОтветитьThank you! very nice video, well explained!
ОтветитьGreat concise presentation, much appreciated! 👍
ОтветитьGood job, no wasted time
ОтветитьAwesome explanation!! Nobody did it better!
ОтветитьAround the minute of 1.36, you said "we divide by n for covariance", but we divide by n-1, instead. Please, do check on that. Thanks for the video. Maybe, I sohuld say estimated covariance has the n-1 division.
ОтветитьVery nice video. I plan to use it for my teaching. What puzzles me a bit is that the PCs you give as an example are not orthogonal to each other.
ОтветитьThank you. It was beautiful
ОтветитьGreat video! Can anyone tell how she decided that PC1 is spine length and PC2 is Body mass? Should we guess (hypothesize) this in real world scenarios?
ОтветитьVery Nice..pls keep posting
ОтветитьGood lecture
ОтветитьGraphical interpretation of covariance is very intuitive and useful for me. Thank you.
ОтветитьGood explanation
Ответитьgreat explanation
ОтветитьI do understand that eigenvalues represent the factor by which the eigenvectors are scaled, but how do they signify “the importance of certain behaviors in a system”, what other information do eigenvalues tell us other than a scaling factor? Also, why do eigenvectors point towards the spread of data?
ОтветитьNo one explains why they use covariance matrix. Why not use actual data and find its igen vector/igen values. I have been watching hundreds of videos books. No one explains that. It just doesn't make sense to me to use covariance matrix. Covariance is very useless parameter. It doesn't tell you much at all.
ОтветитьBelieve it or not, I've been wondering a lot about the concept of covariance because every video seems to miss the reason behind the idea. But I think I kind of figured it out today before watching this video and I drew the same exact thing that is in the thumbnail. So I guess was thinking correctly : ))
Ответитьthanks for this simple yet very clear explanation
Ответитьbabe var(x,x) makes no sense. either you say var(x) or cov(x,x)
ОтветитьBest PCA Visual Explanation! Thank You!!!
ОтветитьPlz do more videos
ОтветитьGreat clarity. You clearly understand your stuff from a deep level so it's easy to teach.
Ответитьpoggers explination thankyou
ОтветитьThank you, Ma'am!
ОтветитьWow, that was quite good explanation.
Ответитьbeautiful, thanks a lot!
ОтветитьGreat explication. Thank you.
ОтветитьCongratulations Emma, your work is excellent!
ОтветитьHello Emma, Great job! Very nicely explained.
ОтветитьGreat explanation!
Ответитьnice job was always kinda confused by this.
ОтветитьI just love the voice🙄😸
ОтветитьThank you for this great lecture.
Ответитьinvestigate hedge/hogs
ОтветитьThis video needs a golden buzzer.
ОтветитьGreat video, thank you!
ОтветитьVery nice explanation!
Ответить