Комментарии:
Thanks for your class...
Well explained!!!
Lucid explanation !!
Ответитьvery useful explanation .
Ответитьhello
someone can help me to resolve ?
1) Create the core functions:
- polynomial (c, h, x, y)
-Gaussian (standard deviation, x, y)
-sigmoid (alpha, beta, x, y)
-khi-two (x, y)
2) create a function that constructs a gram matrix.
3) Karnelk means classification algorithm
disappointed Stanford - it needed to show the k-means with pictures graphs diagrams - just a dry explanation is no use - hope you will use this feedback - thanks for the attempt
Ответитьvery helpful and easy to understand. Keep up the good work!
Ответитьgood
ОтветитьLet's say I have a high dimensional dataset which contains 100 features. For picking the initial k points if I follow the approach 2 (dispersed) then how can I manually understand the distance as I cannot plot this high dimensional dataset in 2d graph.
ОтветитьNice Video :-)
Ответитьbest tutorial ever!
ОтветитьWhy is there so much saliva?
Ответитьthank you wonderful video
ОтветитьExcellent start, cleared my initial doubts. Keep it up
Ответитьunfortunately useless.
ОтветитьCan anyone help me out how to merge cluster if they are close to each other or if they are in a particular direction
ОтветитьI love this video. can you please sent me the link to the next video?
Ответитьnice explanation. Thank u
Ответитьwhat about convergence proof etc ... this is too naive
ОтветитьExcellent explanation. Very concise but covered the right details.
ОтветитьUsing another clustering algorithm to pick K points seems to me like a oxymoron.
ОтветитьNice explanation !!! Thank you
Ответитьvery well done.
Ответитьawesome !!! great explanation, simply, useful, efficient ! I feel that I really understood the topic... Thank you !
Ответитьits helpful it you show the calculatins...
Ответить