Комментарии:
You are the best!!!
ОтветитьMan you prepared me for a killer interview, thanks
ОтветитьIt's Always Sunny in Philadelphia: the gang start a law firm
ОтветитьHi Justin, really a very nice and insightful video but when we have a multiple linear regression with hundreds of features with heteroscedasticity, how we figure out which ones to log and which not ?
Ответитьyou are too good bro.... wish that every university to have a professor like you ..
ОтветитьI don't attend my stats lectures no more, I just watch Zed video and pass my tests . Thanks for the uid bro!! <3 from S. Africa
ОтветитьI dont understand any more😢😢
ОтветитьThank you so much!
ОтветитьHow can we not be certain of the coefficients and still have predictive power? where is the prediction power is coming from if not from the coefficients themselves.
ОтветитьI never thought I'd stumble upon a statistics video with not only a Sufjan Stevens vinyl in the background but also with It's Always Sunny references, immediately subscribed.
ОтветитьWhy is vif = 1/(1-r2), if simply higher r2 mean higher vif. why define vif at all ?
ОтветитьJust discovered this channel !!! Am having fun
ОтветитьThanks!
ОтветитьThank you!
ОтветитьGenuinely, thank you
Ответитьthank you
ОтветитьI really appreciate the intuitive explanation! Very valuable! Thanks a lot!
ОтветитьThank you for going into the small details that are usually not explained well in schools. I appreciate your efforts. And I really enjoy your accent - charming. ;)
ОтветитьJanet looks skeptical rather than cynical.
ОтветитьGreat job! Thank you!
Ответитьunderrated channel...
ОтветитьThanks Justin, always a pleasure and transformative to watch your stats videos.
ОтветитьBest explanation of multicollinearity I've come across
ОтветитьThanks for providing such clarity in so simple words. Love frm 🇮🇳
ОтветитьGreat stuff
ОтветитьVery nice set of lectures. Thank you for your demo at the end!
So basically speaking when "X" variables have the same unit, one of them can be removed, either it is distance, time or cucumbers?
Exceptional explanation 👍
ОтветитьThank you! Thank you! Thank you!
Ответитьu r fuking good love u and videos
ОтветитьThis is a complete break down of the concept of multicollinearity. Thanks so much man. Now about the 3rd remedy for multicollinearity you provided in the video, i was studying the effect of education on health status and to measure education i used gross enrollment at primary, gross enrollment at secondary and gross enrolment at tertiary levels but after running my regression i noticed the 3 variables were correlated. would i be right to combine the 3 variables by looking at the average to get what i termed "gross enrolment rate Total"?
ОтветитьI'm in a class and not understanding the way they are trying to teach me these concept. I really appreciate you work and the way you explain things. Not sure why I'm taking a $2,000 class when I can learn more for the price of a "like" on your videos.
ОтветитьExcellent teaching skills, really nice presentation, Thanks man
ОтветитьTHANK YOU SO MUCH
ОтветитьBrilliant Video. you are just great. Very well explained.
ОтветитьYour videos are amazziiinggggg!!!!!!!
ОтветитьThis was great. I've been teaching myself linear algebra over the past 6 months and it's cools to see a real world example of ill-conditioned matrices and, in the final example, of a coefficient matrix who's columns don't form a basis. The real world context really helps cement those more abstract concepts.
ОтветитьHi, great video. What is the little epsilon at the end of the regression equation. I assume beta0 already is the intercept?
Ответитьgreat explanation!
Only watch the example at the beginning, I knew multicollinearty is!!! Thank you
The way how you explained multicollinearity is awesome. Thank you for clearing my doubts.
Ответитьcan we disregard multi collinearity when developing a prediction model? My model is getting quite complex (with some terms with higher order to satisfy linearity on logit scale [fractional polynomials]) and its getting tricky to deal with this issue.
ОтветитьWhy did I go to school!!!!!
ОтветитьThank you much mate. :)
ОтветитьVery useful. Thank you!
Ответитьgreat content . lean a lot. thankx man
ОтветитьThis video is perfect! Thank you :D
Ответитьcan we use VIF when assessing multicollinearity (association) between categorical X variables (multinomial)? thank you!
ОтветитьIndeed great video. Thank you
I have a question Pearson's correlation between variables are statistically significant however in linear multiple regression it's now...why?
Something signs positive or negative are also not consistent with each other in correlation and regression...why?