Комментарии:
good good good good
ОтветитьDon't mind if this is my ring tone, drives the confusion away, BAM...
ОтветитьWonderful clarity. Well done!
ОтветитьDoes the bias apply to all data or only the test data? Same question for Variance.
Ответитьkhouya sir t7wa
ОтветитьA silly doubt, please try to clarify if possibe. You said After a certain weight, mice don't get any taller. That is not a necessary condition practically. Because, Let's take the certain limit as 80kg . Some people even grow taller after 80kgs. So are you asking us to assume a dataset where this is possible?
Ответитьgood good great
ОтветитьI has a feeling if you're Japanese you'd be a great haiku poet
Ответитьthe inability for a machine learning method like linear regression to capture the true relationship is called bias. Large amount of bias means great incapability to capture the data trend into the model.
The difference in fits between data sets is called variance.
We can compare how well the Straight line and the squiggly line fit the training set by calculating their sums of squares(sqaure the distance between fit line and data and add them up)
Wtf bro?! How in earth do you answer all this comments?
ОтветитьStatQuest so cool man
ОтветитьThe way you explain it , priceless . Thank you so much.
Ответитьman
I love how you explianed it so easy to understand like butter 🔥🔥🔥🔥
Just cool
ОтветитьOne way I remeber this is:
Bias : It's the loss associated with training
Variance: It's the loss associated with testing
If your training loss value is low, it means you have low bias and vice versa.
If your testing loss value is low, it means you have low variance and vice versa.
the intro's are amazing, and so are the videos, thanks!
ОтветитьI came here after taking a grad level course, but this simple explanation often stays longer in the mind :)
Ответитьthis learning is fun I like the way you create vidoes.
ОтветитьI'm amazed how many bams I've reached just in a couple of hours. Your videos have been enlightening, thank YOU very much!
ОтветитьDouble BAM !!
ОтветитьVery simply and amazingly explained, saw many tutorials but this was by far the best. Thank you :)
ОтветитьTHANK YOU SO MUCH!
ОтветитьOMG, Pls join as a prof in my university hehe
ОтветитьThanks!
Ответить- Bias is the number showing ability of fitting the traning set, the smaller bias is, the better it fits the traning set
- Variance is the number of ability of fitting the traning set, ... THE SAME
- we want to find sweet spot where bias is low and variance is low
- there are three methods: regularition, boosting, bagging
when you say high variance on different datasets, does that mean different test datasets?
ОтветитьI replayed this video - not because the explanations weren't clear. I just wanted to hear the song again haha
Ответитьthis is probably how education is gonna be in the future, thanks a lot!
Ответитьthank you
Ответитьso good explained!! way better than my ml prof :D thx, good examples, good vid
ОтветитьValeu!
ОтветитьIs the intro song inspired by smelly Cat by Phoebe Buffay? :P
ОтветитьOne of the best videos I have come so far
Ответитьperfect
ОтветитьHi, As per some text books, bias is given to the network. Is it given or appears out of network?
ОтветитьKeep making video like this… ❤
ОтветитьStatQuest using the iMessage color scheme to keep our attention.... MVP🏆
ОтветитьBAM!
ОтветитьGTAT - GREATEST TEACHER OF ALL TIME
ОтветитьI always love your intro music!
ОтветитьYou just simplify everything...great work...love from India❤
ОтветитьThanks!
ОтветитьThis was great!
ОтветитьSo for a model to overfit, it has to be non-linear? I mean can't a Linear model suffer from the overfitting problem?
Ответить