Комментарии:
i wish my prof is half as good at explaining concepts
ОтветитьI'm surprised how easy it is. Thanks :)
ОтветитьGOAT
ОтветитьBro! You deserve the greatest like of all time!
ОтветитьHow are we building the trees?Using gini impurity index?
ОтветитьI’ve been trying to apply a gradient boost regression to a problem I’m working on and I was wondering what’s the best place to start on a search for the right hyperparameters (and which parameters are best to modify). I don’t really have a good intuition on the right places to start looking besides the default settings
ОтветитьDear Josh, I have a question regarding this video. How do we build a tree to predict the residuals? Do we use SSR to do that? Is everything like we perform for normal regression trees?
ОтветитьYou are making my dream of becoming a data scientist comes true. Thank you so much fron the bottom of my heart
ОтветитьYou are like master oogway, giving life lessons is simple phrases
Ответитьbaaam! you are the best
ОтветитьThank you for actually explaining it, and not just "you would ask a bunch of doctors".
Ответитьbaaaammmm
ОтветитьThanks! Helps me a lot.
ОтветитьSo, it is mostly like we don't want any variance and we want bias to explain things. So we start with a super bias (average) and then try to slowly add variances with it carefully with each tree.
Ответитьhow trees are getting constructed here like how root node and decision nodes are getting selected?
ОтветитьThat was a really great explanation ! thank you
Ответить'Small Bam' has me dying lol
ОтветитьHi Josh, How do we decide how the first tree should be build? by gini index or some other mesurements ?
ОтветитьI hate this channel to my very core. The amount of discomfort that song in the start makes me feel is out this world. Tatti khao :D
Ответитьthank you so much for the awesome video,! but i have a question about how you build the tree? for example, why is the( female+ less than 1.6 m, male+ color blue) together and (not male+less than 1.6 m,female+ not color blue ). will the combination of the features make a difference?
ОтветитьThanks for all these amazing videos. Do you plan to do one on LightGBM in the future? Seems to be becoming one of the more and more popular gradient boosting techniques.
ОтветитьThis is a great video, but why would you use "Weight" as a variable name?! It makes it much more confusing than it has to be
ОтветитьSo in every step you get the tree by using desicion trees concepts and this concept is just like forward stagewise methods?
Ответитьhow will we incorporate women who are greater than 1.6M using this regression tree model
ОтветитьThanks!
ОтветитьTRIPLE BAM!
ОтветитьGrazie.
Ответитьis pseudo residual also called the gradient?
Ответитьit's not a good idea to use weight as a feature (especially as the response) in an example
ОтветитьHi Josh, just a question, instead of taking avg in the first step can we build 1 regression tree to predict the y values following which the subsequent residuals can be predicted just like your explanation
ОтветитьGreat. Much better than starting by math
Ответитьperfect!
ОтветитьHello @josh sir, my name is Sachin kapoor from India. First i want to say a very very greatful thank you you. You are such a phenomenal teacher. You explain each and every topic from very basic to advance level. Recently I bought your book. The best book ever.
I have a request can you please provide me pdf of Gradient boosting or XG boost. It's a humble request. I am ready to pay nominal amount.
Hey @josh how can i get the PDF of gradient boost and XGBoost.
ОтветитьThanks for your hard work producing all these incredible tutorials Josh. I just can't imagine how'd I have learned these concepts otherwise without your videos!
Ответитьgetting me through my degree🥺❤
Ответитьtaking a course on machine learning from .... University (a very reputable one) , and yet im here to really understand the concept behind it.
thanks again and love from all the way Beirut-lebanon
I found boosting conceptually harder to grasp than bagging and random forest. This video explains each step with nice clear graphics with just about right pace! Thanks for the great efforts you have put into making this brilliant work.
Ответитьhi josh , great video as usual . dont u think learning rate is very smal ?? could it have done good job if it was 0.5 something
ОтветитьDisappointed...You did not sing "bip boop bipi boop booda booda bip bip boop" while populating the remaining residuals...
ОтветитьA fun fact - you pronounce 'S' in exactly the same way as Ian Somerhalder in The Vampire Diaries. As I have said previously, your videos are awesome! Thank you so much for making the life of so many people easier.
Ответитьthanks for another amazing vid. watching this video alone improves my understanding not only about the model, but also about hyperparameters which makes me better at ML modelling. lucky to have you in the community.
Ответить