Gradient Boost Part 1 (of 4): Regression Main Ideas

Gradient Boost Part 1 (of 4): Regression Main Ideas

StatQuest with Josh Starmer

5 лет назад

770,571 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@iurgnail
@iurgnail - 03.12.2023 10:20

i wish my prof is half as good at explaining concepts

Ответить
@honza8939
@honza8939 - 04.11.2023 10:26

I'm surprised how easy it is. Thanks :)

Ответить
@seolyeong
@seolyeong - 02.11.2023 10:48

GOAT

Ответить
@hhhhh-pb2ep
@hhhhh-pb2ep - 21.10.2023 19:06

Bro! You deserve the greatest like of all time!

Ответить
@deepshikhameghwal692
@deepshikhameghwal692 - 17.10.2023 09:13

How are we building the trees?Using gini impurity index?

Ответить
@harisserdarevic4913
@harisserdarevic4913 - 15.10.2023 05:46

I’ve been trying to apply a gradient boost regression to a problem I’m working on and I was wondering what’s the best place to start on a search for the right hyperparameters (and which parameters are best to modify). I don’t really have a good intuition on the right places to start looking besides the default settings

Ответить
@user-fi2vi9lo2c
@user-fi2vi9lo2c - 14.10.2023 10:33

Dear Josh, I have a question regarding this video. How do we build a tree to predict the residuals? Do we use SSR to do that? Is everything like we perform for normal regression trees?

Ответить
@ghexz7
@ghexz7 - 11.10.2023 11:07

You are making my dream of becoming a data scientist comes true. Thank you so much fron the bottom of my heart

Ответить
@saurabhchoudhary4572
@saurabhchoudhary4572 - 08.10.2023 11:02

You are like master oogway, giving life lessons is simple phrases

Ответить
@ziadadel2003
@ziadadel2003 - 30.09.2023 00:49

baaam! you are the best

Ответить
@cherubin7th
@cherubin7th - 30.09.2023 00:14

Thank you for actually explaining it, and not just "you would ask a bunch of doctors".

Ответить
@user-kr1qg6oo6n
@user-kr1qg6oo6n - 07.09.2023 17:24

baaaammmm

Ответить
@user-kz6xd8zx2z
@user-kz6xd8zx2z - 03.09.2023 19:06

Thanks! Helps me a lot.

Ответить
@mirabirhossain1842
@mirabirhossain1842 - 28.07.2023 09:51

So, it is mostly like we don't want any variance and we want bias to explain things. So we start with a super bias (average) and then try to slowly add variances with it carefully with each tree.

Ответить
@aneesarom
@aneesarom - 22.07.2023 08:39

how trees are getting constructed here like how root node and decision nodes are getting selected?

Ответить
@prathvikgs4406
@prathvikgs4406 - 08.07.2023 23:37

That was a really great explanation ! thank you

Ответить
@willymccarthy5248
@willymccarthy5248 - 22.06.2023 16:43

'Small Bam' has me dying lol

Ответить
@harryliu1005
@harryliu1005 - 12.06.2023 11:18

Hi Josh, How do we decide how the first tree should be build? by gini index or some other mesurements ?

Ответить
@priyanshujaiswal9563
@priyanshujaiswal9563 - 23.05.2023 17:08

I hate this channel to my very core. The amount of discomfort that song in the start makes me feel is out this world. Tatti khao :D

Ответить
@inllac8832
@inllac8832 - 22.05.2023 16:19

thank you so much for the awesome video,! but i have a question about how you build the tree? for example, why is the( female+ less than 1.6 m, male+ color blue) together and (not male+less than 1.6 m,female+ not color blue ). will the combination of the features make a difference?

Ответить
@redsapph1re
@redsapph1re - 29.04.2023 19:04

Thanks for all these amazing videos. Do you plan to do one on LightGBM in the future? Seems to be becoming one of the more and more popular gradient boosting techniques.

Ответить
@spevo51
@spevo51 - 17.04.2023 02:09

This is a great video, but why would you use "Weight" as a variable name?! It makes it much more confusing than it has to be

Ответить
@sampathkodi6052
@sampathkodi6052 - 13.04.2023 12:13

So in every step you get the tree by using desicion trees concepts and this concept is just like forward stagewise methods?

Ответить
@martinmaati5127
@martinmaati5127 - 17.03.2023 11:20

how will we incorporate women who are greater than 1.6M using this regression tree model

Ответить
@BogusArtem
@BogusArtem - 07.02.2023 18:54

Thanks!

Ответить
@kevinramos9587
@kevinramos9587 - 02.02.2023 08:20

TRIPLE BAM!

Ответить
@lucaalbertazzi5963
@lucaalbertazzi5963 - 20.01.2023 12:07

Grazie.

Ответить
@adrianrianto1530
@adrianrianto1530 - 03.01.2023 18:26

is pseudo residual also called the gradient?

Ответить
@tonyzong2880
@tonyzong2880 - 03.01.2023 02:34

it's not a good idea to use weight as a feature (especially as the response) in an example

Ответить
@aakashkarmakar7478
@aakashkarmakar7478 - 31.12.2022 09:42

Hi Josh, just a question, instead of taking avg in the first step can we build 1 regression tree to predict the y values following which the subsequent residuals can be predicted just like your explanation

Ответить
@jamalnuman
@jamalnuman - 21.12.2022 22:30

Great. Much better than starting by math

Ответить
@masoudhashemian5629
@masoudhashemian5629 - 05.12.2022 16:32

perfect!

Ответить
@sachinkapoor2424
@sachinkapoor2424 - 04.12.2022 20:59

Hello @josh sir, my name is Sachin kapoor from India. First i want to say a very very greatful thank you you. You are such a phenomenal teacher. You explain each and every topic from very basic to advance level. Recently I bought your book. The best book ever.

I have a request can you please provide me pdf of Gradient boosting or XG boost. It's a humble request. I am ready to pay nominal amount.

Ответить
@sachinkapoor2424
@sachinkapoor2424 - 04.12.2022 20:29

Hey @josh how can i get the PDF of gradient boost and XGBoost.

Ответить
@chongxj9263
@chongxj9263 - 04.11.2022 18:41

Thanks for your hard work producing all these incredible tutorials Josh. I just can't imagine how'd I have learned these concepts otherwise without your videos!

Ответить
@momoh6696
@momoh6696 - 04.11.2022 16:50

getting me through my degree🥺❤

Ответить
@RS-el7iu
@RS-el7iu - 03.11.2022 09:22

taking a course on machine learning from .... University (a very reputable one) , and yet im here to really understand the concept behind it.
thanks again and love from all the way Beirut-lebanon

Ответить
@yurobert3007
@yurobert3007 - 22.10.2022 00:24

I found boosting conceptually harder to grasp than bagging and random forest. This video explains each step with nice clear graphics with just about right pace! Thanks for the great efforts you have put into making this brilliant work.

Ответить
@mrcharm767
@mrcharm767 - 19.10.2022 06:58

hi josh , great video as usual . dont u think learning rate is very smal ?? could it have done good job if it was 0.5 something

Ответить
@prantikborthakur2053
@prantikborthakur2053 - 07.10.2022 13:20

Disappointed...You did not sing "bip boop bipi boop booda booda bip bip boop" while populating the remaining residuals...

Ответить
@malinkata1984
@malinkata1984 - 06.10.2022 11:18

A fun fact - you pronounce 'S' in exactly the same way as Ian Somerhalder in The Vampire Diaries. As I have said previously, your videos are awesome! Thank you so much for making the life of so many people easier.

Ответить
@Josh-di2ig
@Josh-di2ig - 15.09.2022 18:50

thanks for another amazing vid. watching this video alone improves my understanding not only about the model, but also about hyperparameters which makes me better at ML modelling. lucky to have you in the community.

Ответить