Tutorial 37: Entropy In Decision Tree Intuition

Tutorial 37: Entropy In Decision Tree Intuition

Krish Naik

4 года назад

278,896 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@user-uy5ls9eq7l
@user-uy5ls9eq7l - 08.12.2023 19:48

This is one of the best explanation thankyou somuch sir

Ответить
@beastjaguar7196
@beastjaguar7196 - 18.05.2023 19:23

thanku a lot🙏😊

Ответить
@vinayakrao4754
@vinayakrao4754 - 04.05.2023 15:53

What do you mean by feature?

Ответить
@ambresh009
@ambresh009 - 13.04.2023 12:44

@krishNaik, I like your videos very much as they are quick reference guides for me to quickly understand something required for interview prep or for any project.

Just noticed here that, you mentioned Entropy is a measure of purity. But, it is a measure of impurity which makes more sense. The more the value of entropy, more is heterogeneity in the variable.

Ответить
@mehdicharife2335
@mehdicharife2335 - 15.03.2023 21:39

You don't explain the intuition though.

Ответить
@yogendrashinde473
@yogendrashinde473 - 17.02.2023 09:02

Dear Krish Naik Sir.
Could you please recheck the calculation. As per my calculation entropy for f2 node where the split is 3|2 is 0.97 and not 0.78 ?
Kindly correct me if I am wrong.

Ответить
@ABINASHPANDA-be7ug
@ABINASHPANDA-be7ug - 20.01.2023 11:34

Hi, there might be calculation mistake in the entropy part. its not 0.78. Can you please mention that in a caption in the video or a description. So that people dont mistaken it in the future. Great video!!

Ответить
@memeddk
@memeddk - 25.09.2022 17:14

tidak membantu

Ответить
@MrBank1717
@MrBank1717 - 11.08.2022 06:44

Awesome video.

Ответить
@MuhammadAwais-hf7cg
@MuhammadAwais-hf7cg - 30.06.2022 05:48

why this entropy in bits? as for normal its about 0.97, and how can i convert my entropy iinto bits

Ответить
@skvali3810
@skvali3810 - 22.06.2022 14:34

i have one question .at root node is the gini are Entropy is high are low..

Ответить
@lekhnathojha8537
@lekhnathojha8537 - 14.06.2022 09:12

very well understandable your teaching curriculum.

Ответить
@ashishkumari762
@ashishkumari762 - 26.04.2022 18:53

thank you

Ответить
@ernestanonde3218
@ernestanonde3218 - 03.04.2022 00:39

I SAID I LOVE YOU

Ответить
@deepalisharma1327
@deepalisharma1327 - 15.03.2022 09:31

Can we use same feature for multi level split in the decision tree?

Ответить
@yamika.
@yamika. - 03.03.2022 14:01

thank you. we all need teachers like you. god bless you. you're a blessing for us college students who are struggling with offline colleges after the reopening.

Ответить
@sohammukherjee837
@sohammukherjee837 - 17.01.2022 14:00

Hi Krish, can you please explain the process of calculating probability of a class in a decision tree and whether we can arrive at the probability from feature importance

Ответить
@ankitac4994
@ankitac4994 - 21.12.2021 13:33

good explanation

Ответить
@loganwalker454
@loganwalker454 - 28.10.2021 09:47

Krish, I love you so much, more than my girlfriend, zillions like from my side. You always make knotty problems so simple

Ответить
@vishaljhaveri7565
@vishaljhaveri7565 - 12.10.2021 16:00

Thank you, Krish sir.

Ответить
@arunkumars3966
@arunkumars3966 - 18.07.2021 13:28

how is 0.79 bits when you compute it? someone pls explain

Ответить
@louerleseigneur4532
@louerleseigneur4532 - 16.07.2021 01:52

Thanks Krish

Ответить
@abdulkayumshaikh5411
@abdulkayumshaikh5411 - 11.07.2021 11:32

Explained in a great way ...Thank you krish

Ответить
@murumathi4307
@murumathi4307 - 05.07.2021 19:43

Entropy is thermodynamics concept measure tha energy, why using mechine learning.

Ответить
@shwetadalal1549
@shwetadalal1549 - 27.05.2021 15:18

Nice explanation. But actuallly we dont use this formula while modelling. We just set the parameter of decision tree to either entropy or gini. So when does this formula of entropy really help??

Ответить
@shubhamnehete8020
@shubhamnehete8020 - 16.05.2021 10:13

Sir, here u didn't mentioned that how f3 is in right side and how f2 is in left side node. As u said the attribute having less entropy is selected for split. This is understood but why f2 is on left and f3 os on right?

Ответить
@yashmehta8886
@yashmehta8886 - 14.05.2021 11:22

Can you mathematically explain how you obtained entropy=1 for a completely impure split(yes=3, no=3)?

Ответить
@bhavikdudhrejiya4478
@bhavikdudhrejiya4478 - 29.04.2021 11:37

Best channel for Data Science Beginners

Ответить
@AK-ws2yw
@AK-ws2yw - 18.04.2021 14:13

In the formula of Entropy what is the significance of log base 2, why not simple log having base 10?

Ответить
@AromonChannel
@AromonChannel - 02.04.2021 09:42

Definitely subscribe and tell my fellow other programmer to see and subscribe your channel, you are the best explainer i've ever seen!

Ответить
@sandupaegodage8163
@sandupaegodage8163 - 29.03.2021 15:49

GOOD ONE

Ответить
@ayberkctis
@ayberkctis - 26.03.2021 13:47

You clearly explain the mathematics of machine learning algorithms! Thank you for your effort.

Ответить
@b.f.skinner4383
@b.f.skinner4383 - 25.03.2021 03:36

Great introduction to the topic, thank you

Ответить
@ahmarhussain8720
@ahmarhussain8720 - 09.03.2021 20:29

excellent explanation man, thanks

Ответить
@spurthishetty6834
@spurthishetty6834 - 08.03.2021 18:37

Hi Krish,
Have you explained how decision tree works? because im not finding it

Ответить
@alastairbowie
@alastairbowie - 07.03.2021 13:36

Nice explanation. Cheers =]

Ответить
@sameerkhnl1
@sameerkhnl1 - 07.03.2021 09:10

Thank you for a great tutorial. The entropy value is actually 0.97 and not 0.78.

Ответить
@lemonoji288
@lemonoji288 - 04.03.2021 14:56

Thank you, this was very helpful!

Ответить
@amitmodi7882
@amitmodi7882 - 27.02.2021 03:10

Super Awsome!

Ответить
@AbhishekRana-ye9uw
@AbhishekRana-ye9uw - 22.02.2021 10:56

very much helpful sir thank you you are best :)

Ответить
@abhiramikc6883
@abhiramikc6883 - 02.01.2021 17:43

if we have very high dimensional data , how do we apply decision tree ?

Ответить
@shivadumnawar7741
@shivadumnawar7741 - 27.12.2020 17:52

One of the great teacher in the Machine Learning field. You are my best teacher in ML.Thank you so much sir for spreading your knowledge.

Ответить
@anandachatterjee2976
@anandachatterjee2976 - 13.12.2020 17:04

I tried to purchase the going through the above pasted link but its showing unavailable now, could you please tell me how to get your book?I really need that,I follow your channel frequently whenever I face trouble in understanding any concepts of data science and after watching your videos it gets cleared so please let me know how to purchase your book.

Ответить
@vaddadisairahul2956
@vaddadisairahul2956 - 02.12.2020 11:31

in my opinion, calculating entropy is sufficient and we don't require information gain, as in information gain we simply subtract from the entropy of attribute from the entropy of dataset; the entropy of dataset is always constant for a particular dataset.

Ответить
@subrahmanyamkv8168
@subrahmanyamkv8168 - 07.11.2020 09:39

As Entropy of pure node is zero..I think Entropy is measure of impurity..lesser the Entropy..more pure the node is

Ответить
@nirajchaudhari5974
@nirajchaudhari5974 - 24.10.2020 07:36

please upload the video for regression tree also and discuss it in detail manner

Ответить
@lucianoval903
@lucianoval903 - 23.10.2020 22:25

Yours videos are very nice, but you really need to improve the quality of your microphone

Ответить