Комментарии:
🌲 Decision trees are one of my favorite models because of their high interpretability! However, you can achieve much better performance by ensembling them in a Random Forest (at the expense of interpretability). Which do you prefer: Decision trees or Random Forests? 🤔
ОтветитьI found this video super clear and helpful, thank you so much!
ОтветитьHi, thanks. Can you do a video on interpreting the export text in relation to the tree? Appreciated
Ответитьwhoa thank you so much! i've been learning this for these past week. thank you for the knowledge!
ОтветитьThis is simply awesome! I hope you have a million subs soon.
ОтветитьWhile you can't use this on a RandomForestClassifier model directly, you can use this method with each item of the "estimators_" list of the RF model (code below). Caution: running this on an RF model instantiated with the defaults will output 100 trees with unlimited depth. May take a minute to render!
rf = RandomForestClassifier()
rf.fit(X_train, y_train)
for dt in rf.estimators_: plot_tree(dt, feature_names = features, class_names=classes)
Thanks Kevin
ОтветитьThank you for uploading this video.
I've been struggling to visualize the result of Decision tree. this video helped me to do it
Can you please show the relation between gini purity & entropy?
ОтветитьCould you explain what data you are using to generate this tree. Also, if sex takes on 0 or 1, why you have <=0.5? Could you explain? Thanks
ОтветитьThanks a lot Kevin.
ОтветитьGreat.
ОтветитьNice.
ОтветитьIf only I could have opportunity for giving more than 1 👍
Ответить