Комментарии:
BOSS BOSS, one of the best pedagogue
ОтветитьGREAT explanation ..this video and all the others in the playlist.
Ответитьthanku sir
Ответитьcongrats for your lovely tutorial. is C++ being used for deep learning? or Python is the top list of industries for AI transformation.
ОтветитьNice series of tutorials. Super easy and time-efficient explanations.
ОтветитьSir you explain the easy example ..... And take other example in coding......without explaining the process.......and this number data set is highly confusing......please use simple data set
Ответитьsir, i have one question, as before doing machine learning, you said to complete your numpy and pandas tutorials, should we still watch those videos, because they are 6 years old now. will that be okay if we will watch that videos now to jump on machine learning tutorial,
Ответитьbig fan of this tutorials
ОтветитьVery well articulated, I searched the whole web, nobody explained these concepts in such simple way, without any confusion!!! Thank you
ОтветитьI want you know that you are wonderful. I really enjoy watching your tutorials 💟
Ответитьlove your tutorials ❤❤❤
ОтветитьYour videos are excellent. Your words and diagrams really help clarify the process. I have recommended your videos to fellow colleagues. Bravo 👍
ОтветитьGreat explanation!
ОтветитьSir u r amazing
Ответитьis there any activation function that does not suffer from vanishing gradient problem?
ОтветитьAmazing teacher
ОтветитьCan we get jupyter notebooks coding ?
ОтветитьDoes the shape of Leaky Relu mean that False Negative would increase because on the negative side it treats all values the same. e.g. Assume age 35 for buying insurance being considered as 0. Then the activation for 20 year old is same as 34 year old
ОтветитьHats off, I am a PhD student, and I worked on NLP, ML and text analytics, in the last semester of my PhD I am turning to deep learning for my post doc research, and I needed background information on deep learning. Also in my last project somehow I managed to apply deep learning simple classifier, but that instinct to theoretically and technically understand background of deep learning was missing. I read articles, videos etc. a lot but man your videos on deep learning concept is really fulfilling my instinct up till now. Hats off to you Bro. Thank you for your vision of education and these helpful tutorials.
ОтветитьSir, what about softmax activation function, only for multiple classification? any other activation function for time series analysis?
Ответитьgood tutorial
ОтветитьExcellent
Ответитьthis was extremely important, cleared my all doubts and now i think i m able to solve problems myself thank you so much, god bless you
Ответитьsir, you give a good concept of deep learning. Sir i am beginner and one my friend refer your deep learning lectures when i started your lectures i learn so much from it. Sir keep it up for future, thank you sir again..
Ответитьmaths parts must be known to the learner because to understand the problem statments , and your lecture help me a lot thanks ,,,,,,
Ответитьvery nice concept
ОтветитьGreat video! Thank you
ОтветитьBrother u are a savior god bless you
Ответить👏👏👏
ОтветитьSIR HOW YOUR CONCEPT IS TOO CLEAR ON ANY TOPIC? AWESOME Sir🙏🙏
ОтветитьDear sir.. request to make videos on Boltzmann machines
Ответитьsir what ah explanation , it seems so easy to learn deep learning,carry on your winning momentum , hope you become one of the great teachers in data science🔥🔥🔥🔥🔥🔥🔥🔥🔥
Ответитьdhasu 🔥
ОтветитьThis is brilliant indeed Dhaval
Ответитьwow this channel has a lot of crucial content. relu activation decreased my loss value from 0.04 to 0.003 even with half of training data!
Ответитьop
ОтветитьCan you also upload the presentation slide in the github link for quick occasional revision. Thank You
Ответитьthank you
Ответитьthank you so much i have learned so much
Ответитьvery nice videos....good work
ОтветитьHello
can you give me the presentation of the course please?
👏👏👏👏👏👏👏👏👏
ОтветитьAmazing!
ОтветитьThanks a lot, sir for such an amazing tutorial
Ответить