Tutorial 34- LSTM Recurrent Neural Network In Depth Intuition

Tutorial 34- LSTM Recurrent Neural Network In Depth Intuition

Krish Naik

4 года назад

206,259 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Parthiv Shah
Parthiv Shah - 30.08.2023 18:42

Thank You sir for such videos, Just please arrange it in playlist or in your website in order to access it easily. Thank You so much.

Ответить
HM
HM - 27.08.2023 16:51

confusing

Ответить
Srirama Yeshwanth
Srirama Yeshwanth - 11.08.2023 18:18

Sir why are we again applying sigmoid function in the input layer while we have already done in the Forget Date? what is the necessity of calculating i(t) again? isn't f(t) = i(t)?

Ответить
Moayyad arz
Moayyad arz - 21.01.2023 14:22

Hi , Thanks for your wonderful explanation,
In my opinion , this detailed video is more important for researcher rather than programmers want to use LSTM or RNN

Ответить
jain vinith
jain vinith - 21.12.2022 14:15

Nice lecture sir. Plz, try to solve only one numerical example manually for at least one epoch sir. It will be helpful to understand lstm in depth. Thank you

Ответить
RENGARAJ IT
RENGARAJ IT - 14.12.2022 05:28

Excellent sir

Ответить
Seshan Sesha
Seshan Sesha - 06.12.2022 23:16

Excellent..

Ответить
Atrey Anagal
Atrey Anagal - 19.11.2022 22:38

Finest explanation of such a difficult topic, hats off!! 🫡

Ответить
Sagar Parab
Sagar Parab - 31.10.2022 15:37

You disappointed us

Ответить
Sergey B
Sergey B - 28.10.2022 13:31

Паша Техник совсем плох, индусом стал, нейронками занялся

Ответить
user add
user add - 26.06.2022 21:59

Good Copy of Rohan's research. I see Most of materials are from TSAI

Ответить
Murugan Veera
Murugan Veera - 13.05.2022 08:59

from which book you are teaching krish

Ответить
Indrashis Powali
Indrashis Powali - 31.03.2022 08:44

nice! simple explanations.... much appreciable Sir

Ответить
Tingu Tech
Tingu Tech - 16.03.2022 13:21

sir g love ho gia ap sy main first time nlp smjny ki koshish kr raha tha q ky main ny final year my isy as a research work choose kia hy and sir your videos help me alot love you sir boht ziada

Ответить
Sujatha OnTheWeb
Sujatha OnTheWeb - 16.01.2022 14:41

Please go back to your whiteboard. You're amazing with whiteboard and marker!

Ответить
ThePresistence
ThePresistence - 07.01.2022 13:23

Wonderful Explanation!

Ответить
Deep R. Ode
Deep R. Ode - 28.12.2021 18:11

Sigmoid doesn't inherently converts real values to binary labels i.e. 0 or 1, instead it'll be range of real values between 0 to 1 (inclusive). The vectors at output of gates need NOT be something like [0 0 1 1] but can be, and most probably be, something like [0.122, 0.23, 0, 0.983].

Ответить
Lucky Girl
Lucky Girl - 21.12.2021 01:23

Thanks sir 🌸🙏

Ответить
Vishak Arudhra
Vishak Arudhra - 10.12.2021 21:04

So is it fair to say the forget gate decides "where the new word fits in the context" and hence the forgetting in the context and the input gate decides how the new word 'changes' the context, thereby altering the influence of the new word on the context?

Ответить
Nipun Dahra
Nipun Dahra - 01.12.2021 23:02

amazing explanation sir..many thanks

Ответить
jagadeesh mandala
jagadeesh mandala - 12.11.2021 09:44

Too much advertisements😒😔

Ответить
Green Forest Kaziranga
Green Forest Kaziranga - 12.10.2021 10:33

Sir, are you possible image classify of folder wise move. I'm data operator forest Kaziranga National Park. Many photos trapped camera. Manual segregation is to very hard. Please help you

Ответить
vc Jayan
vc Jayan - 16.09.2021 14:06

I was really strugling to understand the core concept of LSTM. This really helped me. Thank you very much,,Also the blog is really awesome..

Ответить
Saikiran Ayyagari
Saikiran Ayyagari - 05.09.2021 12:57

What happens to the -1 values of tanh and sigmoid cross product when the information is added to cell state in lstm?

Ответить
fatma mamdouh
fatma mamdouh - 31.08.2021 00:09

the best explanation as usual,, thank you so much for your effort.

Ответить
srikant hiremath
srikant hiremath - 27.08.2021 11:47

LSTM accepts input of variable size?? Or padding is required to make all input of same size?

Ответить
Priya M
Priya M - 12.08.2021 17:45

Thank you, sir! It's great content and I'm almost following your NLP playlist.

Ответить
Mambo mambo
Mambo mambo - 01.08.2021 16:02

Me watching other YT videos: Watch then like/dislike/do nothing
Me watching Krish sir's videos: First like then watch

Thank you so much for explaining so many things. I learnt complete practical ML/DL from your videos. A big thumbs up from my side. Definitely, I will share your channel to anyone who would want to dive into ML/DL/DS.

Ответить
Abhijit Bhandari
Abhijit Bhandari - 19.07.2021 19:05

A small confusion in C t-1. How does Ct-1 differ from h t-1, if both are previous output

Ответить
Nidhi Chakravarty
Nidhi Chakravarty - 18.07.2021 13:04

Can you please make a video on how to combine two deep learning model which are trained on different dataset

Ответить
code pathsala
code pathsala - 22.06.2021 17:06

This is the best explanation on LSTM.. really thanks

Ответить
Reynold Barboza
Reynold Barboza - 20.06.2021 09:08

is the video for the different types of LSTM skipped ?

Ответить
Prem Chand Kamarapu
Prem Chand Kamarapu - 17.06.2021 06:28

I have recently been thinking of Data Science and Machine Learning, Krishna Naik's videos were very helpful in framing my decision. Thank you Krishna Naik.

Ответить
DEV MAHARAJ
DEV MAHARAJ - 29.05.2021 12:48

Day of Recording of this video is the day when the LOCKDOWN started !!!!!!!!

Ответить
VARUN PARUCHURI
VARUN PARUCHURI - 25.05.2021 08:53

@krish naik wonderfull explanation

Ответить
louer le seigneur
louer le seigneur - 22.05.2021 11:51

Thanks Krish

Ответить
sowmya kavali
sowmya kavali - 20.05.2021 20:41

Hi krish ,
In lstm don't we have back propagation and weight updation ? if yes, why?

Ответить
Akash Thing
Akash Thing - 07.05.2021 17:09

Amazing explanation, you made it very simple and clear

Ответить
Shahrin Nakkhatra
Shahrin Nakkhatra - 01.05.2021 01:23

Hi, actually I don't understand why do we need to do the sigmoid part twice? Once for input and once for forget gate? Isn't it doing the same thing?

Ответить
Ved Deo
Ved Deo - 29.03.2021 23:27

hello Krish can you explain Conv-LSTM with one sample data and difference with LSTM and time distributed concept of LSTM?

Ответить
Rohullah
Rohullah - 24.03.2021 19:45

Mar-24-2021

Ответить
Iram Arshad
Iram Arshad - 27.02.2021 17:24

Can you please make a video on GAN as well?

Ответить
Aqib Fayyaz
Aqib Fayyaz - 17.02.2021 12:15

Great

Ответить
kapil bisht
kapil bisht - 06.02.2021 16:01

How can we do extractive summarisation in BERT??

Ответить