Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)

Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)

Brandon Rohrer

7 лет назад

779,712 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Galvin Voltag
Galvin Voltag - 02.11.2023 16:03

Thanks Brandon

Very cool!

Ответить
Shridhar Yadav
Shridhar Yadav - 24.09.2023 11:14

That was an amazing explanation. Very clear with the help of images and symbols.

Ответить
Ehan Cheung
Ehan Cheung - 08.07.2023 13:29

Subtitles are auto-generated with lots of mistakes.

Ответить
Joe_Stoney
Joe_Stoney - 28.06.2023 11:53

Thanks a lot for this straight to the point explanation. Really learnt a lot.

Ответить
Philippe Muller
Philippe Muller - 12.06.2023 07:08

This is simply amazing and so clear.

Ответить
Ashita
Ashita - 23.11.2022 02:15

Watching in 2x speed, even then it makes perfect sense! 🙏

Ответить
Shaymaa Khalifa
Shaymaa Khalifa - 26.10.2022 10:29

Amazing Explanation!!

Ответить
goodmusic284
goodmusic284 - 18.10.2022 19:46

The best Intro To RNNS and LSTMs I have seen!

Ответить
廖子庭
廖子庭 - 09.10.2022 16:21

im a student major in data science from taiwan and i wanna say thank you, i got more to know about LSTM after watching ur video. Much appreciate sir!

Ответить
Vegard Nybakeri
Vegard Nybakeri - 13.09.2022 15:26

Waffles for dinner? yuck

Ответить
Besho Samir
Besho Samir - 21.08.2022 08:17

Hi , i need some help here
why we decide to make the next hidden state = the long memory after filter it ? why not the next hidden layer not = the long memory (Ct)

Ответить
Arya Parvizi
Arya Parvizi - 10.07.2022 09:16

subscribed! TYSSM

Ответить
ocean
ocean - 28.05.2022 16:13

Exceptionally good, the best I've seen in this subject.

Ответить
CarlSeconds
CarlSeconds - 14.05.2022 11:17

I don't usually place any comment like this, but this is extraordinary :) So easy to understand :) Thank you :)

Ответить
Deepak
Deepak - 03.05.2022 13:51

Superb! Thanks!

Ответить
Japsowin Kaur
Japsowin Kaur - 23.04.2022 06:38

The best video on LSTM and RNN I've seen. Thank you so much!

Ответить
RC Link
RC Link - 21.03.2022 08:53

Are the neural network units in an LSTM trained independently or simultaneously through backpropagation? I am assuming the latter and the functions of each unit are simply learned through their placement. For example, the "selection" NN unit is just named that from its placement and we don't do anything special during the training phase to make it a "selection" unit.

Ответить
David Porter
David Porter - 07.02.2022 04:49

you lost me at former and latter

Ответить
vijaya kumar
vijaya kumar - 30.01.2022 09:30

Hey Thanks Bradon, Complexity put forth in Layman's language, just loved it!!!

Ответить