Комментарии:
Thanks.
ОтветитьI'm in love with his way of teaching!
Ответитьanyone here to know the murderer, its the butler🤣
Ответитьtanks how you can apply lstm with times series?
Ответитьawesome
ОтветитьHi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
ОтветитьAnyone else distracted by the fact that he's writing backwards? Great vids, keep it up
Ответить❤❤
Ответитьwhat if Jennifer is they?
ОтветитьClear and concise explanation.
👍
Cam we use rnn for ctr prediction
ОтветитьGood lecture ! Thank you very much for the explanations.
Ответитьsaved my exam
ОтветитьGreat video - do you write on glass and then transpose/flip the video?
ОтветитьWho can please tell me what screen he wrote on, it is so cool!🤩
ОтветитьUseful video. Thanks a lot
ОтветитьFantastic explanation. Please keep making more.
ОтветитьVery useful, plain, and concise 😀
ОтветитьThat was great!
ОтветитьTell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
ОтветитьGreat video and nice visual effects!
ОтветитьHow are you writing ?
Ответитьmartin are you married or nah
ОтветитьVery good teaching, thank you!
Ответитьthank you martin and team. great work.
ОтветитьIs there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
ОтветитьMartin you are a wonderful teacher! Thank you very much for the explanations.
ОтветитьAfter first 10 seconds of this video
me: Woah now I know the origins of knives out movie😂
thank you
Ответитьreally good video, thanks you!
ОтветитьWao, Very helpful
ОтветитьToo many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
Ответитьmirror writing🙃🙃🙃🙃🙃
Ответитьmagnificent
ОтветитьVery useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
Ответить"always butler" is high bias :D
ОтветитьAre u using mirror or can you actually write backwards sir 🤐🤐
ОтветитьSir can you do a video of Rnn example by giving numerical values
ОтветитьThank you !
ОтветитьThank you for the lecture
ОтветитьThanks for the clear explanation.
ОтветитьReally good video. Thank you!
ОтветитьI see the LSTM = low volume. This is the 3rd guide that is almost muted sound.
ОтветитьSo are going to ignore the fact he wrote everything backwards on a clear glass wall?
ОтветитьGreat advancement in time. Glad to have a better understanding. Thank you folks
ОтветитьSuper good lecture
Ответитьsuch an informative lecture, thank you so much
ОтветитьVery helpful lecture. Keep up the good work!
ОтветитьGreat lecture, thank you.
Ответить