Adam Optimization Algorithm (C2W2L08)

Adam Optimization Algorithm (C2W2L08)

DeepLearningAI

6 лет назад

231,764 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

c w
c w - 01.04.2023 15:50

It would be easier if you just typed instead of handwrite I can’t read it

Ответить
Piotr
Piotr - 21.03.2023 21:19

-1 no knowledge about why Adam works better then previous algorithms is provided

Ответить
DerAfroJack
DerAfroJack - 24.09.2022 16:44

Hey there I know I am late to the party but I have a pressing question the rest of the internet has failed to answer so far.
I currently have to work with a model and network I didn't design and my job is to basically find out whats wrong so naturally I need to understand the LOC used.
There was a line I havent found any example for: optimizer = keras.optimizers.Adam(0.002, 0.5)
I am still studying so I am not that well versed in Keras or anything AI so far really but I wanna know if this second value refers to the beta_1 or any other value I am not noticing.
The documentation has me puzzled so far so I hope theres someone here who can answer this.

Ответить
Stipe Pavić
Stipe Pavić - 25.04.2022 21:04

this man is a Legend!!

Ответить
ML Lo
ML Lo - 15.04.2022 10:38

The very best and most succinct explanation of ADAM I've ever seen. Things become crystal clear if one watches L06 to L08 in a row.

Ответить
Sasha Kobrusev
Sasha Kobrusev - 18.09.2021 22:18

what is t I do not completely understand

Ответить
Therealme
Therealme - 17.08.2021 00:22

First task of significance is for me to figure out how to spell Andrews last name then I move on to the algorithm 🤓

Ответить
Sammya Majumdar
Sammya Majumdar - 13.07.2021 14:07

Could anyone give me a list of the notations he mentions in the video or direct me towards a video that has those explained? Main issue with understanding the concept in the video is the lack of explanation of the notations used.

Ответить
VeganPhilosopher
VeganPhilosopher - 11.05.2021 01:38

Ow my ears

Ответить
Sahan Mendis
Sahan Mendis - 28.04.2021 08:44

Only understood his friend has nothing to do with Adam optimization!

Ответить
ximing wen
ximing wen - 15.04.2021 21:47

what is s and v

Ответить
Sandipan Sarkar
Sandipan Sarkar - 21.12.2020 04:41

great explntion.Meed to watch again

Ответить
Hudson
Hudson - 26.10.2020 17:52

比助教講得好太多了

Ответить
pipilu
pipilu - 22.07.2020 11:01

This video is closely related to the video "Bias Correction of Exponentially Weighted Averages". Please revisit that video if you feel this is too confusing.

Ответить
Troglodyte
Troglodyte - 17.05.2020 18:24

Eve Optimization Algorithm will come soon!

Ответить
Prajwol Lamichhane
Prajwol Lamichhane - 15.05.2020 09:29

Roasting at the end ! Hahaha

Ответить
Ayush Sahu
Ayush Sahu - 08.05.2020 13:18

You are my god.

Ответить
Mostafa Nakhaei
Mostafa Nakhaei - 16.04.2020 23:53

any time I want to implement ML from scratch, I watch all Andrew's videos from beginning to end! I don't know how to express my appreciation to this great man.

Ответить
omid taghizadeh
omid taghizadeh - 21.02.2020 14:25

you really dont think that statement of the problem that ADAM solves is of relevance, when you are introducing ADAM?

Ответить
Shubham Agrawal
Shubham Agrawal - 25.11.2019 09:42

You are so sweet. Thank you Sir, for these awesome videos!

Ответить
Jere Kabi
Jere Kabi - 01.07.2019 14:28

This nailed down the Adam paper. Thanks alot

Ответить
Md Ashiqur Rahman
Md Ashiqur Rahman - 28.03.2019 21:33

there is roasting in the end

Ответить
Krishna Kumar
Krishna Kumar - 17.02.2019 19:16

God of lucid explanation <3

Ответить
Douglas Kai Cong
Douglas Kai Cong - 01.02.2019 18:37

i am confuse to the maximum level, can i buy more brain power like i buy more rams?

Ответить
E M
E M - 12.12.2018 10:19

Haha showing Adam there was hilarious :>

Ответить
Ahmad Ali
Ahmad Ali - 10.12.2018 21:38

please apply a low pass filter on the audio of this video

Ответить
Philson
Philson - 20.08.2018 17:17

SGD vs ADADELTA? If I only had those 2 choices.

Ответить
CubeLang
CubeLang - 10.08.2018 14:45

Yay

Ответить
Jedi Q
Jedi Q - 06.07.2018 23:55

is that you beatthebush?

Ответить
dex lee
dex lee - 22.04.2018 02:01

why do this-> m1 /(1-beta1) , m2/(1-beta2) ? ? this operation zoom it 10 times and 1000 times, reason? bias correction? what does it means?

Ответить
Abbass H. Gallagher
Abbass H. Gallagher - 02.03.2018 13:30

It would be appreciated if teachers would revisit their videos and replace torturous live digital pen notes with elegant text and diagrams. Chalk on a board is fine, but digital pens are painful to endure.

Ответить
Igor Aherne
Igor Aherne - 29.12.2017 15:48

I don't understand why some people hating, - yes, Proff missed a couple of symbols (once in a lifetime)
The matter of truth - without his or Geoffrey's videos to watch we would be totally fucked ))

Ответить
ADITYA SINGH
ADITYA SINGH - 27.12.2017 17:10

Why do we need correction in momentum or rms using T elimination?

Ответить
Ahmed Elsafy
Ahmed Elsafy - 17.11.2017 23:41

shouldnt db at Sdb be squared ?!

Ответить
zzzzz
zzzzz - 31.10.2017 21:11

what does db mean here? derivative of the bias?

Ответить
Infratech software services
Infratech software services - 29.08.2017 22:17

How can we use it for facial recognition

Ответить