Relu Leaky Relu and Swish Activation Functions || Lesson 8 || Deep Learning || Learning Monkey ||

Relu Leaky Relu and Swish Activation Functions || Lesson 8 || Deep Learning || Learning Monkey ||

Learning Monkey

3 года назад

3,209 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@nekomeyo
@nekomeyo - 07.01.2023 13:53

Omg, you are awesome. Please continue creating content. It helped me a lot. :*

Ответить
@piyushpathak7311
@piyushpathak7311 - 16.01.2022 10:15

Sir plz upload videos on rnn,GANs,autoencoders, timer series Plz 🙏 Sir

Ответить
@mouleshm210
@mouleshm210 - 22.07.2021 07:35

Hi sir,
I just wanted to know that incase of leaky relu when we take the derivative of function with respect to z for z < 0 , the derivative value should be (-0.01) right ? If we consider 'a' value to be 0.01 ..negative sign will be there nah ? Because already the value of z is negative so the slope value is negative only right?

Ответить
@rohitk4483
@rohitk4483 - 24.05.2021 14:51

The way of teching and content in the video is very power full.👌😎😎

Ответить