Watching Neural Networks Learn

Watching Neural Networks Learn

Emergent Garden

9 месяцев назад

1,153,309 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Emergent Garden
Emergent Garden - 25.08.2023 00:20

Some notes:
- A lot of you have pointed out that (tanh(x)+1)/2 == sigmoid(2x). I didn't realize this, so the improvement I was seeing may have been a fluke, I'll have to test it more thoroughly. It is definitely true that UNnormalized tanh outperforms sigmoid.
- There are apparently lots of applications of the fourier series in real-world neural nets, many have mentioned NERF and Transformers.

Ответить
Lucas Zampar
Lucas Zampar - 18.09.2023 18:56

One of the best videos on neural nerworks I have ever seen. Great work!

Ответить
Uncle Code
Uncle Code - 18.09.2023 05:00

Great, this is a real good job! Well donw. How you made these videos from the learning iteration of model? I really found it helpful, I can use it for my students. Thanks

Ответить
Apratim Shaw
Apratim Shaw - 18.09.2023 02:53

Beautiful explanation of mathematical concepts. Thanks.

Ответить
Ariel Castro
Ariel Castro - 17.09.2023 23:53

When i was in college I solve A LOT of this kind of problems in Calculus and math sup, and never know for what or what they really do in the world, know you show me, thanks bro.

Ответить
Clip Cast
Clip Cast - 17.09.2023 12:52

Sir what software do you use to make animations, I guess MANIM

Ответить
Omri Ben Ami
Omri Ben Ami - 17.09.2023 08:49

As a musician i can agree more tnan you think

Ответить
Evan Starr
Evan Starr - 16.09.2023 05:13

great video!

Ответить
Dan Floyd Arnaiz
Dan Floyd Arnaiz - 15.09.2023 22:19

you just earned a subscribe! nice work!

Ответить
good to think with
good to think with - 15.09.2023 01:40

Beautiful work!

Ответить
dexterr
dexterr - 14.09.2023 19:51

Желаю вам удачи и сильных бравлеров китобоев

Ответить
Muhannad Obeidat
Muhannad Obeidat - 14.09.2023 09:06

Amazing style, please do videos on other topics such as LLMs and transformers

Ответить
Muhannad Obeidat
Muhannad Obeidat - 14.09.2023 09:05

This video is amazing. The ideas, the animation, the examples, even the voice and narration style. Excellent in every detail.

Ответить
Khalifa Tube
Khalifa Tube - 12.09.2023 04:04

maybe try PCA to alleviate the curse of dimensionality

Ответить
An Anattan
An Anattan - 12.09.2023 01:41

What software is being used to run these elements?

Ответить
Thomas Kleppe
Thomas Kleppe - 11.09.2023 23:37

Very interesting. Can you, or anyone on here, give me tips on how to do these amazing manim animations? I've fiddled with it, but it all lookes like garbage and I have no idea how to make them as amazing as these ones.

What are the 5 top tips?

Ответить
Mr
Mr - 11.09.2023 17:43

❤️❤️❤️❤️✨🤐

Ответить
James Jones
James Jones - 11.09.2023 13:15

Wait until humanity learns about phi-first fractal search for solution sets

Ответить
rj
rj - 10.09.2023 19:34

THIS IS GREAT!
very detailed descriiption and visualization of the inner working of the models. Thanks for the effort...

Ответить
Peter Dejean
Peter Dejean - 10.09.2023 17:31

Your baptist preacher got nothing on this math teacher. He is teaching like he was moved by the Holy Ghost!

Ответить
hari
hari - 09.09.2023 03:57

you earned a sub for this one captain

Ответить
cd5050
cd5050 - 09.09.2023 02:51

what a captivating and incredible visual journey. this sort of stuff needs to win awards

Ответить
T Brx
T Brx - 09.09.2023 00:15

Amazing Video!! May I ask which software you use to create those nice visuals? Keep doing what you do!!

Ответить
Wissam Kahi
Wissam Kahi - 08.09.2023 20:30

Very very well done.

Ответить
New-legs Lt. Dan
New-legs Lt. Dan - 08.09.2023 19:11

I don't get the math but HOW PRETTY are the neon sign-looking visual representations!?

Ответить
Nikola Temp
Nikola Temp - 08.09.2023 14:06

Then, are these all approximations enough so we can say that some neural network predicted something correctly, and how we can rely on such results?

Ответить
Antolin Jiao Alipio
Antolin Jiao Alipio - 08.09.2023 09:44

as always, visual is the easiest way to learn. thumbs up!

Ответить
carl johnson
carl johnson - 08.09.2023 07:33

the mandelbrot... seen it so many times on mushrooms, maybe next time ill analize it

Ответить
Zaur Samadov
Zaur Samadov - 07.09.2023 10:03

dude i'll use function next time wanted to swear

Ответить
hasalinah stevenson
hasalinah stevenson - 07.09.2023 08:51

The tone, the background soothing music, the images, you made something so complicated so easy to digest. Great job. I know you are brilliant!

Ответить
Beauty Reveal
Beauty Reveal - 06.09.2023 08:53

Wao so knowledgeable Video.Sir can you make a short video on Radial Basis Function Neural Network ❤

Ответить
Gert-Jan Bark
Gert-Jan Bark - 05.09.2023 22:50

so for some situations one solution (like for example fourier) is better than for others as you stated. Wouldn't it make sense to create a two step process, where step 1 would be to use a neural net to predict/estimate which solution should be used, and after this is figured out to have a second stage neural net for the actual evalution of the function? (so function A derives the type of neural net, and function B the actual function?).

Ответить
MultiNeurons
MultiNeurons - 05.09.2023 17:20

Very good job! My appreciation

Ответить
Edgar MMXXIII
Edgar MMXXIII - 05.09.2023 13:46

Annoying voice of Thomas

Ответить
Young Entrepreneurs
Young Entrepreneurs - 05.09.2023 11:41

When a neural network video feels like watching an Oscar-winning documentary

Ответить
13lacle
13lacle - 05.09.2023 10:55

I don't know if this will work, but from the lower dimensional examples it looks like minimizing surface area of the final network output function would help learn the underlying function with a method that would normally overfit. So you can get the benefits of a fast convergence but avoid the overfitting. Also maybe minimizing curvature changes as an alternative to surface area?

You still first minimize the distance between your data points and the function you are approximating (i.e. error). But once the change in that error is small, with in some tolerance, then you start lowering the the surface area of the output by comparison between updates.

Ответить
Liquo
Liquo - 04.09.2023 23:39

For an atheist, if so much of computing power is not been able to solve just a simple problem if we compare it with our day to day issues, how our brain having only consuming some calories of energy solves much more complex problems within just a blink of an eye, means some Power has created this complex biological neural network and that power is the only one known as The Creator of this universe and all in between, He is the only one Allah SWT. No thing can be created automatically, there must be some power behind creation of even a tiny string - to date known building block of all the matter.

Ответить
Cody Maverick
Cody Maverick - 04.09.2023 23:18

😂😂😂 not everything is a function

Ответить
Sapien Space
Sapien Space - 04.09.2023 19:05

Excellent video, it makes me wonder if, aside from Fourier series, if it is possible that the Mandelbrot set itself, could be used as a function approximator....

Ответить
ApesTogetrStrong
ApesTogetrStrong - 04.09.2023 04:40

Id love to see some code used in the video if possible

Ответить
Starscreame
Starscreame - 04.09.2023 01:37

Great video, it made me feel smaller and dumber the longer I watched.

Ответить
William Smith
William Smith - 03.09.2023 15:29

What software do you use to get the visualisation of the curve fitting?

Ответить
Gaurav Kispotta
Gaurav Kispotta - 03.09.2023 13:44

This is the best video I have ever watched regarding the understanding the basics of NN and its underlying functions.

Ответить
I77AGIC
I77AGIC - 03.09.2023 03:09

I have had the idea of approximating the mandelbrot set in the back of my mind for so long. That and training a GAN to generate images of the set.

Ответить
Jason Okoro
Jason Okoro - 02.09.2023 21:50

Functions! Describe! THE WORLD!!! 🎉

Ответить
刘一新
刘一新 - 02.09.2023 16:27

This content is exceptionally inspiring, especially the introduction of Taylor series as a layer of a neural network. I also quite amazed by the Fourier feature layer! I may adopt this approach in my research. Thanks!

Ответить