Комментарии:
Some notes:
- A lot of you have pointed out that (tanh(x)+1)/2 == sigmoid(2x). I didn't realize this, so the improvement I was seeing may have been a fluke, I'll have to test it more thoroughly. It is definitely true that UNnormalized tanh outperforms sigmoid.
- There are apparently lots of applications of the fourier series in real-world neural nets, many have mentioned NERF and Transformers.
One of the best videos on neural nerworks I have ever seen. Great work!
ОтветитьGreat, this is a real good job! Well donw. How you made these videos from the learning iteration of model? I really found it helpful, I can use it for my students. Thanks
ОтветитьBeautiful explanation of mathematical concepts. Thanks.
ОтветитьWhen i was in college I solve A LOT of this kind of problems in Calculus and math sup, and never know for what or what they really do in the world, know you show me, thanks bro.
ОтветитьSir what software do you use to make animations, I guess MANIM
ОтветитьAs a musician i can agree more tnan you think
Ответитьgreat video!
Ответитьyou just earned a subscribe! nice work!
ОтветитьBeautiful work!
ОтветитьЖелаю вам удачи и сильных бравлеров китобоев
ОтветитьAmazing style, please do videos on other topics such as LLMs and transformers
ОтветитьThis video is amazing. The ideas, the animation, the examples, even the voice and narration style. Excellent in every detail.
Ответитьmaybe try PCA to alleviate the curse of dimensionality
ОтветитьWhat software is being used to run these elements?
ОтветитьVery interesting. Can you, or anyone on here, give me tips on how to do these amazing manim animations? I've fiddled with it, but it all lookes like garbage and I have no idea how to make them as amazing as these ones.
What are the 5 top tips?
❤️❤️❤️❤️✨🤐
ОтветитьWait until humanity learns about phi-first fractal search for solution sets
ОтветитьTHIS IS GREAT!
very detailed descriiption and visualization of the inner working of the models. Thanks for the effort...
Your baptist preacher got nothing on this math teacher. He is teaching like he was moved by the Holy Ghost!
Ответитьyou earned a sub for this one captain
Ответитьwhat a captivating and incredible visual journey. this sort of stuff needs to win awards
ОтветитьAmazing Video!! May I ask which software you use to create those nice visuals? Keep doing what you do!!
ОтветитьVery very well done.
ОтветитьI don't get the math but HOW PRETTY are the neon sign-looking visual representations!?
ОтветитьThen, are these all approximations enough so we can say that some neural network predicted something correctly, and how we can rely on such results?
Ответитьas always, visual is the easiest way to learn. thumbs up!
Ответитьthe mandelbrot... seen it so many times on mushrooms, maybe next time ill analize it
Ответитьdude i'll use function next time wanted to swear
ОтветитьThe tone, the background soothing music, the images, you made something so complicated so easy to digest. Great job. I know you are brilliant!
ОтветитьWao so knowledgeable Video.Sir can you make a short video on Radial Basis Function Neural Network ❤
Ответитьso for some situations one solution (like for example fourier) is better than for others as you stated. Wouldn't it make sense to create a two step process, where step 1 would be to use a neural net to predict/estimate which solution should be used, and after this is figured out to have a second stage neural net for the actual evalution of the function? (so function A derives the type of neural net, and function B the actual function?).
ОтветитьVery good job! My appreciation
ОтветитьAnnoying voice of Thomas
ОтветитьWhen a neural network video feels like watching an Oscar-winning documentary
ОтветитьI don't know if this will work, but from the lower dimensional examples it looks like minimizing surface area of the final network output function would help learn the underlying function with a method that would normally overfit. So you can get the benefits of a fast convergence but avoid the overfitting. Also maybe minimizing curvature changes as an alternative to surface area?
You still first minimize the distance between your data points and the function you are approximating (i.e. error). But once the change in that error is small, with in some tolerance, then you start lowering the the surface area of the output by comparison between updates.
For an atheist, if so much of computing power is not been able to solve just a simple problem if we compare it with our day to day issues, how our brain having only consuming some calories of energy solves much more complex problems within just a blink of an eye, means some Power has created this complex biological neural network and that power is the only one known as The Creator of this universe and all in between, He is the only one Allah SWT. No thing can be created automatically, there must be some power behind creation of even a tiny string - to date known building block of all the matter.
Ответить😂😂😂 not everything is a function
ОтветитьExcellent video, it makes me wonder if, aside from Fourier series, if it is possible that the Mandelbrot set itself, could be used as a function approximator....
ОтветитьId love to see some code used in the video if possible
ОтветитьGreat video, it made me feel smaller and dumber the longer I watched.
ОтветитьWhat software do you use to get the visualisation of the curve fitting?
ОтветитьThis is the best video I have ever watched regarding the understanding the basics of NN and its underlying functions.
ОтветитьI have had the idea of approximating the mandelbrot set in the back of my mind for so long. That and training a GAN to generate images of the set.
ОтветитьFunctions! Describe! THE WORLD!!! 🎉
ОтветитьThis content is exceptionally inspiring, especially the introduction of Taylor series as a layer of a neural network. I also quite amazed by the Fourier feature layer! I may adopt this approach in my research. Thanks!
Ответить