MIT 6.S191 (2023): Convolutional Neural Networks

MIT 6.S191 (2023): Convolutional Neural Networks

Alexander Amini

1 год назад

247,787 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Andrew's_Lab
Andrew's_Lab - 09.09.2023 08:53

This presentation is really well put together.

Ответить
Electrical Engineering Education
Electrical Engineering Education - 30.08.2023 18:18

Alexander Amini has splendid presentation skills

Ответить
Neeraj Sharma
Neeraj Sharma - 18.08.2023 16:28

Hi, After CNN performed, and pixels are flatten then can we add VAE with GAN to create the same probability distribution of input flattened array and as well some alternative derivativea Or distribution, like cycle gan, road to map.

Am I connecting it correct or again watch the videos,

Thank you for the videos

Ответить
agustin villagra
agustin villagra - 08.08.2023 16:56

Where do I find the labs for the practice?

Ответить
manu tube
manu tube - 04.08.2023 12:03

Great Lecture. Thank you very much!

Ответить
Yury Kalinin
Yury Kalinin - 30.07.2023 22:37

Super 👍

Ответить
Sukhdeep Singh
Sukhdeep Singh - 09.07.2023 08:21

love from india as I'm not able to study at MIT but this series helps me a lot and I hope lots of people but if you can add the labs lecture that how we can build this practically so it would be a great honor

Ответить
Rajab Natshah
Rajab Natshah - 29.06.2023 23:19

Thank you :)

Ответить
Best News
Best News - 05.06.2023 00:40

Thanks sir for this wonderful explanation.

Ответить
Anonymous User
Anonymous User - 04.06.2023 21:34

Are NNs not always fully connected? I just assumed they were from the math, unless a given weight is zero.

Ответить
Axel Rose
Axel Rose - 21.05.2023 03:10

This entire series on Deep Learning is a great pleasure to listen to and brainstorm about.

There are limitless possibilities for AI applications, and I'm highly inspired for some of them.

Ответить
Bruno Wozniak
Bruno Wozniak - 16.05.2023 09:17

Wow, it's ridiculous, the more it goes, the better - I love every single minute of this course - A huge thank you!

Ответить
Salma Taha
Salma Taha - 30.04.2023 01:10

Where can I find the Tensor Flow labs to practice?

Ответить
Emanuel Thiago de Andrade da Silva
Emanuel Thiago de Andrade da Silva - 19.04.2023 15:48

hello, I'm giving a course at my university on Brazil about Machine Learning, and i would like to ask to use some of your slides and translate your material for the next leasson which is about CNN

Ответить
Muhammad Altaf
Muhammad Altaf - 14.04.2023 21:10

I am in awe. You have delivered these concepts so beautifully that I didn't need to look up into other resources. I have recently made a switch to this field and you happened to be my biggest motivator to pursue it further. Thank you.

Ответить
kiran kumar
kiran kumar - 12.04.2023 19:57

I get so excited about the use cases and various possibilities of using CNNs. Excellent presentation. A master class in simplifying a complex subject.

Ответить
Doctor Shadow
Doctor Shadow - 12.04.2023 02:27

Thank you to the author. Does anybody get from this video how all this works with shift/rotation/scale of the image?

Ответить
Jobssss
Jobssss - 05.04.2023 07:21

Thank you very much for sharing!

Ответить
mehdi smaeili
mehdi smaeili - 04.04.2023 14:15

excellent.

Ответить
Muhammad Abul Hassan
Muhammad Abul Hassan - 01.04.2023 18:57

Now i have learned the whole CNN working. Great explanation

Ответить
Ahsen Ali
Ahsen Ali - 01.04.2023 12:37

The best tutorial of CNN on earth.

Ответить
Raman Raguraman
Raman Raguraman - 30.03.2023 17:16

Thank you Dr

Ответить
Deepak Singh
Deepak Singh - 29.03.2023 14:27

Can we get access to software labs with some hands on learning? I know codes are available but something else where we can learn from scratch.

Ответить
Gondwana
Gondwana - 29.03.2023 03:35

Here's what I love about your lectures: You give the intuition and logic behind the architectures and this helps a lot as opposed to the stone tablet thrown down from the heavens approach. Not only is this important for learning but it also stimulates intuition for the next set of innovations!

Ответить
Hoạ Mi
Hoạ Mi - 26.03.2023 17:15

I was very impressed when I heard that the transformer model was created by a Vietnamese person

Ответить
Hoạ Mi
Hoạ Mi - 26.03.2023 17:13

I'm self-studying deep learning without going through any school so I need sharers like you . thank you very much!

Ответить
Md. Sabbir Rahman Akash
Md. Sabbir Rahman Akash - 26.03.2023 16:54

Thank you for uploading this video ❤

Ответить
Alexander Skusnov
Alexander Skusnov - 26.03.2023 06:37

Don't use dark theme for code: many chars are badly visible.

Ответить
Shaida Muhammad
Shaida Muhammad - 26.03.2023 03:42

Hello Alexander,

Please make a dedicated video on "Reinforcement Learning with Human Feedback"

Ответить
Công Chúa tóc mây
Công Chúa tóc mây - 25.03.2023 15:57

Convolution is a dot product, weighted sums are dot products. Max pooling is a switching decision, ReLU is (if you get your head straight) a literal switch with x>0 as the switching decision. Then all a neural network is is a collection of weighted sums / dot products that are connected and disconnected to and from each other by switches according to a switching decision predicate for each switch. You should see you are free to connect in cheap fast weighted sum algorithms like the FFT or WHT, make ReLU a 2 pole switch and many other things. There is a Walsh Hadamard transform booklet via archive.

Ответить
Salvador Dali
Salvador Dali - 25.03.2023 02:23

Great Lecture, but last week Ava said this year's CV lecture will be about Vision Image Transformer!

Ответить
Nestor Hernandez
Nestor Hernandez - 24.03.2023 21:50

the videos, slides and explanation keep getting better.

Ответить
Aritra Roy
Aritra Roy - 24.03.2023 20:20

Wow !! Really awesome lecture Alex sir . Nice explanation with perfect slides

Ответить
Akash Kumar Singh
Akash Kumar Singh - 24.03.2023 18:31

It's unbelievable that you're doing this for free. Thanks a lot Sir. Your explanation is very clear and in an easy manner. Thanks again Sir.

Ответить
Jan Vendelin Hála
Jan Vendelin Hála - 24.03.2023 18:02

All man yone
2 videos blady chicken hahah

Ответить
Margaux
Margaux - 24.03.2023 17:57

We are so lucky to be alive at a time when we can attend these types of lectures for free <3

Ответить
Falguni Das Shuvo
Falguni Das Shuvo - 24.03.2023 17:57

Awesome!

Ответить
Rashmi Nagpal
Rashmi Nagpal - 24.03.2023 17:51

Such a brilliant session! I am totally in the awe of this course, and loved the way Dr. Alex dissects the concepts in simplified way!

Ответить
Jan Vendelin Hála
Jan Vendelin Hála - 24.03.2023 17:48

I wonder if the 2 AIs can argue each other. haha and there could be a problem if you have 100 AI. 😂

Ответить
Abdullahi Abdislaan
Abdullahi Abdislaan - 24.03.2023 17:40

alex, i wanna ask you last lecture was sequencing in the website there's code lab related to that lecture can i walk through or you gonna assigning

Ответить
Jan Vendelin Hála
Jan Vendelin Hála - 24.03.2023 17:30

the raw data comes from the universe where we live.

Ответить
Jan Vendelin Hála
Jan Vendelin Hála - 24.03.2023 17:28

the spark is what the drive is, but the drive changes according to the input. but it is exhaustible, it is the human body that dies. that's your second problem hahaha
And now tell me if the program that runs in a person is created by genes or by that spark. I would describe it as a synopsis.

Ответить
Jan Vendelin Hála
Jan Vendelin Hála - 24.03.2023 17:25

you try to figure out how to program the human mind, but you can't until you are able to create that spark of consciousness, that divine particle that makes a brain a brain.

Ответить
nizar nizo
nizar nizo - 24.03.2023 17:21

The Convolutional Neural Network, one of my Passion and with MIT is an ART

Ответить
Jan Vendelin Hála
Jan Vendelin Hála - 24.03.2023 17:19

if I were to describe to you how I perceive the world, you would turn the brown thing into a textile.

Ответить