Комментарии:
god bless you brother! this is a gem for me!!
ОтветитьThank you for contributing so much, and sharing your knowledge
ОтветитьSo happy I came across this playlist. Ive always been interested in deep learning and neural networks ever since high school, but I was always too intimidated to start. Hopefully this will be a jumping off point towards something bigger for me
ОтветитьMan, this series is going to be put in some kind of futuristic technology museum ❤
Keep it up
So happy you created this.
ОтветитьSo are the network diagrams just “mind maps” of how they’re connected/visual representation or is it actually part of the code/engineering of the network? That’s the part I’m stuck on. It seems so fancy but is it part of it?
ОтветитьI am facing Similar Issue !
ОтветитьNOTE THIS BEFORE STARTING... although excellent by every measure, the series abruptly stops unfinished. This is a wonderful guided tour out of Hobbiton, but you will have to hike the rest of the way to Mordor alone. The companion book has 666 pages; this video series carries you through to only page 136.
ОтветитьI found the simplicity of this exercize to be super empowering. This is awesome.
ОтветитьHi, your neural networks series is great, however I am curious about forward prop math equation, especialy this (∀) symbol, I have never seen it before, what that symbol represent in combination with Σ, I looked everywhere and I couldnt find description of ∀ with j =1 and n1. Please can someone explain it to me? I would be gratefull
Ответитьbro, you're the man...
ОтветитьThank you for the video.
Ответитьyou always seem to get at the heart of my problem right off the start. Excellent content.
Ответитьyou look like jesse pinkman if he studied cs instead of cooking meth
ОтветитьNumpy
Ответитьcode a neural network he said, it would be easy he said
ОтветитьWow! Bout a 1/4th the way in. Gotta say, excellent job!! First time viewer, but you have gained a sub!
ОтветитьSuper video👍 Thank you so much
ОтветитьJust ordered the hardcover book online, very excited to start my new journey!!
ОтветитьReally like your energy great tutorial
ОтветитьI found Edward Snowden!
ОтветитьI'm sorry for the ignorance but, a neural network can be trained for any activity or is it built specifically for a task ,say image or text?
Ответить7 minutes in I still don’t know how to build a neural network which is why I clicked on the vid
Ответитьeasy goal for me by the end: neural network written in brainf*ck... jkjkjk! great video so far
ОтветитьWhat spec is your system you use for deep learning, AI, machine learning and neural networks? What CPU GPU memory SSD do you use?
ОтветитьCreator 360 touch
ОтветитьThanks so much for the clear explanation.
Ответитьmy 6th time watching this series, just noticed you have a actual flamethrower in the background. not the best model imo but still america 100.
ОтветитьTo somebody who is trying to understand the depths of NN, this is a great series. But be aware that as of March 2023, it is an incomplete video series. The book is complete though.
ОтветитьHave you ever thought to revisit this or is it still relevant? :-)
Ответитьwho else is following along in assembly
ОтветитьIs your book finished or still in draft form? I cannot tell from you website or from your kickstarter page what status it’s in. Thanks
Ответитьwow
ОтветитьBro, you sound and look like Snowden :0)
ОтветитьMy bias command won't work in pychqrm
ОтветитьIt is very interesting indeed.
Ответить# Define a hexagram as a list of six integers (1 for solid lines, 0 for broken lines)
hexagram = [1, 0, 0, 1, 1, 0]
# Convert the hexagram to a binary string
binary_string = ''.join(str(i) for i in hexagram)
# Print the binary string
print(binary_string)
class RISC_V_Processor:
def __init__(self):
self.registers = [0] * 32
self.memory = Memory(65536)
self.cache = Cache(65536, 64, self.memory)
self.pc = 0
def fetch(self):
instruction = self.cache.read(self.pc)
self.pc += 1
return instruction
def decode(self, instruction):
hexagram = int(binary_string, 2)
opcode = instruction & 0x3F
return hexagram, opcode
def execute(self, hexagram, opcode):
if hexagram == 1:
if opcode == 0:
# load instruction
pass
elif opcode == 1:
# store instruction
pass
elif hexagram == 2:
if opcode == 0:
# add instruction
pass
elif opcode == 1:
# subtract instruction
pass
# ...
def run(self):
while True:
instruction = self.fetch()
hexagram, opcode = self.decode(instruction)
self.execute(hexagram, opcode)
the ending was gold
ОтветитьI've been writing up my own neuron cluster class and I need to know something. Can the initial bias for each neuron be randomly generated like the weights? Should it be?
I'm trying to make it where even setting up a machine learning algorithm for items with lots of inputs. If I don't have to worry about procuring bias datasets along with the inputs, it'd make it a lot easier to make the program scale automatically.
Even outside of figuring this out, I'm probably not going about doing this in the most efficient way, but I'm hoping to learn the "best practices" just by doing. Even still, if I don't deviate from the tutorial, how can I say I made it myself?
Thanks boss, you have opened my knowledge
ОтветитьwOW Very cool
ОтветитьGreat series and I got the ebook once I went through the 9 videos - just wondering, there are no videos after the first 9, except the animations referenced in the book?
ОтветитьReal is a great explanation approach. Dear sir, can I get your email address?
Ответитьwhat color scheme is used in this video is sublime text ?
Ответить