Комментарии:
That is awesome
Ответитьcan you help me 🤦♀🚨🚨
TypeError: unhashable type: 'numpy.ndarray'
print([fib(n) for n in range(0, 10000)])
something like that with the cache enables python to bypass the maximum depth recursion
Amazing thanks
ОтветитьThank u, and is there anyway u could make a video on Cython, Numba, or any other library helping speed ups?
ОтветитьCaching data basically means you're decreasing the amount of calculations you need to do by optimizing the process through which the calculations are done.
But heres what im confused about.
Why does caching make it THAT much faster? Here you're decreasing the number of calculations u need to do by half (or is that more than half) so why is it so significantly faster
Found my answer. The decrease in calculations is significantly more than half.
The Fibonacci sequence is super unoptimized
Why did time.time() function is useless for results under one seconds ? In fact, executing code is happening in microseconds and milliseconds.
ОтветитьDoes caching have a negative side?
Ответитьthat's nice!!
Ответить🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥
ОтветитьCool.....
Ответитьamazing video , just love it.
ОтветитьCool video!
Ответитьfor me it isnt working. with cache and without cache always shows same time idk why
ОтветитьMe: "How do I do caching in python... I bet there's a module for this."
NeuralNine: "yeah 'from functools import cache' "
Perfect video, it's rare that you get the exact answer you are looking for with a walk through right away. I love the future!
awesome toturial, thanks.
but can we know how the cache function acts or not, and can we write a normal function acts like a built-in cache function.
this cache is similar to the idea of dynamic - save some part of the result
Ответитьgreat
ОтветитьIt was very helpful for me.
ОтветитьHi bro, Thanks for tutorial. but I have problem for this:
from functools import cache
when I run my code give me error:
ImportError: cannot import name 'cache' from 'functools' (/usr/lib/python3.8/functools.py)
april first lol
ОтветитьThank you so much.
Ответитьexplanation is too clear !!!
Ответитьhelpful
ОтветитьThis is awesome! Learned much about it!
ОтветитьActually learned so much from this. Decorators finally make sense and I found a way to speed up my python programs
ОтветитьMight've been useful to show how to implement caching like that instead of just importing
ОтветитьWhere have I seen this video before...? You copied that from someone right?
ОтветитьThanks
Ответитьmy python 3.9 functools dont have @cache anymore i think , i get an error whereas @lru_cache works
ОтветитьDo a video about cython
ОтветитьNice video
ОтветитьWhen caching is used in calculations like this the technique is often called 'memoization'. Also quite handy when writing a prime factor sieve. Love the channel. Long live human 2.0. Go SpaceX.
ОтветитьLoved your intro music!
The video as well. You're awesome and unique as always :)
I just found out that the cache decorator was added in Python version 3.9, so those who are in a previous version have to use lru_cache instead.
ОтветитьWhere can you learn about this caching from first principles?
ОтветитьSearch: Emanuel Swedenborg
ОтветитьWhen recommendation is faster than notification
ОтветитьPycharm? Vscode pls!
ОтветитьHello bro can you explain pwntools python library
ОтветитьI had already tried to import cache from functool in google colab but it got an error, do you know why this happens? So instead i import Cache from cache-decorator
ОтветитьYou can increase the recursion limit using sys module
import sys
sys.setrecursionlimit(10**12)
Wieso wieder PyCharm?
ОтветитьOk, so if I have a pyinstaller .exe onefile, will cache speed that up too?
ОтветитьAwesome as always 👍😀
ОтветитьMore of this content please !!
actual real life use-full things
thank you for the hard-work
But Pycharm is heavy
ОтветитьAnd now should i Cache every function?
Ответитьwhen i am working with classes and "@property" can i use the "@cache" too???
ОтветитьWhen I saw it I just hoped that it wasn’t an april fools
Ответить