Комментарии:
Hey everybody! I hope you find this video helpful. I'll be releasing the multiprocessing video next week. I am currently packing up my temporary recording station and will be moving into my new place tomorrow, so I should be able to get it recorded, edited, and released next week for sure. We'll be covering how to use multiprocessing to process the high-resolution images we downloaded in this video. Hope you all get some good use out of these topics!
ОтветитьThis dude is like the bible for programming tutorials.
ОтветитьBut when I tried it out, it looked like threads are switching between each other even if there is no "dead time" in the threads. Are they switching switching between each other or are they actually happening in parallel (are they actually multiprocessing)?
import threading
def cpu_bound_task(game_id):
i = 0
while i < 10**7:
i += 1
print(f"Game {game_id} CPU-bound task - iteration {i}")
def main():
threads = []
for game_id in range(3):
thread = threading.Thread(target=cpu_bound_task, args=(game_id,))
threads.append(thread)
thread.start()
for thread in threads:
thread.join()
if _name_ == "__main__":
main()
Thank you so much, Corey, for this fantastic tutorial on introducing threading in Python to beginners! Your clear explanations and step-by-step demonstrations have really helped me grasp this concept much better. Your tutorials are always a go-to resource for me, and I appreciate how you make even complex topics so accessible. Your contribution to the Python community is invaluable, catering to both newcomers like me and experienced developers. Keep up the fantastic work, Corey!
Ответитьnice video!
Ответитьomg.
I was using some of it, but not all of it's potential.
Amazing.
still valid even after 3 yrs
Awesome tutorial
I think you meant asynchronously a couple of times
ОтветитьCan we use thread in APIs(created in fastapi), does it cause database deadlock?
Ответитьwhy do I get None between my thread and finished in thread pool executor?
Ответитьthanks alot ,
and for those who the method didn't work for after using with concurrent.futures.ThreadPoolExecutor() as executor :
add max_workers=(to more than or equal to the number in the range function)
like this
with concurrent.futures.ThreadPoolExecutor(max_workers=20) as executor :
So threads in python only uses one core of CPU of underlying server?
ОтветитьAmazing video, thank you!!
ОтветитьThank you
ОтветитьI'd be so screwed without your videos. The BEST explanations anywhere on the internet for learning python.
ОтветитьYour a great teacher i keep comming back to your lessons!
ОтветитьThanks
Ответитьwhen I See Someone is appreciate your video. I don't know why i'm just feel so fine💝💝
ОтветитьAmazing! Tried something similar with multiprocess, but will test this one out. Thanks!
ОтветитьCorey I love you
Ответитьyour'e awesome
ОтветитьI am blown away
ОтветитьFor five years I was looking for some simple explanation that I could use in my code, to pull my dna, from snpedia...OMG. So simple but very good. Thanks man.
ОтветитьIn the minute 24 I subscribe and like it !!!OMG
ОтветитьThanks for the wonderful video
ОтветитьThanks man.
ОтветитьI don't know if this comment will be seen but I'm having trouble running the concurrent.futures method with a for loop
ОтветитьThank youu so much Highlyy recommended , you just express it in a simple way
ОтветитьBeautiful !
Ответитьcan I perform multithreading from dictionary? i cant seem to find out the syntax :(
ОтветитьCompletely agree about the basic example tutorials, I often find them useless. Great video, thanks!
ОтветитьWhat would be candidate for doing something with json, like extracting and storing, multi threading?
ОтветитьI am curious about a best practice for dividing up a list of REST API calls so that chunks can be sent to different processes via multi-processing first, then each process can use multi-threading to request data and subsequently process the payloads individually. Basically a good way of using both techniques in such a way that you leverage the strengths of multi-threading within a multi-process operation which I know can be done.
There's just a bit of magic that thread/process pools are doing where they determine how to allocate workers for you and I don't know how to use that magic to divide the list of REST calls optimally per process. I guess I could count the number of CPUs available on the machine but then I kind of like how process pools does that for you and helps make your operations safe without potentially maxing out all processors.
Thank you for this powerful tutorial.
ОтветитьThis video is a life-saver! I have a project due in 3 days that needs me to do a lot of fancy stuff in a multithreaded program, and I had no clue what multithreaded programs are! I'm in a much better position after watching this video. Thank you, Corey!
Ответитьwhat a godsend tutorial
ОтветитьThanks Corey. Why do you pass in the args as a list [1.5]?
ОтветитьThank you for this. The Thread Pool Executor is really simple to use.
ОтветитьThank for the detailed step by step explanation. How do we control the number of threads using a ThreadPoolExecutor when processing a list of say 40k urls? For example if I want to process 10urls at a time. Is creating a smaller sublist with 10 items from the main 40k list an ideal solution?
Ответитьthis is kinda interesting that you can run a code in parallel withing for loop THANKS COREY
ОтветитьTried Skillshare to take on learning Python. There is a bunch of milquetoast videos with no real substance. These videos by Corey are outstanding. Much appreciation. I'm glad it was a free subscription.
ОтветитьHey Corey, thanks for this. You are an awesome teacher. I will be watching your other videos too as needed and Subscribed! :D
ОтветитьYou're great. Your video on multithreading helped me to undertand the differncce between multithread & multiprocess. Great explaination!!❤❤
Ответить