Комментарии:
Awesome video as usual!
Ответитьwhat about compared to threading?
which is faster and whats the difference
thanks for the video bro, I was looking for several and several and could not understand as clearly as it was in this video!
ОтветитьThank you so much for your demonstration! Very clear and helpful. Can I ask why do we need the line "if _name_ == '__mian__' "? Thank you!
ОтветитьI had problem with real time OCR and QR detection. when using single thread, it takes time save the images and process the OCR and QR which make the screen freeze about 0.3 seconds . so I used multiprocessing and make two process to capture images and process QR OCR and one shared list to put and read numpy arrays (ROI). It works better. Thank you very much.
ОтветитьExcellent. I'm trying to implement this along with a timeout that will kill processes that are running longer than a set time. If anyone knows of a video that shows how to do that please pass it on.
ОтветитьAwesome explanation!
ОтветитьHow to solve broken process pool on multiprocessing?
ОтветитьNice, one comment, you can use double the process of the number of cores so 16 cores is 32 process you can run.
ОтветитьHi! Is there anyway to use a multiprocessing within a function?
Specifically I'd like to know if for example it is possible to define a function "sum_square_with_mp(numbers)", where if a run the code, then I would be able to type sum_square_with_mp([1,3,5]) and get 1, 14 and 55.
This work for py3 or no ???
Ответитьthank you!
Ответитьhow can I use pool to count words in a huge file (genome = dna) to speed up my results.
for ex if I look for palindrome words in a text and need to make a dict (word:[list positions]) it takes a 2 minuts and I have 198000 text to look for. 8)
I was passing the arguments like alphabet = 'acgt' and k = integer representing the length of the words to look for.
i was using partial to pass the text to the function and then pool.apply_async(func, args). But still not good performance at all. do you have any suggestions?
os.cpu_count() returns the number of virtual cores. On a 4-core, hyper-threading machine, it returns 8. However, python cannot efficiently run two interpreters on a single hyper-threaded core as the processes have separate memory spaces but the hyperthreaded core halves share a single memory space. To really operate at maximum efficiency, you should separate your code into a pool of 4 processes, each of which consisted of 2 threads running on a single interpreter. Thus the thread pairs could run concurrently taking advantage of overlapping IO and computation for example, while the 4 cores could each be doing some computationally intensive operation. Too much detail for this level of tutorial, but you might have significant savings for large memory processes by avoiding the default argument to Pool() and instead do Pool(os.cpu_count()/2) or Pool(psutil.cpu_count(logical = False))
ОтветитьHow the Pool object differs from ProcessPoolExecutor?
ОтветитьIt is work mentioning that it isn't possible to call p.join() before calling p.call(). At least it isn't possible, based on the official Python documentation.
Ответитьmultiprocessing.cpu_count() finds the number of cores
ОтветитьI would like to understand if my VM is having only 10 core, and if I allocate 200, what will happen and how does p.map( fun, [1....300]) , fucntion work if I'm calling function more than number core and args are more number of cores? thank you
ОтветитьWhile using Multiprocessing Pool, I am getting a permission denied error. Can you help me with that
ОтветитьHow to explicitily limit the number of process , like I want to run only 3 process at a time .
ОтветитьWhat should I do if my function needs two arguments? What would be the correct syntax? Should I create a list of two arguments on each element? result = p.map(function, (arg1,arg2)) ??
Ответитьgreat example and explanation bro!
ОтветитьGreat example! I have almost faced up with the same situation where I expected to speed up my execution time using Multiprocessing however I got longer run time! Thanks.
Ответитьgreat examplwe - thanks!
Ответитьwell explained :)
ОтветитьFantastic Python class!
ОтветитьHi, I have code which interacts with network devices and takes some commands output from the devices and processes that output to take useful data and writes to a csv file. Now in the code i want to introduce multiprocessing but the problem is that the main() function calls the other functions and these functions also call other local functions as per requirement. Can you please help me out here as I am really confused on how to write the same code with multiprocessing because this operation is done on more than 100 devices and consumes about an hour to complete.
ОтветитьIs it possible to map the function on gpu instead of cpu? Thank you very much for the tutorial.
ОтветитьQUESTION: seeing how easy it is to use Pool, why would one choose to use multiprocessing.Process?
ОтветитьNice work!!
ОтветитьFinally a clear tutorial on this! Was finally able to get it to work thanks to you.
ОтветитьExcellent video, greetings from Germany
Ответитьcool video - what is the diffeerence with using 'Process' instead of ''Pool'?
Ответитьimport multiprocessing as mp
mp.cpu_count()
I like your vim setup. How can I get the style/customization like yours? Thanks.
ОтветитьThanks for the great contents. I'm really learning a lot from these videos. I have a question, is there a smart way to give pool class multiple arguments? For example, in this video, function 'sum_square' takes 1 integer as an argument and to execute parallel computation we make a list of integers[numbers] and use 'p.map(sum_square, numbers)' . What I see is that the map function takes the function to parallelize and then a list of arguments. But for a function that takes multiple arguments other than only one, lets say for 2 integers, how do you design the pool multiprocessing?
Ответитьwhat is the terminal that you are using?
ОтветитьI was using this code to understand what the code does and if it's converted into a exe on Windows, It'll eat all of your RAM in 1-5 seconds and create over 1000 processes with a system crash. (I saw the code create 5000+ at one time.)
Ответитьyou are best teacher . Greetings from Azerbaijan
ОтветитьThanks a lot for this rich content, I really learned a lot from it. I have a question: What is the difference between Pooling and using the classical Process modules as in the previous videos?
ОтветитьNice work!! Thanks for this series of videos. Greetings from Chile
ОтветитьOnce again ! Very interesting video ! Thank you very much
Ответить