| 1 | initial version |
I did not inspect deep causes (like a pipe being full), but a quic look at https://docs.python.org/2/library/multiprocessing.html indicates that doing queue.get() before p.join() seems to fix the issue. Just replace
for p in processes:
p.join()
results = [queue.get() for p in processes]
with
results = [queue.get() for p in processes]
for p in processes:
p.join()
| 2 | No.2 Revision |
I did not inspect deep causes (like a pipe being full), but a quic quick look at https://docs.python.org/2/library/multiprocessing.html indicates that doing queue.get() should be done before p.join() , and it seems to fix the issue. Just replace
for p in processes:
p.join()
results = [queue.get() for p in processes]
with
results = [queue.get() for p in processes]
for p in processes:
p.join()
Copyright Sage, 2010. Some rights reserved under creative commons license. Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.