Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

After some googling, I found that this can be done via standard Python's module multiprocessing as explained below.

First, we need our functions to store their results in a given queue, which can be done via a wrapper:

def func_wrapper(queue, func, x):
    queue.put( func(x) )

Then the concurrent run of f1 and f2 on x can be implemented as follows:

def concurrent_run(f1,f2,x):
    q = multiprocessing.Queue()
    p1 = multiprocessing.Process(target=func_wrapper, args=(q,f1,x))
    p2 = multiprocessing.Process(target=func_wrapper, args=(q,f2,x))

    p1.start()
    p2.start()

    res = q.get()

    p1.terminate()
    p2.terminate()

    return res

After some googling, googling and reading documentation, I have found that this can be done via standard Python's module multiprocessing as explained below.

First, we need our functions to store their results in a given queue, which can be done via a wrapper:

def func_wrapper(queue, func, x):
    queue.put( func(x) )

Then the concurrent run of f1 and f2 on x can be implemented as follows:

def concurrent_run(f1,f2,x):
    q = multiprocessing.Queue()
    p1 = multiprocessing.Process(target=func_wrapper, args=(q,f1,x))
    p2 = multiprocessing.Process(target=func_wrapper, args=(q,f2,x))

    p1.start()
    p2.start()

    res = q.get()

    p1.terminate()
    p2.terminate()

    return res

After some googling and reading documentation, I have found that this can be done via standard Python's module multiprocessing as explained below.

First, we need our functions to store their results in a given queue, which can be done via a wrapper:

def func_wrapper(queue, func, x):
    queue.put( func(x) )

Then the concurrent run of f1 and f2 on x can be implemented as follows:

def concurrent_run(f1,f2,x):
    q = multiprocessing.Queue()
    p1 = multiprocessing.Process(target=func_wrapper, args=(q,f1,x))
    p2 = multiprocessing.Process(target=func_wrapper, args=(q,f2,x))

    p1.start()
    p2.start()

    res = q.get()

    p1.terminate()
    p2.terminate()

    return res

PS. Perhaps, we can simply use return q.get() without explicitly calling .terminate() methods and letting the corresponding Process classes to take care of a clean-up. On the other hand, for unresponsive processes we can use .kill() instead of .terminate() to force their shutdown.