Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
891 views
in Technique[技术] by (71.8m points)

python - multiprocessing pool not working in nested functions

Following code not executing as expected.

import multiprocessing

lock = multiprocessing.Lock()
def dummy():
    def log_results_l1(results):
        lock.acquire()
        print("Writing results", results)
        lock.release()

    def mp_execute_instance_l1(cmd):
        print(cmd)
        return cmd

    cmds = [x for x in range(10)]

    pool = multiprocessing.Pool(processes=8)

    for c in cmds:
        pool.apply_async(mp_execute_instance_l1, args=(c, ), callback=log_results_l1)

    pool.close()
    pool.join()
    print("done")


dummy()

But it does work if the functions are not nested. What is going on.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

multiprocessing.Pool methods like the apply* and *map* methods have to pickle both the function and the arguments. Functions are pickled by their qualified name; essentially, on unpickling, the other process needs to be able to import the module they were defined in and do a getattr call to find the function in question. Nested functions aren't available by name outside the function they were defined in, so pickling fails. When you move the function to global scope, you fix this, which is why it works when you do that.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...