I'm guessing here at your request, because the original question is quite unclear. Since os.listdir
doesn't guarantee an ordering, I'm assuming your "two" functions are actually identical and you just need to perform the same process on multiple files simultaneously.
The easiest way to do this, in my experience, is to spin up a Pool
, launch a process for each file, and then wait. e.g.
import multiprocessing
def process(file):
pass # do stuff to a file
p = multiprocessing.Pool()
for f in glob.glob(folder+"*.csv"):
# launch a process for each file (ish).
# The result will be approximately one process per CPU core available.
p.apply_async(process, [f])
p.close()
p.join() # Wait for all child processes to close.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…