One approach would be to wrap your multiprocessing.Queue
with a custom class (just on the producer side, or transparently from the consumer perspective). Using that you would queue up items to be dispatched to the Queue
object that you're wrapping, and only feed things from the local queue (Python list()
object) into the multiprocess.Queue
as space becomes available, with exception handling to throttle when the Queue
is full.
That's probably the easiest approach since it should have the minimum impact on the rest of your code. The custom class should behave just like a Queue while hiding the underlying multiprocessing.Queue
behind your abstraction.
(One approach might be to have your producer use threads, one thread to manage the dispatch from a threading Queue
to your multiprocessing.Queue
and any other threads actually just feeding the threading Queue
).
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…