When you call a function in the same thread, it will normally not return until complete. The function you call really has to be designed to be interruptible in the first place. There are many ways to achieve this, with varying degrees of complexity and generality.
Probably the simplest way is to pass the time limit to your function, and process the work in small chunks. After each chunk is processed, check if the elapsed time exceeds the timeout and if so, bail early.
The following example illustrates this idea, with the work taking a random amount of time per chunk which will sometimes complete and sometimes time out:
import time
import random
import datetime
class TimeoutException(Exception):
def __init__(self, *args, **kwargs):
Exception.__init__(self, *args, **kwargs)
def busy_work():
# Pretend to do something useful
time.sleep(random.uniform(0.3, 0.6))
def train_loadbatch_from_lists(batch_size, timeout_sec):
time_start = datetime.datetime.now()
batch_xs = []
batch_ys = []
for i in range(0, batch_size+1):
busy_work()
batch_xs.append(i)
batch_ys.append(i)
time_elapsed = datetime.datetime.now() - time_start
print 'Elapsed:', time_elapsed
if time_elapsed > timeout_sec:
raise TimeoutException()
return batch_xs, batch_ys
def main():
timeout_sec = datetime.timedelta(seconds=5)
batch_size = 10
try:
print 'Processing batch'
batch_xs, batch_ys = train_loadbatch_from_lists(batch_size, timeout_sec)
print 'Completed successfully'
print batch_xs, batch_ys
except TimeoutException, e:
print 'Timeout after processing N records'
if __name__ == '__main__':
main()
Another way to achieve this is to run the worker function in a separate thread, and use an Event
to allow the caller to signal the worker function should terminate early.
Some posts (such as the linked one above) suggest using signals, but unfortunately signals can cause additional complications, and so is not recommended.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…