Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
368 views
in Technique[技术] by (71.8m points)

Python requests, how to limit received size, transfer rate, and/or total time?

My server does external requests and I'd like to limit the damage a failing request can do. I'm looking to cancel the request in these situations:

  • the total time of the request is over a certain limit (even if data is still arriving)
  • the total received size exceeds some limit (I need to cancel prior to accepting more data)
  • the transfer speed drops below some level (though I can live without this one if a total time limit can be provided)

Note I am not looking for the timeout parameter in requests, as this is a timeout only for inactivity. I'm unable to find anything to do with a total timeout, or a way to limit the total size. One example shows a maxsize parameter on HTTPAdapter but that is not documented.

How can I achieve these requirements using requests?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You could try setting stream=True, then aborting a request when your time or size limits are exceeded while you read the data in chunks.

As of requests release 2.3.0 the timeout applies to streaming requests too, so all you need to do is allow for a timeout for the initial connection and each iteration step:

r = requests.get(..., stream=True, timeout=initial_timeout)
r.raise_for_status()

if int(r.headers.get('Content-Length')) > your_maximum:
    raise ValueError('response too large')

size = 0
start = time.time()

for chunk in r.iter_content(1024):
    if time.time() - start > receive_timeout:
        raise ValueError('timeout reached')

    size += len(chunk)
    if size > your_maximum:
        raise ValueError('response too large')

    # do something with chunk

Adjust the timeout as needed.

For requests releases < 2.3.0 (which included this change) you could not time out the r.iter_content() yield; a server that stops responding in the middle of a chunk would still tie up the connection. You'd have to wrap the above code in an additional timeout function to cut off long-running responses early.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...