I'm scraping HTML pages and have set up a HTTP client like so:
client := *http.Client{
Transport: &http.Transport{
Dial: (&net.Dialer{
Timeout: 30 * time.Second,
KeepAlive: 30 * time.Second,
}).Dial,
TLSHandshakeTimeout: 10 * time.Second,
ResponseHeaderTimeout: 10 * time.Second,
},
}
Now when I make GET requests of multiple URLs I don't want to get stuck with URLs that deliver massive amount of data.
response, err := client.Get(page.Url)
checkErr(err)
body, err := ioutil.ReadAll(response.Body)
checkErr(err)
page.Body = string(body)
Is there a way to limit the amount of data (bytes) the GET request accepts from a resource and stops?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…