On an ASP.net site at my place of work, the following chunk of code is responsible for handling file downloads (NOTE: Response.TransmitFile is not used here because the contents of the download are being streamed from a zip file):
private void DownloadFile( Stream stream)
{
int bytesRead;
int chunkSize = 1048576; //1MB
byte[] readBuffer = new byte[chunkSize];
while ((bytesRead = stream.Read(readBuffer, 0, readBuffer.Length)) != 0)
{
if(!Response.IsClientConnected)
break;
byte[] chunk = new byte[bytesRead];
Array.Copy(readBuffer,0,chunk,0,bytesRead);
Response.BinaryWrite(chunk);
Response.Flush();
}
stream.Close();
}
Our users frequently download multi-hundred MB files, which can chew up server memory pretty fast. My assumption is that this is due to response buffering. Does that make sense?
I've just read about the 'buffer' property of the Response object. If I set that to false, will that prevent the Response.BinaryWrite() calls from buffering the data in memory? In general, what is a good way to limit memory usage in this situation? Perhaps I should stream from the zip to a temporary file, then call Response.TransmitFile()?
EDIT: In addition to possible solutions, I'm very interested in explanations of the memory usage issue present in the code above. Why would this consume far more than 1MB, even though Response.Flush is called on every loop iteration? Is it just the unnecessary heap allocation that occurs on every loop iteration (and doesn't get GC'd right away), or is there something else at work?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…