Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
302 views
in Technique[技术] by (71.8m points)

c# - Downloading large files(~150MB) from FTP server hangs

I am trying to download files from ftp server with this code:

using (System.IO.FileStream fileStream = System.IO.File.OpenWrite(filePath))
              {

                byte[] buffer = new byte[4096];

                int bytesRead = responseStream.Read(buffer, 0, 4096);
                while (bytesRead > 0)
                {
                  fileStream.Write(buffer, 0, bytesRead);                      
                  bytesRead = responseStream.Read(buffer, 0, 4096);
                }

              }

The creation of responseStream:

System.IO.Stream responseStream = GetFileAsStream(url, username, password, false);

public static System.IO.Stream GetFileAsStream(string ftpUrl, string username, string password, bool usePassive)
{
  System.Net.FtpWebRequest request = (System.Net.FtpWebRequest)System.Net.WebRequest.Create(ftpUrl);
  request.KeepAlive = false;
  request.ReadWriteTimeout = 120000;
  request.Timeout = -1;
  request.UsePassive = usePassive;

  request.Credentials = new System.Net.NetworkCredential(username, password);

  request.Method = System.Net.WebRequestMethods.Ftp.DownloadFile;

  System.IO.Stream fileResponseStream;


  System.Net.FtpWebResponse fileResponse = (System.Net.FtpWebResponse)request.GetResponse();



  fileResponseStream = fileResponse.GetResponseStream();

  return fileResponseStream;
}

It works fine with smaller files but when a file is bigger (e.g. 150mb) the process hangs. For some reason the program does not understand that it has completed the download and it still tries to read more bytes.

I prefer answers which do not include using external libraries. Thank you

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I have solved my problem by introducing a request timeout- which, if reached, makes the program to throw a WebException. In that case, the program resumes the download from the place it left of. Here's my code :

This is inside of a method which is returning true if the file is downloaded, false- otherwise

Digitalez.DirectoryUtil.EnsureDirectoryExists(relativePath);

        string filePath = System.IO.Path.Combine(relativePath, fileInfo.Name);
        long length = Digitalez.FtpUtil.GetFileLength(fileInfo.FullPath, userName, password, usePassive);
        long offset = 0;
        int retryCount = 10;
        int? readTimeout = 5 * 60 * 1000; //five minutes

        // if the file exists, do not download it
        if (System.IO.File.Exists(filePath))
        {
          return false;
        }

        while (retryCount > 0)
        {

          using (System.IO.Stream responseStream = Captator.Eifos.Net.FtpUtil.GetFileAsStream(fileInfo.FullPath, userName, password, usePassive, offset, requestTimeout: readTimeout != null ? readTimeout.Value : System.Threading.Timeout.Infinite))
          {

            using (System.IO.FileStream fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Append))
            {
              byte[] buffer = new byte[4096];
              try
              {
                int bytesRead = responseStream.Read(buffer, 0, buffer.Length);

                while (bytesRead > 0)
                {
                  fileStream.Write(buffer, 0, bytesRead);

                  bytesRead = responseStream.Read(buffer, 0, buffer.Length);
                }

                return true;
              }
              catch (System.Net.WebException)
              {
                // Do nothing - consume this exception to force a new read of the rest of the file
              }
            }

            if (System.IO.File.Exists(filePath))
            {
              offset = new System.IO.FileInfo(filePath).Length;
            }
            else
            {
              offset = 0;
            }

            retryCount--;

            if (offset == length)
            {
              return true;
            }

          }
        }

Digitalez.FtpUtil:

public static System.IO.Stream GetFileAsStream(string ftpUrl, string username, string password, bool usePassive, long offset, int requestTimeout)
{
  System.Net.FtpWebRequest request = (System.Net.FtpWebRequest)System.Net.WebRequest.Create(ftpUrl);

  request.KeepAlive = false;
  request.ReadWriteTimeout = requestTimeout;
  request.Timeout = requestTimeout;
  request.ContentOffset = offset;
  request.UsePassive = usePassive;
  request.UseBinary = true;

  request.Credentials = new System.Net.NetworkCredential(username, password);

  request.Method = System.Net.WebRequestMethods.Ftp.DownloadFile;

  System.IO.Stream fileResponseStream;

  System.Net.FtpWebResponse fileResponse = (System.Net.FtpWebResponse)request.GetResponse();

  fileResponseStream = fileResponse.GetResponseStream();

  return fileResponseStream;
}

public static long GetFileLength(string ftpUrl, string username, string password, bool usePassive)
{
  System.Net.FtpWebRequest request = (System.Net.FtpWebRequest)System.Net.WebRequest.Create(ftpUrl);

  request.KeepAlive = false;
  request.UsePassive = usePassive;


  request.Credentials = new System.Net.NetworkCredential(username, password);
  request.Method = System.Net.WebRequestMethods.Ftp.GetFileSize;

  System.Net.FtpWebResponse lengthResponse = (System.Net.FtpWebResponse)request.GetResponse();
  long length = lengthResponse.ContentLength;
  lengthResponse.Close();
  return length;

}

I haven't tried other servers but this certainly does the trick.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...