I have following in Windows Forms .NET 3.5
It works fine for csv with records less than 10,000 but is slower for records above 30,000.
Input csv file can can any records between 1 - 1,00,000 records
Code currently used :
/// <summary>
/// This will import file to the collection object
/// </summary>
private bool ImportFile()
{
try
{
String fName;
String textLine = string.Empty;
String[] splitLine;
// clear the grid view
accountsDataGridView.Rows.Clear();
fName = openFileDialog1.FileName;
if (System.IO.File.Exists(fName))
{
System.IO.StreamReader objReader = new System.IO.StreamReader(fName);
do
{
textLine = objReader.ReadLine();
if (textLine != "")
{
splitLine = textLine.Split(',');
if (splitLine[0] != "" || splitLine[1] != "")
{
accountsDataGridView.Rows.Add(splitLine);
}
}
} while (objReader.Peek() != -1);
}
return true;
}
catch (Exception ex)
{
if (ex.Message.Contains("The process cannot access the file"))
{
MessageBox.Show("The file you are importing is open.", "Import Account", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
else
{
MessageBox.Show(ex.Message);
}
return false;
}
}
Sample Input file :
18906,Y
18908,Y
18909,Y
18910,Y
18912,N
18913,N
Need some advice on optimizing this code for fast reads & view in grid.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…