I am currently developing an application that reads a text file of about 50000 lines. For each line, I need to check if it contains a specific String.
At the moment, I use the conventional System.IO.StreamReader
to read my file line by line.
The problem is that the size of the text file changes each time. I made several test performance and I noticed that when the file size increase, the more time it will take to read a line.
For example :
Reading a txt file that contains 5000 lines : 0:40
Reading a txt file that contains 10000 lines : 2:54
It take 4 times longer to read a file 2 times larger. I can't imagine how much time it will takes to read a 100000 lines file.
Here's my code :
using (StreamReader streamReader = new StreamReader(this.MyPath))
{
while (streamReader.Peek() > 0)
{
string line = streamReader.ReadLine();
if (line.Contains(Resources.Constants.SpecificString)
{
// Do some action with the string.
}
}
}
Is there a way to avoid the situation: bigger File = more time to read a single line?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…