When I'm trying to write very large amount of data (list with 300 000 rows and more) to memory stream using CsvHelper, it throws the exception "System.IO.IOException: Stream was too long.".
Data class is rather big and has ~30 properties, consequently each record in the file would have ~30 columns.
This is the actual writing code where exception throws (by the way this code is based on that answer of CsvHelper lib's author):
using (var memoryStream = new MemoryStream())
{
using (var streamWriter = new StreamWriter(memoryStream, encoding ?? Encoding.ASCII))
{
var csvWriter = new CsvWriter(streamWriter, GetConfiguration(delimiter, mappingClassType, mappingActions));
csvWriter.WriteRecords(data); //data is IEnumerable<T> and has more than 300k records
streamWriter.Flush();
return memoryStream.ToArray();
}
}
Then I save the resulted bytes array into the file.
File.WriteAllBytes(filePath, resultedBytesArray);
Please note, that the same code works well when I write 100 000 records to the file (in that case the file has size about 1GB). By the way, my goal is to write more then 600 000 data records.
This is the relevant part of the stack trace related to this issue.
Stream was too long.|System.IO.IOException: Stream was too long.
at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
at System.IO.StreamWriter.Write(Char[] buffer, Int32 index, Int32 count)
at CsvHelper.CsvWriter.NextRecord() in C:UsersJoshProjectsCsvHelpersrcCsvHelperCsvWriter.cs:line 290
at CsvHelper.CsvWriter.WriteRecords(IEnumerable records) in C:UsersJoshProjectsCsvHelpersrcCsvHelperCsvWriter.cs:line 490
at FileExport.Csv.CsvDocument.Create[T](IEnumerable`1 data, String delimiter, Encoding encoding, Type mappingClassType, IDictionary`2 mappingActions) in d:DevDrugDevExportFileExportCsvCsvDocument.cs:line 33
As far as I'm concerned the basic way to achieve my goal and avoid that issue is to split my list of written data up on few parts and concatenate them together then, but may be is there any pretty obvious and easy solution without a significant code refactoring (like increase the default stream/buffer size, etc..)?
Also keep in mind, that I've also applied two possible solutions in order to prevent "Out Of Memory" objects exception.
Thanks in advance.
See Question&Answers more detail:
os