I am receiving an error when using
_adlsFileSystemClient.FileSystem.Create(_adlsAccountName, destFilePath, stream, overwrite)
to upload files to a datalake. The error comes up with files over 30Mb. It works fine with smaller files.
The error is:
at
Microsoft.Azure.Management.DataLake.Store.FileSystemOperations.d__16.MoveNext()
--- End of stack trace from previous location where exception was thrown --- at
System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task
task) at
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task
task) at
Microsoft.Azure.Management.DataLake.Store.FileSystemOperationsExtensions.d__23.MoveNext()
--- End of stack trace from previous location where exception was thrown --- at
System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task
task) at
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task
task) at
Microsoft.Azure.Management.DataLake.Store.FileSystemOperationsExtensions.Create(IFileSystemOperations
operations, String accountName, String directFilePath, Stream
streamContents, Nullable1 overwrite, Nullable
1 syncFlag) at
AzureDataFunctions.DataLakeController.CreateFileInDataLake(String
destFilePath, Stream stream, Boolean overwrite) in
F:GitHubutoDWADF_ProcessAllFilesADF_ProcessAllFilesDataLakeController.cs:line
122
Has anybody else encountered this? Or observed similar behaviour? I am getting around this by splitting my files into 30Mb pieces and uploading them.
However this is impractical in the long term because the original file is 380Mb, and potentially quite a bit larger. I do not want to have 10-15 dissected files in my datalake in the long term. I would like to upload as a single file.
I am able to upload the exact same file to the datalake through the portal interface.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…