Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
450 views
in Technique[技术] by (71.8m points)

c# - 30Mb limit uploading to Azure DataLake using DataLakeStoreFileSystemManagementClient

I am receiving an error when using

_adlsFileSystemClient.FileSystem.Create(_adlsAccountName, destFilePath, stream, overwrite)

to upload files to a datalake. The error comes up with files over 30Mb. It works fine with smaller files.

The error is:

at Microsoft.Azure.Management.DataLake.Store.FileSystemOperations.d__16.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.Management.DataLake.Store.FileSystemOperationsExtensions.d__23.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.Management.DataLake.Store.FileSystemOperationsExtensions.Create(IFileSystemOperations operations, String accountName, String directFilePath, Stream streamContents, Nullable1 overwrite, Nullable1 syncFlag) at AzureDataFunctions.DataLakeController.CreateFileInDataLake(String destFilePath, Stream stream, Boolean overwrite) in F:GitHubutoDWADF_ProcessAllFilesADF_ProcessAllFilesDataLakeController.cs:line 122

Has anybody else encountered this? Or observed similar behaviour? I am getting around this by splitting my files into 30Mb pieces and uploading them.

However this is impractical in the long term because the original file is 380Mb, and potentially quite a bit larger. I do not want to have 10-15 dissected files in my datalake in the long term. I would like to upload as a single file.

I am able to upload the exact same file to the datalake through the portal interface.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

It answered here.

Currently there is a size limit of 30000000 bytes. You can work around by creating an initial file and then append, both with stream size less than the limit.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...