I am trying to implement writing files to an SFTP server in Node using the SSH2 module. For my use case, the file source is Azure Blob Storage, and files are relatively large (more than 5 gigs) and so the idea is to capture data from blob storage in chunks and write them to the server. Don't want to download the whole file and then perform the write as files are large and don't want to have a disk space issue during runtime.
I have a working implementation of this by making use of downloadToBuffer() and write() functions available and incrementing the 'offset' until all the bytes and written. As seen in the code snippet
sftp.open('remoteFilePath','w', async (openError,handle) => {
if (openError) throw openError;
var blobOffset=0;
try{
while(blobOffset<file.size){
await client.downloadToBuffer(blobOffset, blobOffset+ length > file.size? file.size - blobOffset: length).then((buffer) => {
sftp.write(handle,buffer,0,blobOffset + length > file.size? buffer.length:length, blobOffset, (writeError)=>{if(writeError) throw writeError});
});
blobOffset += length;
}
}
catch(e){
console.log(e);
}
}
This solution works but does not feel very efficient for large files. Is there a much better way to implement this? Maybe using streams and don't have to use loops?
question from:
https://stackoverflow.com/questions/65885593/write-buffer-data-to-sftp-server-using-node-and-ssh2 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…