I recommend splitting the incoming file into smaller-sized chunks on the frontend. I'm doing this to implement a pause/resume function for large file uploads in a Flask application.
Using Sebastian Tschan's jquery plugin, you can implement chunking by specifying a maxChunkSize
when initializing the plugin, as in:
$('#file-select').fileupload({
url: '/uploads/',
sequentialUploads: true,
done: function (e, data) {
console.log("uploaded: " + data.files[0].name)
},
maxChunkSize: 1000000 // 1 MB
});
Now the client will send multiple requests when uploading large files. And your server-side code can use the Content-Range
header to patch the original large file back together. For a Flask application, the view might look something like:
# Upload files
@app.route('/uploads/', methods=['POST'])
def results():
files = request.files
# assuming only one file is passed in the request
key = files.keys()[0]
value = files[key] # this is a Werkzeug FileStorage object
filename = value.filename
if 'Content-Range' in request.headers:
# extract starting byte from Content-Range header string
range_str = request.headers['Content-Range']
start_bytes = int(range_str.split(' ')[1].split('-')[0])
# append chunk to the file on disk, or create new
with open(filename, 'a') as f:
f.seek(start_bytes)
f.write(value.stream.read())
else:
# this is not a chunked request, so just save the whole file
value.save(filename)
# send response with appropriate mime type header
return jsonify({"name": value.filename,
"size": os.path.getsize(filename),
"url": 'uploads/' + value.filename,
"thumbnail_url": None,
"delete_url": None,
"delete_type": None,})
For your particular application, you will just have to make sure that the correct auth headers are still sent with each request.
Hope this helps! I was struggling with this problem for a while ;)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…