I'm making a photo sharing web app. Users can select photos they want to download from a gallery view, and press a download button to download those selected photos into a ZIP file.
The problem is, I don't really know an efficient way to make a ZIP file from an array of files, especially with big sizes. Photos are about 2-3 MB in size, or even more for higher-res images.
I'm using Firebase functions, or specifically Google Cloud Functions. I already have a serverless function that does what I want, but it's super slow and takes up a lot of memory. It downloads all the selected photos into a temporary folder from Google Cloud Storage, uses node-archiver, and uploads the final ZIP file to Google Cloud Storage, where the link of that file to download is returned from the serverless function. 32 photos takes about 23 seconds for the function to complete, whereas 150 photos takes a whopping 97 seconds.
I tried using buffers instead of writing the files to a temporary directory, and it still had about the same times.
I got it to become a bit faster by deleting the files in the temporary folder as node-archiver appends it to the ZIP. This most likely eliminates some memory since the temporary folder for Google Cloud Functions are counted against memory too. 32 photos takes about 16 seconds, and 150 photos takes 81 seconds.
Is there any other efficient way to do this? I just thought about file streams as I was writing this question, but I'm not familiar with them.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…