php - Create and download large zip from GCE without reaching memory and response limits

one text

We have a problem downloading large .zip files from GCE, it is either memory limit (512 MB), or response size limit (32 MB).

We could increase memory limit, but we already have 1 GB .zip and it can be 10 GB next time.

Response size limit cannot be increased.

The goal: download product images archived as .zip.

Images vary from 1 MB to 10+ MB, so e.g. 100+ products .zip would be 1 GB in size. In the future there will be catalogs of 1K products, so the size can be 10 GB+.

Project is on Laravel 6.

I've already tried various solutions without success:

  • Saving .zip in GCS and then start downloading. Result: reaches memory limits.
  • Adding files to .zip without saving, and streaming with ZipStream. Result: Response size limit triggered.

--

Code using ZipStream:

        $options->setSendHttpHeaders(true);
        //$options->setContentType('application/octet-stream');
        $options->setZeroHeader(true);

        $zipName = time().'.zip';

        // create a new zipstream object
        $zip = new ZipStream($zipName, $options);

        // Turned off deflate
        $fileOptions = new FileOptions();
        $fileOptions->setMethod(MethodOptions::STORE());

        foreach($files as $file) {
            $zip->addFileFromStream($file['name'], Storage::readStream($file['src'], $fileOptions));
        }

        // finish the zip stream
        $zip->finish();

This way it is OK with memory limits, but I get response size (32MB) error which not sure how to overcome.

I've tried saving to GCS, but without any luck, it runs out of memory each time. Even if I could save successfully, it runs out of memory during download, e.g.:

$fs = Storage::getDriver();

        $metaData = $fs->getMetadata($filePath);
        $stream = $fs->readStream($filePath);

        return response()->streamDownload(function () use ($stream) {
            while(ob_get_level() > 0) ob_end_flush();

            fpassthru($stream);
        }, $name, [
            'Content-Type' => $metaData['type'],
            'Content-disposition' => 'attachment; filename="' . $name . '"',
        ]);

The solution could be saving to GCS with longer (random) .zip name as public for shorter period of time (for security reasons), and initiate download directly, the problem is that I haven't managed to do so without reaching memory limits:

$zip = new ZipArchive();
        $zipName = time().'.zip';
        $downloadFolderName = 'downloads';
        $zipCreatedDirs = [];

        if (! Storage::exists($downloadFolderName)) {
            Storage::makeDirectory($downloadFolderName);
        }

        $zipPath = tempnam(sys_get_temp_dir(), time());

        if (! $zip->open($zipPath, ZIPARCHIVE::OVERWRITE)) {
            return response()->json([
                'status'  => 'error',
                'message' => 'Failed to create .zip archive.'
            ]);
        }

        // Add files for download one by one to zip
        // and split into different directories grouped by product
        foreach($files as $file) {
            if (! in_array($file['dirName'], $zipCreatedDirs)) {
                $zip->addEmptyDir($file['dirName']);

                $zipCreatedDirs[] = $file['dirName'];
            }

            $zip->addFromString($file['dirName'].'/'.$file['name'], Storage::get($file['src']));
        }

        $zip->close();

        Storage::putFileAs($downloadFolderName, new File($zipPath), $zipName, 'private');

        return response()->json([
            'status' => 'success',
            'downloadFiles' => [
                [
                    'src' => route('download', [base64_encode($downloadFolderName.'/'.$zipName), base64_encode($zipName)]),
                    'name' => $zipName
                ]
            ]
        ], 200, [], JSON_NUMERIC_CHECK);

It seems that pretty much any of this doesn't work on GCE compared to Apache (local WAMP environment). If I set memory limit to 512 MB (the same as in GCE), I can download with the script above .zip of 1 GB without reaching memory limits.

Maybe I do something wrong, or don't understand something how it should be done. Any other thoughts or ideas? Any help would be appreciated.

Source