javascript - Merging uploaded file chunks in php result in corrupt files

one text

I am developing a file uploader section in my app. The client side is Vue.js and the backend is PHP. Using Laravel as my framework.

I am slicing up the selected files at the client side using Blob.slice() (I have also tried FileReader Api, Resumablejs, Now working on my own implementation). Data is sent using xhr (tried Axios, XMLHTTPRequest), one request per "slice" or "chunk". I fetch the data at the backend and save the incoming file as "chunk1", "chunk2" ... and so on. Upon receiving the last chunk, I merge the chunks using PHP.

My problem is that the merged file somehow corrupts. mp4s - not playable or not seekable, exes - corrupt, some exes do good but not all (its unpredictable), some small pdfs survive.

Failed Attempts

  1. Send sliced data with multipart/form-data

--- save chunk with Storage::put() or Storage::putFileAs() --- save chunk with fopen(file, 'wb' or 'ab'), fwrite(), fclose() --- save chunk with file_put_contents

  1. Send sliced data with base64 encoding

--- save chunk as received (base64 encoded) -> read each chunk with base64_decode() while saving data in new file --- append all chunks as received (base64 encoded) to one file -> later create a new file decoding this appended file. (this attempt was by far the most successful one but still some files corrupted, especially exes).

Client side code ...

upload(file, start = 0, request = 0) {
        let chunkSize = 1024 * 1024 * 3;
        let end = (start + chunkSize) > file.fileObject.size ? file.fileObject.size : (start + chunkSize);
        let reader = new FileReader();
        let slice = file.fileObject.slice(start, end);

        reader.onload = (e) => {

            let data = {
                fileName: file.fileObject.name,
                chunkNumber: request + 1,
                totalChunks: Math.ceil(file.fileObject.size / chunkSize),
                chunk: reader.result.split(',')[1]
            }

            axios({
                url: '/api/admin/batch-sessions/'+ this.batchSessionId +'/files',
                method: 'POST',
                data: data,
                headers: {'Content-Type': 'application/json'}
            })
            .then(res => {
                start += chunkSize;
                request++;

                if (start <= file.fileObject.size) {
                    this.upload(file, start, request);
                }
            })
            .catch(err => {
                console.log(err.message);
            });
        }

        reader.readAsDataURL(slice);
    }

Server side code ...

public function handle()
{
    $chunks = Storage::disk('s3-upload-queue')
        ->files($this->directory);

    $mergedFile = Storage::disk('s3-upload-queue')->path($this->directory.'/'.basename($this->directory));
    $base64File = Storage::disk('s3-upload-queue')->path($this->directory.'/'.basename($this->directory).'.b64');

    $mfs = fopen($mergedFile, 'wb');
    $b64fs = fopen($base64File, 'r');

    fwrite($mfs, base64_decode(fread($b64fs, filesize($base64File))));

    fclose($mfs);
    fclose($b64fs);
}

Actually I do not have in-depth knowledge about different encodings, was reading about base64 chunking here on stackoverflow and tried to create "slice" of size (1024 * 1024 * 3). this is when most files were merged successfully using base64 encoded transfer. but that too was unpredictable. some files still corrupted. I am trying to understand this properly. Please let me know if more info is needed. Thanks.

Source