Introduction: I did a small project before, which used file uploading, and used breakpoint continuation on large files to reduce the pressure on the server. Now I would like to make a detailed summary of this development experience.

directory

  • The principle is introduced
  • Methods to summarize
  • We practice

The principle is introduced

Here’s how file uploading works to help clarify this.

Ordinary upload

General websites are ordinary upload more, most of them are to upload some users’ profile pictures, users’ dynamic comments with pictures, so first talk about the principle of this.

  • After the user selects the file, JS detects whether the file size is out of limit and whether the format is correct.
  • After checking, Ajax is used to submit the request, and the server also verifies the second time and stores it to the server.
  • The back end returns the file address to the front end and renders the page data.

Large file upload

  • After the user selects the file, JS detects whether the file size is out of limit and whether the format is correct.
  • Depending on the file size, usefile.sliceMethod for file segmentation;
  • useSparkMD5andFileReaderAPI generates file unique MD5 values;
  • Submit a request using Ajax, the server receives the file and returns the information in the server;

    • Returns how many files were uploaded in the md5 folder, if the md5 folder exists;
    • If it does not exist, create a new folder with the md5 value and return empty contents.
  • After receiving the information, the front end makes a judgment based on the returned information.

    • If the length of the returned file is equal to the total length of the slice, the file is requested to be merged.
    • If the length of the returned file is less than the total length of the slice, the corresponding slice file will be uploaded, and the merge file will be requested after the last slice is uploaded.
  • When the back-end receives the merge request, it merges the files in the folder with the corresponding MD5 value and returns the file address.
  • The front end receives the file address and renders the page data;

Breakpoint continuingly

In case of force majeure, such as network outage, server outage, or other reasons, the uploading process is interrupted.

Next time, the server will find out how many files have been uploaded and how many have not been uploaded according to the MD5 value of the file, and send it to the client. After receiving the files, the client will continue to upload the ones that have not been uploaded. After uploading, the server will merge the files and return the address.

In this way, the file is avoided to upload repeatedly, which wastes server space and saves server resources, and the speed is faster and more efficient than uploading a large file.

Methods to summarize

Next according to the above logical principle analysis steps, code function realization.

Common file

This section covers uploading normal files, both front-end and back-end.

The front part

  • HTML part

Let’s build a little house first

<div class=" Upload "class=" Upload" class=" Upload "class=" Upload" class=" Upload "class=" Upload" class=" Upload "class=" Upload" class=" Upload "class=" Upload" class=" Upload "</ div > Name ="file" id="file" accept="image/*" bb0 </div> > <div class=" <p style="width: 0;" Id ="current"></span> </div div> <div class=" link" href="javascript:void();" Target = "_blank" > file link < / a > < / div > < / form > < / div > < div class = "upload" > < h3 > large file upload < / h3 > < form > < div class = "upload the file - >" </div> > <input type="file" name="file" id="big-file" accept="application/*"> </div> <div Class ="upload-progress" <p style="width: 0; width: 0; Id ="big-current"></span> </div div> <div class="upload-link"> </div> < a id = "big - links" href = "" target =" _blank "> file link < / a > < / div > < / form > < / div >

Two JS files, Axios and Spark-md5, were introduced.

< script SRC = "https://cdn.bootcdn.net/ajax/libs/axios/0.21.1/axios.min.js" > < / script > < script SRC = "https://cdn.bootcdn.net/ajax/libs/spark-md5/3.0.0/spark-md5.min.js" > < / script >
  • The CSS part

Come and decorate the house.

body { margin: 0; font-size: 16px; background: #f8f8f8; } h1,h2,h3,h4,h5,h6,p { margin: 0; } /* * { outline: 1px solid pink; } */ .upload { box-sizing: border-box; margin: 30px auto; padding: 15px 20px; width: 500px; height: auto; border-radius: 15px; background: #fff; } .upload h3 { font-size: 20px; line-height: 2; text-align: center; } .upload .upload-file { position: relative; margin: 30px auto; } .upload .upload-file label { display: flex; justify-content: center; align-items: center; width: 100%; height: 150px; border: 1px dashed #ccc; } .upload .upload-file input { position: absolute; top: 0; left: 0; width: 100%; height: 100%; opacity: 0; } .upload-progress { display: flex; align-items: center; } .upload-progress p { position: relative; display: inline-block; flex: 1; height: 15px; border-radius: 10px; background: #ccc; overflow: hidden; } .upload-progress p span { position: absolute; left: 0; top: 0; width: 0; height: 100%; background: linear-gradient(to right bottom, rgb(163, 76, 76), rgb(231, 73, 52)); transition: all .4s; } .upload-link { margin: 30px auto; } .upload-link a { text-decoration: none; color: rgb(6, 102, 192); } @media all and (max-width: 768px) { .upload { width: 300px; }}
  • Js part

Finally, add an interactive effect.

// Get the element const file = document.querySelector('#file'); let current = document.querySelector('#current'); let links = document.querySelector('#links'); let baseUrl = 'http://localhost:3000'; // Listener for file.addEventListener('change', (e) => {console.log(e.target.files); let file = e.target.files[0]; If (file.type.indexOf('image') == -1) {return alert(' file.type.indexOf('image')) == -1) {return alert(' '); } if ((file.size / 1000) > 100) {return alert(' '); } links.href = ''; file.value = ''; this.upload(file); }, false); Async function Upload (file) {let formData = new formData (); formData.append('file', file); let data = await axios({ url: baseUrl+'/upload', method: 'post', data: formData, onUploadProgress: function(progressEvent) { current.style.width = Math.round(progressEvent.loaded / progressEvent.total * 100) + '%'; }}); if (data.data.code == 200) { links.href = data.data.data.url; } else {alert(' Upload failed! ')}}

The backend part

Open the folder demo, download and install a file processing package formidable, then start processing the uploaded files.

Create a new folder and don’t forget to introduce new files in app.js.

const upload = require('./upload/index');

app.use(express.static(path.join(__dirname, 'public')));
app.use('/file', express.static(path.join(__dirname, 'static')));

app.use('/upload', upload);

Below is the file hierarchy diagram.

-- static
    -- big
    -- doc
    -- temp
-- upload
    - index.js
    - util.js
-- app.js
const express = require('express'); const Router = express.Router(); const formidable = require('formidable'); const path = require('path'); const fs = require('fs'); const baseUrl = 'http://localhost:3000/file/doc/'; const dirPath = path.join(__dirname, '.. Router.post("/", (req, res) = bb0 {let form = formidable({multiples: true, uploadaddir: formidable) // router.post ("/", (req, res) = formidable({multiples: true, uploadaddir: formidable) dirPath+'temp/' }) form.parse(req, (err,fields, files)=> { if (err) { return res.json(err); } let newPath = dirPath+'doc/'+files.file.name; fs.rename(files.file.path, newPath, function(err) { if (err) { return res.json(err); } return res.json({ code: 200, msg: 'get_succ', data: { url: baseUrl + files.file.name } }) }) }) }); module.exports = Router;

A large file

This large file breakpoint continuation, in fact, is in the previous file upload on the basis of further expansion. So the structure of the front end is the same as the style, but the method is different.

The front part

Here is mainly the method introduction.

  • Access to elements
const bigFile = document.querySelector('#big-file');
let bigCurrent = document.querySelector('#big-current');
let bigLinks = document.querySelector('#big-links');
let fileArr = [];
let md5Val = '';
let ext = '';
  • Test file
bigFile.addEventListener('change', (e) => { let file = e.target.files[0]; If (file.type.indexOf('application') == -1) {return alert(' file format can only be used for document application! '); } if ((file.size/(1000*1000)) > 100) {return alert(' '); } this.uploadBig(file); }, false);
  • The cutting file
Function sliceFile (file) {const files = []; const chunkSize = 128*1024; for (let i = 0; i < file.size; i+=chunkSize) { const end = i + chunkSize >= file.size ? file.size : i + chunkSize; let currentFile = file.slice(i, (end > file.size ? file.size : end)); files.push(currentFile); } return files; }
  • Gets the MD5 value of the file
Function md5File (files) {const spark = new sparkmd5.arrayBuffer (); let fileReader; for (var i = 0; i < files.length; i++) { fileReader = new FileReader(); fileReader.readAsArrayBuffer(files[i]); } return new Promise((resolve) => { fileReader.onload = function(e) { spark.append(e.target.result); if (i == files.length) { resolve(spark.end()); }}})}
  • Upload the shard file
async function uploadSlice (chunkIndex = 0) {
    let formData = new FormData();
    formData.append('file', fileArr[chunkIndex]);
    let data = await axios({
        url: `${baseUrl}/upload/big?type=upload&current=${chunkIndex}&md5Val=${md5Val}&total=${fileArr.length}`,
        method: 'post',
        data: formData,
    })

    if (data.data.code == 200) {
        if (chunkIndex < fileArr.length -1 ){
            bigCurrent.style.width = Math.round((chunkIndex+1) / fileArr.length * 100) + '%';
            ++chunkIndex;
            uploadSlice(chunkIndex);
        } else {
            mergeFile();
        }
    }
}
  • Merge files
async function mergeFile () { let data = await axios.post(`${baseUrl}/upload/big? type=merge&md5Val=${md5Val}&total=${fileArr.length}&ext=${ext}`); If (data.data.code == 200) {alert(' Uplift! '); bigCurrent.style.width = '100%'; bigLinks.href = data.data.data.url; } else { alert(data.data.data.info); }}

The backend part

  • To obtain parameters

Get the upload parameters and make a judgment.

let type = req.query.type; let md5Val = req.query.md5Val; let total = req.query.total; let bigDir = dirPath + 'big/'; let typeArr = ['check', 'upload', 'merge']; if (! Type) {return res.json({code: 101, MSG: 'get_fail', data: {info: 'Upload type cannot be empty! ' } }) } if (! Md5Val) {return res.json({code: 101, MSG: 'get_fail', data: {info: 'file md5 value cannot be empty! ' } }) } if (! Json ({code: 101, MSG: 'get_fail', data: {info: 'File slice number cannot be empty! ' } }) } if (! Typear.includes (type)) {return res.json({code: 101, MSG: 'get_fail', data: {info: 'Wrong upload type! '}})}
  • The type is detection
let filePath = `${bigDir}${md5Val}`; fs.readdir(filePath, (err, data) => { if (err) { fs.mkdir(filePath, (err) => { if (err) { return res.json({ code: 101, MSG: 'get_fail', data: {info: 'Fail! ', err}})} else {return res.json({code: 200, MSG: 'get_succ', data: {info: ') ', data: { type: 'write', chunk: [], total: 0 } } }) } }) } else { return res.json({ code: 200, msg: 'get_succ', data: {info: 'Got success! ', data: { type: 'read', chunk: data, total: data.length } } }) } })
  • The type is uploaded
let current = req.query.current; if (! Current) {return res.json({code: 101, MSG: 'get_fail', data: {info: 'The current shard value of the file cannot be empty! ' } }) } let form = formidable({ multiples: true, uploadDir: `${dirPath}big/${md5Val}/`, }) form.parse(req, (err,fields, files)=> { if (err) { return res.json(err); } let newPath = `${dirPath}big/${md5Val}/${current}`; fs.rename(files.file.path, newPath, function(err) { if (err) { return res.json(err); } return res.json({ code: 200, msg: 'get_succ', data: { info: 'upload success! '}})})})
  • Types are merged
let ext = req.query.ext;
if (!ext) {
    return res.json({
        code: 101,
        msg: 'get_fail',
        data: {
            info: '文件后缀不能为空!'
        }
    })
}

let oldPath = `${dirPath}big/${md5Val}`;
let newPath = `${dirPath}doc/${md5Val}.${ext}`;
let data = await mergeFile(oldPath, newPath);
if (data.code == 200) {
    return res.json({
        code: 200,
        msg: 'get_succ',
        data: {
            info: '文件合并成功!',
            url: `${baseUrl}${md5Val}.${ext}`
        }
    })
} else {
    return res.json({
        code: 101,
        msg: 'get_fail',
        data: {
            info: '文件合并失败!',
            err: data.data.error
        }
    })
}

CreateWriteStream (FS) and CreateReadStream (FS) methods are used to implement this function.

  • Merge files
const fs = require('fs');

function mergeFile (filePath, newPath) {
    return new Promise((resolve, reject) => {
        let files = fs.readdirSync(filePath),
        newFile = fs.createWriteStream(newPath);
        let filesArr = arrSort(files).reverse();
        main();
        function main (index = 0) {
            let currentFile = filePath + '/'+filesArr[index];
            let stream = fs.createReadStream(currentFile);
            stream.pipe(newFile, {end: false});
            stream.on('end', function () {
                if (index < filesArr.length - 1) {
                    index++;
                    main(index);
                } else {
                    resolve({code: 200});
                }
            })
            stream.on('error', function (error) {  
                reject({code: 102, data:{error}})
            })
        }
    })
}
  • File sorting
function arrSort (arr) {
    for (let i = 0; i < arr.length; i++) {
        for (let j = 0; j < arr.length; j++) {
            if (Number(arr[i]) >= Number(arr[j])) {
                let t = arr[i];
                arr[i] = arr[j];
                arr[j] = t;
            }
        }
    }
    return arr;
}

We practice

Now that the method is written, let’s test that it’s OK.

Two files are prepared to test the two functions separately.

  • Common file

This is the normal file upload interface

After uploading successfully:

Back end return content:

Open file address preview:

You can see success!

  • A large file

This is the large file upload interface

After uploading successfully:

Here is a shard file being uploaded:

This is the content returned by merging the file after uploading the shard:

Open file address preview:

Upload again found soon return file address:

This is a screenshot of the nodejs directory. You can see that the shards of the files are well preserved and well merged.



So much for file uploading and breakpoint continuation. Of course, the methods I mentioned above are just a reference. If you have a better way, please contact me.