This article has been updated, please go to: juejin.cn/post/685003…

For now, I’ll keep this one. This is the initial version for your convenience.

In the nuggets white piao for a long time, want to should feedback, in order to continue to white piao 😁

background

I read about implementing breakpoint continuations a long time ago in Denver, but I didn’t do it, so I went back and did it from start to finish.

Technology stack: VUE+Elementui+ LocalStorage +Worker

Message: I am not a “giant “, but lucky to stand on the shoulders of giants

Please read the following blog post first:

Juejin. Cn/post / 684490… Juejin. Cn/post / 684490…

conclusion

In my opinion, the main difficulty of this project lies in how to correspond the status and progress of each file to the relevant interface display when processing multiple file uploads. It’s a lot of detour.

Do retry, card along while, really do not understand, finally go to the toilet to stay, and then quickly write out.

Note: the following code is pseudocode and is not recommended to be used directly in the project. The official code, WHICH I need to sort out again, will be updated this week. In order to increase fans, first publish the article, ha ha.

Train of thought

  • File uploading logic: Traverse the selected file – create slice – Compute HASH- Request file verification interface, check whether the file exists – If yes – End – If not, upload slice – Request Merge interface – End
  • Breakpoint continuation logic: There are generally two ways of breakpoint continuation. One is to store the number of slices for the server. The advantage is that the user can also change the browser to continue the transmission. Second, browser-side storage: there seems to be no special advantages, it is easier to implement, but there are many disadvantages, unfortunately, this project is to store slices in the browser side. Ha, ha, ha
    • Store the uploaded slice to localstorage and use hash as the key. -> Check whether the slice exists in the cache before each run. If yes, skip the uploaded slice.

Detailed ideas you can refer to the above 2 articles, no longer repeat!

The front part

// simple <inputtype="file" multiple @change="handleFileChange" />
<el-button @click="handleUpload"</el-button> <el-button @click="handleResume"</el-button> <el-button @click="handlePause"</el-button> //js const SIZE = 50 * 1024 * 1024; Var fileIndex = 0; // The subscript of the file currently being traversedexport default {
  name: 'SimpleUploaderContainer',
  data: () => ({
    container: {
      hashArr: [], // Stores the calculated completedhashData: []}, tempFilesArr: [], // Store files message uploadMax: 3, cancels: [] // Store request to cancel})}Copy the code
  • The handleFileChange method is implemented. Because the fileList object is read-only, you need to copy a list of fileList data
handleFileChange(e) {
  const files = e.target.files;
  console.log('handleFileChange -> file', files);
  if(! files)return;
  Object.assign(this.$data, this.$options.data()); // reset all data fileIndex = 0; // Reset the subscript this.container. Files = files; // Copy a filelist objectfor (const key in this.container.files) {
    if (this.container.files.hasOwnProperty(key)) {
      const file = this.container.files[key];
      var obj = { statusStr: 'Uploading', chunkList: [], uploadProgress: 0, hashProgress: 0 };
      for (const k infile) { obj[k] = file[k]; } this.tempFilesArr.push(obj); }}}Copy the code
  • File upload

    Create slices – “-” calculate HASH- “Determine whether to upload slices in seconds -” upload slices – “Stores uploaded slice subscripts. The hash calculation method is also processed by the worker.

    async handleUpload() {
     if(! this.container.files)return;
     const filesArr = this.container.files;
     var tempFilesArr = this.tempFilesArr;

     console.log('handleUpload -> filesArr', filesArr);
     for (let i = 0; i < filesArr.length; i++) {
       fileIndex = i;
       const fileChunkList = this.createFileChunk(filesArr[i]);
       // hashCheck whether the second is consthash = await this.calculateHash(fileChunkList, filesArr[i].name);
       console.log('handleUpload -> hash'.hash);
       const verifyRes = await this.verifyUpload(filesArr[i].name, hash);
       if(! verifyRes.data.presence) { this.$message('a pass');
         tempFilesArr[i].statusStr = 'Passed in seconds';
         tempFilesArr[i].uploadProgress = 100;
       } else {
         console.log('Start uploading files ----', filesArr[i].name);
         const getChunkStorage = this.getChunkStorage(hash);
         tempFilesArr[i].fileHash = hash; / / filehashChunkList = FilecHunkList.map (({file}, index) => ({fileHash:hash,
           fileName: filesArr[i].name,
           index,
           hash: hash + The '-'+ index, chunk: file, size: file.size, uploaded: GetChunkStorage && getChunkStorage.includes(index), // mark: whether the upload is complete progress: getChunkStorage && getChunkStorage.includes(index) ? 100 : 0, status: getChunkStorage && getChunkStorage.includes(index) ?'success' : 'wait'// Upload status, used as progress status display})); console.log('handleUpload -> this.chunkData', tempFilesArr[i]); await this.uploadChunks(this.tempFilesArr[i]); CreateFileChunk (file, size = size) {const fileChunkList = []; var count = 0;while (count < file.size) {
       fileChunkList.push({
         file: file.slice(count, count + size)
       });
       count += size;
     }
     returnfileChunkList; } // Save the uploaded slice subscript addChunkStorage(name, index) {const data = [index]; const arr = getObjArr(name);if (arr) {
       saveObjArr(name, [...arr, ...data]);
     } else{ saveObjArr(name, data); }}, // get the uploaded slice subscript getChunkStorage(name) {returngetObjArr(name); } // Generate the filehash(web-worker) calculateHash(fileChunkList, name) {return new Promise((resolve) => {
       this.container.worker = new Worker('./hash/md5.js');
       this.container.worker.postMessage({ fileChunkList });
       this.container.worker.onmessage = (e) => {
         const { percentage, hash} = e.data; // Then want to put each filehashThis.tempfilesarr [fileIndex].hashProgress = percentage;if (hash) {
           resolve(hash); }}; }); }Copy the code

UploadChunks uploadChunks is a method that I thought about for a day

// Transfer slices to server async uploadChunks(data) {var chunkData = data.chunkList; const requestDataList = chunkData .filter(({ uploaded }) => ! uploaded) .map(({ fileHash, chunk, fileName, index }) => { const formData = new FormData(); formData.append('md5', fileHash);
         formData.append('file', chunk);
         formData.append('fileName', index);
         return { formData, index, fileName };
       });

     try {
       const ret = await this.sendRequest(requestDataList, chunkData);
       console.log('uploadChunks -> chunkData', chunkData);
       console.log('ret', ret);
       data.statusStr = 'Upload successful'; } catch (error) {// Upload data.statusstr ='Upload failed, please try again';
       this.$message.error('Upload failed, consider trying again.');
       return; Const isUpload = chunkdata.some ((item) => item. event === =false);
     console.log('created -> isUpload', isUpload);
     if (isUpload) {
       alert('Failed slice exists');
     } else{// perform merge await this.mergerequest (data); }Copy the code

SendRequest concurrent upload slice + retry mechanism

The retry mechanism is also referenced in the above blog post:

  • Request error puts failed task on queue
  • The array stores the number of hash retries for each file. For example, [1,0,2] indicates that the 0th file slice error occurs once and the second file slice error occurs twice. Failed to throw if uploading is stopped for more than 3 times

Concurrency: Concurrency is achieved by controlling the starting value through the for loop and making recursive calls within the function.

// Concurrently handle sendRequest(forms, chunkData) {console.log('sendRequest -> forms', forms);
     console.log('sendRequest -> chunkData', chunkData); var finished = 0; const total = forms.length; const that = this; const retryArr = []; // Array stores each filehashThe number of retries of the request, such as [1,0,2], is the 0 file slice error 1, the second error 2return new Promise((resolve, reject) => {
       const handler = () => {
         console.log('handler -> forms', forms);
         if(forms.length) {// stack const formInfo = forms.shift(); const formData = formInfo.formData; const index = formInfo.index; instance .post('fileChunk', formData, {
               onUploadProgress: that.createProgresshandler(chunkData[index]),
               cancelToken: new CancelToken((c) => this.cancels.push(c)),
               timeout: 0
             })
             .then((res) => {
               console.log('handler -> res', res); ChunkData [index]. Matches =true;
               chunkData[index].status = 'success'; // Store uploaded slices with the subscript this.addChunkStorage(chunkData[index].fileHash, index); finished++; handler(); }) .catch((e) => { console.warn('Error occurred', e);
               console.log('handler -> retryArr', retryArr);
               if(typeof retryArr[index] ! = ='number') { retryArr[index] = 0; } // Update status chunkData[index]. Status ='warning'; RetryArr [index]++; // Retry 3 timesif (retryArr[index] >= 3) {
                 console.warn('Retry failed -- > handler -> retryArr', retryArr, chunkData[index].hash);
                 return reject('Retry failed', retryArr);
               }

               console.log('handler -> retryArr[finished]', `${chunkData[index].hash}For the first -${retryArr[index]} 'Retries'`); console.log(retryArr); this.uploadMax++; Form.push (formInfo); form.push (formInfo); form.push (formInfo); handler(); }); } console.log('handler -> total', total);
         console.log('handler -> finished', finished);

         if (finished >= total) {
           resolve('done'); }}; // Control concurrencyfor (leti = 0; i < this.uploadMax; i++) { handler(); }}); }Copy the code

Progress of the processing

CreateProgresshandler (item) {return(p) => { item.progress = parseInt(String((p.loaded / p.total) * 100)); this.fileProgress(); }; } // Total file progressfileProgress() {// Locate the file currently being transferred through the global fileIndex variable. const currentFile = this.tempFilesArr[fileIndex]; const uploadProgress = currentFile.chunkList.map((item) => item.size * item.progress).reduce((acc, cur) => acc + cur); const currentFileProgress = parseInt((uploadProgress / currentFile.size).toFixed(2)); currentFile.uploadProgress = currentFileProgress; }Copy the code

MergeRequest merges slices

mergeRequest(data) {
     const obj = {
       md5: data.fileHash,
       fileName: data.name,
       fileChunkNum: data.chunkList.length
     };

     instance.post('fileChunk/merge', obj, {timeout: 0}). Then ((res) => {// Clear storage clearLocalStorage(data.fileHash); this.$message.success('Upload successful');
       });
   }
Copy the code

HandlePause suspended

This project uses AXIos, the key to pause is to cancel the current request, Axios also provides a method, we need to deal with it briefly.

 handlePause() {
     while (this.cancels.length > 0) {
       this.cancels.pop()('Cancel request'); }} // Cancels for each request that is currently in transit. [](https://p1-jj.byteimg.com/tos-cn-i-t2oaga2asx/gold-user-assets/2020/7/7/17329dbc5014e7df~tplv-t2oaga2asx-image.image)Copy the code