In daily work, it is often necessary to process and export data in various formats according to operational requirements. After exporting, many people prefer to put the data into Excel for processing.

In general, when processing data export, you need to perform some operations on the data. In the past, the way to do this was to run scripts on a separate server.

Now that you know the cloud, you don’t have to maintain the server, you can just write the code and leave everything to the cloud functions. In this article, we will introduce the function of exporting data tables to Excel files through the Know-Cloud function, and upload the code to the Know-Cloud by using Webpack and MinCloud.

Technology stack:

I. Project construction

Project file structure:

Export-Excel-File ├─ Index.js ├─ Src │ ├─ Index.js ├─ Index.js ─ Webpack.config. JS ├─ Index.js ─ Webpack.config

The project setup is basically the same as the cloud function code packaging sample document. After the project is set up, you need to install the following dependencies (choose one of the two installation methods) :

// NPM install --save node-xlsx minclou

Modify the deploy script as follows:

// package.json
...
"scripts": {
   "build": "webpack --mode production",
   "predeploy": "npm run build",
   "deploy": "mincloud deploy export-excel-file ../"
},
...

In the end we will deploy and test using the following two commands:

Mincloud Invoke Export-Excel-File // Test the Cloud function that has been deployed to the Cloud Invoke

2. Export the data table to an Excel file

We need to prepare two tables: order: order table (new field: name, price) export_task: export task record table (new field: file_download_link)

Cloud function calls that know the cloud have synchronous and asynchronous modes. The maximum timeout of synchronous calls is 5 s, while that of asynchronous calls is 300 s.

Assume that the Order order table has 100,000 pieces of data. Since it is known that the maximum limit of pulling data from the cloud is 1000 pieces at a time, the data need to be obtained in batches. In addition, the subsequent data processing may be required, which will take more than 5 seconds, so the call to the cloud function will be asynchronous. This is where the EXPORT_TASK export task log table is needed to manage the export tasks.

The EXPORT_TASK table manages the export task as follows:

  1. When a cloud function is calledexport_taskMysql > create A record from table Afile_download_linkIf the field value is null, and you get the id of record A, let’s call that idjobId
  2. fororderTable data query, Excel file generation, file upload and other operations, get the file download link
  3. Then according to thejobIdTo update the record created in the first step, save the file download link tofile_download_linkIn the field
  4. When it’s updated, it’ll be inexport_taskGet the file download link in the table

Through the above preparation and analysis, the operation of exporting Excel file is divided into the following four steps:

  1. orderOrder table data acquisition
  2. Use the obtained data to create an Excel file in a cloud function environment
  3. Upload the Excel file created to the know-cloud
  4. Save the file download link toexport_taskIn the tablefile_download_linkfield

The complete code is as follows:

const fs = require('fs') const xlsx = require('node-xlsx') const EXPORT_DATA_CATEGORY_ID = '5c711e3119111409cdabe6f2' // Const TABLE_ID = {order: 66666, // order table export_task: } const TMP_FILE_NAME = '/ TMP /result.xlsx' // local temporary file path with/TMP https://doc.minapp.com/support/technical-notes.html (cloud function of temporary file storage) const ROW_NAME = [' name ', // let result = [] /** *. ** *. ** * Update the file_download_link field in the export record * @Param {*} TableID * @Param {*} RecordId * @Param {*} FileLink */ function updateExportJobIdRecord(tableID, recordId, fileLink) { let Schame = new BaaS.TableObject(tableID) let schame = Schame.getWithoutData(recordId) schame.set('file_download_link', FileLink) return schame.update()} /** * Create the data export task * Set the initial file_download_link to null * Save the file download address to file_download_link after the export task completes * @Param {*} TableId */ Function CreateExportJobIdRecord (TableId) {let Schame = new Baas.TableObject (TableId) let schame = Schame.create() return schame.set({file_download_link: Then (res => {return res.data.id})} /** * Get the total number of data * @tableid {*} tableId */ function getTotalCount(tableId) { const Order = new BaaS.TableObject(tableId) return Order.count() .then(num => { Console. log(' Error :', num) return num}). Catch (err => {console.log(' Error :', num) Err) throw new Error(err)})} /** ** @Param {*} tableId * @Param {*} offset * @Param {*} limit */ function getDataByGroup(tableId, offset = 0, limit = LIMIT) { let Order = new BaaS.TableObject(tableId) return Order.limit(limit).offset(offset).find() .then(res => }). Catch (err => {console.log(' Failed to fetch packet data :', Err) throw new Error(err)})} /** * create Excel export * @Param {*} sourceData */ function genExportFile(sourceData = Const resultArr = [] const resularr = [] const resularr = [] const resularr = [] Row_name.map (k => v[k]))}) resultArr[0] = {data: rowArr, name: 'sheet1', // Excel worksheet name} const option = {'! cols': [{wch: 10}, {wch: Build (resultArr, option) return fs.writeFile(TMP_FILE_NAME, buffer, resultArr, resultArr, resultArr) If (err) {console.log(' Fail to create Excel export file ') throw new Error(err)}})} /** * UploadFile () {let  MyFile = new BaaS.File() return MyFile.upload(TMP_FILE_NAME, {category_id: EXPORT_DATA_CATEGORY_ID}). Catch (err => {console.log(' File upload failed ') throw new Error(err)})} module.exports = async function(event, callback) { try { const date = new Date().getTime() const groupInfoArr = [] const groupInfoSplitArr = [] const [jobId, totalCount] = await Promise.all([createExportJobIdRecord(TABLE_ID.export_task), getTotalCount(TABLE_ID.order)]) const groupSize = Math.ceil(totalCount / LIMIT) || 1 for (let i = 0; i < groupSize; i++) { groupInfoArr.push({ offset: i * LIMIT, limit: LIMIT, }) } console.log('groupInfoArr:', groupInfoArr) const length = Math.ceil(groupInfoArr.length / MAX_CONNECT_LIMIT) for (let i = 0; i < length; i++) { groupInfoSplitArr.push(groupInfoArr.splice(0, MAX_CONNECT_LIMIT)) } console.log('groupInfoSplitArr:', GroupInfoSplitarr) const date0 = new Date().getTime() console.log(' group_time ', date0-date, 'ms') let num = 0 const getSplitDataList = Index = bb0 {return Promise. All (GroupInfo PlitArr [Index]. Map (v => { return getDataByGroup(TABLE_ID.order, v.offset, v.limit) }) ).then(res => { ++num result.push(... Array.prototype.concat(... res)) if (num < groupInfoSplitArr.length) { return getSplitDataList(num) } else { return result } }) } Then (res => {const date1 = new Date().getTime() console.log(' Result Count :'), Result. Length) console.log(' fetch time :', num) console.log(' fetch time :', date1 - date0, 'ms') genExportFile(result) const date2 = new Date().getTime() console.log(', date2 - date1 ') 'ms') uploadFile().then(res => { const fileLink = res.data.file_link const date3 = new Date().getTime() Console. log(' Upload time :', date3-date2, 'ms') console.log(' Total time :', date3-date, 'ms') updateExportJobIdRecord(TABLE_ID.export_task, jobId, FileLink).then(() => {const date4 = new Date().getTime() console.log(' ', date4 - date3, ') 'ms') console.log(' timeout :', date4-date, 'ms') callback(null, {message: Catch (err => {callback(err)})}). Catch (err => {console.log(' Failed to upload the file: ', err) throw new Error(err) }) }) } catch (err)

III. Deploy and test

As with NPM, you need to log in before deployment. Please refer to the documentation for configuration.

To deploy a cloud function to a knowing cloud, use the following command:

npm run deploy

The execution results are as follows:

Use the following command to test:

mincloud invoke export-excel-file

The execution results are as follows:

export_taskTable record:

The Excel file uploaded to Qiaoyun is as follows:

File contents:

4. Reference documents

Know cloud development documentation: https://doc.minapp.com/ node – XLSX document: https://www.npmjs.com/package…

Five, the source

Warehouse address: https://github.com/ifanrx/exp…

Six, welfare

Starting from March 8, the top 50 users who sign up and pass the verification of the public test of the Web end will receive a gift of 100 yuan after the official access. 🎉

Registration point here 👉 know cloud public testing activities in addition to this good news, xiao yun also tell you that after the end of the public testing activities will also pick out 5 positive feedback of users to get know xiao yun limited memorial T-shirt ah

(. · Ω ·.) ノ ♡


In this paper, starting from the “unknown cloud” public number: https://mp.weixin.qq.com/s/g6…

Knowcloud is the first domestic back-end cloud service focusing on small program development. Know the use of cloud, small program development one step faster.