“This is the 7th day of my participation in the Gwen Challenge in November. Check out the details: The Last Gwen Challenge in 2021.”

background

Work projects rely on external files maintained by other teams, built with Jenkins, and pushed to [Amazon S3](aws.amazon.com/pm/serv-s3/… !!!!! g!! amazon%20s3&ef_id=Cj0KCQiAsqOMBhDFARIsAFBTN3fVdxIfVqInTwrhwfexAuPOiZ8kdzYQBuPVqdsS-GWwSjO7cvLGmesaAgSGEALw_wcB:G:s&s_kwc id=AL! 4422! 3! 488982706716! e!! g!! Amazon %20s3), we need to manually download files from S3 and copy them to the project. The whole process can be automated.

There was also a serious problem: The path we need for the build product in S3 is similar to ‘A /b//c/’, the extra/is actually a folder named ‘/’, which can be identified using Windows S3 Browser, probably because ‘/’ is considered a file separator under MAC, As a result, trying several GUI tools failed to properly identify the directory, so MAC developers had to use Windows to download artifacts in their virtual machines, a process that was extremely wasteful and pointless.

Since Amazon provides API access, I thought I could implement a script to download the update.

The process to comb

Script not used:

Using scripts:

Jenkins → product name → execute script

This can be done directly, eliminating the need for manual processes and eliminating the ‘/’ bug.

The development of

Connect the AWS

Here use aws SDK provided by Amazon, use S3 Client, pass accessKeyId and secretAccessKey to connect:

import S3 from "aws-sdk/clients/s3";
const s3 = new S3({ credentials: { accessKeyId, secretAccessKey } });
Copy the code

The download file

The INTERFACE of adding, deleting, modifying and checking bucket and files is provided in AWS – SDK. Here, we can get the product file name constructed by Jenkins in advance, and download the file according to the file name and location:

const rs = s3
  .getObject({ Bucket: "your bucket name".Key: "file dir + path" })
  .createReadStream();
Copy the code

A Bucket is the location of a file. The Key is the file path in S3. The entire path is equivalent to a directory name and a file name.

Here we can get a ReadStream, which can then be written locally using Node.js:

const ws = fs.createWriteStream(path.join(__dirname, outputfilename));
rs.pipe(ws);
Copy the code

Unpack the

Decompress the package using node-tar and install it directly:

npm install tar
Copy the code

* ReadStream = ‘x’; * ReadStream = ‘x’; * tar = ‘x’;

- const ws = fs.createWriteStream(path.join(__dirname, outputfilename));
- rs.pipe(ws);
+ rs.pipe(tar.x({ C: path.join(__dirname, outputfilename) }));
Copy the code

The pipe operation here returns a stream object, and we can listen to the Finish method to handle the subsequent flow:

const s = rs.pipe(tar.x({ C: path.join(__dirname, outputfilename) }));
s.on('finish'.() = > {
	// do something ...
})
Copy the code

Flatten the directory

The original file has subfolders, we need to move them to the outermost layer, so we need to do a tiling folder operation.

Fs API is divided into synchronous and asynchronous apis. The function name of synchronous API ends with Sync, and the default asynchronous function is callback Error first style. Promises offers a promise style asynchronous API for fs/ Promises, which can be used as needed.

Since our directory has only one layer, we only make one layer flatten. If there are multiple layers, we can use recursion:

async function flatten(dir) {
  const fileAndDirs = await fsp.readdir(dir);
  const dirs = fileAndDirs.filter((i) = >
    fs.lstatSync(path.join(dir, i)).isDirectory()
  );
  for (const innerDir of dirs) {
    const innerFile = await fsp.readdir(path.join(dir, innerDir));
    await Promise.all([
      innerFile
        .filter((item) = > fs.lstatSync(path.join(dir, innerDir, item)).isFile())
        .map((item) = >fsp.rename(path.join(dir, innerDir, item), path.join(dir, item)) ), ]); remove(path.join(dir, innerDir)); }}Copy the code

Copy to target location

Copy files to our project directory. Just call copyFile API and use regular expression to exclude files that are not needed.

async function copy(from, to) {
  const files = await fsp.readdir(from);
  await Promise.all(
    files
      .filter((item) = >! exclude.test(item)) .map((item) = > fsp.copyFile(path.join(from, item), path.join(to, item)))
  );
}
Copy the code

The configuration file

The accessKeyId and secretAccessKey should be configured by each user and therefore placed in a separate configuration file that is created locally by the user and read in the main program:

// config.js
module.exports = {
  s3: {
    accessKeyId: 'accessKeyId'.secretAccessKey: 'secretAccessKey',}};// main.js
if(! fs.existsSync('config.js')) {
  console.error('please create a config file');
  return;
}
const config = require(path.resolve(__dirname, 'config.js'));
Copy the code

Passing parameters

The file name of each download needs to be passed in at the time of the call, and is frequently changed in the file, which is passed directly as a parameter.

Argv is an array. The first element of this array is the installation path of Node, the second element is the path of the script currently executing, and the third element is the custom parameter. So you need to start with process.argv[2]. If you have complex command line parameter requirements, you can use the command line parameter parsing library such as COMMANDER. Since this example requires only one parameter, you can read it directly:

const filename = process.argv[2];
if(! filename) {console.error('please run script with params');
	return;
}
Copy the code

At this point, a usable command-line tool is complete.

conclusion

Node.js can develop backends, but the most meaningful aspect of Node.js is never developing backends with JS. For front-end developers, the real value of Node.js is that it is a very practical tool. Front-end tools such as Webpack, rollup and dev-server are all values created by Node. Due to the rich ecosystem of NPM, using Node can quickly complete script development. It is useful to address some of the tool-chain and productivity issues encountered in development, and consider experimenting when you encounter problems in the workplace.