Start with the link: github.com/zencodex/co…

ZComposer mirroring was born in March 2017, and has been running for more than two years. It is not a technical thing, so I will briefly talk about some ideas for development and problem solving, hoping to inspire you a little. If you feel that you have gained something, please click your mouse and give me a star on Github (supported), thank you.

  1. Security, do not modify the original JSON, zip, otherwise it will cause hash changes, recompute hash is no problem (the third party did this before), the problem is that the security of the package can not be verified, if there is malicious black mirror, the data has been modified, you can not determine. So ZComposer mirroring, all packages are consistent with packagist.org official, can be compared to hash without any changes.

  2. Stability. If there is an error in the process of data collection and uploading, problems may occur. Therefore, it is important to check the integrity of the collected packets using hash values. Sometimes third-party API policies, or CDN lines, can cause problems. So the biggest difficulty in mirroring is the guarantee of stability.

  3. Webysther/ Packagist-Mirror fork from Hirak/Packagist-Crawler, but none of these mirror handles dist, which is the biggest/most valuable and most worthy of CDN processing. ZComposer open source is a full mirror that includes processing of the DIST part. There is also a problem with the upper limit of 65000 subdirectories for dist package. In one year, the number of packages is doubled. The soft connection scheme is my original work. Maybe with the unlimited increase of packages, other schemes need to be designed.

Install and deploy ZComposer images

Recommended host configuration:

  • Memory should not be less than 4G
  • The free disk space is not less than 30 GB
$ apt install beanstalkd
$ cd composer-mirror
$ composer install
Copy the code

Modifying Configuration Parameters

You can modify parameters based on the actual deployment environment. For details, see config.default.php

Run the following command to cp config.default. PHP config. PHP to modify the following parameters in config.php

    /** * distdir is used to store zip packages */
    'distdir'= >__DIR__ . '/dist/'./** * points to the actual directory on the Web corresponding to mirrorUrl */
    'cachedir'= >__DIR__ . '/cache/'./** * packagistUrl: official collection source */
    'packagistUrl'= >'https://packagist.org'./** ** image package publishing site, packs. json entry root domain */
    'mirrorUrl'= >'https://packagist.laravel-china.org'./** *. Json file dist CDN */
    'distUrl'= >'https://dl.laravel-china.org/'.Copy the code

The supervisor configuration

Sudo vim/etc/supervisor/supervisord. Conf, add the following configuration information:

[program:crawler] command=php ./bin/console app:crawler directory=/home/zencodex/composer-mirror/ ; Autostart =true Autorestart =true redirecT_stderr =true; Redirect stderr to stdout, default false stdout_logFILe_maxBytes = 10MB; Stdout logfile size. Default value: 50MB stdout_logfile_backups = 5; Stdout Number of backup log files stdout_logfile = / TMP /composer_crawler_stdout.log [program:composer_daemon] command= php. /bin/console app:daemon directory=/home/zencodex/composer-mirror/ ; Autostart =true Autorestart =true redirecT_stderr =true; Redirect stderr to stdout, default false stdout_logFILe_maxBytes = 10MB; Stdout logfile size. Default value: 50MB stdout_logfile_backups = 5; Stdout Number of backup log files stdout_logfile = / TMP /composer_daemon_stdout.logCopy the code

Crontab Indicates a scheduled task

# sudo crontab -e
# replace /home/zencodex/composer-mirror based on the location of your environment code
# getComposer is to get the latest composer and upload it to the CDN cloud storage

0 */2 * * * /usr/bin/php /home/zencodex/composer-mirror/bin/console app:clear --expired=json
0 1 * * * /usr/bin/php /home/zencodex/composer-mirror/getcomposer.php
Copy the code

Common commands

Execute fetching task
$ php ./bin/console app:crawler

# Background multi-process model synchronization again shot cloud
$ php ./bin/console app:daemon

# Remove expired garbage files
$ php ./bin/console app:clear --expired=json

Scan and validate hash256 of all JSON and ZIP files
$ php ./bin/console app:scan
Copy the code

For Developers

  • Instead of using database storage, it is stored entirely in a directory structure
  • The dist/ ZIP file of each package stores the download address of the corresponding Github URL. Due to the limited disk space, it is not stored locally, so it is directly pushed to the cloud
  • Clear expired files, determine whether there is update, expiration is based on the timestamp of the file, so do not manually touch the file, or cause the timestamp change operation

If you use other platforms, you need to pay attention to the following code, need to implement their own

  • ClientHandlerPlugin requires the corresponding Adapter of Flysystem to have corresponding interfaces. In this example, only Zencodex/FlySystem-upyun can implement the ClientHandlerPlugin. Other third-party packages can be implemented by following the example
  • Cloud::refreshRemoteFile, which refreshes CDN cached files, is only used when refreshing package.json
  • Cloud::refreshRemoteFile, if using a non-cloud platform, you need to refresh the code for your own platform instead. Or referenceZenCodex\Support\Flysystem\Adapter\UpyunAdapterEncapsulation getClientHandler.
  • Cloud::prefetchDistFile is similar to refreshRemoteFile. It calls the Cloud platform special interface and cannot be encapsulated in Flysystem, so it is also processed by getClientHandler

Note the pit for the maximum number of subdirectories

Code for details see the SRC/Commands/PatchCommand. PHP

/ * | -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - | Linux corruption support most architectural directory number has a limit, About 64000 ~ 65000, the upper limit of the number of package is already more than | -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - | | there are three kinds of solutions, The first two are basically unrealistic. So myself by trying to find the 3 (soft connection doesn't count) | | 1. Change there is no limit to the number of subfolders of the file system, such as XFS | 2. Or change the relevant code, recompile corruption kernel | 3. Cut large folders and scatter files that start with different letters. Inside the home folder using soft connection, soft connection does not count | * /
Copy the code

The idea of ZComposer mirroring was put forward by @Summer in the early stage, and it was also strongly supported by @OverTrue and LC community partners. Open source is also suggested by Overtrue. Thank you all for your encouragement and support. The 1st Laravel Conf China conference will be held on August 3-4, 2019, which is a grand gathering of elite players. Please register via laravelconf.cn

First published at: learnku.com/articles/28…