Introduction to ELK platform

Logs include system logs, application logs, and security logs. System operation and maintenance engineers can use logs to learn about server software and hardware information and check configuration errors and error causes. Analyzing logs frequently can help you understand server load, performance security, and take corrective action.

Typically, logs are scattered across different devices. If you manage dozens or hundreds of servers, you’re still using the traditional method of logging on to each machine in turn. Does this feel cumbersome and inefficient? For the most part, we use centralized log management, such as open source syslog, to collect logs from all servers.

After centralized log management, log statistics and retrieval become a troublesome thing. Generally, we can use Linux commands such as grep, AWk and WC to achieve retrieval and statistics. However, it is inevitable that such a method is not able to meet the higher requirements of query, sorting and statistics and the large number of machines.

The open source real-time log analysis platform ELK is made up of ElasticSearch, Logstash and Kiabana. The official website: https://www.elastic.co/products

  1. Elasticsearch is an open source distributed search engine. It features distributed, zero configuration, automatic discovery, index sharding, index copy, restful interface, multiple data sources, and automatic search load.
  2. Logstash is a completely open source tool that collects, filters, and stores your logs for future use (e.g., searching).
  3. Kibana is also an open source and free tool that provides a log analysis friendly Web interface for Logstash and ElasticSearch to help you aggregate, analyze and search important data logs.

As shown in the figure, Logstash collects the logs generated by AppServer and stores them in ElasticSearch cluster, while Kibana queries the data from ES cluster to generate a chart and returns it to Browser.

ELK platform was built

Elasticsearch installation

  1. Click download,
  2. Decompress the zip or tar package
  3. In bin/ elasticSearch (Windows version is binelasticSearch.bat)
  4. requesthttp://localhost:9200/You can see the return value with the version

Logstash installation

  1. Click on the download
  2. Decompress the zip or tar package
  3. Create a logstash. Conf file in the bin directory with the following contents
Input {TCP {port => 4567 // Port type written in the project log configuration file => "logs"}}Copy the code
Filter {} Output {stdout{codec => rubyDebug} ElasticSearch {hosts => ["localhost:9200"] // IP port of ElasticSearch}} 4. Bin /logstash -f logstashCopy the code

Kibana installation

  1. Click on the download
  2. Decompress the zip or tar package
  3. Yml file: vi config/kibana.yml File Change elasticSearch. url = IP address of ElasticSearch
  4. Run bin/kibana (Run binkibana.bat on Windows).
  5. Open your browser http://localhost:5601

validation

  1. Logback. XML configuration

  1. The test class

  1. Logstash log

  1. Kibana display

If you feel that the article is helpful to you, you can pay attention to the wechat public number [colorful color] to encourage