ELK workflow

  • Multiple independent agents (shippers) collect data from different sources, a central Agent(Indexer) aggregates and analyzes the data, and brokers (implemented using Redis) in front of the central Agent serve as buffers. ElasticSearch after the central Agent is used to store and search data, and Kibana at the front provides rich chart presentation.
  • The Shipper stands for log collection and uses LogStash to collect log data from a variety of sources, including system logs, files, Redis, MQ, and so on.
  • Broker serves as a buffer between the remote Agent and the central Agent, and is implemented by Redis. On the one hand, it can improve the performance of the system and on the other hand, it can improve the reliability of the system. When the central Agent fails to extract data, the data is saved in Redis and will not be lost.
  • A central Agent(Indexer), also LogStash, extracts data from the Broker and performs associated analysis and processing (Filter);
  • ElasticSearch stores the final data and provides search capabilities;
  • Kibana provides a simple, rich Web interface with data from ElasticSearch that supports all kinds of queries, statistics, and displays

The machine deployment

Logstash

(The Logstash is deployed on the machine with the IP address 192.168.123.2.)

The data flow

input|decode|filter|encode|output

Installation and configuration

1. Install the Java environment

2. Download and install the GPG key

[root@localhost ~]# rpm –import packages.elasticsearch…

3. Yum source configuration

3. Install the Logstash

[root@localhost ~]# yum install logstash

4. Installation directory

5. Edit a simple configuration file

Configuration to store logs in ES:

7. Problems encountered:

ElasticSearch

The data flow

Installation and configuration

If you are installing on a different machine, you need to configure the Java environment as you did in Step 1 of Logstash.

(This article is deployed on a different machine. The following configuration is performed on the machine whose IP address is 192.168.123.3.)

1. Download and install the GPG key

[root@localhost ~]# rpm –import packages.elastic.co/G…

2. Yum source configuration

3. Install the ElasticSearch

[root@localhost ~]# yum install elasticsearch

4. Installation directory

5. Modify the limits. Conf

6. Create a directory and authorize it

 [root@localhost ~]# mkdir -p /data/es-data
 [root@localhost ~]# chown -R elasticsearch.elasticsearch /data/es-data/
 
Copy the code

7. To configure elasticsearch. Yml

8. To start the ElasticSearch

8. Check the startup

9. Access tests

Windows access:

11. Install Elasticsearch

  • Head
  • Plug-in function: mainly do ES cluster management.



Here I recommend an architecture learning exchange group. Exchange learning group number: 575745314 inside will share some senior architects recorded video video: Spring, MyBatis, Netty source analysis, high concurrency, high performance, distributed, microservice architecture principle, JVM performance optimization, distributed architecture, and so on these become architects necessary knowledge system. I can also get free learning resources, which I benefit a lot from now

Kibana

Installation and configuration

1. Download and install GPG Key:

rpm –import packages.elastic.co/G…

2. Yum source configuration

3. Install the Logstash

4. Installation directory

5. Modify the configuration file

6. Start Kibana

7. Check ports

Visit 8.

Access the address: http://192.168.123.3:5601