PK creative Spring Festival, I am participating in the “Spring Festival creative submission contest”, please see: Spring Festival creative submission Contest

preface

Logs are an integral part of a complete system. Where do you export logs to? Console or file? These are simple and straightforward ways, but I’m afraid they are not good ways. Because of the huge amount of log data, the late troubleshooting will be scratching our heads. Also, every time you grab a log, you have to go to the server remotely, which is neither convenient nor in line with the geek spirit of code farming. Therefore, what we need is a log system that can search efficiently and grab logs anytime and anywhere. Currently, the most popular solution in the industry is ELK (Elasticsearch + Logstash + Kibana).

ELK structures,

The following ELK installation and deployment is the way of Docker container, do not know Docker friends need to be familiar with first!

ELK mirror pull

docker pull elasticsearch
docker pull logstash
docker pull kibana
Copy the code

This command pulls the latest image of ELK from the Docker repository, currently version 7.10.2

ELK container created

1. Create Elasticsearch

Docker run --name elasticSearch \ -p 9200:9200 \ -p 9300:9300 \ -e "discovery. Type =single-node" \ # single-node mode -e ES_JAVA_OPTS=" -xms256m -xmx256m "\ # Initial memory and maximum memory -d elasticSearchCopy the code

2. Kibana container was created and run

Docker run --name kibana \ -p 5601:5601 \ -e ELASTICSEARCH_HOSTS= http://es service IP address :9200 \ -d kibanaCopy the code

Create and run the Logstash container

Create/usr/local/docker logstash/logstash. Conf configuration files, content is

Input {TCP {mode => "server" host => "0.0.0.0" port => 4560}} output {elasticSearch {hosts => [" Deploy ES server IP address :9200"] action => "index" codec => json index => "today-log-%{+YYYY.MM.dd}" } }Copy the code
docker run --name logstash \ -p 4560:4560 \ -v / usr/local/docker logstash/logstash. Conf: / usr/share/logstash/pipeline/logstash. Conf \ # mount logstash config file - d logstashCopy the code

Now that the ELK installation and deployment are complete, let’s see how they work

docker ps
Copy the code

SpringBoot outputs logs

1. Logstash dependency

<dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> The < version > 5.3 < / version > < / dependency >Copy the code

2, create the SRC/main/resource/logback – spring. The XML output logs to Logstash

<? The XML version = "1.0" encoding = "utf-8"? > <! DOCTYPE configuration> <configuration> <! - refer to the default log configuration - > < include resource = "org/springframework/boot/logging/logback/defaults. The XML" / > <! - use the default console log output implementation - > < include resource = "org/springframework/boot/logging/logback/console - appender. XML" / > < appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination> Deploy Logstash service IP address :4560</destination> <encoder charset=" utF-8" class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"> <providers> <! - custom log output format - > < pattern > < pattern > {" project ":" today ", "level" : "% level", "pid" : "${pid: -}", "thread" : "%thread", "class": "%logger", "message": "%message", "stack_trace": "%exception{20}" } </pattern> </pattern> </providers> </encoder> </appender> <root level="INFO"> <appender-ref ref="LOGSTASH"/> </root> </configuration>Copy the code

Visit Kibana to view the logs

After the SpringBoot service is started, the browser accesses Kibana

http://Kibana service IP address :5601