Docker deployment ELK (ElasticSearch/Logstash/Kibana) log collection analysis system

First installation ELK (ElasticSearch/Logstash/Kibana)

The configuration file

  • docker-compose.yml
# es.yml
version: '3'
services:
  elasticsearch:
    image: Elasticsearch: 7.8.0
    container_name: elk-es
    restart: always
    environment:
      Enable memory lock
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - "TAKE_FILE_OWNERSHIP=true"
      # specify single node start
      - discovery.type=single-node
    ulimits:
      # Remove memory restrictions used to enable memory locking
      memlock:
        soft: - 1
        hard: - 1
    volumes:
      - ./logs/data:/usr/share/elasticsearch/data
      - ./logs:/usr/share/elasticsearch/logs
      - ./logs/plugins:/usr/share/elasticsearch/plugins
    ports:
      - "9200:9200"
  kibana:
    image: Kibana: 7.8.0
    container_name: elk-kibana
    restart: always
    depends_on:
      - elasticsearch # Kibana will start after ElasticSearch has started
    environment:
      ELASTICSEARCH_HOSTS: http://elk-es:9200
      I18N_LOCALE: zh-CN
    ports:
      - "5601:5601"
  logstash:
    image: Logstash: 7.8.0
    container_name: elk-logstash
    restart: always
    depends_on:
      - elasticsearch #logstash starts after ElasticSearch starts
    environment:
      XPACK_MONITORING_ENABLED: "false"
      pipeline.batch.size: 10
    volumes:
      - ./conf/logstash/logstash-springboot.conf:/usr/share/logstash/pipeline/logstash.conf
    ports:
      - "4560:4560" # set port

Copy the code
  • Logstash configuration file logstash-springboot.conf

This configuration has only one input source. To differentiate logs of multiple systems, configure multiple inputs in the xxx.log file

Create a new logstash/logstash-springboot.conf file. Input {TCP {mode => "server" host => "0.0.0.0" port => 4560 codec => json_lines}} output {elasticSearch { Hosts => "ES :9200" // Index => "logstash-%{+ YYYY.mm. Dd}"}}Copy the code

File ready

  • Once the files are ready, the directory structure should look like this
- conf
    - logstash-springboot.conf
- docker-compose.yml
Copy the code
  • Execute startup command
docker-compose up -d
Copy the code
  • Check whether the startup status is normal
Docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 6cFF523389DC logstash:7.7.0 "/usr/local/bin/dock..." 6 hours ago Up 14 minutes 0.0.0.0:4560->4560/ TCP, :::4560->4560/ TCP, :::4560->4560/ TCP, 9600/ TCP elk_logstash eac2af4bfa55 kibana:7.7.0 "/usr/local/bin/dumb..." 6 hours ago Up 6 hours 0.0.0.0:5601->5601/ TCP, :::5601->5601/ TCP elk_kibana 6fb7fd998ecf ElasticSearch :7.7.0 "/tini -- /usr/local..." 6 hours ago Up 2 minutes 0.0.0.0:9200->9200/ TCP, :::9200->9200/ TCP, 9300/ TCP elk_elasticSearchCopy the code
  • Access port 5601 to see the Kibana UI interface

Configure the service

  • maven
<! -- Logback push log file to logstash -->
<dependency>
    <groupId>net.logstash.logback</groupId>
    <artifactId>logstash-logback-encoder</artifactId>
    <version>6.6</version>
</dependency>
Copy the code
  • logback-spring.xml

      
<configuration>
    <include resource="org/springframework/boot/logging/logback/base.xml"/>

    <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>127.0.0.1:4560</destination>
        <! -- The logstash IP and exposed port where the logback sends the log to the logstash -->
        <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder"/>
    </appender>

    <root level="INFO">
        <appender-ref ref="LOGSTASH"/>
        <appender-ref ref="CONSOLE"/>
    </root>
</configuration>
Copy the code
  • Write log output interface, access the interface to output logs can be seen in Kibana logs
@GetMapping("/logs")
public String printLogs(a){
        log.info(this.getClass().getSimpleName()+" info : "+LocalDateTime.now().getSecond());
        log.warn(this.getClass().getSimpleName()+" warn : "+LocalDateTime.now().getSecond());
        log.error(this.getClass().getSimpleName()+" error : "+LocalDateTime.now().getSecond());
        return"logs";
        }
Copy the code
  • Then go to the Kibana -> Discover directory to configure the index mode

  • Enter the index and click Next

  • Create index schema

  • Waiting for creation to complete

  • Go to Kibana -> Discover again to see the log

  • search