A, requirements,

Use Logstash to collect the logs on the system, use Grok to parse the logs, use mutate to change the type of the parsed fields, delete the fields, rename the fields, and finally export the parsed day to ElasticSearch.

Second, the implementation steps

1. Write the PIPELINE file

vim output-es.yml

input { file { id => "mutate-id" path => ["/Users/huan/soft/elastic-stack/logstash/logstash/pipeline.conf/output-es/*.log"] start_position => "beginning" sincedb_path => "/Users/huan/soft/elastic-stack/logstash/logstash/pipeline.conf/output-es/sincedb.db" codec => multiline  { pattern => "^\[+" negate => "true" what => "previous" charset => "UTF-8" auto_flush_interval => 2 } } } filter { grok  { match => { "message" => "(?m)^\[%{INT:pid}\]%{SPACE}%{TIMESTAMP_ISO8601:createTime}%{SPACE}\[%{DATA:threadName}\]%{SPACE}%{LOGLEVEL:LEVEL}%{SPAC E}%{JAVACLASS:javaClass}#(?<methodName>[a-zA-Z_]+):%{INT:linenumber}%{SPACE}-%{GREEDYDATA:msg}" remove_field => ["message"]}} mutate {convert => {"pid" => "integer"} rename => {" MSG "=> "message"}} # CreateTime Date {match => ["createTime"," yyyy-mm-dd HH: MM: ss.sss "," yyyy-mm-dd HH: MM: ss.sss "] target => "@timestamp" Remove_field => ["createTime"]}} output {# Elasticsearch {hosts => ["http://localhost:9200","http://localhost:9201","http://localhost:9202"] user => "springboot_logstash" password => "123456" index => "springboot-%{+YYYY.MM.dd}" template_overwrite => "false" } }

1.elasticsearchConfiguration parameters parsing:

  1. hosts: esThe access address of the recommended useThe masterNode.
  2. user: User name to access es.
  3. password: Password to access ES.
  4. index: Index name in ES.
  5. template: Set your own ES template path.
  6. template_name: Use the index template name in ES.
  7. The top ES password is clear text, there may be a leak, you can use Logstash keystore to solve.

    1. Refer to the link https://www.elastic.co/guide/en/logstash/current/keystore.html

An exception that might be reported

{
    "error": {
        "root_cause": [
            {
                "type": "security_exception",
                "reason": "action [indices:data/      write/bulk] is unauthorized for user [logstash_system] on indices [], this action is granted by the index privileges [create_doc,create,delete,index,write,all]"
            }
        ],
        "type": "secu      rity_exception",
        "reason": "action [indices:data/write/bulk] is unauthorized for user [logstash_system] on indices [], this action is granted by the index privileges [create_doc      ,create,delete,index,write,all]"
    },
    "status": 403
}

When using logstash_system user, it may indicate that the database :data/write/bulk operation does not have privileges.

2. Prepare test data

[9708] 2021-05-13 11:14:51.873 [http-NIO-8080-EXEC-1] INFO [9708] 2021-05-13 11:14:51.873 [http-NIO-8080-EXEC-1] INFO org.springframework.web.servlet.DispatcherServlet#initServletBean:547 -Completed initialization in 1 ms [9708] The 2021-05-13 11:14:51. 910 [HTTP - nio - 8080 - exec - 1] ERROR com. Huan. Study. LogController# showLog: 32 - request: [/ showLog] exception occurred java.lang.ArithmeticException: / by zero at com.huan.study.LogController.showLog(LogController.java:30) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

3. Start Logstash

bin/logstash -f output-es.yml

4. Create index mode on ES

5. Log search

3. Reference documents

1, https://www.elastic.co/guide/en/logstash/current/keystore.html

2, https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html