Configure the Logstash

input {
   s3 {
     aws_credentials_file => "/app/aws-credentials.yaml"
     region => "us-west-1"
     bucket => "aws-test-qa"
     prefix => "QA_APILOG_2021-04/ip-10-0-1-10/2021-04-14"
     sincedb_path => "/app/logstash/.qa-kong-logs-s3-sincedb"
     type= >"qa-kong-logs-s3"
   }
}
filter {
    if [type] = ="qa-kong-logs-s3" {
        grok {
            match => { "message"= >"%{SYSLOGBASE} %{GREEDYDATA:msg_content}" }
        }
 
        json {
            source= >"msg_content"
            remove_field => ["message"."msg_content"]
        }
 
        date {
            match => [ "timestamp"."MMM d HH:mm:ss"."MMM dd HH:mm:ss"]
            target => "@timestamp"
            remove_field => [ "timestamp"]}if "_grokparsefailure" in [tags] {
            drop { }
        }
    }
}
output {
    if [type] = ="qa-kong-logs-s3" {
        elasticsearch {
            hosts => ["localhost:9200"]
            user => "qa"
            password => "xxxxx"
            index => "qa-kong-logs-s3-%{+YYYY.MM.dd}"
        }
        stdout { codec => rubydebug }
    }
}
Copy the code

Test configuration

 sudo /usr/share/logstash/bin/logstash -t -f /etc/logstash/conf.d/qa-kong-logs-s3.conf
Copy the code

After a few seconds, if there are no syntax errors, it should show Configuration OK. Otherwise, try to read the error output and see what’s wrong with the Logstash configuration

Restart the Logstash

  sudo systemctl restart logstash
  sudo systemctl enable logstash
Copy the code

Check the index in Elasticsearch

  curl http://localhost:9200/_cat/indices/qa-kong-logs-s3*
Copy the code