The rich programming languages available today provide programmers with a wealth of tools for building applications. Whether it’s an established giant like Java or a new public language like Go, applications need to be monitored after deployment. In this article, you’ll learn how to send Golang logs to ELK Stack and Logz.io.

You can often see the health of your application by looking at its logs. However, log data tends to grow exponentially over time. This is especially true when more applications are deployed and distributed across multiple servers. The Elastic Stack, with its ability to store large amounts of data and search quickly and easily, comes in handy.

In this article, you’ll learn how to import logs written by the Go application. The Go programming language (also known as Golang or Golang) is a relatively new but mature general-purpose programming language that has gained widespread adoption in the programming community and among major cloud providers.

In the previous article “Structuring logs with Filebeat,” I showed you how to structure logs directly with Filebeat. In that article, we used a Python application as an example. In today’s example, I’m going to show you an example of Golang.

 

GoLang Log Overview

You can write logs to files from the Go program using several different options. One of these is the Logrus library, which is very easy to use and has all the features you need to write informative logs and easily send them to Elasticsearch.

First, get the Logrus package by running the following command on your terminal. This package will not download correctly because it is on Google’s website using the following statement:

go get github.com/sirupsen/logrus
Copy the code

We can refer to the article to setup the proxy:

Export GOPROXY environment variable export GOPROXY=https://goproxy.ioCopy the code

In terminal, run the above command and then run the following command:

go get github.com/sirupsen/logrus
Copy the code

Our Go program is very simple:

main.go

package main import ( "os" log "github.com/sirupsen/logrus" ) func main() { log.SetFormatter(&log.JSONFormatter{ FieldMap: log.FieldMap{ log.FieldKeyTime: "@timestamp", log.FieldKeyMsg: "message", }, }) log.SetLevel(log.TraceLevel) file, err := os.OpenFile("out.log", os.O_RDWR | os.O_CREATE | os.O_APPEND, 0666) if err == nil { log.SetOutput(file) } defer file.Close() fields := log.Fields{"userId": 12} log.WithFields(fields).Info("User logged in!" ) fields = log.Fields{"userId": 12} log.WithFields(fields).Info("Sent a message!" ) fields = log.Fields{"userId": 12} log.WithFields(fields).Info("Failed to get a message!" ) fields = log.Fields{"userId": 12} log.WithFields(fields).Info("User logged out!" )}Copy the code

Above is a very simple Go language application. This code snippet opens a file for writing and sets it as the target of the Logrus recorder. Now, when you call log.info (…) The information you recorded will be written to the file. However, you can (optionally) enrich the log data with other relevant information, such as user identifiers, that can help solve the problem by providing additional context. It will generate several logs and save them to a file called out.log.

We can run it using the following command:

go run main.go
Copy the code

We can view the files in the current directory:

$ pwd
/Users/liuxg/go/es_logs
$ go run main.go
liuxg:es_logs liuxg$ ls
main.go out.log
Copy the code

Above, we can see a file called out.log. It reads as follows:

$ cat out.log {"@timestamp":"2020-10-16T14:37:44+08:00","level":"info","message":"User logged in!" ,"userId":12} {"@timestamp":"2020-10-16T14:37:44+08:00","level":"info","message":"Sent a message!" ,"userId":12} {"@timestamp":"2020-10-16T14:37:44+08:00","level":"info","message":"Failed to get a message!" ,"userId":12} {"@timestamp":"2020-10-16T14:37:44+08:00","level":"info","message":"User logged out!" ,"userId":12}Copy the code

From the above, we can see that this is a very structured log. It has a JSON structure because the JSON formatter is set at the beginning of the program. When this option is available, formatting logs in JSON format makes it easier to send them to Elasticsearch without additional configuration, because JSON properties and values map directly to fields in Elasticsearch. Instead, you must tell Elasticsearch how to parse data from text logs with no obvious structure.

This ease of writing logs to jSON-formatted files (with the possibility of including additional fields as needed) puts you in a good position to send your logs to Elasticsearch.

Logrus

Like almost all other logging libraries, Logrus allows you to write logs with many different severity levels (including messages, warnings, errors, and so on). This is done by calling the appropriate function, such as Info (). You can also configure the lowest level.

For example, when you call log.setlevel (log.tracelevel), only logs of trace level or higher are written. Since Trace is the lowest level, this call instructs you to write all logs, regardless of their level. For example, you can change it to log.infolevel to ignore logs with trace or debug levels.

 

Transport GoLang logs to ELK

Writing logs to files has many benefits. The process is fast and robust, and the application does not need to know anything about the type of storage that will ultimately be used for logging. Elasticsearch provides Beats to help you collect data from a variety of sources (including files) and ship it to Elasticsearch reliably and efficiently. Once the log data is in Elasticsearch, you can analyze it using Kibana.

The log data sent to Elasticsearch needs to be parsed so that Elasticsearch can construct it correctly. Elasticsearch can easily handle JSON data. You can set up more complex parsing for other formats.

We created the following files in fileBeat’s installation directory:

filebeat_json.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /Users/liuxg/go/es_logs/out.log
  json:
    keys_under_root: true
    overwrite_keys: true
    message_key: 'message'

processors:
  - decode_json_fields:
      fields: ['message']
      target: json
 
setup.template.enabled: false
setup.ilm.enabled: false
 
output.elasticsearch:
  hosts: ["localhost:9200"]
  index: "logs_json"
  bulk_max_size: 1000
Copy the code

Please note: you must replace the paths above with your own.

We use the following command to import the data:

$ ls filebeat_json.yml 
filebeat_json.yml

$ ./filebeat -e -c ./filebeat_json.yml
Copy the code

After running the above command, we use the following command to check the produced index logs_json:

GET _cat/indices
Copy the code

The command above shows:

green open .apm-custom-link CO_6a4_ISAGSswWYjZrNjQ 1 0 0 0 208b 208b yellow open logs_json Is-RVM33T920Ffd35I0ZUw 1 1 4 0 5.6 KB 5.6 KB Green open. Kibana_task_manager_1 PACoKKQ9SC6n7ui8YZe0DQ 1 0 6 1845 239.9 KB 239.9 KB Green open Kibana-event-log-7.9.1-000001 hL1iRZ8cRX6qe4of7TPYMQ 1 01 0 5.5 KB 5.5 KB Green open. apM-agent-configuration W5BmEwunQ96q6KdABWW3pA 10 0 208b 208b green open. kibanA_1 Rnt_C5o5RquYYgkvBZ3xJQ 10 13 2 10.4 MB 10.4 MBCopy the code

We can see that a logs_JSON index has been generated.

We can check its contents with the following command:

GET logs_json/_search
Copy the code

The command above shows:

{ "took" : 0, "timed_out" : false, "_shards" : { "total" : 1, "successful" : 1, "skipped" : 0, "failed" : 0 }, "hits" : {" total ": {" value" : 4, "base" : "eq"}, "max_score" : 1.0, "hits" : [{" _index ":" logs_json ", "_type" : "_doc", "_id" : "mePsNXUBuukB9WUDNLdT", "_score" : 1.0, "_source" : {" @ timestamp ": "The 2020-10-16 T06:37:44. 000 z", "userId" : 12, "input" : {" type ":" log "}, "ecs" : {" version ":" 1.5.0} ", "the host" : {" name ":" liuxg} ", "agent" : {" type ":" filebeat ", "version" : "7.9.1", "hostname" : "liuxg", "ephemeral_id" : "2c0b6672-cb9e-4074-bee4-c550622d273a", "id" : "a1e3e46b-ca14-457d-bbfb-97133166b5b9", "name" : "liuxg" }, "level" : "info", "log" : { "offset" : 0, "file" : { "path" : "/Users/liuxg/go/es_logs/out.log" } }, "message" : "User logged in!" } }, { "_index" : "logs_json", "_type" : "_doc", "_id" : "muPsNXUBuukB9WUDNLdT", "_score" : 1.0, "_source" : {" @ timestamp ":" the 2020-10-16 T06:37:44. 000 z ", "ecs" : {" version ":" 1.5.0} ", "the host" : {" name ": "liuxg" }, "log" : { "offset" : 98, "file" : { "path" : "/Users/liuxg/go/es_logs/out.log" } }, "level" : "info", "message" : "Sent a message!", "userId" : 12, "input" : { "type" : "log" }, "agent" : { "hostname" : "liuxg", "ephemeral_id" : "2c0b6672-cb9e-4074-bee4-c550622d273a", "id" : "A1e3e46b-ca14-457d-bbfb-97133166b5b9 ", "name" : "liuxg", "type" :" fileBeat ", "version" : "7.9.1"}}}, {"_index" : "Logs_json _type", "" :" _doc ", "_id" : "m - PsNXUBuukB9WUDNLdT", "_score" : 1.0, "_source" : {" @ timestamp ": "2020-10-16T06:37:44.000z ", "message" : "Failed to get a message!", "input" : {"type" : "log"}, "agent" : {"id" : "A1e3e46b-ca14-457d-bbfb-97133166b5b9 ", "name" : "liuxg", "type" :" fileBeat ", "version" : "7.9.1", "hostname" : "Liuxg ", "ephemeral_id" :" 2C0B6672-CB9E-4074-BEE4-C550622D273A "}, "ECS" : {"version" : "1.5.0"}, "host" : { "name" : "liuxg" }, "log" : { "offset" : 196, "file" : { "path" : "/Users/liuxg/go/es_logs/out.log" } }, "userId" : 12, "level" : "info" } }, { "_index" : "logs_json", "_type" : "_doc", "_id" : "nOPsNXUBuukB9WUDNLdT", "_score" : 1.0, the "_source" : {" @ timestamp ":" the 2020-10-16 T06:37:44. 000 z ", "level" : "info", "message" : "User logged out!", "userId" : 12, "input" : { "type" : "log" }, "agent" : { "type" : "filebeat", "version" : "7.9.1", "hostname" : "liuxg", "ephemeral_id" : "2C0B6672-CB9E-4074-BEE4-C550622D273A ", "id" : "A1e3e46b - ca14-457 - d - b5b9 BBFB - 97133166", "name" : "liuxg"}, "ecs" : {" version ":" 1.5.0} ", "the host" : {" name ": "liuxg" }, "log" : { "file" : { "path" : "/Users/liuxg/go/es_logs/out.log" }, "offset" : 303 } } } ] } }Copy the code

From the above we can see that we have successfully imported the log into Elasticsearch.