Tell a story

Those of you who follow my official account should know that I have written several articles about cloud native app log collection/analysis, most of which focus on a specific component:

  • Super useful TraceId, use it quickly!
  • How to use NLog to output structured logs and analyze logs elegantly in Kibana? |
  • Why logstash log collector when you can log directly to ElasticSearch?

This article documents a standard, non-invasive containerized application log collection scheme:

  1. What kind of logs should be collected?
  2. How do I output structured logs?
  3. Use EFK non-intrusive collection and analysis logs

Custom ASP.NET Core logging; Output structured logs to STdout; Fluentbit non-intrusive forwarding container logs; Stored in Es and analyzed logs on Kibana

Customize the ASP.NET Core log

A classic Internet-oriented application consists of three parts of logging: requests, business processing, and database operations. When collecting logs, pay attention to [specific log scenario] :

  • Apis provided for third party calls (with potential for arguments)
  • Core Process Business (996 barriers)
  • Database operation (deletion run away possibility)
  • Apply internal HTTP requests
  • Logs of Warn, Error, and Fatal levels (continuous attention)

ASP.NETCore’s flexible configuration system and pluggable component system make it easy to configure and manage logging components.

Log Collection Policy

The Logging configuration of ASP.NET Core applications depends on the Logging configuration section of the appsettings.{Environment}. Json file. It supports multiple logproviders, filtering logs, and customizing the collection level of specific types of logs.

"Logging": { "LogLevel": { "Microsoft": "Warning", "Microsoft.AspNetCore.Hosting.Diagnostics": "Information", / / provided to a third party call API log "Microsoft. Hosting. Lifetime" : "Information", "Microsoft.EntityFrameworkCore.Database.Command": "Information", / / SQL database operation log "System.Net.Http.HttpClient" : "Information", / / record within HTTP requests "Default" : "Warning" // In addition to the above log, log Warning+ level}}Copy the code

The above Logging configuration is specific to specific log scenarios and meets the log collection requirements of classic Internet applications.

NLog Provider

Structured logging proposes [MessageTemplate] to solve the problem that traditional text logging is not machine friendly.

① Use the NLog Provider to take over all log output

// Please  install-package NLog.Web.AspNetCore
internal static IHostBuilder CreateHostBuilder(string[] args) =>
            Host.CreateDefaultBuilder(args)
              .ConfigureLogging((hostBuilder, loggerBuilder) =>
              {
                  loggerBuilder.ClearProviders();
                  loggerBuilder.AddNLog("nlog.production.config");
              })
              .ConfigureWebHostDefaults(webBuilder =>
              {
                  webBuilder.UseStartup<Startup>();
              });
Copy the code

NLog[JsonLayout] convert traditional text log to JSON format log

<? The XML version = "1.0" encoding = "utf-8"? > <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" autoReload="true" internalLogFile="logs/nlog-internal.log" internalLogLevel="Info" > <targets async="true"> <target name="console" xsi:type="Console"> <layout xsi:type="JsonLayout" includeAllProperties="true" excludeProperties="EventId_Id,EventId_Name,EventId"> <attribute name="time" layout="${date:format=yyyy/MM/dd HH\:mm\:ss.fff zzz}" /> <attribute name="category" layout="${logger}" /> <attribute name="log_level" layout="${level:lowerCase=true}" /> <attribute name="message" layout="${message}" /> <attribute name="trace_id" layout="${aspnet-TraceIdentifier:ignoreActivityId=true}" /> <attribute name="user_id" layout="${aspnet-user-identity}" /> <attribute name="exception" layout="${exception:format=tostring}" /> </layout> </target> </targets> <rules> <logger name="*" minlevel="Info" writeTo="console" ruleName="console" /> </rules> </nlog>Copy the code

Business-specific log characters:

  • IncludeAllProperties =”true” Prints all attributes of a log entry
  • Trace_id = ${aspnet – TraceIdentifier: ignoreActivityId = true} trace_id, is very useful when debugging
  • User_id =${aspnet-user-identity} Gets the name of the log producer

The startup application log looks like this:

Please keep the output target of all application logs as STDOUT to make Fluent-bit non-intrusive collection!

. TODO: Making Docker images !!!! .

Fluent-bit Collects container logs

Fluent-bit log collection, small enough!

To collect container logs, change the Logging Driver used by the container to [Fluentd]. The Fluentd Driver listens for Forward messages on port 24224 of the host by default.

A simple docker-compose container example:

Version: "3.7" services: website: image: ${DOCKER_REGISTRY}/eap/website:0.1 ports: - "80/80" environment: - TZ=Asia/Shanghai networks: - webnet logging: driver: fluentd options: # fluentd-address: localhost:24224 tag: eap-website restart: always networks: webnet: external: true name: eap-netCopy the code

The Fluentd Driver collection format is as follows:

{ "container_id": "..." , "container_name": "..." , "source": "stdout", "log": "This is log content" }Copy the code

Json logs generated by container applications (In the log field) will be encoded, this is embarrassing, deliberately structured log does not extract the log field!!

Find Decoders plugin on Fluentbit, which can decode the encoded JSON string:

The complete fluent-bit.conf is as follows:

[SERVICE] flush 1 log_Level info daemon off http_server on // Start the host server http_LISTEN 0.0.0.0 http_port 2020 Storage. metrics on parsers.conf // Load the file for Parser. [INPUT] Name Forward max_chunk_size 1M max_buffer_size 5M [FILTER] // FILTER Praser plug-in (docker) Decode application JSON log Name Parser // Plug-in Name, An enumerated value Match eap-website // What tag log is matched Key_Name log // Field to be parsed Parser docker // Parse in docker log format [details are in the parser.conf file] Preserve_Key True // Retain the original parsed fields Reserve_Data True // Retain the original other fields [FILTER] // This part is nginx container logs Name Parser Match eap-frontend // Filter Praser plugin (nginx) Parser Nginx Key_Name log Preserve_Key True Reserve_Data True [OUTPUT] name es match * host es01 port 9200 logstash_format on replace_dots on retry_limit falseCopy the code

The output is:

Nice, feel free to analyze logs in Kibana later.

Complete EFK collection container log source configuration, Github portal: github.com/zaozaoniao/…

This is a complete set of container log collection and analysis solution: ① classic Internet application log collection strategy ② structured output ③ non-invasive collection, provide Github configuration source code, can be learned and commercial.

  • Docs. Fluentbit. IO/manual/pipe…