Log aggregation using the ELK stack

As we have seen about the centralized logging approach for the distributed microservices based application. The components, such as Log streams, Log shippers, Log storage, and Log Dashboard, work together to provide a centralized logging solution for distributed applications, deployed either on the container-based environment or on virtual/physical machines.

Logstash is an open source tool for collecting, parsing, and storing logs for future use. Kibana is a web interface that can be used to search and view the logs that Logstash has indexed. Both of these tools are based on Elasticsearch, which is used for storing logs.

The Elasticsearch, Logstash, and Kibana tools, collectively known as the ELK stack, provide an end-to-end logging solution in the distributed application. The ELK is one of the most commonly used architectures for custom logging management. The following diagram shows the centralized log-monitoring architecture:

As you can see in the diagram of ELK stack tools, the multiple microservices, A, B, C, and D, are using Log4j/Logback to emit log streams and Logback appenders to write log streams to the Logstash directly, and Logstash is working as a broker between log streams and log storage, sending log messages to Elasticsearch. The Elasticsearch tool saves the generated logs in the form of text-based indexes. Kibana uses these indexes, it is working as a log dashboard to display log analysis reports.

Let's see the following steps to implement the ELK stack for central custom logging:

Step 1: Install all three components of the centralized logging approach that we have to download and install Elasticsearch, Kibana, and Logstash on a single server, known as the ELK server.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset