Using ELK Stack for Log Analysis

Tutorial 4 of 5

1. Introduction

In this tutorial, our goal is to learn how to set up and use the ELK Stack (Elasticsearch, Logstash, and Kibana) for log analysis.

By the end of this tutorial, you will be able to:
- Understand the components of the ELK Stack
- Install and configure the ELK Stack
- Use the ELK Stack to analyze logs

Prerequisites:
- Basic understanding of Linux command line
- Familiarity with JSON format

2. Step-by-Step Guide

Installation of ELK Stack components

  1. Elasticsearch Installation
  2. Elasticsearch is a NoSQL database that is based on the Lucene search engine. It is used for storing logs.
  3. You can install Elasticsearch using the following commands:
sudo apt update
sudo apt install apt-transport-https
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
sudo apt update
sudo apt install elasticsearch
  1. Logstash Installation
  2. Logstash is a log pipeline tool that accepts inputs from various sources, executes different transformations, and exports the data to various targets.
  3. Install Logstash with the following commands:
sudo apt update
sudo apt install logstash
  1. Kibana Installation
  2. Kibana is a visualization layer that works on top of Elasticsearch.
  3. Install Kibana using the following commands:
sudo apt update
sudo apt install kibana

3. Code Examples

The following code snippets show how to use ELK Stack for log analysis.

  1. Configuring Logstash
  2. Create a configuration file for Logstash:
sudo nano /etc/logstash/conf.d/logstash.conf
  • In the opened file, write the following configuration:
input {
  file {
    path => "/var/log/syslog"
    start_position => "beginning"
  }
}

filter {
  grok {
    match => { "message" => "%{SYSLOGBASE}" }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}
  • This configuration specifies that Logstash will read from the /var/log/syslog file, apply a filter to parse the logs, and send the output to Elasticsearch.

  • Analyzing logs in Kibana

  • After setting up Logstash and Elasticsearch, you can use Kibana to visualize and analyze the logs.
  • Access Kibana at http://localhost:5601.
  • Navigate to the "Discover" tab to view and analyze the logs.

4. Summary

In this tutorial, we have covered the following points:
- Introduction to ELK Stack
- How to install and configure Elasticsearch, Logstash, and Kibana
- How to analyze logs using Kibana

To continue learning, you can explore the following resources:
- Elasticsearch Documentation
- Logstash Documentation
- Kibana Documentation

5. Practice Exercises

  1. Write a Logstash configuration to read logs from a different log file (e.g., /var/log/auth.log).

  2. Modify the Logstash filter in the example to parse a different type of log format. (Hint: Look into other grok patterns)

  3. Create a visualization in Kibana based on the analyzed logs.

Remember, the best way to learn is by doing. Keep practicing and exploring different features of the ELK Stack. Good luck!