ELK Stack / Work with ELK Stack from command line

Posted on Feb. 11, 2019
Elasticsearch
Logstash
Kibana
ELK Stack
107

This tutorial gives a basic understanding of elk stack and how to work with elk stack from command line.

ELK Stack is the most popular log Analysis platform where E: Elastic Search, L: Logstash, K: Kibana

Elastic Search: Real time Distributed and Analytics Engine. An Apache Lucene based search engine on rest API's. Implemented in Java. Supports full text search. Completely document oriented instead of tables and schemas. This is mostly used for single page application projects. Query and analyze the structured and as well as unstructured data in any form we want. Helpful to search the logs in json format. Indexing and allowing it to search of logs.

Advantages: Scalablility, Really fast, Multiligual, Document Oriented (JSON), Auto completion and Instant search, Schema free.

Installation:

    Step1: Register the elastic signing in key so that the package is verified after installtion

      wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

      sudo apt-get install apt-transport-https (Debian only)

    Step2: Add repository to the system

      echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

    Step3: final setup of updating repositories and installation of elasticsearch

      sudo apt-get update

   sudo apt-get install elasticsearch

So elasticsearch installed in your system. Now it's the time to configure.

Open /etc/elasticsearch/elasticsearch.yml (Automatically created along with the package installation) file and uncomment the below mentioned properties network.host and http.port

# Set the bind address to a specific IP (IPv4 or IPv6):
network.host: "localhost"
# Set a custom port for HTTP:
http.port: 9200

try to run sudo service elasticsearch start

Logstash: A tool for collecting, monitoring logs from remote machines, Process those and send them down the pipeline. These logs can be centralized or decentralized. Store all the collected logs at central place for the ease of access.

        Logs from remote places---->Collection-->cleansing-->convert to the required format(structured/unstructured)-->Analyze-->Obtain results

Data pipeline for Elasticsearch.

Need of Logstash: Issue Debugging, Security Analysis, Predictive Analysis, Internet of things & debugging, Performance analysis.

Installation

    Step1: Register the elastic signing in key so that the package is verified after installtion

      wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

    Step2: Add repository to the system

      echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

    Step3: final setup of updating repositories and installation of logstash

      sudo apt-get update

   sudo apt-get install logstash

So logstash installed in your system. Now It's the time to start the service. you can initiate the service by running sudo service logstash start

Kibana: This is a data visualization and exploration tool.

Front end interface for ELK Stack.

Used for log Analytics, Application monitoring and operational Intelligence.

Installation

    Step1: Register the elastic signing in key so that the package is verified after installtion

      wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

    Step2: Add repository to the system

      echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list 

    Step3: final setup of updating repositories and installation of kibana

      sudo apt-get update
   sudo apt-get install kibana
 
vim  /etc/kibana/kibana.yml (open kibana.yml and set the port and the below mentioned according to your need)
# Kibana is served by a back end server. This setting specifies the port to use.
server.port: 5601 (Kibana by default served from 5601 port if you want to change you can change it)

# The Kibana server's name.  This is used for display purposes.
server.name: "harika"

# The URL of the Elasticsearch instance to use for all your queries.
elasticsearch.url: "http://localhost:9200" (This is same the configuration we are providing in /etc/elasticsearch/elasticsearch.yml
file make sure the host and port are same then only the kibana will be served)

Kibana service will be started once you run sudo service kibana start

try to type https://localhost:5601/ in the browser

Important Notes: all three services takes time for the initiation. let's wait for atleast 45_60sec for the initiation. Do not be scared once you see kibana is not initiated as the html text in your browser. If it is taking too long time try to check the logs(Follow the guidelines mentionaed for logs).

 
Guidelines:
  1. If you want to check the logs try to run  sudo tail -f /var/log/logstash/logstash-plain.log
  2. If you want to know the status of any service try to run 
  • sudo systemctl status elasticsearch.service
  • sudo systemctl status kibana.service
  • sudo systemctl status logstash.service

All the configuration related stuff is in /etc/*/*.yml

Logs related stuff is in /var/log/*/*.log 

*: elasticsearch/logstash/kibana




0 comments

Please log in to leave a comment.

Author
harika

11 posts

Share this