<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=490755098261949&amp;ev=PageView&amp;noscript=1">
Heady Logo White
Heady Logo Purple
Big ideas for small screens.
Don’t miss the future.

Nah, I'm good.

By subscribing you agree to our Privacy Policy.

ELK stack explained and configuring logging.

Deepak Poojari

Author Deepak Poojari
Published On Apr 30, 2020
Deepak Poojari

Author Deepak Poojari
Published On Apr 30, 2020
elk_resp

What is ELK Stack?

The ELK Stack (Elasticsearch, Logstash, and Kibana) is the world’s most popular open-source log analysis platform. ELK is quickly overtaking existing proprietary solutions and becoming The first choice for companies shopping for log analysis and management solutions.

ELK stack

  • Elasticsearch: Stores and indexes transformed data from Logstash.
  • Logstash: Collect logs and events data. Also parses and transforms data and sends it to Elasticsearch.
  • Kibana: A visualization tool that runs alongside Elasticsearch to allow users to analyze data and build powerful reports.

Why ELK Stack?

There are many logging tools. But options like Loggly, Sumo Logic, and others are expensive when compared with maintaining an ELK Stack, and they don’t necessarily have more or better features.

Top companies that use ELK Stack include Netflix, LinkedIn, Stack Overflow and others. This shouldn’t be surprising, if we consider all of the critical capabilities and services that that this one stack provides:

  1. A central logging system for all microservices, with real-time logging analytics and alerting system.
  2. Simplified Scales deployment, vertically and horizontally.
  3. A data visualization tool to capture and display analytics, i.e. new customer acquired in a day, API fails after a new release and more.

 

Why is logging so important?

With the growth of microservices and server data, logging is increasingly important. It’s critical for diagnosing and troubleshooting issues for optimal application performance. Plus, many tools make it possible to get critical business metrics and data from logs.

Logging is no longer just for finding issues. It’s also for monitoring your systems.

 

Configuring and using ELK Stack

If you’re ready to optimize your logging and log analysis processes, here’s what you need to do to get started with ELK Stack.

Requirements
Java 8
Homebrew 2.X.X

Installation
1. Elasticsearch
brew install elasticsearch && brew services start elasticsearch
2. Logstash
brew install logstash && brew services start logstash
3. Kibana
brew install kibana && brew services start kibana

1_BlTVuYs_yzh9Dh3gZaAZow
System Architecture for Logger service

Configure Kibana to start visualizing logs.

Open the Kibana configuration file and uncomment server.port and elasticsearch.hosts for Kibana to start listening on 5601.

sudo vi /usr/local/etc/kibana/kibana.yml

If you have successfully installed ELK Stack, you should see the Kibana status as green on http://localhost:5601/status.

1_ccUkzkjUWvQqs8gtVdo2JA

Start Sending data to elasticsearch using logstash

Copy this file to /usr/local/Cellar/logstash/7.6.1/libexec/config/syslog.conf.

This is the path of the configuration file for Logstash where you can configure the file path for all your log files, service logs, system logs and ngnix logs—and then transfer or send as-is to Elasticsearch.

1_zeZZo1grrze1mafjS03ASQ
ELK Stack

1_4DqKluawgUMn2vUiJZd13w

Next, verify your configuration file.

/usr/local/Cellar/logstash/7.6.1/bin/logstash — config.test_and_exit -f /usr/local/Cellar/logstash/7.6.1/libexec/config/syslog.conf

To reflect new config file, restart Logstash.

/usr/local/Cellar/logstash/7.6.1/bin/logstash -f /usr/local/Cellar/logstash/7.6.1/libexec/config/syslog.conf

Viewing and querying logs in Kibana

On the management tab in Kibana, you’ll see the new index syslog-demo created by Logstash.

1_w499uMe3ntdJ26s3os2_1A

To define the index pattern to receive data from Elasticsearch, we will use the timestamp, as it can be helpful to visualize logs in chronological order.

1_4HCG6dwfNq4UE_A67nzurA

Click on Create index pattern, and you are done. You can visualize all your log files.

You can use text search on your logs, by adding filters or direct text-based search.

1_DC9x7iyYb1cDK1NzxOhy0A

On top of these logs, you can create various visualization for monitoring critical business metrics like daily/weekly/monthly signups, registrations, subscriptions and more on your server.

You can also monitor any anomalies in your server.

1_uWpbKay8nCSOeLwZ_0Cc4g

Next steps

  1. Dockerize ELK Stack.
  2. Use Kafka for stream processing logs.
  3. Use ElastAlert for alert notification on Elasticsearch.

1_3XLwyS-0wp_qWag7U4JZ7g

What’s next Load Testing ELK Stack with 1M Hits.

You can also read more about ELK Stack on Github.

Group 169@2x

Interested in a career at Heady?

Excellent! We are always looking for great talent.

app-clips_1
app-clips_2

Interested in a career at Heady?

Excellent! We are always looking for great talent.

LET'S TALK IT OUT.

Figuring out your next step? We're here for you.

Or drop us a line.

hello@heady.io