Using Elasticsearch, Logstash, and Kibana (ELK)

This tuturial builds on four howto's from DigitalOcean.

The goal is to create a setup to gather logs of multiple servers, and visualize the gathered logs.

(beat)  logstash --> elastic-search --> kibana --> user

Installing ELK

The Install tutorial will install the following Elastic products:

In our setup we've used OpenJDK, the default JDK on Ubuntu 14 and not the Oracle JDK referenced in the tutorial - this will work fine:


sudo apt-get install openjdk-7-jre

We've also used Apache and not Nginx. The reverse proxy config for Apache is the following.

First load mod_proxy and mod_proxy_http:

LoadModule proxy_module /usr/lib/apache2/modules/
LoadModule proxy_http_module /usr/lib/apache2/modules/

(Or use 'a2enmod' on Ubuntu/Debian).

Then reverse proxy to the elasticsearch server:

ProxyRequests off
ProxyPreserveHost on
ProxyPass / http://localhost:5601/ nocanon
ProxyPassReverse / http://localhost:5601

Adding Topbeat

Topbeat provides a distributed top command for all the servers it is installed.

Instead of printing the metrics on the screen, it sends them periodically to Logstash or Elastic Search.

Adding Logstash Filters

Filters allow for fine grained control of what is monitored.

For instance, the tutorial describes how to install an Apache filter looking like this:

filter {
  if [type] == "apache-access" {
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}" }

This then is combined with a 'filebeat' config on the server that is monitored:

  - /var/log/apache2/access.log
document_type: apache-access

Using Kibana

With Kibana you can visualise your data, for instance like the following:


Combined with Geoip data it's for instance possible to print out a map of where visitors are coming from. The Geoip tutorial will tell you how to set this up.