Building up a logging server – OSS style!

So – I wanted to get Splunk but in my organisation that was never going to happen (you want something that costs MONEY?! Ludicrous!) So we had to come up with a compromise. A colleague of mine went hunting for some open source logging software and found that the combination of Elastic Search, LogStash, Kibana and nxLog worked well. He tested it on his PC, wrote a few lines on how to get it roughly working and then sent it through to me to get it working from a server perspective. (Hi Ken! ^_^)

I’ve just recently finished the base setup (with a little assistance) and it’s getting information from our production, test and development AD DC’s and WOAH do they waffle a lot. So after posting on Twitter that I’d got this working (because I was excited I’d got it working…duh!) someone asked me to do a blog post on the setup. So this is that blog post. I’ve written this in a way that anyone else with basic Windows Server knowledge could install this if required – yes, it is dumbed down a bit in some areas, but that’s because I wrote it to be as idiot-proof as possible.

UPDATE (3rd August 2016) – This document has now been updated with details regarding the most recent ELK stack. I have recently done an install using the instructions of ElasticSearch 2.3.4, LogStash 2.3.4 & Kibana 4.5.3.

Software used:

Other basic setup (specific to the environment we setup):

  • Virtual Server
  • Windows Server 2012
  • 2vCPU
  • 4GB RAM
  • Two HDD – C: & D: (disk space is up to you – the more you give it, the more logs you can stash!)

Log Server Installation Instructions 

Folder creations

  • On C: drive
    • Create a C:\Program Files\Java\jdk[version number]
  • On D: drive
    • Create a D:\LogData directory (or whatever you want to call where you dump your logs)
    • Create a D:\ElasticSearch directory
    • Create a D:\Kibana directory
    • Create a D:\LogStash

Create a Service Account

  • In your domain:
    • Create a new user service account user
  • On the LogServer
    • Add the new user to the ‘Administrators’ group (yes, I know this is ugly and dirty, but it was the quickest and easiest way to get this up and running without having to mess too much with permissions)

Install Java JRE

  • Extract Java JRE to C:\Program Files\Java\jdk[version number]
  • Set up a system environment variable
    • Right-click on ‘My Computer
    • Select ‘Properties’
    • Click on “Advanced system settings”
    • Select the ‘Advanced’ tab
    • Click on ‘Environment Variables…’
    • Under ‘System variables’ click ‘New…’
    • Enter the following:
      • Variable name: JAVA_HOME
      • Variable value: C:\Program Files\Java\jdk\jre

Java is now installed and the variables required for the LogServer set.

Install ElasticSearch

  • Extract the downloaded ElasticSearch files to D:\ElasticSearch
  • Edit the D:\ElasticSearch\config\elasticsearch.yml file
    • Set the cluster.name to “[clustername]” (Take note of what you do use – this will be useful if you decide to add in more ElasticSearch servers later)
    • Set the path.data option to D:\LogData
  • Edit the D:\ElasticSearch\bin\service.bat file
    • Under the REM ***** JAVA options ***** add in an entry “set ES_MAX_MEM=4g”. (or however much memory you want it to use – we gave it access to everything because it is the only service hosted on this server)
  • Set up a firewall rule to allow the ElasticSearch ports
    • Open ‘Windows Firewall with Advanced Security’
    • Select ‘Inbound Rules’
    • Click ‘New Rule…’
    • In the window that appears, select ‘Port’ and click ‘Next’
    • Make sure ‘TCP’ is selected and check ‘Specific local ports:’
      • 9200, 9300
    • Click ‘Next’
    • Select ‘Allow the connection’ and click ‘Next’
    • Select all the profiles you want it to use (we selected all three, as we’ll have logs coming from multiple sources/multiple domains and from our DMZ) and click ‘Next’
    • Name the rule ‘ElasticSearch’ and give it a description (if you so desire)
    • Click ‘Finish’
  • To test that everything has been configured correctly:
    • Open a command prompt (no admin rights needed)
    • cd to D:\ElasticSearch and run
      • bin\elasticsearch.bat
      • If it doesn’t sit there waiting for input, something isn’t configured properly – go back over your configuration to ensure everything has been set correctly
    • Ctrl+C to quit
  • Install the ElasticSearch Windows service
    • Open a command prompt as an administrator
    • cd to D:\ElasticSearch\bin
    • Type: service install
    • The ElasticSearch service is now installed
  • Configure the ElasticSearch service
    • Open ‘Services’
    • Find the ‘ElasticSearch’ service
      • If it’s not present, go back and install the service
    • Right-click and select ‘Properties’
    • On the ‘General’ tab, change ‘Startup type’ to Automatic
    • On the ‘Log On’ tab, change to use the service account you created
    • Click ‘Apply’

ElasticSearch will now be running on the server successfully as a service.

Install Kibana

(Unlike the previous version, Kibana 4.* no longer requires IIS to run and instead runs inside it’s own webserver, yay!)

  • Extract the Kibana files to D:\Kibana
  • Edit the Kibana config file:
    • Browse to D:\Kibana\config
    • Right-click on ‘kibana.yml’ and click ‘Edit’
    • In the file that opens, edit the line that starts with ‘elasticsearch.url:’ to be:
      • elasticsearch.url: “http://[FQDN of log server]:9200”
  • Set up a firewall rule to allow the Kibana port
    • Open ‘Windows Firewall with Advanced Security’
    • Select ‘Inbound Rules’
    • Click ‘New Rule…’
    • In the window that appears, select ‘Port’ and click ‘Next’
    • Make sure ‘TCP’ is selected and check ‘Specific local ports:’
      • 5601
    • Click ‘Next’
    • Select ‘Allow the connection’ and click ‘Next’
    • Select all three profiles (Domain, Private & Public) and click ‘Next’
    • Name the rule ‘Kibana’ and give it a description
  • Setup the Kibana service to run as a service – number of ways to do this – you could install it as a service or use a third-party service manager, but I’ve chosen the easier “run a scheduled task” method:
    • Open Task Scheduler
    • Click ‘Create Task…’
    • On the ‘General’ tab
      • ‘Name’ field: Start Kibana
      • Click ‘Change User or Group…’ and select the service account you created earlier
      • ‘Security Options’, select ‘Run whether user is logged on or not’
    • On the ‘Triggers’ tab
      • Select ‘New…’
      • Beside ‘Begin the task:’ select ‘At startup’ and click ‘OK’
    • On the ‘Actions’ tab
      • Select ‘New…’
      • Beside ‘Action:’ select ‘Start a program’
        • Program/script: D:\Kibana\bin\kibana.bat
        • Start in (optional): D:\Kibana\bin
      • Click ‘OK’
    • On the ‘Settings’ tab
      • ‘Allow task to be run on demand’ is checked
      • ‘Run task as soon as possible after a schedule start is missed’ is checked
      • ‘If the task fails, restart every:’ 5 minutes
      • ‘Attempt to restart up to:’ 3 times
      • ‘If the running task does not end when requested, force it to stop’ is checked
    • Click ‘OK’
    • Put in the password for your service account
    • Click on the service and select ‘Run’
      • The ‘Last Run Result’ will display 0x41301 as the kibana.bat file is being run – this will remain this way unless there is an error (4101 means “running”)
    • Either restart your server or right-click and ‘Run’ this scheduled task
  • To test that everything has been configured correctly:
    • Open up a web browser on your PC and browser to: http://[FQDN of log server]:5601 (if you want to change this, you can modify the port number in the kibana.yml file)
    • If you cannot access the website, something isn’t configured correctly (either kibana or elasticsearch) – go back and check your configuration

The Kibana webserver will now be running and can be successfully accessed.

Install Logstash

  • Create the LogStash config file (to read from ElasticSearch)
    • See below for first ‘Update’ regarding our configuration; basic configuration information can be found here from Elastic
    • Place this file in D:\LogStash
  • Set up a firewall rule to allow the LogStash port
    • Open ‘Windows Firewall with Advanced Security’
    • Select ‘Inbound Rules’
    • Click ‘New Rule…’
    • In the window that appears, select ‘Port’ and click ‘Next’
    • Make sure ‘TCP’ is selected and check ‘Specific local ports:’
      • 3515
    • Click ‘Next’
    • Select ‘Allow the connection’ and click ‘Next’
    • Select all three profiles (Domain, Private & Public) and click ‘Next’
    • Name the rule ‘LogStash’ and give it a description
    • Click ‘Finish’
  • To test
    • Open a command prompt (no admin rights needed)
    • cd to D:\LogStash and run
      • bin\logstash.bat agent -f logstash.conf
      • If it doesn’t sit there waiting for input, something isn’t configured properly – go back over your configuration to ensure everything has been set correctly
    • Ctrl+C to quit
  • Setup the LogStash scheduled task
    • Open Task Scheduler
    • Click ‘Create Task…’
    • On the ‘General’ tab
      • ‘ Name’ field: Start LogStash
      • ‘Security Options’, select ‘Run whether user is logged on or not’
    • On the ‘Triggers’ tab
      • Select ‘New…’
      • Beside ‘Begin the task:’ select ‘At startup’ and click ‘OK’
    • On the ‘Actions’ tab
      • Select ‘New…’
      • Beside ‘Action:’ select ‘Start a program’
        • Program/script: D:\LogStash\bin\logstash.bat
        • Add arguments (optional): agent -f logstash.conf
        • Start in (optional): D:\LogStash
    • On the ‘Settings’ tab
      • ‘Allow task to be run on demand’ is checked
      • ‘Run task as soon as possible after a schedule start is missed’ is checked
      • ‘If the task fails, restart every:’ 5 minutes
      • ‘Attempt to restart up to:’ 3 times
      • ‘If the running task does not end when requested, force it to stop’ is checked
    • Click ‘OK’
    • Put in the password for your service account
    • Click on the service and select ‘Run’
      • The ‘Last Run Result’ will display 0x41301 as the kibana.bat file is being run – this will remain this way unless there is an error (4101 means “running”)

LogStash will now be running on the server successfully without a user needing to be logged in.

Client Server Installation Instructions

nxLog

  • Run the nxlog.msi
    • Select ‘I accept the terms in the License Agreement’
    • Click ‘Install’
    • If prompted by UAC, click ‘Yes’
    • Uncheck ‘Open README.txt to read important installation notes’
    • Click ‘Finish’
  • After it’s installed
    • Browse to C:\Program Files (x86)\nxlog\conf
    • Make a copy of nxlog.conf
    • Rename the existing nxlog.conf file to nxlog-default.conf
      • If prompted by UAC, click ‘Continue’
    • Make any changes to the nxlog.conf file as required (see ‘Other tips/configuration’ below for changes that have been made in our environment)
  • Edit the config file
    • Change the host setting near the bottom from 127.0.0.1 to the FQDN of the Log Server
  • Start the service
    • Open up Services
    • Find the ‘nxlog’ service
    • Right-click and select ‘Start’

The server will now be sending logs to the Log Server

Other tips/configuration

I tried really hard to get LogStash to run as a service…and I failed miserably. If someone knows how to get this working, please enlighten me as my batch file does work, but it’s not quite as clean and lovely as a service.

A really nifty command that we’ve found is that sometimes you may need to delete the logs you’ve collected – either because there’s too many, or you’ve changed your config and want to collect something else, or you were testing and want to get rid of the test logs you’d collected. In this case, the way we were deleting things was via PowerShell (all hail PowerShell!):

Invoke-WebRequest -Uri http://[FQDN of log server]:9200/[name of log file folder] -Method DELETE

In order to not be absolutely FLOODED with events, we also modified the nxLog conf file to only collect what we wanted. You may want to tweak this yourself, depending on what you’re interested in collecting.

Changes we made to the nxlog.conf file:

Query <QueryList>\
<Query Id="0">\
<Select Path="Security">*</Select>\
<Suppress Path="Security">*[System[(EventID=4624 or EventID=4776 or EventID=4634 or EventID=4672 or EventID=4688)]]</Suppress>\
<Select Path="System">*[System[(EventID=1074 or (EventID &gt;= 6005 and EventID &lt;= 6009) or EventID=6013)]]</Select>\
<Select Path="Microsoft-Windows-TerminalServices-LocalSessionManager/Operational">*</Select>\
</Query>\
</QueryList>

This is giving us a few things:

  • Security Log: we’ve excluded a few ID’s, purely because they were generating way too much traffic to be useful (the “User has logged on”, for example, generated over 6 million log entries in 24 hours…) – if you’re going to be logging your security logs into this thing, you need to exclude stuff. Otherwise you’ll just end up filling your disk way too quickly. Just as an example, leaving security as *, we used 13GB in less than 24 hours – suppressing those 5 event ID’s changed that to only 300MB in 24 hours…
  • System Log: we’re only including a few things here – the logs that tell us when the server was shut down/restarted/started.
  • Terminal Services – Local Session Manager: this was picked up by a colleague who included it here. This little log lets us know when people are logging on to the domain controller – in particularly, when people are logging on directly to the domain controller via the console. This is bad and we want to strongly discourage it…so we log it.

Our default dashboard (I’ve removed any proprietary info from the image so it’s safe to view!) has also been customised a bit (thanks again to Ken! ^_^) to include some of the information most useful to us and to make it look nice and shiny to management. In particular:

  • Pie chart breaking down Event ID’s
  • Bar chart showing our most active DC’s
  • Pie chart of the accounts that are being locked out the most – this, for me right now, is one of the more interesting charts…
  • Standard bar chart, showing logs over time
  • A sorted column list displaying all events but with limited columns, in particular: EventTime, EventID, SourceName, message, SubjectUserName, TargetUserName – this may not include every bit of information we need for certain events, but it fits for most events.

So yes – that’s our log server. Very exciting. If there are any updates or tweaks, I’ll do an updated post.

UPDATE (23rd June 2014) – I was requested to give information on our logstash.conf file as well as the dashboard we use.

The edit logstash.conf file:

input {
	# Accept messages in on tcp/3515
	# Incoming messages will be in json format, one per line
	# Tag these messages as windows and eventlog so we can filter on them later on
    tcp {
        port => 3515
	codec => json_lines
	tags => ["windows","eventlog"]
    }
}

filter {
    # If it is an eventlog message, change some fields to lower case, and rename some fields so they match logstash's default
	if "eventlog" in [tags] {
        mutate {
            lowercase => [ "EventType", "FileName", "Hostname", "Severity", "host" ]
            rename => [ "Hostname", "host" ]
            rename => [ "Message", "message" ]
        }
    }
}

output { 
	# Send all the output to the elasticsearch cluster listed below
	elasticsearch { 
		host => localhost
		cluster => "YourClusterName"
	} 
}

I was also asked for a copy of our dashboard.json file, which I’ve uploaded here: Kibana Dashboard (default.json). Due to security restrictions on my blog, I’ve uploaded it as a .txt file. When you’ve downloaded it, just change the .txt to .json and away you go!

22 thoughts on “Building up a logging server – OSS style!

  1. caskings (@caskings)

    Your post was the final prompt for me to pull my finger out and centralise my logging. I now have a shiny fresh graylog2 install and have begun pointing all my infrastructure to it.

    Reply
  2. jd

    maybe i’m blind – but how does your nxlog.conf look like? had tried with syslog format or gelf but didn’t work propper.

    Reply
    1. girlgerms Post author

      Under ‘Other tips/configuration’ of my post is a section on the query we used in nxlog. Everything else was left as default.

      Reply
    1. girlgerms Post author

      Hi John – I tried for two days or so to get nssm to work with LogStash. They just didn’t want to play nicely. If you’ve got some info on how you did it, I’d be most grateful!

      Reply
  3. Dean Smith

    I use NSSM for logstash too and it works great. (in fact I use it for everything, things like log rotation etc. make it so much nicer than using Windows Services/Scheduled Tasks.
    The issue you might be having is that logstash seems to only really look for a supplied config file in it’s working directory. I just created a simple batch file called nssm_logstash.bat and put it in my logstash /bin director>

    @echo off
    cd “C:\apps\logstash\bin”
    logstash.bat agent -f logstash.conf

    Reply
  4. bloggermouthtiven

    Firstly, great article. Nicest Ive seen for a windows install. I do have some questions:
    – I couldnt find Kibana\config.js file
    – Im as a loss as to what the nxlog.conf file should like like (Im trying to get event logs as my test

    Reply
    1. girlgerms Post author

      The config.js file should be directly in the Kibana folder. If it’s not there, I’d suggest downloading again and doing another extraction, because it sounds like it is missing and maybe didn’t extract cleanly.

      How your nxlog.conf file looks depends on what it is you’re trying to get off the box. If all you’re trying to do (as a test) is get all event logs off a standard Windows server, then your ‘Query’ section should look as follows:
      Query \
      \
      \
      \
      \
      \

      (Just be warned, if your server is very chatty or you have lots of people logging in, your security log is going to be quite large!)

      If you’re planning on setting it up for Domain Controllers, as we have, you’d then want to look at some of the extra logs (such as Directory Service & DFS Replication).

      Hope that helps!

      Reply
  5. Smon

    I can’t express how much this has helped me… thank you so much.

    I do have one question though. The nxlog.conf says it will connect on port 514. In the logstash.conf you posted, it says it will receive on port 3515. Is that correct?

    Reply
    1. Chris

      Yes I had to change 3515 to 514 to get it working in logstash.conf
      i.e.

      input {
      tcp {
      port => 514
      codec => json_lines
      tags => [“windows”,”eventlog”]
      }
      }

      Also in logstash.conf I had to change the syntax in the output section to change host to hosts (host deprecated), I added port number (note quotes and square brackets required) and had to remove the cluster line. I am using elasticsearch 2.3.5 kibana 4.5.4 and logstash 2.3.4, with nxlog 2.8.1248
      i.e.

      output {
      elasticsearch { hosts => [“localhost:9200”] }
      }

      It all appears to be working fine now, 4 products integrated with only the 2 minor changes above, many thanks to the Jess for taking the time to write this guide and keeping it updated.

      Chris, UK

      Reply
      1. girlgerms Post author

        Thank you kindly for those updates, it’s very much appreciated. When I get some time this week, I’ll roll them in to my post!

  6. Nick K

    Clearly i’m late to the game here, but this was very helpful in regards to NXlog and Windows Event logs. I’m using Kibana 4.4.2 and your dashboard won’t import. I’m assuming there was an update that broke it. Just a heads up.

    Reply
    1. girlgerms Post author

      Thank you kindly for the feedback! I’m waiting on my new ELK box to start receiving logs (different client) and once that’s done I’ll be able to regenerate some of the dashboard stuff 🙂

      Reply
  7. 0derentis

    Hi girlgerms,

    Came across this post after streaming your talk at Ignite NZ 2016 and figured I could add some value here for anyone else who wants to install the ELK stack as a service in Windows.

    Here’s a sanitized extract from my deployment powershell script to do the service installation and service config for Elasticsearch, Logstash and Kibana (and then install x-pack).

    # ELK stack v5.0.0
    Invoke-Expression -command (“C:\ELK\Elasticsearch\bin\elasticsearch-service.bat install”)

    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm install Logstash C:\ELK\Logstash\bin\logstash.bat AppDirectory C:\ELK\Logstash\bin”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Logstash AppParameters Arguments: -f c:\ELK-Stack\logstash\bin\logstash-syslog.conf”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Logstash Description ‘Logstash Service'”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Logstash Start SERVICE_DELAYED_START”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Logstash DependOnService elasticsearch-service-x64”)
    Invoke-Expression –command (“C:\ELK\logstash\bin\logstash-plugin install logstash-input-beats”)

    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm install Kibana C:\ELK\Kibana\bin\kibana.bat AppDirectory C:\ELK\Kibana\bin”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Kibana AppParameters ‘ ‘”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Kibana Description ‘Kibana Service'”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Kibana Start SERVICE_DELAYED_START”)
    Invoke-Expression –command (“C:\ELK\nssm\win64\nssm set Kibana DependOnService elasticsearch-service-x64, Logstash”)

    Invoke-Expression -command (“C:\ELK\Elasticsearch\bin\elasticsearch-plugin install x-pack –batch”)
    Invoke-Expression -command (“C:\ELK\Kibana\bin\kibana-plugin install x-pack”)

    Reply
  8. Rashed

    (A bit late to the party here!)

    After having my Splunk request smacked down and told “You don’t need this”

    I was despondently browsing Reddit yesterday and saw a post about windows logging and that’s when I saw your comment about ELK, I was intrigued. *cue celestial trumpets*

    I ended staying up til 2am reading your blog! Fantastic job, you’re a grand example of what a sysadmin should be 🙂

    And here I am now (3 hours of sleep and many cups of coffee later), meticulously following your steps and installing ElasticSearch for the first time.

    But, I got stuck on something since I’m not big on Java: I cannot find where I should put in the memory line in the elasticsearch-service.bat file.

    Mind nudging me in the right direction? 🙂

    Thank you!

    Reply
  9. Azurem

    Thought I’d add this info while I’m running through this guide (Thanks for this guide btw).
    ElasticSearch now requires Java SE Runtime Environment 8 so the link above may trip a few people up 🙂

    https://www.elastic.co/guide/en/elasticsearch/reference/master/_installation.html#_installation

    People should receive this error (or very similar) if they are using Java 7:

    Exception in thread “main” java.lang.UnsupportedClassVersionError: org/elasticse
    arch/bootstrap/Elasticsearch : Unsupported major.minor version 52.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:14
    2)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)

    Reply

Leave a Reply to jd Cancel reply