What is ElasticSearch?
ElasticSearch is a distributed, RESTful search and analytics engine capable of solving a growing number of use cases. It is built on top of Apache Lucene. It is commonly used for log analytics, full-text search, and operational intelligence use cases. As the heart of the
Elastic Stack, it centrally stores your data so you can discover the expected and uncover the unexpected.
What will you get from this article?
In this article, I'll guide you how to install it on Windows, publish it with your domain, secure it. I also introduce to you first steps to combined it with Kibana and Logstash, so you can create a ELK system which can analyze a huge log data near real time.
Let's go!
Install Java Runtime
ElasticSearch requires Java runtime to run. To check if having Java on your Windows, let open
cmd and key:
java -version
If you have no Java or the version < 8.x, let go to
Oracle Java website for downloading and installing the latest version or from 8.x.
After installing, let create / update
JAVA_HOME system environment variable, set it to the new folder installed Java runtime. For example:
Install & Configure ElasticSearch
Go to
https://www.elastic.co/downloads/elasticsearch and download the zip file. Unpack into a folder, e.g.
C:\ES
By default ElasticSearch is configured with a 1 GB heap. For real enviroment, this number is not enough. To set new heap size, you must create a system environment variable
ES_HEAP_SIZE and set its value, e.g. 4g. This value depends on the memory size (RAM) of your server,
it should be less than half of RAM (
reference).
After setting new heap size, open
C:\ES\config\elasticsearch.yml, set values for important parameters as the following:
# Path to directory where to store the data (separate multiple locations by comma).
path.data: F:\data
# Path to log files.
path.logs: F:\logs
# Lock the memory on startup.
bootstrap.memory_lock: true
# Upper limit on the fielddata. Old data will be evicted to make space for the new values if it is over the limit size. Can be set to a percentage of the heap size, or a concrete value like 5gb.
indices.fielddata.cache.size: 40%
# Set the bind address to a specific IP (IPv4 or IPv6).
network.bind_host: ["192.168.2.12", "localhost"]
# For ReadonlyREST
rest.action.multi.allow_explicit_index: false
Change the values of parameters upon your server. In which, parameter
rest.action.multi.allow_explicit_index is reserved for
ReadonlyREST tool which will be used for securing queries.
Start ElasticSearch by running
bin\elasticsearch.bat file (e.g.
C:\ES\bin\elasticsearch.bat). To set up ElasticSearch as a Windows service, run
bin\elasticsearch-service.bat install then go to Windows services manager , find ElasticSearch service and change its
Startup Type to
Automatic.
After starting, let open a browser and check URL
http://localhost:9200/ to see if it is working.
Install Logstash
Go to
https://www.elastic.co/downloads/logstash and download the zip file. Unpack into a folder, e.g.
C:\LS
Create a simple config file (e.g.
logstash-simple.conf) in
bin folder (e.g.
C:\LS\bin) with a content as the following example to run Logstash.
input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
Then open
cmd, go to
bin folder & key below command for starting Logstash:
logstash.bat -f logstash-simple.conf
To make a Windows service for Logstash, you can use
Non-Sucking Service Manager (NSSM) tool. Download latest NSSM from its download page, unzip it into a folder, e.g.
F:\soft\nssm. Use
cmd go to
F:\soft\nssm\win64 and run the command:
nssm.exe install Logstash
Then you may fill in values as below and click
Install service button:
After that, you can open Windows services manager to start Logstash service.
Install Kibana
Go to
Kibana download page, download and unzip Kibana into a folder, e.g.
C:\KB
Open
config/kibana.yml file and configure some important parameters like:
# The URL of the Elasticsearch instance to use for all your queries.
elasticsearch.url: "http://localhost:9200"
# Time in milliseconds to wait for responses from the back end or Elasticsearch. This value
# must be a positive integer.
elasticsearch.requestTimeout: 30000
Run
bin\kibana.bat in the installation folder (e.g.
C:\KB\bin\kibana.bat) to start Kibana. Check
http://localhost:5601 to see if Kibana works.
You can use NSSM to create a Windows service for Kibana. The above picture means no data has been shipped to Kibana yet. For shipping sample data into Kibana, you can try
Winlogbeat which sends Windows event logs such as application events, security events, system events, etc. to Logstash.
Setup nginx to access ElasticSearch via a domain
With above configuration, you can only access ElasticSearch via localhost or IP 192.168.2.12. To access ElasticSearch via a domain, we can use nginx forward request to localhost (you can read
my article on this link for how to installing nginx). Of course you can configure ElasticSearch directly on file
C:\ES\config\elasticsearch.yml to allow to access it via a domain / public IP without using nginx, but I want to use nginx for allowing read only request to my ElasticSearch, for writing request I will setup for IP 192.168.2.12. This security will be done on
ReadonlyREST tool in next step.
Below the configuration for nginx:
server {
listen *:80;
server_name yourdomain.com www.yourdomain.com;
location / {
root your_web_site_root_folder;
index index.html index.htm;
}
location /es {
proxy_pass http://localhost:9200;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
rewrite ^/es/(.*) /$1 break;
}
}
Now you can access ElasticSearch via yourdomain.com/es.
Setup ReadonlyREST tool
Go to
https://readonlyrest.com/download.html, select your ElasticSearch version and download.
Open
cmd,
cd to the ElasticSearch home and run the command:
bin/elasticsearch-plugin install file:///download-folder/readonlyrest-<version>.zip
For example, download-folder is
C:\ES. After that, let create
readonlyrest.yml file in the config folder of ElasticSearch (e.g.
C:\ES\config/readonlyrest.yml). Then add rules in this file. For example, I want to allow anything from 192.168.2.12 and other hosts can read only, I can add the following rules:
readonlyrest:
access_control_rules:
- name: "Rule 1 - Allowing anything from localhost"
hosts: [192.168.2.12]
- name: "Rule 2 - Other hosts can only read certain indices"
actions: ["indices:data/read/*"]
indices: ["logstash-*"] # aliases are taken in account!
For more examples, you can
read here to add configuration snippets into this file.
Finally, your ELK is ready for using in real life. Let enjoy it :)
Any comment is welcome!
See you next time.