.

Security incidents are frequent,

Discussion of group friends in the first half of 2018:

www.safedog.cn/news.html?i…
www.easyaq.com/news/118440…

1. India: The security permission for Elasticsearch is not set. 2. Wedding website: Elasticsearch server exposed to public network. 3. Group friend: Port 9200 is mapped to the Internet.

Securing Elasticsearch single node or cluster network must be on the agenda!!

How do YOU secure your Elasticsearch cluster?

1. Do not expose Elasticsearch to the Internet

This point must be emphasized. Even in development and testing, there is no reason to expose your cluster to a public IP. Remote joint adjustment, external network access to the scene of major companies exist, but please do not “streaking”.

1.1 Firewall: Restrict public ports

Limit 9200 – cluster external access port

iptables -A INPUT -i eth0 -p tcp --destination-port 9200 -s {PUBLIC-IP-ADDRESS-HERE} -j DROP
Copy the code

Limit 9300 — internal cluster communication port

iptables -A INPUT -i eth0 -p tcp --destination-port 9300 -s {PUBLIC-IP-ADDRESS-HERE} -j DROP
Copy the code

Restrict 5601 — Kibana access port

iptables -A INPUT -i eth0 -p tcp --destination-port 5601 -s {PUBLIC-IP-ADDRESS-HERE} -j DROP
Copy the code

After that you can relax! Elasticsearch is no longer accessible from the Internet.

1.2 Bind only Elasticsearch ports to Intranet private IP addresses

Change the configuration in ElasticSearch.yml to bind only to a private IP address or to bind a single node instance to a loopback interface:

Network. Bind_host: 127.0.0.1Copy the code

1.3 Adding a Private Network between Elasticsearch and the Client Service

If you need to access Elasticsearch from another computer, connect them over a VPN or any other private network. A quick way to establish a secure tunnel between two machines is through SSH tunnels:

ssh -Nf -L 9200:localhost:9200 user@remote-elasticsearch-server
Copy the code

You can then access Elasticsearch from the client computer through an SSH tunnel

curl http://localhost:9200/_search
Copy the code

2. Use Nginx for authentication and SSL/TLS

There are several open source and free solutions that provide Elasticsearch access authentication, but if you want something quick and easy, here’s how to do it yourself using Nginx

2.1 Nginx generation itself

Step 1: Generate a password file

printf "esuser:$(openssl passwd -crypt MySecret)\n" > /etc/nginx/passwords
Copy the code

Step 2:

Generate a self-signed SSL certificate if you do not have an official certificate…

sudo mkdir /etc/nginx/ssl sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/nginx/ssl/nginx.key -out  /etc/nginx/ssl/nginx.crtCopy the code

Step 3:

Add proxy configuration and activate basic authentication to /etc/nginx/nginx.conf using SSL (note that we expect SSL certificates and key files in /etc/nginx/ssl/). Ex. :

# define proxy upstream to Elasticsearch via loopback interface in http { upstream elasticsearch { server 127.0.0.1:9200; }} server {# enable TLS listen 0.0.0.0:443 SSL; ssl_certificate /etc/nginx/ssl/nginx.crt; Ssl_certificate_key/etc/nginx/SSL/nginx. Key ssl_protocols TLSv1.2; ssl_prefer_server_ciphers on; ssl_session_timeout 5m; ssl_ciphers "HIGH:! aNULL:! MD5 or HIGH:! aNULL:! MD5:! 3DES"; # Proxy for Elasticsearch location / { auth_basic "Login"; auth_basic_user_file passwords; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_set_header X-NginX-Proxy true; # use defined upstream with the name "elasticsearch" proxy_pass http://elasticsearch/; proxy_redirect off; if ($request_method = OPTIONS ) { add_header Access-Control-Allow-Origin "*"; add_header Access-Control-Allow-Methods "GET, POST, , PUT, OPTIONS"; add_header Access-Control-Allow-Headers "Content-Type,Accept,Authorization, x-requested-with"; add_header Access-Control-Allow-Credentials "true"; add_header Content-Length 0; add_header Content-Type application/json; return 200; }}Copy the code

Restart Nginx and try to access Elasticsearch

https://localhost/_search
Copy the code

2.2 Elasticsearch security tool Xpack

Version 6.3 has been integrated and released without additional installation. But belong to charge function, free trial of 1 month. If it is a local rich user, might as well buy.

2.3 Elasticsearch third-party security plugin

You can install and configure one of Elasticsearch’s several free security plug-ins to enable authentication:

  1. ReadonlyREST for Elasticsearch is available on Github. It provides different types of authentication, from basic to LDAP, as well as index and operation-level access control. Git address: github.com/sscarduzio/…

  2. SearchGuard is a free security plug-in for Elasticsearch (charging for some advanced features), including role-based access control and SSL/TLS encrypted node-to-node communication. Each Elasticsearch cluster can use other enterprise features such as LDAP authentication or JSON Web token authentication.

Auditing and alerts

As with any type of system that stores sensitive data, you have to monitor it very closely. This means not only monitoring its various metrics, whose sudden changes can be an early sign of a problem, but also looking at its logs.

Elasticsearch is supported by many monitoring vendors. Logs should be collected and sent in real time to the log management service, where alerts need to be set up to monitor for any anomalies or suspicious activity, etc.

Alerting metrics and logs means that you will find security vulnerabilities early and take appropriate action, hopefully preventing further damage.

4. Back up and restore data

Elasticdump is a very convenient tool to backup/restore or re-index data based on Elasticsearch queries. To back up the full index, ` ` ` Elasticsearch snapshot API] (https://www.elastic.co/guide/en/elasticsearch/reference/6.5/modules-snapshots.html) ` ` is the right tools. The snapshot API provides the ability to create and restore entire indexes, snapshots stored in files, or Amazon S3 buckets.

Let’s look at some examples of Elasticdump and snapshot backup and recovery.

1) Install elasticdump package containing Node Package Manager.

 npm i elasticdump -g
Copy the code

2) Back up the query statement as a ZIP file.

elasticdump --input='http://username:password@localhost:9200/myindex' --searchBody '{"query" : {"range" :{"timestamp" : {"lte": 1483228800000}}}}' --output=$ --limit=1000 | gzip > /backups/myindex.gz
Copy the code

3) Restore from the ZIP file.

zcat /backups/myindex.gz | elasticdump --input=$ --output=http://username:password@localhost:9
Copy the code

Use the latest version of Elasticsearch.

This is a general best practice because, in older releases, there were specific bugs in version 5.x. If you are still using 1.x or 2.x, be sure to disable dynamic scripting. Elasticsearch allows you to evaluate custom expressions using scripts, but as Elastic has documented, using the non-sandboxed language can be a problem. In the latest version 6.5.x, the space function is added, and security + role division is further enhanced.

Reference: logz. IO/blog/securi… Sematext.com/blog/elasti…