IR Tales: The Quest for the Holy SIEM: Elastic stack + Sysmon + Osquery

This blog post is the first in a series to demonstrate how to install and setup common SIEM platforms. The ultimate goal of each blog post is to empower the reader to choose their own adventure by selecting the best SIEM based on their goals or requirements. Each blog post in the series will provide Docker-compose v2, Docker-compose for Swarm, Ansible, Vagrant, and manual instructions to allow the reader to setup each platform with the deployment method of their choosing. This blog post will cover how to setup the Elastic stack formerly known as ELK. In addition to setting up the Elastic stack I will provide instructions to install Sysmon + Winlogbeat on Windows and Osquery + Filebeat on Ubuntu to ship logs to Elastic.

Goals

  • Setup the Elastic stack with Docker
  • Setup the Elastic stack with Ansible
  • Setup the Elastic stack with Vagrant
  • Setup the Elastic stack with manual instructions
  • Test Elastic stack with Python script
  • Ingest Sysmon logs into Elastic with Winlogbeat
  • Ingest Osquery logs into Elastic with Filebeat

Update log

  • July 15th 2021 – Updated Docker and Ansible playbooks from v7.10 to v7.13.2
  • August 30th 2021 – Added Vagrantfile for Elastic
  • October 22nd 2021 – Updated Docker and Ansible playbooks from v7.13.2 to v7.15.1
  • January 11th 2022 – Updated Docker and Ansible playbooks from v7.15.1 to v7.16.2

Background

What is Elastic stack?

The ELK stack is an amazing and powerful collection of three open source projects – Elasticsearch, Logstash, and Kibana. Despite each one of these three technologies being a separate project, they have been built to work exceptionally well together. Elastic Stack is a complete end-to-end log analysis solution which helps in deep searching, analyzing and visualizing the log generated from different machines.

What is Sysmon?

System Monitor (Sysmon) is a Windows system service and device driver that, once installed on a system, remains resident across system reboots to monitor and log system activity to the Windows event log. It provides detailed information about process creations, network connections, and changes to file creation time. By collecting the events it generates using Windows Event Collection or SIEM agents and subsequently analyzing them, you can identify malicious or anomalous activity and understand how intruders and malware operate on your network.

What is Osquery?

Osquery exposes an operating system as a high-performance relational database. This allows you to write SQL-based queries to explore operating system data. With osquery, SQL tables represent abstract concepts such as running processes, loaded kernel modules, open network connections, browser plugins, hardware events or file hashes.

What is Winlogbeat?

Winlogbeat ships Windows event logs to Elasticsearch or Logstash. You can install it as a Windows service. Winlogbeat reads from one or more event logs using Windows APIs, filters the events based on user-configured criteria, then sends the event data to the configured outputs (Elasticsearch or Logstash). Winlogbeat watches the event logs so that new event data is sent in a timely manner. The read position for each event log is persisted to disk to allow Winlogbeat to resume after restarts.

What is Filebeat?

Filebeat is a lightweight shipper for forwarding and centralizing log data. Installed as an agent on your servers, Filebeat monitors the log files or locations that you specify, collects log events, and forwards them to Logstash for indexing.

Network diagram

Generate OpenSSL private key and public cert

  1. git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
  2. cd ChooseYourSIEMAdventure
  3. vim conf/tls/tls.conf and set:
    1. Set the location information under [dn]
      1. C – Set Country
      2. ST – Set state
      3. L – Set city
      4. O – Enter organization name
      5. emailAddress – Enter a valid e-mail for your org
    2. Replace example.com in all fields with your domain
    3. For alt names list all the valid DNS records for this cert
    4. Save and exit
  4. openssl req -x509 -new -nodes -keyout conf/tls/tls.key -out conf/tls/tls.crt -config conf/tls/tls.conf
    1. Generate TLS private key and public certificate

Install/Setup Elastic stack v7.15 with Docker-compose v2.x

WARNING

The Docker-compose v2.x setup is for development use ONLY. The setup contains hard-coded credentials in configs and environment variables. For a more secure deployment please skip to the next section to use Ansible.

WARNING

  1. git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
  2. cd ChooseYourSIEMAdventure
  3. vim .env and set:
    1. ELASTIC_VERSION – OPTIONAL – Set the version of Elastic you want to use
    2. SIEM_PASSWORD – Set the password for the Elastic stack
    3. NGINX_VERSION – OPTIONAL – Set the version of NGINX you want to use
    4. Save and exit
  4. docker-compose -f docker-compose-elastic.yml build
  5. docker-compose -f docker-compose-elastic.yml up

Install/Setup Elastic stack v7.15 with Docker-compose v3.x (Swarm)

WARNING

The Docker-compose v3.x is for development use ONLY. The purpose of this setup is to demonstrate how to use the Elastic keystore. Therefore this setup is arguably more secure than Docker-compose v2.x because it uses the Elastic keystore to store credentials, which means that credentials/secrets are not stored in configuration files or environment variables.

WARNING

Create secrets

  1. git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
  2. cd ChooseYourSIEMAdventure
  3. for user in 'elastic' 'kibana_system' 'logstash_system' 'beats_system' 'apm_system' 'remote_monitoring_user' 'logstash_writer'; do pass=$(openssl rand -base64 32 | tr -cd '[:alnum:]'); echo ${pass} | docker secret create elastic-builtin-${user} -; echo ${user} - ${pass}; done
    1. For loop to generate random passwords for each builtin user

Docker start stack

  1. docker stack deploy -c docker-compose-swarm-elastic.yml elastic
  2. docker exec -it $(docker ps | grep elastic_elasticsearch | awk '{print $1}') /usr/share/elasticsearch/elasticsearch-entrypoint.sh
    1. Execute script to setup builtin users
  3. docker service logs -f elastic_elasticsearch
    1. Monitor Elasticsearch logs
  4. docker service logs -f elastic_kibana
    1. Monitor Kibana logs
  5. docker service logs -f elastic_logstash
    1. Monitor Logstash logs

Install/Setup Elastic stack v7.15 with Ansible

WARNING

This Ansible playbook will allocate half of the systems memory to Elasticsearch. For example, if a machine has 16GBs of memory, 8GBs of memory will be allocated to Elasticsearch.

WARNING

Init playbook

  1. git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
  2. cd ChooseYourSIEMAdventure
  3. vim hosts.ini add IP of Elastic server under [elastic]
  4. vim group_vars/all.yml and set:
    1. base_domain – Set the domain where the server resides
    2. timezone – OPTIONAL – The default timezone is UTC+0
    3. siem_username – Ignore this setting
    4. siem_password – Ignore this setting
  5. vim group_vars/elastic.yml and set:
    1. hostname – Set the desired hostname for the server
    2. elastic_repo_version – Change the repo version to install the Elastic stack – best to leave as default
    3. elastic_version – Set the version of the Elastic stack to install
    4. Save and exit

Run playbook

  1. ansible-playbook -i hosts.ini deploy_elastic.yml -u <username> -K

Install/Setup Elastic stack with Vagrant

  1. git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
  2. cd ChooseYourSIEMAdventure
  3. VAGRANT_VAGRANTFILE=Vagrantfile-elastic vagrant up

Manual install/Setup of Elastic stack v7.13

WARNING

These manual instructions will allocate half of the systems memory to Elasticsearch. For example, if a machine has 16GBs of memory, 8GBs of memory will be allocated to Elasticsearch.

WARNING

Init Linux instance

  1. apt update -y && apt upgrade -y && reboot
    1. Update Ubuntu and reboot
  2. timedatectl set-timezone Etc/UTC
    1. Set timezone to UTC +0
  3. apt install wget net-tools git -y
  4. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/tls/tls.conf -O /etc/ssl/tls.conf
    1. Download TLS config
    2. Go to the “Generate OpenSSL private key and public cert” section at the top for more details
  5. openssl req -x509 -new -nodes -keyout /etc/ssl/private/elastic.key -out /etc/ssl/certs/elastic.crt -config /etc/ssl/tls.conf
    1. Generate self-signed public certificate and private key
  6. chmod 440 /etc/ssl/private/elastic.key
  7. chmod 644 /etc/ssl/certs/elastic.crt
    1. Set the proper permissions for the private key and public cert
  8. wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
    1. Add Elastic GPG key
  9. apt-get install apt-transport-https -y
  10. echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
    1. Add Elastic repo

Install/Setup Elasticsearch 7.15

  1. apt-get update && apt-get install elasticsearch=7.15.1 -y
    1. Install Elasticsearch
  2. sed -i "s#-Xms1g#-Xms$(echo $(( $(dmidecode -t 17 | grep 'Size: ' | awk '{print $2}') / 2 ))"M")#g" /etc/elasticsearch/jvm.options
    1. Setting the maximum size of the total heap size for Elasticsearch
  3. sed -i "s#-Xmx1g#-Xmx$(echo $(( $(dmidecode -t 17 | grep 'Size: ' | awk '{print $2}') / 2 ))"M")#g" /etc/elasticsearch/jvm.options
    1. Setting the initial size of the total heap size for Elasticsearch
  4. echo 'xpack.security.enabled: true' >> /etc/elasticsearch/elasticsearch.yml
    1. Enable X-Pack security
  5. echo 'xpack.security.transport.ssl.enabled: true' >> /etc/elasticsearch/elasticsearch.yml
    1. Enable X-Pack security transport SSL
  6. echo 'discovery.type: single-node' >> /etc/elasticsearch/elasticsearch.yml
    1. Set to a single-node mode
  7. systemctl restart elasticsearch
  8. systemctl enable elasticsearch
  9. yes | /usr/share/elasticsearch/bin/elasticsearch-setup-passwords -s auto > /tmp/elasticsearch-setup-passwords.txt
    1. Generate Elasticsearch passwords
  10. cat /tmp/elasticsearch-setup-passwords.txt
  11. curl -u elastic:<Elastic password> http://127.0.0.1:9200

Install/Setup Kibana 7.15

  1. apt install kibana=7.15.1 -y
  2. sed -i 's/#elasticsearch.username:/elasticsearch.username:/g' /etc/kibana/kibana.yml
    1. Set the Kibana username for Elasticsearch
  3. sed -i "s/#elasticsearch.password: \"pass\"/elasticsearch.password: \"$(cat /tmp/elasticsearch-setup-passwords.txt | grep kibana_system | grep PASSWORD | awk '{print $4}')\"/g" /etc/kibana/kibana.yml
    1. Set the Kibana password for Elasticsearch
  4. systemctl restart kibana
  5. systemctl enable kibana

Install/Setup NGINX

  1. apt install nginx -y
  2. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/nginx/nginx.conf -O /etc/nginx/nginx.conf
    1. Download nginx.conf
  3. ln -s /etc/ssl/certs/elastic.crt /etc/ssl/certs/nginx.crt
  4. ln -s /etc/ssl/private/elastic.key /etc/ssl/private/nginx.key
    1. Create symlinks to private key and public certificate
  5. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/nginx/kibana.conf -O /etc/nginx/conf.d/kibana.conf
    1. Download kibana.conf
  6. systemctl restart nginx
  7. systemctl enable nginx

Create Logstash user and role for Elastic

  1. curl -X POST http://localhost:9200/_xpack/security/role/logstash_writer --user elastic:$(cat /tmp/elasticsearch-setup-passwords.txt | grep 'PASSWORD elastic' | awk '{print $4}') -H 'Content-Type: application/json' -d '{"cluster": ["manage_index_templates", "monitor", "manage_ilm"], "indices": [ {"names": [ "*" ], "privileges": ["write","create","delete","create_index","manage","manage_ilm"]}]}'
    1. Create an Elastic role that allows all users under this role to perform write, create, delete, create_index, manage, manager_ilm actions on Elastic
  2. logstash_writer_password=$(openssl rand -base64 32 | tr -cd '[:alnum:]')
    1. Generate random password
  3. echo -e "Changed password for user logstash_writer\nPASSWORD logstash_writer = $logstash_writer_password\n" >> /tmp/elasticsearch-setup-passwords.txt
  4. echo $logstash_writer_password
  5. curl --user elastic:$(cat /tmp/elasticsearch-setup-passwords.txt | grep 'PASSWORD elastic' | awk '{print $4}') -X POST http://localhost:9200/_security/user/logstash_writer -H 'Content-Type: application/json' -d '{"password": "$logstash_writer_password", "roles": [ "logstash_writer" ], "full_name": "Logstash writer", "email": "logstash_writer@<domain>"}'
    1. Create an Elastic user with the role logstash_writer

Install/Setup Logstash 7.15

  1. apt install logstash=1:7.15.1-1 -y
  2. /usr/share/logstash/bin/logstash-plugin install logstash-filter-json_encode
    1. Install plugin
  3. mkdir /etc/logstash/tls
    1. Make directory for TLS private key and public certificate
  4. openssl req -x509 -new -nodes -keyout /etc/logstash/tls/logstash.key -out /etc/logstash/tls/logstash.crt -config /etc/ssl/tls.conf
    1. Generate self-signed public certificate and private key for Logstash
  5. mkdir /etc/logstash/tls
    1. Make directory for TLS private key and public certificate
  6. chown logstash:logstash /etc/logstash/tls/logstash.key /etc/logstash/tls/logstash.crt
  7. chmod 644 /etc/logstash/tls/logstash.crt
  8. chmod 600 /etc/logstash/tls/logstash.key
    1. Set the correct permissions for TLS private key and public certificate
  9. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/elastic/02-input-beats.conf -O /etc/logstash/conf.d/02-input-beats.conf
    1. Download Logstash Beats input config
  10. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/elastic/30-output-elasticsearch.conf -O /etc/logstash/conf.d/30-output-elasticsearch.conf
    1. Download Logstash Elasticsearch output config
  11. sed -i 's/{{ logstash_writer_username }}/logstash_writer/g' /etc/logstash/conf.d/30-output-elasticsearch.conf
    1. Set the Logstash username for Elasticsearch
  12. sed -i "s/{{ logstash_writer_password }}/$logstash_writer_password/g" /etc/logstash/conf.d/30-output-elasticsearch.conf
    1. Set the Logstash password for Elasticsearch
  13. systemctl restart logstash
  14. systemctl enable logstash
  15. tail -f /var/log/logstash/logstash-plain.log

Setup UFW

  1. ufw allow OpenSSH
    1. Allow SSH access
  2. ufw allow 'NGINX http'
    1. Allow HTTP
  3. ufw allow 'NGINX https'
    1. Allow HTTPS
  4. ufw allow 5044/tcp
    1. Allow Logstash
  5. ufw enable

Login into Elastic WebGUI

  1. Open a browser to https://<IP addr or FQDN of Elastic>:<Ansible - 443, Docker - 8443> and login
    1. Username: elastic
    2. Password: <Docker the value of SIEM_PASSWORD in .env, Ansible output for the Elastic user during setup>
    3. Select “Log in”

Test Elastic stack

  1. cd ChooseYourSIEMAdventure/pipeline_testers
  2. virtualenv -p python3 venv
  3. pip3 install -r requirements.txt
  4. python3 beats_input_test.py --platform elastic --host <IP addr of Elastic> --api_port <Elasticsearch port - default 9200> --ingest_port <Logstash port - default 5044> --siem_username elastic --siem_password <Elasticsearch password>
  5. Log into Elastic
  6. Expand the menu on the left > Stack Management > Index management
  7. Expand the menu on the left > Stack Management > Index patterns
    1. Select “Create index pattern”
    2. Enter python-logstash-* into Define an index pattern
    3. Select “Next step”
    4. Select “@timestamp” for Time field
    5. Select “Create index pattern”
  8. Select “Discover” in the top left
  9. Enter <random message> into search

Ingest Sysmon logs with Winlogbeat on Windows 10

Install/Setup Sysmon v13 on Windows 10

  1. Login into Windows VM
  2. Open Powershell as Administrator
  3. cd $ENV:TMP
  4. Invoke-WebRequest -Uri https://download.sysinternals.com/files/Sysmon.zip -OutFile Sysmon.zip
    1. Download Sysmon
  5. Expand-Archive .\Sysmon.zip -DestinationPath .
    1. Unzip Sysmon
  6. Invoke-WebRequest -Uri https://raw.githubusercontent.com/olafhartong/sysmon-modular/master/sysmonconfig.xml -OutFile sysmonconfig.xml
    1. Download Sysmon config
  7. .\Sysmon.exe -accepteula -i .\sysmonconfig.xml
    1. Install Sysmon driver and load Sysmon config
  8. Enter eventvwr into Powershell
  9. Expand Application and Services Logs > Microsoft > Windows > Sysmon

Install/Setup Winlogbeat on Windows 10

  1. cd $ENV:TEMP
  2. Invoke-WebRequest -Uri https://artifacts.elastic.co/downloads/beats/winlogbeat/winlogbeat-7.10.0-windows-x86_64.zip -OutFile winlogbeat-7.10.0-windows-x86_64.zip
    1. Download Winlogbeat
  3. Expand-Archive .\winlogbeat-7.10.0-windows-x86_64.zip -DestinationPath .
    1. Unzip Winogbeat
  4. mv .\winlogbeat-7.10.0-windows-x86_64 'C:\Program Files\winlogbeat'
    1. Move WInlogbeat to the Program Files directory
  5. cd 'C:\Program Files\winlogbeat\'
    1. Change to the Program Files directory
  6. Invoke-WebRequest -Uri https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/winlogbeat/winlogbeat.yml -OutFile winlogbeat.yml
    1. code Download Winglogbeat config
  7. Using your favorite text editor open C:\Program Files\winlogbeat\winlogbeat.yml
    1. Open the document from the command line with Visual Studio Code: code .\winlogbeat.yml
    2. Open the document from the command line with Notepad: notepad.exe.\winlogbeat.yml
  8. Scroll down to the output.logstash:
    1. Replace logstash_ip_addr with the IP address of FQDN of Logstash
    2. Replace logstash_port with the port Logstash uses to ingest Beats (default 5044)
  9. powershell -Exec bypass -File .\install-service-winlogbeat.ps1
  10. Set-Service -Name "winlogbeat" -StartupType automatic
  11. Start-Service -Name "winlogbeat"
  12. Get-Service -Name "winlogbeat"

Ingest Osquery logs with Filebeat on Ubuntu 20.04

Install/Setup Osquery v4.6.0 on Ubuntu 20.04

  1. Log into VM with SSH
  2. sudo su
  3. export OSQUERY_KEY=1484120AC4E9F8A1A577AEEE97A80C63C9D8B80B
  4. apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys $OSQUERY_KEY
    1. Add Osquery GPG key for repo
  5. add-apt-repository 'deb [arch=amd64] https://pkg.osquery.io/deb deb main'
    1. Add Osquery repo
  6. apt-get update -y && apt-get install osquery -y
    1. Install Osquery
  7. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/osquery/linux-osquery.conf -O /etc/osquery/osquery.conf
  8. sed -i 's#/etc/osquery/packs/ossec-rootkit.conf#/usr/share/osquery/packs/ossec-rootkit.conf#g' /etc/osquery/osquery.conf
    1. Download Osquery config
    2. Copy of Palantir config
  9. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/osquery/linux-osquery.flags -O /etc/osquery/osquery.flags
    1. Download Osquery flags
  10. systemctl restart osqueryd
  11. systemctl enable osqueryd

Install/Setup Filebeat v7.10 on Ubuntu 20.04

  1. wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | apt-key add -
    1. Add Elastic repo GPG key
  2. apt-get install apt-transport-https -y
  3. echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
    1. Add Elastic repo
  4. apt-get update && sudo apt-get install filebeat -y
    1. Install Filebeat
  5. wget https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/filebeat/linux-filebeat.yml -O /etc/filebeat/filebeat.yml
    1. Download Filebeat config
  6. sed -i 's/{{ logstash_ip_addr }}/<Logstash IP addr>/g' /etc/filebeat/filebeat.yml
  7. sed -i 's/{{ logstash_port }}/<Logstash Beats port - default 5044>/g' /etc/filebeat/filebeat.yml
    1. Set the IP address and port for Logastash
  8. filebeat modules enable osquery
    1. Enable Osquery module for Filebeat
  9. systemctl restart filebeat
  10. systemctl enable filebeat

Create Elastic index patterns

Create Osquery index pattern

  1. Log into Elastic
  2. Settings > Stack Management > Data > Index Management
  3. Ensure that an Osquery index exists
  4. Settings > Stack Management > Kibana > Index patterns
  5. Select “Create index pattern”
    1.  Define an index pattern
      1. Enter osquery-* for Index pattern name
      2. Select “Next step”
    2. Configure settings
      1. Select @timestamp for Time field
      2. Select “Create index pattern”
  6. Settings > Overview > Discover

Create Sysmon index pattern

  1. Log into Elastic
  2. Settings > Stack Management > Data > Index Management
  3. Ensure that an Sysmon index exists
  4. Settings > Stack Management > Kibana > Index patterns
  5. Select “Create index pattern”
    1.  Define an index pattern
      1. Enter sysmon-* for Index pattern name
        1. Select “Next step”
    2.  Configure settings
      1. Select event.created for Time field
      2. Select “Create index pattern”
  6.  Settings > Overview > Discover
  7. Select “sysmon” index

Lessons learned

New skills/knowledge

  • Learned Elastic stack v7.10 and v7.13
  • Learned how to use the Elastic API to create a role and user for Logstash
  • Implemented authentication on Elasticsearch

What I would have done better

  • I would have liked to implement the Elastic keystore for Docker Swarm

References

Leave a Reply

Your email address will not be published. Required fields are marked *