This blog post is the season finale in a series to demonstrate how to install and setup common SIEM platforms. The ultimate goal of each blog post is to empower the reader to choose their own adventure by selecting the best SIEM based on their goals or requirements. Each blog post in the series will provide Docker-compose v2, Docker-compose for Swarm, Ansible, Vagrant, and manual instructions to allow the reader to setup each platform with the deployment method of their choosing. In addition to setting up Splunk, I will cover fundamental Splunk concepts such as the Common Information Model (CIM). Lastly, I will provide step-by-step instructions to install Sysmon + Splunk Universal Forwarder on Windows, Osquery + Splunk Universal Forwarder on Ubuntu, and Zeek + Filebeat to ship logs to Splunk.
Goals
- Learn the fundamentals of Splunk
- Setup Splunk with Docker
- Setup Splunk with Ansible
- Setup Splunk with Vagrant
- Setup Splunk with manual instructions
- Test Splunk with Python script
- Install/Setup of Sysmon on Windows with the Splunk Universal Forwarder
- Install/Setup of Osquery on Ubuntu with the Splunk Universal Forwarder
- Install/Setup of Zeek on Ubuntu with Filebeat
- Install Splunk Add-Ons for Splunk CIM compliance
Update log
- August 30th 2021 – Added Vagrantfile for Splunk
- October 24th 2021 – Updated Docker and Ansible playbooks from Splunk v8.1 to v8.2
- December 19th 2021 – Updated Docker and Ansible playbooks from Splunk v8.2 to v8.2.4
Background
What is Splunk?
Custom search commands are user-defined Splunk Search Processing Language (SPL) commands that extend SPL to serve your specific needs. Although Splunk software includes an extensive set of search commands, these existing commands might not meet your exact requirements. Custom search commands let you perform additional data analysis in Splunk Enterprise.
What is Osquery?
Osquery exposes an operating system as a high-performance relational database. This allows you to write SQL-based queries to explore operating system data. With Osquery, SQL tables represent abstract concepts such as running processes, loaded kernel modules, open network connections, browser plugins, hardware events, or file hashes.
What is Sysmon?
System Monitor (Sysmon) is a Windows system service and device driver that, once installed on a system, remains resident across system reboots to monitor and log system activity to the Windows event log. It provides detailed information about process creations, network connections, and changes to file creation time. By collecting the events it generates using Windows Event Collection or SIEM agents and subsequently analyzing them, you can identify malicious or anomalous activity and understand how intruders and malware operate on your network.
What is Zeek?
Zeek is a passive, open-source network traffic analyzer. It is primarily a security monitor that inspects all traffic on a link in depth for signs of suspicious activity. More generally, however, Bro supports a wide range of traffic analysis tasks even outside of the security domain, including performance measurements and helping with troubleshooting.
What is a Splunk universal forwarder?
The Splunk Universal Forwarder provides reliable, secure data collection from remote sources and forward that data into Splunk software for indexing and consolidation. They can scale to tens of thousands of remote systems, collecting terabytes of data.
Splunk universal forwarder vs BEATS logging clients
One of the biggest differences between Splunk UF and BEATs is the Splunk server has the ability to control the Splunk UF. For example, with Splunk you could make a change on the server to instruct all Splunk UFs to start collecting SSH logs. This instruction is pushed down to all the endpoints. However, with BEATs you would have to a configuration management tool to push this update to all BEATs agents and restart the agent.
The Splunk UF can be configured to run a script on a machine. For example, you could have a Powershell script that periodically checks if Sysmon is installed and logging to Splunk. If not, the script can install Sysmon and setup the logging to automatically start logging Sysmon data to Splunk. This is very convenient because this means on-boarding new assets can be as easy as installing the Splunk UF to setup logging.
Lastly, Splunk has a large and diverse ecosystem of applications that are ready for install. For example, let’s say you are switching from Elastic to Splunk and your endpoints are using Sysmon. Just install the Splunk Add-On for Microsoft Sysmon and it will instruct the Splunk UFs how to collect Sysmon data. There is no need to configure the Splunk UFs if the an app for the data source already exists. For more information on the differences between the two see the following blog posts: Let’s Chat About Splunk and ELK, Elasticsearch Best Practice Architecture, and Splunk vs. Elastic Stack (ELK): Making Sense of Machine Log Files
What and why Logstash?
Logstash, is an application that was built in the modern era that supports a high availability setup, configuration files are like code, works very well with JSON, and provides TLS without the need for client certificates. There are multiple methods for high availability which can be reviewed here. The configuration as code is very important because it provides simple programming concepts such as if statements, regex capabilities, ability to extract data, and the ability to transform data. Lastly, if you are converting your infrastructure from an Elastic stack or Graylog cluster to Splunk your endpoints were most likely using BEATS to ship logs. Also personally, I use Filebeats and Winlogbeats for all my blog posts so I want all future blog posts using BEATs clients to be compatible with this setup.
Splunk fundamentals
Splunk buckets
Understanding how Splunk stores logs is an important concept. As you can see from the illustration above Splunk has the concept of HOT, WARM, COLD, and FROZEN buckets. The colloquial terms of temperature typically correspond to how fast you can access said data in bucket. Below is an explanation of how the buckets operate as stated by Cloudian.
Warm Storage
Both hot and warm buckets are kept in warm storage. This storage is the only place that new data is written. Warm storage is your fastest storage and requires at least 800 input/output operations per second (IOPS) to function properly. If possible, this storage should use non-volatile memory express (NVMe) drivers and SSD to ensure performance. NVMe is designed specifically for SSD and can provide significantly higher performance than other interfaces.
Cold Storage
Cold data storage is your second actual storage tier and is the lowest searchable tier. Cold storage is useful when you need storage to be immediately available but don’t need high performance. For example, to meet PCI compliance requirements. Cold data storage is more cost-effective than warm since you can use lower quality hardware. Keep in mind, if you are using one large storage pool for your tiers, cold storage is a name only. Buckets still roll to cold but your performance doesn’t change.
Frozen Storage
Your lowest storage tier is frozen storage. It is primarily used for compliance or legal reasons which demand you store data for longer periods of time. By default, Splunk deletes data rather than rolling it to frozen storage. However, you can override this behavior by specifying a frozen storage location. Frozen storage is not searchable, so you can store it at a significant size reduction, typically 15% of the original. This is because Splunk deletes the metadata associated with it.
What and why Splunk Common Information Model (CIM)?
The Splunk CIM is arguably one of thee best features of Splunk. Splunk states “The Splunk Common Information Model (CIM) is a shared semantic model focused on extracting value from data. The CIM is implemented as an add-on that contains a collection of data models, documentation, and tools that support the consistent, normalized treatment of data for maximum efficiency at search time.”.
At first the CIM didn’t make sense to me because for me I needed to see it in action. Personally, the best way to show that is with Zeek/BRO logs and the Splunk CIM data model for network traffic. The Splunk network traffic CIM model defines that the key name for a source IP address is src_ip
but Zeek defines this key name as id.orig_h
. When you install the Splunk Add-on for Zeek aka Bro it includes the CIM model to map the key names in Zeek logs to the key names defined by the Splunk CIM data model for network traffic. So what does this mean? You can perform the following Splunk search: index="zeek" AND sourcetype="bro:conn:json" AND src_ip="x.x.x.x"
. Splunk takes in this query and knows the source IP address defined in the query exists in the Zeek id.orig_h
field. Additionally, take notice in the screenshot below that the actual log entry is the original event generated by Zeek.
One of the most tedious aspects of being an incident responder is remembering the key names for each data source. To make matters worse, it’s even worse when 9/10 sensors use source_ip
but one sensor uses src
. If this is the case, if you perform the following search: source_ip: <attacker IP address>
you would be missing events and possible malicious activity. The Splunk CIM uses the key name src_ip
for all network logs regardless of the original generated format.
Network diagram
Generate OpenSSL private key and public cert
git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
cd ChooseYourSIEMAdventure
vim conf/tls/tls.conf
and set:- Set the location information under
[dn]
C
– Set CountryST
– Set stateL
– Set cityO
– Enter organization nameemailAddress
– Enter a valid e-mail for your org
- Replace
example.com
in all fields with your domain - For alt names list all the valid DNS records for this cert
- Save and exit
- Set the location information under
openssl req -x509 -new -nodes -keyout conf/tls/tls.key -out conf/tls/tls.crt -config conf/tls/tls.conf
- Generate TLS private key and public certificate
Install/Setup Splunk with Docker-compose v2.x
WARNING
The Docker-compose v2.x setup is for development use ONLY. The setup contains hard-coded credentials in configs and environment variables. For a more secure Docker deployment please skip to the next section to use Docker Swarm which implements Docker secrets or Ansible.
WARNING
git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
cd ChooseYourSIEMAdventure
vim .env
and set:SPLUNK_VERSION
– OPTIONAL – Set the version of Splunk you want to useSIEM_PASSOWRD
– Set the password for SplunkNGINX_VERSION
– OPTIONAL – Set the version of NGINX you want to use- Save and exit
docker-compose -f docker-compose-splunk.yml build
docker-compose -f docker-compose-splunk.yml up
Install/Setup Splunk with Docker-compose v3.x (Swarm)
WARNING
The Docker-compose v3.x is for development use ONLY. The purpose of this setup is to demonstrate how to the Splunk default.yml
config. The Splunk admin password can not be changed it must stay the same. For a more secure deployment please skip to the next section to use Ansible.
WARNING
Create secrets
git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
cd ChooseYourSIEMAdventure
SPLUNK_ADMIN_PASSWORD=$(openssl rand -base64 32 | tr -cd '[:alnum:]')
- Generate Splunk admin password
echo $SPLUNK_ADMIN_PASSWORD
- Print password to screen and save password in a safe location
SPLUNK_HEC_TOKEN=$(openssl rand -base64 32 | tr -cd '[:alnum:]')
- Generate Splunk HEC token
echo $SPLUNK_HEC_TOKEN
- Print Splunk HEC token to screen and save token in a safe location
docker run -it -e SPLUNK_PASSWORD=${SPLUNK_ADMIN_PASSWORD} -e SPLUNK_HEC_TOKEN=${SPLUNK_HEC_TOKEN} -e SPLUNK_HEC_SSL=false splunk/splunk:8.2 create-defaults > conf/docker/splunk/default.yml
- Generate Splunk
default.yml
config
- Generate Splunk
echo $SPLUNK_HEC_TOKEN | docker secret create splunk-hec-token -
cat conf/docker/splunk/default.yml | docker secret create splunk-default-conf -
- Create Docker secret with the contents of
default.yml
- Create Docker secret with the contents of
Docker start stack
docker stack deploy -c docker-compose-swarm-splunk.yml splunk
docker service logs -f splunk_splunk
Install/Setup Splunk with Ansible
Init playbook
vim hosts.ini
add IP of Splunk server under[splunk]
vim group_vars/all.yml
and set:base_domain
– Set the domain where the server residestimezone
– OPTIONAL – The default timezone is UTC+0siem_username
– Ignore this settingsiem_password
– Set the Splunk admin password
vim group_vars/splunk.yml
and set:hostname
– Set the desired hostname for the serversplunk_version
– Set the desired version of Splunk to usesplunk_dl_url
– Set to the URL to download Splunkbeats_port
– OPTIONAL – Set the port to ingest logs using BEATs clientselastic_version
– OPTIONAL – Set the desired version of Logstash to use with Splunk – best to leave as defaultelastic_repo_version
– Change the repo version to install Logstash
Run playbook
ansible-playbook -i hosts.ini deploy_splunk.yml -u <username> -K
- Enter password
Install/Setup Splunk with Vagrant
git clone https://github.com/CptOfEvilMinions/ChooseYourSIEMAdventure
cd ChooseYourSIEMAdventure
VAGRANT_VAGRANTFILE=Vagrantfile-splunk vagrant up
Manual install/Setup of Splunk
Init Linux instance
sudo su
timedatectl set-timezone Etc/UTC
- Set the system timezone to UTC +0
apt update -y && apt upgrade -y && reboot
Install Splunk
wget 'https://www.splunk.com/bin/splunk/DownloadActivityServlet?architecture=x86_64&platform=linux&version=8.1.2&product=splunk&filename=splunk-8.1.2-545206cc9f70-linux-2.6-amd64.deb&wget=true' -O /tmp/splunk-8.1.2-linux-2.6-amd64.deb
- Download Splunk
dpkg -i /tmp/splunk-8.1.2-linux-2.6-amd64.deb
- Install Splunk
/opt/splunk/bin/splunk enable boot-start --accept-license --answer-yes --no-prompt --seed-passwd <Splunk admin password>
- Enable Splunk to start on boot, accept license agreement, enter Splunk admin password
sed -i 's/# server.socket_host = .*/server.socket_host = localhost/g' /opt/splunk/etc/system/default/web.conf
- Instruct Splunk webGUI to listen on localhost
SPLUNK_ZEEK_HEC_TOKEN=$(openssl rand -base64 32 | tr -cd '[:alnum:]')
echo ${SPLUNK_ZEEK_HEC_TOKEN}
- Create a random token for the HEC
mkdir /opt/splunk/etc/apps/splunk_httpinput/local
chown splunk:splunk -R /opt/splunk/etc/apps/splunk_httpinput/local
- Create Splunk HEC local config directory
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/splunk/splunk-hec.conf --output /opt/splunk/etc/apps/splunk_httpinput/local/inputs.conf
- Download config for HEC
mkdir /opt/splunk/etc/apps/search/local
chown splunk:splunk -R /opt/splunk/etc/apps/search/local
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/splunk/splunk-hec-zeek.conf --output /opt/splunk/etc/apps/search/local/inputs.conf
- Create Splunk searching app local directory
sed -i "s/{{ SPLUNK_ZEEK_HEC_TOKEN }}/${SPLUNK_ZEEK_HEC_TOKEN}/g" /opt/splunk/etc/apps/search/local/inputs.conf
- Download Zeek HEC config
systemctl restart splunk
systemctl enable splunk
systemctl status splunk
Install/Setup NGINX
apt install nginx -y
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/nginx/nginx.conf --output /etc/nginx/nginx.conf
- Download main NGINX config
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/nginx/splunk.conf --output /etc/nginx/conf.d/splunk.conf
- Download NGINX to reverse proxy Splunk
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/tls/tls.conf --output /etc/ssl/splunk_tls.conf
- Download TLS config
- Go to the “Generate OpenSSL private key and public cert” section at the top for more details
openssl req -x509 -new -nodes -keyout /etc/ssl/private/nginx.key -out /etc/ssl/certs/nginx.crt -config /etc/ssl/splunk_tls.conf
- Use OpenSSL config to generate public certificate and private key
systemctl restart nginx
systemctl enable nginx
systemctl status nginx
Setup UFW
ufw allow OpenSSH
- Allow SSH access
ufw allow 'NGINX http'
- Allow HTTP
ufw allow 'NGINX https'
- Allow HTTPS
ufw allow 5044/tcp
- Allow Logstash
ufw allow 8088/tcp
- Allow access to HEC
ufw allow 8089/tcp
- Allow Splunk API access
ufw allow 9997/tcp
- Allow Splunk agents to report logs to Splunk
ufw enable
Install/Setup Logstash
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
- Add Elastic GPG key
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-7.x.list
- Add Elastic repo
apt-get update && sudo apt-get install logstash -y
- Install Logstash
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/splunk/02-inputs-beat.conf --output /etc/logstash/conf.d/02-inputs-beat.conf
- Download Logstash Beats input config
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/ansible/splunk/30-output-splunk-hec.conf --output /etc/logstash/conf.d/30-output-splunk-hec.conf
- Download Logstash Splunk HEC output config
sed -i "s/{{ SPLUNK_ZEEK_HEC_TOKEN }}/${SPLUNK_ZEEK_HEC_TOKEN}/g" /etc/logstash/conf.d/30-output-splunk-hec.conf
- Set the HEC token in the config
mkdir /etc/logstash/tls
openssl req -x509 -new -nodes -keyout /etc/logstash/tls/logstash.key -out /etc/logstash/tls/logstash.crt -config /etc/ssl/splunk_tls.conf
- Generate self-signed public certificate and private key
chown logstash:logstash /etc/logstash/tls/logstash.key /etc/logstash/tls/logstash.crt
chmod 644 /etc/logstash/tls/logstash.crt
chmod 600 /etc/logstash/tls/logstash.key
- Set the correct permissions for TLS private key and public certificate
systemctl restart logstash
systemctl enable logstash
tail -f /var/log/logstash/logstash-plain.log
Login into Splunk WebGUI
Open a browser to https://<IP addr of Splunk>:<Docker port is 8443, Ansible port is 443>
- Enter
admin
for username - Enter
<Splunk admin password>
for password - Select “Sign-in”
Create Splunk index
- Settings > Data > Indexes
- Select “New index” in the top right
- Enter
zeek
into name - Select “save”
- Enter
- Repeat for Osquery and Sysmon
Install Splunk TA apps for sourcetypes
- Log into Splunk
- Select “+ Find More Apps”
- Search for “Bro”
- Install Splunk Add-on for Zeek aka Bro
- Repeat for add-on for osquery and Splunk Add-On for Microsoft Sysmon
Setup HEC logging input
All the setups above automatically created or provide the configs to create the HEC input. This section exists to provide manual instructions on how to create a HEC token. However, the Docker setups will need a user to manually select the index to send data and the sourcetype.
- Log into Splunk
- Settings > Data > Data inputs > HTTP Event Collector
- Select “Global settings”
- Make sure “Enabled” is selected for All tokens
- UNcheck “Enable SSL”
- Select “Save”
- Select “New token” in the top right
- Select source
- Enter
zeek-hec
into name - Select “Next”
- Enter
- Input settings
- Select source type based on data source
- Select “Search & Reporting (Searching)” for app context
- Select the an index
- Select “Review”
- Review
- Select “Submit”
- Select source
- Settings > Data > Data inputs > HTTP Event Collector
Enable Splunk universal forwarding logging
- Settings > Data > Forwarding and receiving
- Select “+ Add new” for Configuring receiving under Receive data
- Enter
9997
for port - Select “Save”
- Enter
Ingest Sysmon logs with Splunk agent on Windows 10
Install/setup of Sysmon on Windows 10
- Login into Windows VM
- Open Powershell as Administrator
cd $ENV:TMP
$ProgressPreference = 'SilentlyContinue'
- Disable download status bar
Invoke-WebRequest -Uri https://download.sysinternals.com/files/Sysmon.zip -OutFile Sysmon.zip
- Download Sysmon
Expand-Archive .\Sysmon.zip -DestinationPath .
- Unzip Sysmon
Invoke-WebRequest -Uri https://raw.githubusercontent.com/olafhartong/sysmon-modular/master/sysmonconfig.xml -OutFile sysmonconfig.xml
- Download Sysmon config
.\Sysmon.exe -accepteula -i .\sysmonconfig.xml
- Install Sysmon driver and load Sysmon config
- Enter
eventvwr
into Powershell - Expand Application and Services Logs > Microsoft > Windows > Sysmon
Install/setup of Splunk universal forwarder on Windows 10
cd $ENV:TEMP
Invoke-Webrequest -Uri 'https://www.splunk.com/bin/splunk/DownloadActivityServlet?architecture=x86_64&platform=windows&version=8.1.2&product=universalforwarder&filename=splunkforwarder-8.1.2-545206cc9f70-x64-release.msi&wget=true' -OutFile splunkforwarder-8.1.2-x64-release.msi -MaximumRedirection 3
- Download Powershell
msiexec.exe /i splunkforwarder-8.1.2-x64-release.msi RECEIVING_INDEXER="<Splunk FQDN or IP addr>:9997" AGREETOLICENSE=Yes /quiet
- Quietly install Splunk universal forwarder
Invoke-WebRequest -Uri https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/splunk_agent/sysmon_windows_input.conf -OutFile 'C:\Program Files\SplunkUniversalForwarder\etc\system\local\inputs.conf'
- Download Splunk universal forwarder logging input config
& 'C:\Program Files\SplunkUniversalForwarder\bin\splunk.exe' restart
- Restart the Splunk universal forwarder
- Splunk search:
index="sysmon" eventtype="ms-sysmon-network"
Ingest Osquery logs with Splunk agent on Ubuntu 20.04
Install/setup of Osquery on Ubunut 20.04
- Log into VM with SSH
sudo su
export OSQUERY_KEY=1484120AC4E9F8A1A577AEEE97A80C63C9D8B80B
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys $OSQUERY_KEY
- Add Osquery GPG key for repo
sudo add-apt-repository 'deb [arch=amd64] https://pkg.osquery.io/deb deb main'
- Add Osquery repo
sudo apt-get update -y && sudo apt-get install osquery -y
- Install Osquery
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/osquery/linux-osquery.conf --output /etc/osquery/osquery.conf
- Download Osquery config
- Copy of Palantir config
sed -i 's#/etc/osquery/packs/ossec-rootkit.conf#/usr/share/osquery/packs/ossec-rootkit.conf#g' /etc/osquery/osquery.conf
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/osquery/linux-osquery.flags --output /etc/osquery/osquery.flags
- Download Osquery flags
systemctl restart osqueryd
systemctl enable osqueryd
systemctl status osqueryd
Install/setup of Splunk universal forwarder on Ubuntu 20.04
wget -O /tmp/splunkforwarder-8.1.2-linux-2.6-amd64.deb 'https://www.splunk.com/bin/splunk/DownloadActivityServlet?architecture=x86_64&platform=linux&version=8.1.2&product=universalforwarder&filename=splunkforwarder-8.1.2-545206cc9f70-linux-2.6-amd64.deb&wget=true'
- Download Splunk universal forwarder
dpkg -i /tmp/splunkforwarder-8.1.2-linux-2.6-amd64.deb
/opt/splunkforwarder/bin/splunk enable boot-start --accept-license --answer-yes --no-prompt
- Enable Splunk universal forwarder to start on boot and accept license agreement
/opt/splunkforwarder/bin/splunk add forward-server <Splunk FQDN or IP addr>:9997
- Setup where to forward logs
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/splunk_agent/osquery_linux_input.conf > /opt/splunkforwarder/etc/system/local/inputs.conf
- Download Splunk universal forwarder logging input config
/opt/splunkforwarder/bin/splunk restart
- Restart Splunk universal forwarder
- Splunk search:
index="osquery" sourcetype="osquery:results"
Ingest Zeek logs with Filebeat on Ubuntu 20.04
Install/setup of Zeek on Ubuntu 20.04
echo 'deb http://download.opensuse.org/repositories/security:/zeek/xUbuntu_20.04/ /' | sudo tee /etc/apt/sources.list.d/security:zeek.list
- Add Zeek repo
curl -fsSL https://download.opensuse.org/repositories/security:zeek/xUbuntu_20.04/Release.key | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/security_zeek.gpg > /dev/null
- Add Zeek repo GPG key
sudo apt update -y && sudo apt install zeek jq -y
- Install Zeek
/opt/zeek/bin/zkg install corelight/json-streaming-logs
echo -e "\n# Load ZKG packages\n@load packages" >> /opt/zeek/share/zeek/site/local.zeek
- Add entry to load ZKG packages
echo -e "\n# Disable TSV logging\nconst JSONStreaming::disable_default_logs = T;" >> /opt/zeek/share/zeek/site/local.zeek
- Disable TSV logging
echo -e "\n# JSON logging - time before rotating a file\nconst JSONStreaming::rotation_interval = 60mins;" >> /opt/zeek/share/zeek/site/local.zeek
- Set file log rotation to once an hour
sed -i "s/interface=.*/interface=$(ip route list | grep default | awk '{print $5}')/g" /opt/zeek/etc/node.cfg
- Configure Zeek to monitor the default interface
- If your host has multiple interfaces set it to that interface
/opt/zeek/bin/zeekctl install
/opt/zeek/bin/zeekctl start
- Start Zeek
/opt/zeek/bin/zeekctl status
head -n 1 /opt/zeek/logs/current/json_streaming_conn.log | jq
Manual install/setup of Filebeat on Ubuntu 20.04
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
- Add Elastic GPG key
sudo apt-get install apt-transport-https -y
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
- Add Elastic repo
sudo apt-get update -y && sudo apt-get install filebeat -y
- Install Filebeat
mkdir /etc/filebeat/inputs.d
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/filebeat/linux-filebeat.yml --output /etc/filebeat/filebeat.yml
- Download Filebeat config
curl https://raw.githubusercontent.com/CptOfEvilMinions/ChooseYourSIEMAdventure/main/conf/filebeat/zeek-input.yml --output /etc/filebeat/inputs.d/zeek-input.yml
- Download Zeek log input config
sed -i "s/{{ logstash_ip_addr }}/<Logstash FQDN or IP addr>/g" /etc/filebeat/filebeat.yml
- Set the remote Logstash server
sed -i "s/{{ logstash_port }}/<Logstash port>/g" /etc/filebeat/filebeat.yml
- Set the remote Logstash BEATs port
systemctl restart filebeat
systemctl enable filebeat
- Splunk search:
index="zeek" sourcetype="bro:conn:json"
Jumping off point
- The basics of indexer cluster architecture
- Create a custom Splunk search command with Python
- Overview of the Splunk Common Information Model
- Demo of Bro (Zeek) log queries in Splunk
- Search GoSplunk’s Query Repository
- Youtube – Splunk & Machine Learning
- Hurricane labs – Splunk tutorials
- Splunk conference videos
Lessons learned
I am currently reading a book called “Cracking the Coding Interview” and it is a great book. One interesting part of the book is their matrix to describe projects you worked on and the matrix contains the following sections which are: challenges, mistakes/failures, enjoyed, leadership, conflicts, and what would you do differently. I am going to try and use this model at the end of my blog posts to summarize and reflect on the things I learn. I don’t blog to post things that I know, I blog to learn new things and to share the knowledge of my security research.
New skills/knowledge
- Install Zeek from pre-built package
- Install and setup Splunk universal forwarder
- Learned about the Splunk CIM
- Learned how to install Splunk apps for common logging platforms for CIM compliance
Challenges
- Using NON-Splunk tools adds an additional level of challenges to get things working
What I would have done better
- I would have liked to explore setting up a Splunk cluster but decided it was best to keep it simple for this blog post
References
- Enable a receiver
- HTTP Event Collector Examples
- How to forward events from logstash to Splunk
- Demo of Bro (Zeek) log queries in Splunk
- Configure the HTTP Event Collector to receive metrics data for SAI
- Set up and use HTTP Event Collector with configuration files
- Running Splunkweb on localhost
- Downloading & Installing Osquery
- Github – splunk/TA-osquery
- Splunk Add-On for Microsoft Sysmon
- How to forward events from logstash to Splunk
- How to install Splunk Forwarder on Ubuntu
- Install a Windows universal forwarder from the command line
- zeek from security:zeek project
- Installing Zeek
- Overview of the Splunk Common Information Model
- Let’s Chat About Splunk and ELK
- Elasticsearch Best Practice Architecture
- Splunk vs. Elastic Stack (ELK): Making Sense of Machine Log Files