Category Archives: DevOps

A hero’s journey to learning and conquering the Kubernetes tech beast

As the title suggests, learning Kubernetes (k8s) was your typical hero’s journey for me. I set out on an adventure to conquer all there is to know about the Kubernetes beast. Alas my attempts were thwarted by the frustratingly vast chasm of fire that is Kubernetes lair. Like any good hero, I sharpened my wit and sought a wealth of knowledge from an infinite library until I happened upon a powerful Wizard who’s guidance lead me, to finally overcome the beast’s lair.

This blog post, like the millions of blog posts that already exist on the internet will provide instructions on how to set up a MicroK8s cluster; however, I will also provide helpful additions to enable you in using Kubernetes in a homelab environment. Hopefully providing this foundational services guide, or “Act 2” of learning Kubernetes will provide a stronger understanding so that you feel empowered to go and explore more applications on your own. The additions I found to be foundational in my homelab are: using your Synology device as persistent storage, creating an ingress controller/load balancer using Traefik + cert-manager, deploying our first web app (Cyberchef), and how to monitor our cluster with Prometheus + Grafana. I will also recite heroic tales of overcoming the tribulations I encountered along the way. My hope is that this blog post will help provide a bootstrapped cluster with the capabilities that will enable beginners learning Kubernetes to continue their own heroic journey.

Continue reading

Getting started with Autopsy multi-user cluster

The purpose of this blog post is to provide multiple methods on how to install/setup an Autopsy multi-client cluster. This blog post generated an infrastructure-as-code in the form of an Ansible playbook, Docker-compose, and manual instructions for setting up a cluster. In addition, this blog post will demonstrate how to setup the Autopsy client to connect to the Autopsy cluster and how to ingest disk images.

Continue reading

Connecting to my homelab remotely with Hashicorp Boundary v0.2.0 and Auth0

The purpose of this blog post is to provide multiple methods on how to install/setup Hashicorp Boundary. This blog post generated an Ansible playbook, Docker-compose for Swarm, and manual instructions for installing Boundary on Ubuntu 20.04. In addition, this blog post will demonstrate how to setup Auth0 OIDC authentication for single-sing on. Lastly, I will end this blog post with connecting to a remote machine in my homelab via SSH using the Boundary Desktop and CLI client.

Continue reading

Gitlab CI/CD pipeline with Vault secrets

 

The purpose of this blog post is to provide instructions on how to setup Gitlab and Vault to use secrets during a CI/CD pipeline build. In addition, I will break down the JWT authorization process with an explanation of the process for Gitlab + Vault. The explanation, step-by-step instructions, and the infra-as-code provided in this post will create the foundation for future blogs that will contain a CI/CD component with Vault secrets.

Continue reading

DevOps Tales: Install/Setup Gitlab + Gitlab runners on Docker, Windows, Linux and macOS

Are you tired of manually pushing code to production? Are you always searching through your BASH history to find the commands you used to test your code? Do you wish the process to merge code into production had a defined process? Well I have the solution for you! Introducing Gitlab CI/CD pipelines! With Gitlab you can setup Gitlab runners to create a CI/CD pipeline. A CI/CD pipeline will revolutionize your workflow to push code to production.

The purpose of this blog post is to provide instructions on how to setup the necessary components (Gitlab and Gitlab runners) to create a CI/CD pipeline. One of the deliverables from this blog post is Docker composes for Swarm and non-swarm deployments of Gitlab. Additionally, there are manual instructions on how to setup Gitlab runners on Ubuntu 20.04, Ubuntu 20.04 with Docker, Windows 10, Windows 10 with Docker, and macOS Big Sur. In addition, a Docker Registry is setup and integrated into the CI/CD pipeline for custom Docker images. The instructions and the infra-as-code provided in this post will create the foundation for future blogs that will contain a CI/CD component.

Continue reading

My development server for Vault

During the COVID19 lock down instead of playing videos games to consume my free time, I decided to be proactive. I started taking Udemy courses and one of the courses was on Vault and ever since I have been incorporating Vault into my blog posts. However, each blog post requires a unique setup and I prefer to start from a clean slate for each blog post. But the turn over of new keys and adding a new root CA to my local cert store became extremely tedious. Below is my Vault development setup where I address these issues.

Continue reading

Getting started with Hashicorp Vault v1.6.1

The purpose of this blog post is to provide multiple methods on how to install/setup Vault. This blog post generated an Ansible playbook, Docker-composes for Swarm and non-swarm, and manual instructions for installing Vault on Ubuntu 20.04. Additionally, over the past couple of months, I have been learning Vault and demonstrating different ways to incorporate Vault. This blog post will be a condensed version of the content in those blog posts and a jumping off point to those blog posts as well.

Continue reading

Integrating Vault secrets into Jupyter Notebooks for Incident Response and Threat Hunting

The industry has gravitated towards using Jupyter notebooks for automating incident response and threat hunting. However, one of the biggest barriers for any application/automation is the ability to store secrets (username+passwords, API keys, etc) to access other services. This blog post will demonstrate how to use Vault to store secrets and integrate the ability to retrieve secrets from Vault with Jupyter Notebooks to assist in automating your security operations.

Continue reading

Creating a Windows 10 64-bit VM on Proxmox with Packer v1.6.3 and Vault

This blog post is going to demonstrate how to implement a new feature added to Packer in version 1.6.3. This new feature provides the ability to mount multiple ISOs on Proxmox VMs because Proxmox “doesn’t” support virtual floppy drives. Since Proxmox doesn’t support virtual floppy drives you can’t supply an Autounattend.xml file to automate the installation and initial configuration of Windows. By converting an Autounattend.xml file to an ISO we can now mount the ISO to install the OS and the ISO containing the necessary file to automate the installation. Lastly, I will be using Vault to store my sensitive values required by Packer to create this VM.

Continue reading

Vault: Connecting entities, auth backends, groups, and policies OH MY

While working on my osquery-file-carve-server project I determined my application needed authentication. However, I didn’t want to pigeon hole my application to a single platform/service for authentication. After some research, I decided to implement support for Vault into my application because it provides the ability for users to authenticate using various methods. However, during my research, I had a hard time understanding how the various Vault components connected to create this functionality.

This blog post will provide an understanding of the Vault components used to implement this functionality. In addition, it will demonstrate the relationship between the various Vault components: authentication backends, entities, groups, and policies. The final result of combining these Vault components is a system that can authenticate a single user using different authentication services.

Continue reading

My Homelab Docker setup

Just like my latest post on my logging pipeline, people want to know more about my Docker set up to learn from or replicate. This blog post is my attempt to share my Docker set up as a framework for newcomers. The hope is that the explanation of the architecture, design decisions, working infrastructure-as-code, and the knowledge I accumulated over the years will be beneficial to the community.

Continue reading

Reducing your alert fatigue with AskJeevesSecBot

In incident response, there is a disconnect between a security alert being generated and a user’s confirmation of the security alert. For example, generating an alert every time a user runs “curl” on a production system would generate a bunch of false positives that can lead to what is called “alert fatigue”. But if we extend our incident response capabilities to include the user as part of the triage process we could reduce the number of alerts. This blog post is going to demonstrate AskJeevesSecBot which is an open-source proof of concept (POC) of how to integrate Slack and user responses into your security pipeline, specifically during the triage phase of the incident response process. In addition to a PoC, this blog post will also provide a deep dive into the architecture of this project, design decisions, and lessons learned as an evolving threat detection engineer.

Continue reading