Automating My Plex Server

Automating My Plex Server image

Automating My Plex Server

Posted: Oct 19, 2025 View all posts

I have been running my Plex server since approximately 2015 and I have always been too lazy to automate the workflow.  Often, it is jokingly said that developers spend countless hours automating workflows that takes a few minutes to complete by hand.  Ironically, this is probably one of the few cases where I could have benefited by setting up the automation much sooner.  I am mildly ashamed to say that I have wasted countless hours over the past 10 years by not automating requests and downloads.

Introduction

Initially, I installed Plex on a Chinese Windows 2-in-1 Tablet in order to share my downloaded content for online Movie Nights. If I remember correctly, it must have been running Windows 8.  Even from the start, the storage was mounted remotely via Google Cloud Drive with rclone.  In those days rclone could not mount on Windows, because WinFSP had not been developed yet.  This meant that we ran a Linux Virtual Machine with VMware just so that we could take advantage of the unlimited Google drive storage provided by Gsuite.  

Eventually, the unlimited storage had come to an end, but I was able to pool my local drives with Drivepool for Windows.  Luckily the desktop tower case that I had at the time (the Fractal R2) could fit something like 14 3.5" hard drives as well as 2 2.5" drives.  With that realization, I added a PCIE card Mini-SAS connector so that I could add an addition 8 hard drives.  As Google wisened up and closed their unlimited storage service plan, I migrated my content locally.  Unfortunately, I didn't have enough local storage and ended up losing more than half my library at the time.  I knew the day would come, but it was still a sad day.

That brings me to my current setup, which is still running on Windows 10 and is still just a ghetto JBOD (just a bunch of drives).  What makes it great, however, is the recently added power of automation through docker.

Motivation

So what finally got me off my lazy ass and down this rabbithole of Sonarr, Radarr, and Overseer?  It was a combination of several factors.  Earlier this year, I had discovered Coolify and finally figured out how Nginx works.  I recently discovered Portainer, and docker desktop for Windows finally fixed several known memory leaks.  Unironically, discovering Portainer changed the way I run apps and kicked off my self hosting journey. Prior to using Portainer, I had a lot of concerns regarding data loss and recovery when running docker.

Putting it all together

Finally, after figuring out a consistent way to run docker containers without the risk of losing data, I was ready to start the setup.  In the past, I had tried setting my homelab on a separate machine using proxmox, but found that it was incredibly hard to change my IP.  As we do frequently change ISPs I figured I would skip out on proxmox this time.  What I do miss, however, is how easy it was to remote desktop into a proxmox environment out of the box.  So naturally, you would think that I just decided to host it on simple distro like Ubuntu server or just plain Ubuntu.  What I ended up doing was going the lazy route, which was to host sonarr and radarr on my existing plex server.  In hindsight, what would have made more sense was to install the *arr application on a separate Linux machine and map the existing network drive, but don't fix what isn't broken.

So I finally setup my instance following the TechHutTv guides.  The guides are a fantastic resource and very flexible. In his particular example, he tunnels all his traffic through Wireguard, whereas I am not.  Overall, there were a few hiccups, but once I got it setup, it worked pretty flawlessly. (Hopefully I am not jinxing myself) I was able to setup:

  • Sonarr - for shows
  • Radarr - for movies
  • Prowlarr - for indexing torrents
  • Overseerr - for making requests
  • Flaresolverr - for bypassing cloudflare
  • Qbittorrent - as the torrent client

The largest issue that I ran into was due to pathing, as I was originally trying to use the Windows qbittorrent client instead of the docker image.  With that out of the way, all that was left was some tweaking to the quality settings and preferences in codec types.  For the most part, I am able to sit back and let Sonarr and Radarr handle incoming requests with little fuss or intervention.  Of course, I still do occasionally check other sources in the event a show/movie is not available, but for now, I am quite pleased that the setup is working after a weekend's worth of effort.