Back in the day it was nice, apt get update && apt get upgrade and you were done.
But today every tool/service has it’s own way to being installed and updated:
- docker:latest
- docker:v1.2.3
- custom script
- git checkout v1.2.3
- same but with custom migration commands afterwards
- custom commands change from release to release
- expect to do update as a specific user
- update nginx config
- update own default config and service has dependencies on the config changes
- expect new versions of tools
- etc.
I selfhost around 20 services like PieFed, Mastodon, PeerTube, Paperless-ngx, Immich, open-webui, Grafana, etc. And all of them have some dependencies which need to be updated too.
And nowadays you can’t really keep running on an older version especially when it’s internet facing.
So anyway, what are your strategies how to keep sanity while keeping all your self hosted services up to date?


I have a shell script that handles all the quircks. I run it every few weeks. It does a btrfs snapshot so I can go back in case something is wrong, and after it updates Docker and Podman to the latest label.
For services not containized I have some automation to fetch the last version from internet (for example some home assistant addons that are just js files).
For the updates that are more difficult to script (or just not worth because they are very infrequent) I have a script that compares the running version with what published on their website and warns me I have a manual update.
Since most of the projecs I host have a gitub page it is relatively simple to write reusable code to do this stuff.
In general I don’t trust automatic updates, there are seldom issues but they can be annoying to fix. So I just prefer to updates by hand whenever I have a few minutes free and I know I have direct access to the server in case the connection drops.