- 1 Post
- 18 Comments
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How do you document your Homelab?English1·21 hours agoi second this
i haven’t gotten around to looking into something like terraform/ansible yet, and currently rely on a series of setup.sh scripts and docker-compose files
i have a single master setup.sh at the root of my homelab which basically just outlines which scripts i need to run and in what order in order to get things back up and running from zero
i only user my README.md for any non scriptable stuff (such as external services i rely on such as cloudflare/vpn providers, etc)
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How do you document your Homelab?English1·21 hours agoi mean charitably you could say that your code / architecture should be self documenting, versus having to rely on READMEs / wikis
in effect, if you change the code you are by definition also changing the documentation, since the file names/function names/hierarchy is clear and unambiguous
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How do you document your Homelab?English1·21 hours agowhile security might be compromised if an attacker found your documentation, it could equally be compromised by having zero documentation
the easier it is for you to get things back up and running in the event of a data loss / corrupted hard drive / new machine / etc, the less likely you are to forget any crucial steps (eg setting up iptables or ufw)
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How do you document your Homelab?English2·21 hours agothis is basically what i ended up doing to - glad to see my approach verified somewhat ha ha!
but yeah, in general whenever i make a change / add new service, i always try and add those steps to some sort of setup.sh / docker-compose
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Best option for hosting ebooks and audiobooks?English1·21 hours agosupports podcasts too? what tool are you using to download those? and does ABS handle the sorting/meta data the same way it does for audio books?
maybe silly question but does tailscale tunnel operate in a similar fashion to a cloud flare tunnel? as in you can remotely access your internal service over https?
i have nginx proxy manager set up all as well, but haven’t worked out the SSL part yet, so all my internal docker services are still on http
out of interest, how did you set up https with npm?
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How do I use HTTPS on a private LAN without self-signed certs?English1·1 month agowhy would you realistically need HTTPS on your local network?
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How and where should I keep backups of system configurations?English2·1 month agoreal question though is do you back up your backup server?
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How to secure Jellyfin hosted over the internet?English0·2 months agoWhy would Cloudflare warn me against a service they themselves offer? The email authentication is all managed by them
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Risks of self-hosting a public-facing forum?English0·2 months agoi definitely didn’t have to enter my card details, could my region though
also, what kind of forum are you running that needs web sockets?
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Optimal Plex Settings for Privacy-Conscious UsersEnglish0·2 months agostopppp
it’s already dead
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•How to secure Jellyfin hosted over the internet?English0·2 months agoSo i’ve been trying to set this up this exact thing for the past few weeks - tried all manner of different Nginx/Tailscale/VPS/Traefik/Wireguard/Authelia combos, but to no avail
I was lost in the maze
However, I realised that it was literally as simple as setting up a CloudFlare Tunnel on my particular local network I wanted exposed (in my case, the Docker network that runs the JellyFin container) and then linking that domain/ip:port within CloudFlare’s Zero Trust dashboard
Cloudflare then proxies all requests to your public domain/route to your locally hosted service, all without exposing your private IP, all without exposing any ports on your router, and everything is encrypted with HTTPS by default
And you can even set up what looks like pretty robust authentication (2FA, limited to only certain emails, etc) for your tunnel
Not sure what your use case is, but as mine is shared with only me and my partner, this worked like a charm
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Risks of self-hosting a public-facing forum?English0·2 months agoHave you ever tried Cloudflare Tunnels? I think this would solve most of those issues
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Risks of self-hosting a public-facing forum?English1·2 months agojust cloudflare tunnel it - i set one up the other day and it works super well, proving external access to a locally hosted service all without having to set up your own SsL certs and worrying about exposing private ips or ports
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Someone help me understand the sonarr to jellyfin workflowEnglish0·2 months agoliterally was going through the exact same thoughts as you a couple a weeks ago, tried so many different configurations but the one i found that worked was actually kinda simple
basically they way i did was to run a gluetun docker container, and then in the environment variables pass in the the fact i wanted this to use the WireGuard VPN manager, and then i passed in my Proton VPN wireguard api key (you’ll need a subscription for this)
then once that gluetun container is up and running, you literally just add “network_mode: service:gluetun” to any other containers that you want to use this VPN
can you can even test its working by sending a curl command to an ip checking site from within those containers connected to gluetun
and then also try shutting down that gluetun vpn, and see if you other services (e.g. qbitorrent, still work)
CapitalNumbers@lemm.eeto Selfhosted@lemmy.world•Someone help me understand the sonarr to jellyfin workflowEnglish0·2 months agoI’ve literally just set this all up and it’s working now after some tinkering, so here’s what I found out. Assuming you have correctly configured the sonarr/qbitorrent api keys and credentials:
When you make a TV show request in Sonarr, it will automatically add the torrent to your download client (e.g qbitorrent)
qbitorrent will then download the file to wherever you specify (e.g. /torrents/completed)
periodically, Sonarr will scan that /torrents/completed folder, and if it finds the tagged TV show, it will either copy or hard link that video file to your specified media folder (e.g. /media/tv-shows)
JellyFin will do the same, periodically scanning your media folders to see if there are any updates
EDIT: also if you are using docker containers, make sure that Sonars native /downloads folder is pointed at the same external folder your qBitTorrent is downloading files in
Here’s my approach to documentation. It’s about habits as much as it’s about actually writing anything down:
Never setup anything important via naked terminal commands that you will forget you did
Always wrap important commands in some kind of “setup-xyz.sh” script and then run that script to see if your install worked.
If you need to make a change to your service, ensure you update your script and so it can be re-run without braking anything
Get into the habit of this and you are documenting as you go