

Would something like Anubis or Iocaine prevent what you’re worried about?
I haven’t used either, but from what I understand they’re both lightweight programs to prevent bot scraping. I think Anubis analyzes web traffic and blocks bots when detected, and Iocaine does something similar but also creates a maze of garbage data to redirect those bots into, in order to poison the AI itself and consume excessive resources on the end of the companies attempting to scrape the data.
Obviously what others have said about firewalls, VPNs, and antivirus still applies; maybe also a rootkit hunter and Linux Malware Detect? I’m still new to this though, so you probably know more about all that than I do. Sorry if I’m stating the obvious.
Not sure if this is overkill but maybe Network Security Toolkit might have some helpful tools as well?
I’ve said this about books, specifically since self-publishing and print-on-demand have saturated the market with garbage.
I say garbage and not slop because this was before AI even became a thing. I can’t imagine how bad it is now, but fortunately I’ve broken my habit of hoarding books so it’s not really something I notice anymore.