

I also have a WD black 2TB that must be near a decade now and it’s still going with zero issues. There were definitely doing something good.
Though truth be told, it’s a data drive, not the OS drive. So there’s less r/w going on.
I also have a WD black 2TB that must be near a decade now and it’s still going with zero issues. There were definitely doing something good.
Though truth be told, it’s a data drive, not the OS drive. So there’s less r/w going on.
The cheapest is to buy some android box with armlogic processor and install coreelec on it. You can do it for 20 bucks, then you have a kodi oriented linux distro on your tv.
Though I prefer to straight up connect my laptop to the tv with a small remote keyboard and have full computer functionality. I’m looking to change the laptop for a miniPC when the laptop finally breaks down. I would use a normal DE. Nothing specially suited for smartTV usage. But you get used to it pretty quick.
Microsoft already lost the home OS battle when people switch their main devices to smartphones with iOS or Android.
I’m against it for several reasons. Running unauthorized heavy duty code on your end. It’s not JS in order to make your site functional, it’s heavy calculations unprompted. If they would add simple button “click to run challenge” would at least be more polite and less “malware-like”.
For some old devices the challenge last over 30 seconds, I can type a captcha in less time than that.
It blocks behind the necessity to use a browser several webs that people (like the article author) tend to browse directly from a terminal.
It’s a delusion. As shown by the article author solving the PoW challenge is not that much of an added cost. Span reduction would be the same with any other novel method, crawlers are just not prepared for it. Any prepared crawler would have no issues whatsoever. People are seeing results just because it’s obscurity, not because it really works as advertised. And in fact I believe some sites are starting to get crawled aggressively despite anubis as some crawlers are already catching up with this new Anubis trend.
Take into account that the challenge needs to be light enough so a good user can enter the website in a few seconds running the challenge on a browser engine (very inefficient). A crawler interested in your site could easily put up a solution to mine the PoW using CUDA in a GPU which would be hundreds if not thousands of times more efficient. So the balance of difficulty (still browsable for users but costly to crawl) is not feasible.
It’s not universally applicable. Imagine if all internet were behind PoW challenges. It would be like constant Bitcoin mining, a total waste of resources.
The company behind Anubis seems more shady to me each day. They feed on anti-AI paranoia, they didn’t even answer the article author valid critics when he email them, they use clearly PR language aimed to convince and please certain demographics to place their product. They are full of slogans but lack substance. I just don’t trust them.
So? You have free will to use another captcha.
What?
You don’t need to use google, or cloudfare, captcha to have a captcha.
There are open source implementations of reCaptcha. And you can always run a classical captcha based on image recognition.
I installed debian long ago. I was on 12 and just updated to 13. Last week.
Trixie is using some new sources format, though the old format is compatible, that was not the issue.
The issue was that the sources was targeting “bookworm” instead of “stable” for updates. So when I was doing “apt update” it didn’t find any updates. I just have to change it to target to stable, and I took the chance and change to the new sources format. But it was hard to catch.
The autoremove issue was mostly an issue because my root partition is small. It fills quickly. It was the default size on the debian installer I believe. The issue was that I tried to update with it being 100% at it was a total mess. It took a long time to fix. Nowadays I always “df”. Before update.
They don’t have to do anything but let an unknown program to max their cpu unauthorized.
Imagine if google would implement that. Billions of computers running PoW constantly, what could go wrong?
First, I said reCaptcha types, meaning captchas of the style of reCaptcha. That could be implemented outside a google environment. Secondly, I never said that types were better for privacy. I just said Anubis is bad for privacy. Traditional captchas that work without JavaScript would be the privacy friendly way.
Third, it’s not a false proposition. Disabling JavaScript can protect your privacy a great deal. A lot of tracking is done through JavaScript.
Last, that’s just the Anubis PR slogan. Not the truth, as I said ddos mitigation could be implemented in other ways. More polite and/or environmental friendly.
Are you astrosurfing for anubis? Because I really cannot understand why something as simple as a landing page with a button “run PoW challenge” would be that bad
Anubis is worse for privacy. As you have to have JavaScript enabled. And worse for the environment as the cryptographic challenges with PoW are just a waste.
Also reCaptcha types are not really that disturbing most of the time.
As I said, the polite thing you just be giving users the options. Anubis PoW running directly just for entering a website is one of the most rudest piece of software I’ve seen lately. They should be more polite, and just give an option to the user, maybe the user could chose to solve a captcha or run Anubis PoW, or even just having Anubis but after a button the user could click.
I don’t think is good practice to run that type of software just for entering a website. If that tendency were to grow browsers would need to adapt and straight up block that behavior. Like only allow access to some client resources after an user action.
Captcha.
It does all Anubis does. If a scrapper wants to solve it automatically it’s computer intensive, they have to run AI inference, but for the user it’s just a little time consuming.
With captchas you don’t run aggressive software unauthorized on anyone’s computer.
Solution did exist. But Anubis is “trendy” and they are masters in PR within some specific circles of people who always wants the lastest most trendiest thing.
But good old captcha would achieve the same result as Anubis, in a more sustainable way.
Or at least give user an option of running or not running the challenge and leave the page. And make clear for the user that their hardware is going to run an intensive task. It really feels very aggressive to have a webpage to run basically a cryptominer unauthorized in your computer. And for me having a cargirl as a mascot does not forgive the rudeness of it.
I had used linux mint as my default OS for years, which is said to be the “easiest distro”. Still there was a ton of maintenance. Every week one thing or the other didn’t work properly.
Even a debian server I own, which is completely barebones, without even a graphic interface. Last week I had to manually fix the sources file because trixie update messed it up. A couple of months ago I have a very bad issue with the root partition filling up of old kernel images because I didn’t run autoremove frequen enough.
So you are not alone. It does feel like owning a boat.
Why the hell don’t you limit the CPU usage of that service?
For any service that could hog resources so bad that they can block the entire system the normal thing to do is to limit their max resource usage. This is trivial to do using containers. I do it constantly for leaky software.
I still think captchas are a better solution.
In order to surpass them they have to run AI inference which is also comes with compute costs. But for legitimate users you don’t run unauthorized intensive tasks on their hardware.
It’s working because it’s not very used. It’s sort of a “pirate seagull” theory. As long a few people use it it works. Because scrappers don’t really count on Anubis so they don’t implement systems to surpass it.
If it were to become more common it would be really easy to implement systems that would defeat the purpose.
As of right now sites are ok because scrappers just send https requests and expect a full response. If someone wants to bypass Anubis protection they would need to take into account that they will receive a cryptographic challenge and have to solve it.
The thing is that cryptographic challenges can be very optimized. They are designed to run in a very inefficient environment as it is a browser. But if someone would take the challenge and solve it in a better environment using CUDA or something like that it would take a fraction of the energy defeating the purpose of “being so costly that it’s not worth scrapping”.
At this point it’s only a matter of time that we start seeing scrappers like that. Specially if more and more sites start using Anubis.
Sometimes I think. Imagine if a company like google or facebook would implement something like anubis. And suddenly most people’s browsers would start solving cpu intensive constant cryptographic challenges. People would be outraged by the wasted energy. But somehow “cool small company” does it and it’s fine.
I do not think anubis system is sustainable for all the people to use it, it’s just too wasteful energy wise.
I use AIO approach with jellyfin but I’m thinking about changing it.
I like how jellyfin handles music, but the search feature is unusable with so many files.
Each time I search for a movie it search through thousands of music files and music people. And jellyfin search feature is bad as it is. I’m waiting for them to fix ot but it doesn’t seem like it.
So maybe taking music out would make that feature usable again.
I’ve always been wary of Kagi, as they have not been very clear about how big part of their search is based on their own index and how much is metasearch on various other engines.
Games are still not perfect. Multiple screens can be really finicky if they have different resolutions and refresh rates.
I have just set up a normal computer with the specs I wanted, installed debian and docker/podman and I’m golden.