

Instead of posting a video of aggregated news, why not just post the source links?
Instead of posting a video of aggregated news, why not just post the source links?
Thanks for clarafying. That sounds like a genuine reason to use a synchronizing program like Nextcloud, to share files between devices frequently.
I don’t know much about syncthing but I hear a lot of people talking about it. Perhaps someone else can shed some light to it. But as I experienced Nextcloud about a decade, I consider it belongs to a hard-to-setup, high-maintenance tier. I’ve had my moments when I failed to upgrade and resorted to nuke it and set it up anew.
I shall also share that I’m currently running a dead “distro”, TrueNAS CORE (based on FreeBSD), which abandoned by the company. As a result, my Nextcloud is stuck at version 28 and I don’t have the energy to do a manual upgrade.
If you have made up your mind to set up your own Nextcloud instance, my recommendation is to buy a genuine industrial grade motherboard, put some ECC RAM in it, and use an OS that’s meant for servers (no Ubuntu, Arch, Fedora shit). You shall also setup RAID or use ZFS to mirror your hard disks to prevent bitrot. And I definitely do not recommend you save your valueable data on some random general purpose hard disks or even “like new” secondhand ones. There are hard disks meant for NAS out there.
Or, you know, Nextcloud Inc. sells prebuilt Nextcloud hardwares.
And do ask for more opinions on !selfhosted@lemmy.world.
In which way do you plan to transfer your photos to the backup storage? In the picture I can see a camera and I assume it uses an SD card. I would, if I were you:
rsync
Some storage tower even comes with an Ethernet port and a web interface. It’s practically a personal “cloud”.
Nextcloud is resource heavy, slow, hard to setup, and hard to backup/restore. This is from someone who has been using it from when it was Owncloud.
The pictures OP posted suggest the distro is Mint. At the last time I installed it, I remember double clicking a exe file brings up a dialogue which asks if I want to run it through WINE.
And I heard Figma is a heavy adopter of WebAssembly which could be faster when running not in a wrapper like Electron. Since it is WebAssembly one can imagine the so-called “desktop version” has to be also WebAssembly and it has to have a wrapper around it.
If all these are true, OP just need to find a way to load the fonts into the web version.
A piece of software always runs locally. It is in some cases those who needs to communicate with the server fail to deliver the usual function you expect when offline.
Please do not confuse one to another.
And perhaps you can start by complaining which services you are using heavily rely on the server side? General questions attract general answers and IMHO you are better off just search on the internet.
1/10 Do not recommend
Want to learn? Buy a current computer (secondhand to save money) that has a blazing fast CPU, shit loads of RAM, and any AMD graphics card. Running into trouble is no fun for beginners. You’ll quickly feel depressed and lose interest.
For the learning part, follow any distro’s official installation guide and do it step by step. Learn which part of the systems does what, and how to set it up, how to debug.
And stick to Ethernet connection before you get comfortable. (Shitty) Wi-fi ICs more often than not have driver issues.
For the old laptop, sell it for parts if you’re not feeling nostalgic.
For the last time, buy a new computer, please.
Back in the day when embedded devices are running Linux kernel 2.6, the kernel is gzipped and saved to an SPI flash, then extracted to RAM and run from there.
Does that sound immutable enough to you?
The decision on this design wasn’t for an immutable system, but just that flash chips were expensive. Immutability was an accidental achievement.
Actually we developers dreamed every day we can directly modify the operating system ad hoc, not needing to go through the compile-flash-boot agonising process just to debug a config file.
You see, my point is, when a system is in good hands, it just does not break. End of story.
Maybe the next time before you guys press Enter after pacman -Syyu
(not exclusively saying your distro is bad, Arch pals, sorry), think about the risk and recovery plan. If you are just an end user expecting 100% uptime and rarely contributing (reporting bugs at least), consider switch to a more stable distro (I heard Debian is good), and ask yourself if you want an immutable distro, or do you just want a super stable system.
You could have booted the old kernel in Grub.
I think “atomic” means “a bunch of actions grouped together as one action”, so that the system won’t end up in a state where some required actions are missing and becomes unusable. But it doesn’t mean it’s unto itself making a system unbreakable: If your system starts in a state of malfunctioning, then it also takes a series of actions to fix it, be it atomic or not.
Most Linux distributions start in the state of functioning after installation.
I don’t recall doing any regular updates
You needed to buy a modem to get online
If you stay offline, you don’t need upgrading to prevent virus or hacking. That’s the norm in the good old days.
Well, with root enabled, the SSH server at least need to verify the key, no? It’s wasting CPU power albeit tiny amount.
Do you just want to see the text content of a HTML file? - a text editor
Do you want most, if not all, HTML tags to be rendered as pretty graphical shapes?
Do you want the text have proper fonts?
Styles? You need something to parse CSS files.
What about dynamically generated content like ten smiley faces? You need a JavaScript engine.
Do you also want to see iframes? You need it to be capable of sending XHR requests.
What if it references to a piece of WebAssembly?
It’s way more complicated than you anticipated.
What’s wrong with http://makemkv.com/? I’ve used it for years for lossless backup DVD/BD and no issue.
What you described as the weakness, is actually what is strong of an open source system. If you compile a binary for a certain system, say Debian 10, and distribute the binary to someone who is also running a Debian 10 system, it is going to work flawlessly, and without overhead because the target system could get the dependency on their own.
The lack of ability to run a binary which is for a different system, say Alpine, is as bad as those situations when you say you can’t run a Windows 10 binary on Windows 98. Alpine to Debian, is on the same level of that 10 to 98, they are practically different systems, only marked behind the same flag.
Imagine a contributor of the project. He would have been fixing the bug for free and give the work to the public project. Right before he submits the code change, he sees an ad from a big tech bro: “Hiring. Whoever can fix this bug gets this job and a sweet bonus.” He hesitated and worked for the company instead.
Now that he is the employee of the company. He can’t submit the same bug fix to the open source project because it is now company property. The company’s product is bug free, and the open source counterpart remains buggy.
Dude, this is like asking “Which car manufacturer ships new cars with mirros folded?” Every driver ought to know that it’s only a matter of pressing the button and they fold. Disqualifying all those good manufacturers because they don’t fold their mirrors before shipping sounds stupid.
Same here on this topic, it’s only a matter of running one command to create the user. Options include writing the instructions down on a piece of paper before giving the computer away, or close the little gap between post-installation and setting up users by yourself.