

Careful! This is very dangerous, you should instead do
sudo chown -R user:user /*
Where “user” is your username, and then do
chmod -R 770 /*
This will make sure that only your user has all the access!
(Don’t do this)
Careful! This is very dangerous, you should instead do
sudo chown -R user:user /*
Where “user” is your username, and then do
chmod -R 770 /*
This will make sure that only your user has all the access!
(Don’t do this)
Yes, but it was a huge corp that literally had it’s own linux community within the corp.
Interesting, thanks!
🤷♂️
I have heard of situations where companies refuse voip numbers for authentication or critical services, and i can’t risk my number getting flagged as it is sadly a critical part of your online identity these days (fucking hate that)
I need it to be my number and to not be flagged as a bot, so commercial voip is a no go
I have wifi, I don’t need data
I tried looking for that, all I found were pricey enterprise level stuff, small ones that only support 2g (dying protocol), and ones that have a cloud service and not self-hosted. Do you know of anything else?
If you have 3 phase you could reasonably do this. This is not very common but some people have it in which case running about 50 rx9070 plus a strong AC should be possible, I think.
An open source IPTV client (TV channels over tge internet)
None, I like to type
RAG is basically like telling an LLM “look here for more info before you answer” so it can check out local documents to give an answer that is more relevant to you.
You just search “open web ui rag” and find plenty kf explanations and tutorials
Have you tried RAG? I believe that they are actually pretty good for searching and compiling content from RAG.
So in theory you could have it connect to all of you local documents and use it for quick questions. Or maybe connected to your signal/whatsapp/sms chat history to ask questions about past conversations
Yeah, personally I just looked for second hand high vram gpus and waited. I got 2 titan Xp (12gb vram) for only $180 each.
I know there is all of that AI hate, which i’m all for. But taking models to run locally does not benefit the AI companies. If anything this is the way to make something that is actually good out of that hot mess.
You could use an llm with an mcp to the local filesystem and hope it can do it for you
It is probably actually easier to create on linux as it is foss and there are also good projects like eBPF which can maybe even simplify and make it more secure.
What is this weird ad post? F u
About as much as I trust other drivers on the road.
As in I give it the benefit of the doubt but if something seems off I take precautions while monitoring and if it seems dangerous I do my best to avoid it.
In reality it means that I rarely check it but if anything seems off I remove it and if I have the time and energy I further check the actual code.
My general approach is minimalism, so I don’t use that many unknown/small projects to begin with.
Simple answer: Yes!
Not so simple: Yes, but nvidia hates linux and their proprietary drivers can cause issues. Generally (especially on stable distros) everything is stable and fine.