I’m starting to have a sneaking suspicion that putting 24G of VRAM on a card isn’t happening because they don’t want people using AI models locally. The moment you can expect the modern gamers computer to have that kind of local computing power - is the moment they stop getting to slurp up all of your data.
Why wouldn’t they want that when they’re the ones selling you the hardware?
Do you think they’d make more money selling every individual an AI GPU or selling 1 GPU to OpenAI to serve thousands of users?
It’s because the GTX 10XX gen had much VRAM (yes 1070 had 8GB VRAM in 2016) and was a super good generation that lasted many years. Clearly they want you to change GPUs more often and that’s why they limit the VRAM.
Honestly I think it is because of DLSS. If you can get a $300 card that could do 4k DLSS performance well, why would you need to buy a xx70(ti) or xx80 card?
Lossless Scaling (on Steam) has also shown HUGE promise from a 2-GPU standpoint as well. I’ve seen some impressive results from people piping their NVidia cards, into an Intel GPU (on-die or discreet) and using a dedicated GPU for the upscaling as well.
I bought an used RTX 3070 with 8GB VRAM, two and a half years ago, and occasionally i get pissed from my VRAM limits. I don’t know who buys a GPU with 8GB these days…
I bought one new as the planned 3090 and potential backup 3080 were far too expensive, worst card i owned.
The 8gb was enough for about 3 months, when i was still enamored with my backlog of older games.
I now switched teams and got a 24gb gpu. My first gpu was also team red and lasted 11 years until it was incapable of running the games i wanted to play. I hope my current gpu will also last longer.
Uninformed desperate users unfortunately😔