• 0 Posts
  • 28 Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle


  • Well, let me solve that for you right away.

    You need neither of these things. Games and entertainment are not a priority if you’re in a “this current economy” type of situation.

    If you already have one, that’s the right one for the money, probably.

    Was Nintendo Life “misrepresenting the value of a Switch 2 over a Deck”? Myeeeeh, not sure. I’ll say I agree with their premise that "Steam Deck fans Seriously Underestimating the Switch 2. In somewhat petty, immature ways, as demonstrated very well here. Does the Steam Deck “obliterate the Switch 2”? Probably not, no. I’ll tell you for sure in the summer, I suppose. That said, their listicle is brand shilling as much as this post is.

    Are these two things different and have different sets of pros and cons? Yeah, for sure. It’s even a very interesting exercise to look at the weird-ass current handheld landscape, because it’s never been wider, more diverse or move overpopulated. The Switch 2 and the Deck will probably remain the two leading platforms until whatever Sony is considering materializes, but they’re far from alone, from dirt cheap Linux handhelds to ridiculously niche high end laptop-in-a-candybar Windows PCs.

    If you want to have a fun thread about that I’m game, but fanboyism from grown men is a pet peeve of mine, and even if I didn’t find it infuriating I’d find it really boring.

    For the record, between these two? Tied for price, Switch 2 will be a little more powerful and take advantage of specifically catered software from both first and third parties, has better default inputs, a better screen and support for physical games. Current Deck is flexible, hugely backwards compatible, can be upgraded to a decent OLED screen and has fewer built-in upsells.

    And as a bonus round, Windows handhelds scale up to better performance than either, have better compatibility than the Deck and some superior screen and form factor alternatives… but are typically much more expensive and most (but not all) struggle with the Windows interface and lack hardware HDR support.

    We good? Because that’s that’s the long and short of it.




  • Ah, so my first reaction is “what actual indie developer who knows what they’re talking about excludes BG3 from AAA”?

    Turns out not this one, apparently, since the creative use of quotes seems intended to obscure that “AAA schlock” is not from the dev, it’s for the journalist rehashing a quote from an article about a quote from a podcast. Speaking of schlock.

    Anyway, I’m on the fence about the core point. I agree on principle that “dumbed down” doesn’t make things mainstream. I agree that this is a lesson the industry insists on refusing to learn, even after The Sims doulbing as architecture software, WoW casual moms playing with a dozen UI mods, Fortnite core players building gothic cathedrals in five seconds and Roblox contianing entire gamedev teams made of unpaid children.

    Whatever the mainstream wants, “simple” has nothing to do with it.

    Do I think BG3 means somebody should fund Pillars 3? Yeeeeah, not so sure. BG3 works because it was the literal best time to be making D&D stuff, because it had two extremely beloved brands propping it up, because it’s a sequel to two extremely well received, accessible CRPGs that both did a lot better than Pillars to begin with, because they were both focused on multiplayer and free-form systems instead of straight-up literature. Nuance matters here.

    And then somebody (a lot of somebodies) gave them two hundred million to make it, so it also looks at least as good as anything Bioware ever did during their heyday. That’s probably why BG3 has 140K players on Steam right now and Avowed has 1K and never peaked past 10K.

    There are lessons from BG I’d love to see the industry learn. I want them to learn the right ones, though, because if they go ahead and invest another nine digits on the wrong thing then we WILL actually have to wait another 30 years for another game like that.



  • Best you can do is read the actual text they published. It’s just eight pages.

    The actually useful bit of that article is the link to the press release. Oddly, the press release does NOT say what that article says it says.

    The article:

    Europe’s CPC (Consumer Protection Cooperation Network) confirmed that they’re calling for changes in the way games present their currency.

    The press release:

    the CPC Network is presenting today key principles to help the gaming industry comply with the EU consumer protection rules related to in-game virtual currencies. (…) The key principles and the Common Position are based on the existing general rules of EU consumer law directives that apply to digital services and digital content provided to consumers, including video games.

    I don’t know if it’s a problem with reading comprehension, the increasing deprofessionalization of games journalism or what, but the reporting on this is consistently… bad.


  • I was frustrated to seeing this (incorrect) framing here and I genuinely didn’t have the energy to get past the clickbait enough to see if the actual video reports accurately on what has been issued.

    This is not new legislation, this is not a legislative change. This is an administrative body that coordinates existing national consumer protection agencies using already existing legislation that has flagged one example of what they consider to be infringement and issued guidance on what it considers infringement in general.

    There is no legislative force to this. You could argue that they are just wrong in court and win. There is no court in the EU obligated to follow these guidelines, they are just more likely to keep you on the right side of these agencies, as far as I understand their role.

    This is nominally better than the exceedingly crappy reporting that was doing the rounds this week saying that the “EU had banned MTX”, but not by much. The recommendations are very mild and while they’re positive and will block some usual dark patterns if applied they won’t change one bit of how modern games, AAA or not, are monetized.

    I’m also less kind than the people downthread. It takes some serious hypocrisy to make a celebratory video about how the EU is finally going after dark patterns while clickbaiting so hard I can see multiple body parts prolapsing from here.


  • A quick look at US Amazon spits out that the only 24Gb card in stock is a 3090 for 1500 USD. A look at the European storefront shows 2400EUR for a 4090. Looking at other assorted stores shows a bunch of out of stock notices.

    It’s quite competitive, I’m afraid. Things are very stupid at this point and for obvious reasons seem poised to get even dumber.


  • Yeah, for sure. That I was aware of.

    We were focusing on the Mini instead because… well, if the OP is fretting about going for a big GPU I’m assuming we’re talking user-level costs here. The Mini’s reputation comes from starting at 600 bucks for 16 gigs of fast shared RAM, which is competitive with consumer GPUs as a standalone system. I wanted to correct the record about the 24Gig starter speccing up to 64 because the 64 gig one is still in the 2K range, which is lower than the realistic market prices of 4090s and 5090s, so if my priority was running LLMs there would be some thinking to do about which option makes most sense in the 500-2K price range.

    I am much less aware of larger options and their relative cost to performance because… well, I may not hate LLMs as much as is popular around the Internet, but I’m no roaming cryptobro, either, and I assume neither is anybody else in this conversation.


  • You didn’t, I did. The starting models cap at 24, but you can spec up the biggest one up to 64GB. I should have clicked through to the customization page before reporting what was available.

    That is still cheaper than a 5090, so it’s not that clear cut. I think it depends on what you’re trying to set up and how much money you’re willing to burn. Sometimes literally, the Mac will also be more power efficient than a honker of an Nvidia 90 series card.

    Honestly, all I have for recommendations is that I’d rather scale up than down. I mean, unless you also want to play kickass games at insane framerates with path tracing or something. Then go nuts with your big boy GPUs, who cares.

    But for LLM stuff strictly I’d start by repurposing what I have around, hitting a speed limit and then scaling up to maybe something with a lot of shared RAM (including a Mac Mini if you’re into those) and keep rinsing and repeating. I don’t know that I personally am in the market for AI-specific muti-thousand APUs with a hundred plus gigs of RAM yet.


  • Thing is, you can trade off speed for quality. For coding support you can settle for Llama 3.2 or a smaller deepseek-r1 and still get most of what you need on a smaller GPU, then scale up to a bigger model that will run slower if you need something cleaner. I’ve had a small laptop with 16 GB of total memory and a 4060 mobile serving as a makeshift home server with a LLM and a few other things and… well, it’s not instant, but I can get the sort of thing you need out of it.

    Sure, if I’m digging in and want something faster I can run something else in my bigger PC GPU, but a lot of the time I don’t have to.

    Like I said below, though, I’m in the process of trying to move that to an Arc A770 with 16 GB of VRAM that I had just lying around because I saw it on sale for a couple hundred bucks and I needed a temporary GPU replacement for a smaller PC. I’ve tried running LLMs on it before and it’s not… super fast, but it’ll do what you want for 14B models just fine. That’s going to be your sweet spot on home GPUs anyway, anything larger than 16GB and you’re talking 3090, 4090 or 5090, pretty much exclusively.


  • This is… mostly right, but I have to say, macs with 16 gigs of shared memory aren’t all that, you can get many other alternatives with similar memory distributions, although not as fast.

    A bunch of vendors are starting to lean on this by providing small, weaker PCs with a BIG cache of shared RAM. That new Framework desktop with an AMD APU specs up to 128 GB of shared memory, while the mac minis everybody is hyping up for this cap at 24 GB instead.

    I’d strongly recommend starting with a mid-sized GPU on a desktop PC. Intel ships the A770 with 16GB of RAM and the B580 with 12 and they’re both dirt cheap. You can still get a 3060 with 12 GB for similar prices, too. I’m not sure how they benchmark relative to each other on LLM tasks, but I’m sure one can look it up. Cheap as the entry level mac mini is, all of those are cheaper if you already have a PC up and running, and the total amount of dedicated RAM you get is very comparable.


  • But they fussed about Call of Duty.

    If I’m annoyed about anything it’s that. Gamers are so often using these ostensible customer protection or political affinity issues as a cudgel for what is ultimately a branding preference. This results on excusing some crappy stuff from people they semi-irrationally like (loot boxes on Steam games are fine!, we don’t talk about GenAI on InZOI!) but give extreme amounts of crap to companies they semi-irrationally dislike even for relatively positive things they do.

    I’d mind less if the difference was based on size or artistic quality, but dude, InZOI is from Krafton. I don’t know that the PUBG guys are the plucky indies I want to stretch my moral stances to support.


  • Those goalposts are moving at supersonic speeds, man.

    “AI driven NPCs” are just chatbots, and generative AI is generative AI. I thought the issue with GenAI was supposed to be that the data for training was of dubious legitimacy (which these models certainly still are) and that they were cutting real artists, writers and developers out of the workforce (which these by definition are).

    Nobody seemed to be particularly fine with Stable Diffusion when that came out and could be run locally. I guess we’ve found the level of convenience against which activism will just deal with it.

    Which, again, is fine. I don’t have a massive hate boner against GenAI, even if I do think it needs specific regulation for both training and usage. But there is ZERO meaningful difference between InZOI using AI generation for textures, dialogue and props and Call of Duty using it to make gun skins. Those are the same picture.


  • Yeah, there were a few attempts in the 00s (including several NSFW ones, for some reason). It’s definitely tough to get right. I see the on-paper appeal of InZOI, in that it seems to be going for the same “we’ll do what Maxis won’t” appeal the original Cities: Skylines had. It’s just that with The Sims you risk finding out there was a good reason for what they weren’t doing, I guess.

    I don’t know what’s going on at Maxis. I don’t know that rolling a whole modern platform, games-as-service approach into Sims 4 retroactively is the right call, regardless of it’s due to a lack of capacity to do it or a strategic choice. I am pretty sure that a lot of the stuff in InZOI isn’t doing it for me, though. Those two ideas can be held at once.


  • I see how some of the weirdness in InZOI is in “so bad it’s hilarious” territory.

    I am not an anti-GenAI zealot, myself. I actually think a few of the ways they use it there are perfectly valid and make sense to support user generation… but are almost certainly a moderation nightmare that is about to go extremely off the rails. Others are more powerful than Sims on paper but the UI seems bonkers and borderline unusable.

    I can see the idea of wanting another Sims successor, or both a successor and a competitor, but it’s hard to see the treatment as anything but hypocritical at this point. If anything, I think it shows that there is a reason why there is such a gap between The Sims’ success and how many viable competitors have surfaced. Turns out The Sims is REALLY hard to get right. Even Sim City, which feels more complex at a glance, was much easier to clone or improve.