• Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    34
    ·
    6 days ago

    So the editor asked AI to come up with an image for the title “Gamers desert Intel in droves” and so we get a half-baked pic of a CPU in the desert.

    Am I close?

        • PieMePlenty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 days ago

          Honestly, not a big deal if you build PC’s to last 6-7 years, since you will be targeting a new RAM generation every time.

          • da_cow (she/her)@feddit.org
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 days ago

            If only your CPU becomes a limiting factor at one point you can simply upgrade your CPU to a few generations newer cpu without having to swap out your motherboard. You can’t really do that with Intel (AFAIK they switch platforms every 2 CPU generations so depending on your CPU you may not be table to upgrade at all (can happen with AMD too, but not that frequent)

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        I mean the i7s had SMT. You had to pay extra for SMT, whereas AMD started giving it to you on every SKU except a few low-end ones.

        • jnod4@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          Is it true that all of them had SMT but they just locked it away for lower tiers processors and some managed to activate it despite Intel’s effort?

  • Lfrith@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    6 days ago

    So happy I chose to go with AM4 board years ago. Was able to go from Zen+ CPU to X3D CPU.

    I remember people said back then people usually don’t upgrade their CPU, so its not that much a selling point. But, people didn’t upgrade because they couldn’t due to constant socket changes on the Intel side.

    My fps numbers were very happy after the CPU upgrade, and I didn’t have to get a new board and new set of ram.

  • somethingold@lemmy.zip
    link
    fedilink
    English
    arrow-up
    23
    ·
    7 days ago

    Just upgraded from an i7-6600k to an RX 7800x3D. Obviously a big upgrade no matter if I went AMD or Intel but I’m loving this new CPU. I had an AMD Athlon XP in the early 2000’s that was excellent so I’ve always had a positive feeling towards AMD.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      AMD has had a history of some pretty stellar chips, imo. The fx series just absolutelty sucked and tarnished their reputation for a long time. My Phenom II x6, though? Whew that thing kicked ass.

    • YiddishMcSquidish@lemmy.today
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 days ago

      I played through mass effect 3 when it was new on a discount AMD laptop with an igpu. Granted it was definitely not on max setting, but it wasn’t with everything turned all the way down either.

  • imetators@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 days ago

    Intel and their last couple of processor generations were a failure. AMD, on the other hand, been consistent. Look at all these tiny AMD APUs that can run C2077 on a 35W computer that fits in the palm of a hand? Valve is about to drop a nuclear bomb on nvidia, intel and microslop with Gabecube.

  • SleeplessCityLights@programming.dev
    link
    fedilink
    English
    arrow-up
    10
    ·
    6 days ago

    I have to lower my 12th Gen cpu multiplier to stop constant crashing when playing UE games, because everything is overlooked at the factory so they could keep up with AMD performance. Fuck Intel.

  • commander@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    7 days ago

    I bought into AM5 first year with Zen 4. I’m pretty confident Zen 7 will be AM5. There’s got to be little chance for DDR6 to be priced well by the end of the decade. Confident that I’ll be on AM5 for 10+ years but way better than the Intel desktop I had for 10 years because I will actually have a great update path for my motherboard. AM4 is still relevant. That’s getting to almost 10 years now. It’ll still be a great platform for years to come. Really if you bought early in the life of first gen chips on the socket for AM4/AM5, you’re looking at a 15 year platform. Amazing

    • Flipper@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      For this generation we changed nothing. Except for the motherboard. K, thanks, bye.

      • M0oP0o@mander.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        Gotta love all the talk of no hardware plateau when we are talking about 10 year old hardware formats being desirable over the current gen stuff.

        I am going to be using AM4 until there is a reason to upgrade and honestly? I think I might be on AM4 for another 10 years with how things are going.

  • BoxOfFeet@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    The last Intel I bought new was the Pentium 4 630. 3.0 Ghz, with hyperthreading. That’s thing was a fucking space heater. And I loved it. But everything new since then has been AMD.

  • bufalo1973@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    6 days ago

    One can only dream about people fleeing x86-64 and going ARM or, even better, RISC-V.

    But no, it’s only changing the collar to the dog. But the dog stays the same.

    • Agent_Karyo@piefed.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 days ago

      Why though? X Elite lags x86 on battery life, performance and compatibility (and you can’t really run Linux on X Elite).

      I am not a fan of Intel, AMD, Nvidia, but what’s the point of moving to ARM for the sake of moving?

      Unlike most, I actually have been running ARM on home server for almost a decade. For that use case it makes sense because it’s cheap and well supported.

      • bufalo1973@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        It would be better to switch to RISC-V because it has no problems with patents and everyone can build a RISC-V CPU, not only 2 companies.

        • Agent_Karyo@piefed.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          I would be happy to, but it’s currently not an option for desktop/laptop.

          Would be great for an SBC where the OS and apps are open source and performance is less of an issue.

          ARM has all the same drawbacks as x86 and it’s not a Deus Ex machina that gives high performance at low power consumption because of magic.

          • bufalo1973@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            Imagine Europe pushing RISC-V and sharing upgrades with China¹. The power of the flagship would soon reach ARM or even x86-64 in a few years.

            ¹ China is already using RISC-V as much as they can.

            • Agent_Karyo@piefed.worldOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 days ago

              I would support that, but it would require European unity and a strategic decision to make a permanent break with the US.

  • 🔰Hurling⚜️Durling🔱@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    6
    ·
    6 days ago

    One thing that may or may not have something to do with people leaving Intel might be related to their relationship with Israel. Not trying to make this political, but it’s something I’ve seen some folks mention before.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 days ago

      I’ve always thought it’s a super weird place for them to have a fab just in general. It’s never been the most politically stable part of the world and surely you don’t want your several billion dollar infrastructure getting blown up, so why would you put it somewhere where that’s more likely?

      • 🔰Hurling⚜️Durling🔱@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        Could be, like I said, not making it political. My current setup is running a 12th gen i9 and runs fine. I’m just repeating what I’ve seen other across Lemmy, Reddit, etc. have said in other places

  • RamRabbit@lemmy.world
    link
    fedilink
    English
    arrow-up
    253
    ·
    7 days ago

    Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.

    Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel’s chips still do.

    Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.

      • kieron115@startrek.website
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        7 days ago

        I just read the other day that at least one motheboard manufacturer is bringing back AM4 since DDR4 is getting cheaper than DDR5, even with the “this isn’t even manufactured anymore” price markup. That’s only even possible because of how much long-term support AMD gave that socket.

      • billwashere@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 days ago

        Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?

        • real_squids@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          6 days ago

          Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren’t that many of those

      • nokama@lemmy.world
        link
        fedilink
        English
        arrow-up
        45
        ·
        7 days ago

        And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.

          • nokama@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            5 days ago

            It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel’s XTU to make things stable again.

            This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.

            When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.

        • bufalo1973@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn’t fail.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          37
          ·
          7 days ago

          I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it’s AM6. What came after the Intel LGA1151? It wasn’t LGA1152.

          • 1Fuji2Taka3Nasubi@piefed.zip
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 days ago

            AMD tried the Intel thing too by stopping support of past generation CPU on latter AM4 boards though. Only after public outcry did they scrap that. Wouldn’t put it past them to try it again on AM5.

            • Captain Aggravated@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              7 days ago

              Are there a lot of people wanting to plug Zen 1 chips into B550 motherboards? Usually it’s the other way around, upgrading chip in an old motherboard.

              • 1Fuji2Taka3Nasubi@piefed.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 days ago

                It can happen if the old motherboard failed, which was more likely than the CPU failing.

                There was talk of not providing firmware update for old chipsets to support new gen CPU as well, which is relevant to the cases you mentioned.

          • Junkers_Klunker@feddit.dk
            link
            fedilink
            English
            arrow-up
            16
            ·
            7 days ago

            Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.

      • UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        17
        ·
        7 days ago

        As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn’t help.

        • real_squids@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          7 days ago

          It wasn’t arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer

          • UnspecificGravity@piefed.social
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            1
            ·
            7 days ago

            Which is a pretty arbitrary naming convention since the number of pins in a socket doesn’t really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.

    • Valmond@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      53
      ·
      7 days ago

      They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.

      Remember buying the 2600(maybe X) and it was soo fast.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        ·
        7 days ago

        The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.

        Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

        • Valmond@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          Yes, that was a beast! I was poor and had to wait and got the generation after, the 3770K and already the segmentation was there, I got overlooking possibilities but not the VM stuff…

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 days ago

          Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

          Past me made the accidentally more financially prudent move of opting for the i7-4790k over the i5-4690k which ultimately lasted me nearly a decade. At the time the advice was of course “4 cores is all you need, don’t waste the money on an i7” but those 4 extra threads made all the difference in the longevity of that PC

    • wccrawford@discuss.online
      link
      fedilink
      English
      arrow-up
      37
      ·
      7 days ago

      All of the exploits against Intel processors didn’t help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.

      • MotoAsh@piefed.social
        link
        fedilink
        English
        arrow-up
        19
        ·
        7 days ago

        Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips…

          • MotoAsh@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 days ago

            Yea that definitely sounds like Intel… Though it’s still worth pointing out that one of them was a novel way to spy on program memory that affects many CPU types and not really indicative of a dropped ball. (outside of shipping with known vulnerabilities, anyways)

            … The power stuff from 12/13th gens or what ever though… ouch, massive dropped ball.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 days ago

      Even the 6-core Phenom IIs from 2010 were great value.

      But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.

  • Voytrekk@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    126
    ·
    7 days ago

    Worse product and worse consumer practices (changing sockets every 2 generations) made it an easy choice to go with AMD.

    • Prove_your_argument@piefed.social
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      7
      ·
      edit-2
      7 days ago

      DDR4 compatibility held on for a while though after AM5 was full DDR5.

      The only real issue they had which has led to the current dire straits is the 13th/14th gen gradual failures from power/heat which they initially tried to claim didn’t exist. If that didn’t happen AMD would still have next to no market share.

      You still find people swearing up and down that intel is the only way to go, even despite the true stagnation of progress on the processor side for a long, long time. A couple of cherry picked benchmarks where they lead by a miniscule amount is all they care about, scheduling / parking issues be damned.

      • msage@programming.dev
        link
        fedilink
        English
        arrow-up
        23
        ·
        7 days ago

        Oh hell naw, the issues with Intel came up much sooner.

        Ever since Ryzen came out, Intel just stagnated.

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          7 days ago

          I don’t disagree that intel has been shit for a long time, but they were still the go to recommendation all the way through the 14th gen. It wasn’t until the 5800x3d came along that people started really looking at AMD for gaming… and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.

          I had a 5800x because I didn’t want yet another intel rig after a 4790k. Then I went on to the 5800x3d, before the 9800x3d now. The 5800x was behind intel, and for me it was just a stopgap anyway because a 5950x was not purchasable when I was building. It was just good enough.

          As someone who lived through the fTPM firmware issue on AM4… I can confidently state that the tpm freezes were a dealbreaker. If you didn’t use fTPM and had the module disabled, or you updated your firmware after release you were fine - but the ftpm bug was for many, MANY years unsolved. It persisted for multiple generations. You could randomly freeze for a few seconds in any game (or any software) at any time… sometimes only once every few hours, sometimes multiple times in the span of a few minutes. That’s not usable by any stretch for gaming or anything important.

          • Mavytan@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            7 days ago

            This might be true for the top of the line builds, but for any build from budget to just below that Ryzen has been a good and commonly recommended choice for a long time

          • Atherel@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            7 days ago

            and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.

            strongly disagree. Prebuilds are mostly overpriced and/or have cheap components and in worst case proprietary connectors.

            I build for the best bang for bucks, and at least in my bubble so do others.

            • Prove_your_argument@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 days ago

              Somehow I think you misunderstood my meaning.

              Prebuilt have all kinds of hardware and unfortunately many users go with those. I offered to do a 265k 5070ti build for my brother’s girlfriend but he instead spent the same amount on a 265k 5070 32gb 5200mhz prebuilt. He does some dev work and she does a tiny amount of creative work and honestly I think he wanted to make sure her system was inferior to his. 1 year warranty and you have to pay to ship the whole system in if there’s any issues. He wouldn’t even consider AMD or going with a custom build like I do for myself and others (just finished another intel build over the weekend for a coworker, diehard intel even after the issues…)

              In the custom build world I think you find more gamers and people who want the fastest gear they can afford, which is why we see gamers picking up AMD x3d chips today. They aren’t beaten and aren’t just the most expensive option.

              AM5 as a platform still has issues with memory training, though it’s largely set it and forget it until you reboot after a month or dont have memory context restore enabled in bios.

              I’m less familiar with the intel side nowadays despite literally just doing a build. They seem to win on boot times unless you accept the instability of AMD’s fast boot memory check bypass stuff. Getting a government bailout though is enough to make me want to avoid them indefinitely for my own gear so I doubt I’ll get much hands on with the current or next gen.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            ·
            7 days ago

            I’ve had AMDs since forever, my first own build with Phenom II.

            They were always good, but Ryzens were just best.

            Never used TPM, so can’t comment on that. And most people never used it,

            But yes, so many hardcore Intel diehards, it’s almost funny if it wasn’t sad. Like Intels legacy of adding wattage to get nothing in return.

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    2
    ·
    7 days ago

    I’ve been buying AMD since the K6-2, because AMD almost always had the better price/performance ratio (as opposed to outright top performance) and, almost as importantly, because I liked supporting the underdog.

    That means it was folks like me who helped keep AMD in business long enough to catch up with and then pass Intel. You’re welcome.

    It also means I recently bought my first Intel product in decades, an Arc GPU. Weird that it’s the underdog now, LOL.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      7 days ago

      AMD almost always had the better price/performance

      Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 days ago

        To be fair, I upgraded my main desktop directly from a Phenom II X4 840(?) to a Ryzen 1700x without owning any Bulldozer stuff in between.

        (I did later buy a couple of used Opteron 6272s, but that’s different for multiple reasons.)

      • Octagon9561@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        I’ve got an FX 8350, sure AMD fell behind during that time but it was by no means a bad CPU imo. Main PC’s got a 7800X3D now but my FX system is still working just fine to this day, especially since upgrading to an SSD and 16GB RAM some years ago. It can technically even run Cyberpunk 2077 with console like frame rates on high settings.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 days ago

          I mean… It functioned as a CPU.

          But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.

          Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:

          It was bad.

          Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.

    • Someonelol@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 days ago

      I’ve been buying since the the Phenom II days with the X3 720. One could easily unlock their 4th core for an easy performance boost. Most of the time it’d work without a hassle.

    • LastYearsIrritant@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      7 days ago

      I decide every upgrade which one to go with. Try not to stay dedicated to one.

      Basically - Buy Intel cause it’s the best last I checked… Oh, that was two years ago, now AMD should have been the right one.

      Next upgrade, won’t make that mistake - buy AMD. Shit… AMD is garbage this gen, shoulda gotten Intel. Ok, I’ll know better next upgrade.

      Repeat forever.

      • Omgpwnies@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 days ago

        TBF, AMD has been pretty rock-solid for CPUs for the last 5-6 years. Intel… not so much.

        My last two computers have been AMD, the last time I built an Intel system was ~2016