So the editor asked AI to come up with an image for the title “Gamers desert Intel in droves” and so we get a half-baked pic of a CPU in the desert.
Am I close?
Could be worse.
Could have been “gamers dessert Intel in droves”
Now I want to see that one. But, I refuse to use online generative AI.
Looks like bad photoshop more than AI
Intel until they realized that other companies made CPUs, too

They also bring a “dying transitor problem we don’t feel like fixing” to the party, too
And a constantly changing socket so you have to get a new motherboard every time.
Honestly, not a big deal if you build PC’s to last 6-7 years, since you will be targeting a new RAM generation every time.
Upgraded from a 1600 to a 5600, same mobo
If only your CPU becomes a limiting factor at one point you can simply upgrade your CPU to a few generations newer cpu without having to swap out your motherboard. You can’t really do that with Intel (AFAIK they switch platforms every 2 CPU generations so depending on your CPU you may not be table to upgrade at all (can happen with AMD too, but not that frequent)
With no multi threading
I mean the i7s had SMT. You had to pay extra for SMT, whereas AMD started giving it to you on every SKU except a few low-end ones.
Is it true that all of them had SMT but they just locked it away for lower tiers processors and some managed to activate it despite Intel’s effort?
So happy I chose to go with AM4 board years ago. Was able to go from Zen+ CPU to X3D CPU.
I remember people said back then people usually don’t upgrade their CPU, so its not that much a selling point. But, people didn’t upgrade because they couldn’t due to constant socket changes on the Intel side.
My fps numbers were very happy after the CPU upgrade, and I didn’t have to get a new board and new set of ram.
Just upgraded from an i7-6600k to an RX 7800x3D. Obviously a big upgrade no matter if I went AMD or Intel but I’m loving this new CPU. I had an AMD Athlon XP in the early 2000’s that was excellent so I’ve always had a positive feeling towards AMD.
AMD has had a history of some pretty stellar chips, imo. The fx series just absolutelty sucked and tarnished their reputation for a long time. My Phenom II x6, though? Whew that thing kicked ass.
Intel Pentium D era sucked compared to the Athlon 64 II x2 from what I remember. I had an Athlon 64 3000+ just before the dual core era. Athlon 64 era was great
Oh yeah I had one of those before my 4790k
I played through mass effect 3 when it was new on a discount AMD laptop with an igpu. Granted it was definitely not on max setting, but it wasn’t with everything turned all the way down either.
Intel and their last couple of processor generations were a failure. AMD, on the other hand, been consistent. Look at all these tiny AMD APUs that can run C2077 on a 35W computer that fits in the palm of a hand? Valve is about to drop a nuclear bomb on nvidia, intel and microslop with Gabecube.
The United States government owns 10% of Intel now.
I have to lower my 12th Gen cpu multiplier to stop constant crashing when playing UE games, because everything is overlooked at the factory so they could keep up with AMD performance. Fuck Intel.
I bought into AM5 first year with Zen 4. I’m pretty confident Zen 7 will be AM5. There’s got to be little chance for DDR6 to be priced well by the end of the decade. Confident that I’ll be on AM5 for 10+ years but way better than the Intel desktop I had for 10 years because I will actually have a great update path for my motherboard. AM4 is still relevant. That’s getting to almost 10 years now. It’ll still be a great platform for years to come. Really if you bought early in the life of first gen chips on the socket for AM4/AM5, you’re looking at a 15 year platform. Amazing
For this generation we changed nothing. Except for the motherboard. K, thanks, bye.
Gotta love all the talk of no hardware plateau when we are talking about 10 year old hardware formats being desirable over the current gen stuff.
I am going to be using AM4 until there is a reason to upgrade and honestly? I think I might be on AM4 for another 10 years with how things are going.
The last Intel I bought new was the Pentium 4 630. 3.0 Ghz, with hyperthreading. That’s thing was a fucking space heater. And I loved it. But everything new since then has been AMD.
That second number…
:: needing to fertilize a tree intensifies::
One can only dream about people fleeing x86-64 and going ARM or, even better, RISC-V.
But no, it’s only changing the collar to the dog. But the dog stays the same.
Why though? X Elite lags x86 on battery life, performance and compatibility (and you can’t really run Linux on X Elite).
I am not a fan of Intel, AMD, Nvidia, but what’s the point of moving to ARM for the sake of moving?
Unlike most, I actually have been running ARM on home server for almost a decade. For that use case it makes sense because it’s cheap and well supported.
It would be better to switch to RISC-V because it has no problems with patents and everyone can build a RISC-V CPU, not only 2 companies.
I would be happy to, but it’s currently not an option for desktop/laptop.
Would be great for an SBC where the OS and apps are open source and performance is less of an issue.
ARM has all the same drawbacks as x86 and it’s not a Deus Ex machina that gives high performance at low power consumption because of magic.
Imagine Europe pushing RISC-V and sharing upgrades with China¹. The power of the flagship would soon reach ARM or even x86-64 in a few years.
¹ China is already using RISC-V as much as they can.
I would support that, but it would require European unity and a strategic decision to make a permanent break with the US.
One thing that may or may not have something to do with people leaving Intel might be related to their relationship with Israel. Not trying to make this political, but it’s something I’ve seen some folks mention before.
I’ve always thought it’s a super weird place for them to have a fab just in general. It’s never been the most politically stable part of the world and surely you don’t want your several billion dollar infrastructure getting blown up, so why would you put it somewhere where that’s more likely?
Imo: Loud minority
Could be, like I said, not making it political. My current setup is running a 12th gen i9 and runs fine. I’m just repeating what I’ve seen other across Lemmy, Reddit, etc. have said in other places
Last time I got a CPU, I didn’t even consider the Intel alternatives because of such. My old CPU was a i7-4790k.
Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.
Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel’s chips still do.
Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.
cached in their lead
There are so many dimensions to this
Don’t forget the awfully fast socket changes
I just read the other day that at least one motheboard manufacturer is bringing back AM4 since DDR4 is getting cheaper than DDR5, even with the “this isn’t even manufactured anymore” price markup. That’s only even possible because of how much long-term support AMD gave that socket.
Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?
Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren’t that many of those

Well this scheme seems much more reasonable and logical to me.
And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.
what was the issue?
It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel’s XTU to make things stable again.
This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.
When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.
In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn’t fail.
Even within the same socket family, looking at you lga1151, can you run into compatibility problems.
I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it’s AM6. What came after the Intel LGA1151? It wasn’t LGA1152.
AMD tried the Intel thing too by stopping support of past generation CPU on latter AM4 boards though. Only after public outcry did they scrap that. Wouldn’t put it past them to try it again on AM5.
Are there a lot of people wanting to plug Zen 1 chips into B550 motherboards? Usually it’s the other way around, upgrading chip in an old motherboard.
It can happen if the old motherboard failed, which was more likely than the CPU failing.
There was talk of not providing firmware update for old chipsets to support new gen CPU as well, which is relevant to the cases you mentioned.
Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.
remember Socket 7?
Holy shit, crosscompatibility between manufacturers? We came this close to the almighty above and still ended up where we are today 🤦♂️
I remember Slot 2
As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn’t help.
It wasn’t arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer
Which is a pretty arbitrary naming convention since the number of pins in a socket doesn’t really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.
deleted by creator
They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.
Remember buying the 2600(maybe X) and it was soo fast.
The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.
Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.
Yes, that was a beast! I was poor and had to wait and got the generation after, the 3770K and already the segmentation was there, I got overlooking possibilities but not the VM stuff…
Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.
Past me made the accidentally more financially prudent move of opting for the i7-4790k over the i5-4690k which ultimately lasted me nearly a decade. At the time the advice was of course “4 cores is all you need, don’t waste the money on an i7” but those 4 extra threads made all the difference in the longevity of that PC
Coincidentally, that’s the exact cpu I use in my server! And it runs pretty damn well.
At this point the only “issue” with it is power usage versus processing capability. Newer chips can do the same with less power.
Yeahhh, iirc it uses slightly less power than my main cpu for significantly less performance
All of the exploits against Intel processors didn’t help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.
Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips…
Only one affected AMD, forget which. But Intel knew about the vulnerabilities, but chose not to fix the hardware ahead of their release.
Yea that definitely sounds like Intel… Though it’s still worth pointing out that one of them was a novel way to spy on program memory that affects many CPU types and not really indicative of a dropped ball. (outside of shipping with known vulnerabilities, anyways)
… The power stuff from 12/13th gens or what ever though… ouch, massive dropped ball.
Even the 6-core Phenom IIs from 2010 were great value.
But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.
Worse product and worse consumer practices (changing sockets every 2 generations) made it an easy choice to go with AMD.
DDR4 compatibility held on for a while though after AM5 was full DDR5.
The only real issue they had which has led to the current dire straits is the 13th/14th gen gradual failures from power/heat which they initially tried to claim didn’t exist. If that didn’t happen AMD would still have next to no market share.
You still find people swearing up and down that intel is the only way to go, even despite the true stagnation of progress on the processor side for a long, long time. A couple of cherry picked benchmarks where they lead by a miniscule amount is all they care about, scheduling / parking issues be damned.
Oh hell naw, the issues with Intel came up much sooner.
Ever since Ryzen came out, Intel just stagnated.
I don’t disagree that intel has been shit for a long time, but they were still the go to recommendation all the way through the 14th gen. It wasn’t until the 5800x3d came along that people started really looking at AMD for gaming… and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.
I had a 5800x because I didn’t want yet another intel rig after a 4790k. Then I went on to the 5800x3d, before the 9800x3d now. The 5800x was behind intel, and for me it was just a stopgap anyway because a 5950x was not purchasable when I was building. It was just good enough.
As someone who lived through the fTPM firmware issue on AM4… I can confidently state that the tpm freezes were a dealbreaker. If you didn’t use fTPM and had the module disabled, or you updated your firmware after release you were fine - but the ftpm bug was for many, MANY years unsolved. It persisted for multiple generations. You could randomly freeze for a few seconds in any game (or any software) at any time… sometimes only once every few hours, sometimes multiple times in the span of a few minutes. That’s not usable by any stretch for gaming or anything important.
This might be true for the top of the line builds, but for any build from budget to just below that Ryzen has been a good and commonly recommended choice for a long time
and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.
strongly disagree. Prebuilds are mostly overpriced and/or have cheap components and in worst case proprietary connectors.
I build for the best bang for bucks, and at least in my bubble so do others.
Somehow I think you misunderstood my meaning.
Prebuilt have all kinds of hardware and unfortunately many users go with those. I offered to do a 265k 5070ti build for my brother’s girlfriend but he instead spent the same amount on a 265k 5070 32gb 5200mhz prebuilt. He does some dev work and she does a tiny amount of creative work and honestly I think he wanted to make sure her system was inferior to his. 1 year warranty and you have to pay to ship the whole system in if there’s any issues. He wouldn’t even consider AMD or going with a custom build like I do for myself and others (just finished another intel build over the weekend for a coworker, diehard intel even after the issues…)
In the custom build world I think you find more gamers and people who want the fastest gear they can afford, which is why we see gamers picking up AMD x3d chips today. They aren’t beaten and aren’t just the most expensive option.
AM5 as a platform still has issues with memory training, though it’s largely set it and forget it until you reboot after a month or dont have memory context restore enabled in bios.
I’m less familiar with the intel side nowadays despite literally just doing a build. They seem to win on boot times unless you accept the instability of AMD’s fast boot memory check bypass stuff. Getting a government bailout though is enough to make me want to avoid them indefinitely for my own gear so I doubt I’ll get much hands on with the current or next gen.
I’ve had AMDs since forever, my first own build with Phenom II.
They were always good, but Ryzens were just best.
Never used TPM, so can’t comment on that. And most people never used it,
But yes, so many hardcore Intel diehards, it’s almost funny if it wasn’t sad. Like Intels legacy of adding wattage to get nothing in return.
I’ve been buying AMD since the K6-2, because AMD almost always had the better price/performance ratio (as opposed to outright top performance) and, almost as importantly, because I liked supporting the underdog.
That means it was folks like me who helped keep AMD in business long enough to catch up with and then pass Intel. You’re welcome.
It also means I recently bought my first Intel product in decades, an Arc GPU. Weird that it’s the underdog now, LOL.
AMD almost always had the better price/performance
Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.
To be fair, I upgraded my main desktop directly from a Phenom II X4 840(?) to a Ryzen 1700x without owning any Bulldozer stuff in between.
(I did later buy a couple of used Opteron 6272s, but that’s different for multiple reasons.)
I’ve got an FX 8350, sure AMD fell behind during that time but it was by no means a bad CPU imo. Main PC’s got a 7800X3D now but my FX system is still working just fine to this day, especially since upgrading to an SSD and 16GB RAM some years ago. It can technically even run Cyberpunk 2077 with console like frame rates on high settings.
I mean… It functioned as a CPU.
But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.
Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:
It was bad.
Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.
Bulldozer was AMD’s Pentium 4.
Even down to the questionable marketing.
I’ve been buying since the the Phenom II days with the X3 720. One could easily unlock their 4th core for an easy performance boost. Most of the time it’d work without a hassle.
Wish I knew about that trick back then! I shelled out for an X4…
deleted by creator
My first AMD was a 386-40. Had several of their CPUs since. But there were a few years there that it was real tough to pick AMD.
deleted by creator
Love my 3DNow! K6-2, also my starter.
Oh man, I’d forgotten all about 3dnow!
I decide every upgrade which one to go with. Try not to stay dedicated to one.
Basically - Buy Intel cause it’s the best last I checked… Oh, that was two years ago, now AMD should have been the right one.
Next upgrade, won’t make that mistake - buy AMD. Shit… AMD is garbage this gen, shoulda gotten Intel. Ok, I’ll know better next upgrade.
Repeat forever.
TBF, AMD has been pretty rock-solid for CPUs for the last 5-6 years. Intel… not so much.
My last two computers have been AMD, the last time I built an Intel system was ~2016




















