It is not an upgrade over the 7800X3D.
I’m still on AM4, mainly because the jump is very expensive, essentially a new pc.
I would need a new CPU, motherboard and Ram to fit in my itx case.
Exactly, and my 5600 is still doing a great job. Give me a good deal and I’ll upgrade, but I don’t have a compelling reason right now to upgrade. Oh, and if I do need more performance, I can look at the AM4 X3D chip, which would be cheaper than getting AM5 and rebuilding my PC.
I’m honestly thinking of building a new AM4 PC. 5700X3D is under 200€ new, cheap mobo, cheap DDR4 RAM and tbh the benchmarks aren’t that far off this new 9xxx series in gaming (which is the only thing I really care about). I’d rather save some money and get a better GPU
We’re all broke and performance improvements have been basically stagnant?
We’re spending our money on fucking groceries… It’s time to optimize, not upscale.
You’d think there would be some value-add in cranking out the older chips faster and at a lower price point, rather than aiming for a marginal improvement in spec that nobody has a use for yet.
I’m considering it, but only just, my 5800x is good enough for most gaming, which is GPU bound anyway, and I run a dual xeon rig for my workstation.
zen 2-4 took care of a lot of the demand, we all have 8-16 cores now, what else could they give us?
They do still seem to be making advances in single-core performance, but whether it matters to most people is a different question. Most people aren’t using software that would benefit that much from these generation-to-generation performance improvements. It’s not going to be anywhere near as noticeable as when we went from 2 or 4 cores to 8, 16, 24, etc.
Single-thread is really hard, we’ve basically saturated our l1 working set size, adding more doesn’t help much. Trying to extend the vector length just makes physical design harder and that reduces clock speed. The predictors are pretty good, and Apple finally kicked everyone up the ass to increase OOO like they should have.
Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.
Flash was the epochal change, maybe we have some new form of hybrid storage but that doesn’t seem likely right now, Apple might do it to cut costs while preserving performance, actually yeah I see them trying to have their cake and eat it too.
Otherwise I don’t know, we need a better way to deal with GPUs, there’s nothing else that can move the needle, except true heterogenous core clusters, but I haven’t been able to sell that to anyone so far, they all think it’s a great idea, that someone else should do.
Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.
The incentives are all wrong for this, except in FOSS. It’s never going to be a priority for Microsoft because everyone is used to the (lack of) speed of Windows, and “now a bit faster!” isn’t a great marketing line. And it’s not in the interests of hardware companies that need to keep shifting new boxes if the software doesn’t keep bogging each generation down eventually. So we end up stuck with proprietary bloatware everywhere.
“what intel gives, microsoft takes away”
dates from the mid 90s, still relevant.
Let’s be fair, Ms was vastly outrunning Intel for a long time, it’s only slowed down recently, and now the problem isn’t single-thread bloat so much as it is an absolute lack of multicore scaling for almost all applications except some games, and even then windows fights as hard as it possibly can to stop you, like amd just proved yet again.
Yes, mostly the applications aren’t there, if you need real cpu power (or gpu for that matter), you’re running linux or on the cloud.
But we are reaching a point where the desktop has to either be relegated to the level of embedded terminal (ie ugly tablet, before it’s dropped altogether), or make the leap to genuine compute tool, and I fear we’re going to see the former.
what else could they give us?
AI!!!
^^/s
I have a 5900x and honestly don’t see any need for an upgrade anytime soon.
A new CPU would maybe give me like 10 fps more in games, but a new GPU would do more. And I don’t think the CPU will be a bottle neck in the next few years
Even beyond that, short of something like blender, Windows just can’t handle that kind of horsepower, it’s not designed for it and the UI bogs down fairly fast.
Linux, otoh, I find can eat as much CPU as you throw at it, but often many graphics applications start bogging down the X server for me.
So I have a windows machine with the best GPU but passable cpu and a decent workstation gpu with insane cpu power on linux.
What is your problem with Windows, though?
Meh, not nearly as configurable as linux, some things you can’t change.
NFS beats SMB into a cocked hat.
You start spending more time in a terminal on linux, because you’re not dealing with your machine, you’re always connecting to other machines with their resources to do things. Yeah a terminal on windows makes a difference, and I ran cygwin for a while, it’s still not clean.
Installing software sucks, either having to download or the few stuff that goes through a store. Not that building from source is much better, but most stuff comes from distro repos now.
Once I got lxc containers though, actually once I tried freebsd I lost my windows tolerance. Being able to construct a new effective “OS” with a few keystrokes is incredible, install progarms there, even graphical ones, no trace on your main system. There’s just no answer.
Also plasma is an awesome DE.
Ah, ok, I thought you were taking about Windows not being able to run CPU at full speed. But yes, it’s certainly a different OS with ups and downs.
Well, it can’t run multithreaded jobs at full speed.
Exhibit A: The latest AMD patch for multicore scheduling across NUMA.
My 3700X is still working fine for me.
I am still running an FX-8320 and it’s fast enough for everything that I need it for. It baffles me to see people arguing about the differences between different Ryzen CPUs.
If you’re not running the latest games it really doesn’t matter at all.
Some people use computers for more demanding things. For anyone who just uses the computer for web browsing, email and watching videos, anything but the most feeble machine from the past decade or more will be fine.
Who TF is Ryen?
Everybody has to support the new new underdog Intel.
I did! I bought shares when they tanked.
They’re still tanking
Good job! ᕕ( ᐛ )ᕗ
Waiting for 9000 X3D. For most people, 7800X3D is more performant than anything 9000 series.
Price drop put the 7900x at bargain bin prices and I bought that instead.
I got another 3-5 years with my 5800X3D
I have an other 2-3 years with my 1600.
I’ve got another 5 or 6 years with my FX-8150.
Not alone there. Still humpin a couple systems with those. Still runs SWBF2 so im good
F
I’ll probably get one, once enough of its vulnerabilities are discovered and post-mitigation benchmarks are released.
And once I have enough money.money… chip I have in my rig right now is so expensive, I would need to save up for at least a year. its not broken so the money can be used on other things.
the capitolists are almost at the end of the of hungry hungry hippos game played with the world’s money.
Mine is pretty expensive too (at least for me, it is). I just make sure not to fly without a rebuy.
o7
Two words: Microsoft Pluton.
Aaaaint touching that shit.
Oh gosh. Forgot all about that shit. No thanks.
Do AMD not realise that Linux/Privacy nerds stuck with them regardless for years. Would they have survived without that loyalty?
Do linux and privacy focused consumers actually make up a large portion of their market share? Linux users still make up a small portion of desktop users, and not even all of those really care much about privacy.
For many years AMD was uncompetitive compared to Intel / Nvidia. Intel had 80% of the market at one point. It probably would have died off if it wasn’t for folk that wanted Linux compatibility. Many run FOSS because of privacy. Linux is a key part of that.
By themselves, no.
But they’re the people friends and family ask for help when deciding to buy a computer. It’s why Intel has slumped. Most people don’t know what a CPU does, so that’s not why they’re picking Intel or AMD - they’re choosing based off recommendations from more knowledgeable people.
And they are early adopters.
Price is probably #1.
Bit of speculation here with no real sources ; There was a boom in late 2022 through 2023 when people could finally reliably get parts again. I’m guessing many who wanted to upgrade already did in the past 2 years. Anyone who got a new computer in 2020 onward should be fine for at least a few more years. I think the average is around 7 years.
The market will probably see a surge between 2027-2030 as people begin replacing their “covid era” computers.The market right now is mainly seeing anyone with a pre-covid computer who bought a nice top of line machine for about 1k. They’re looking at current pricing and choosing to go with today’s mid-low teir, which will outclass their old 201x top of the line computer.
Another factor could be AAA gaming hasn’t exactly been pumping out hit new tiles the last 5 years. People who wanted to play cyberpunk or Eldon ring already upgraded by the time Wukon came out.
With less new games requirng the latest and greatest means the need to upgrade is going drop too.
Again all speculation…
I don’t need one right now and seeing how development slowed down won’t need one in the foreseeable future