• Jin
    link
    fedilink
    English
    297 months ago

    I’m still on AM4, mainly because the jump is very expensive, essentially a new pc.

    I would need a new CPU, motherboard and Ram to fit in my itx case.

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      7 months ago

      Exactly, and my 5600 is still doing a great job. Give me a good deal and I’ll upgrade, but I don’t have a compelling reason right now to upgrade. Oh, and if I do need more performance, I can look at the AM4 X3D chip, which would be cheaper than getting AM5 and rebuilding my PC.

    • @[email protected]
      link
      fedilink
      English
      67 months ago

      I’m honestly thinking of building a new AM4 PC. 5700X3D is under 200€ new, cheap mobo, cheap DDR4 RAM and tbh the benchmarks aren’t that far off this new 9xxx series in gaming (which is the only thing I really care about). I’d rather save some money and get a better GPU

    • @[email protected]
      link
      fedilink
      English
      97 months ago

      We’re spending our money on fucking groceries… It’s time to optimize, not upscale.

    • @[email protected]
      link
      fedilink
      English
      27 months ago

      You’d think there would be some value-add in cranking out the older chips faster and at a lower price point, rather than aiming for a marginal improvement in spec that nobody has a use for yet.

  • @[email protected]
    link
    fedilink
    English
    177 months ago

    I’m considering it, but only just, my 5800x is good enough for most gaming, which is GPU bound anyway, and I run a dual xeon rig for my workstation.

    zen 2-4 took care of a lot of the demand, we all have 8-16 cores now, what else could they give us?

    • @[email protected]OP
      link
      fedilink
      English
      57 months ago

      They do still seem to be making advances in single-core performance, but whether it matters to most people is a different question. Most people aren’t using software that would benefit that much from these generation-to-generation performance improvements. It’s not going to be anywhere near as noticeable as when we went from 2 or 4 cores to 8, 16, 24, etc.

      • @[email protected]
        link
        fedilink
        English
        57 months ago

        Single-thread is really hard, we’ve basically saturated our l1 working set size, adding more doesn’t help much. Trying to extend the vector length just makes physical design harder and that reduces clock speed. The predictors are pretty good, and Apple finally kicked everyone up the ass to increase OOO like they should have.

        Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

        Flash was the epochal change, maybe we have some new form of hybrid storage but that doesn’t seem likely right now, Apple might do it to cut costs while preserving performance, actually yeah I see them trying to have their cake and eat it too.

        Otherwise I don’t know, we need a better way to deal with GPUs, there’s nothing else that can move the needle, except true heterogenous core clusters, but I haven’t been able to sell that to anyone so far, they all think it’s a great idea, that someone else should do.

        • @[email protected]OP
          link
          fedilink
          English
          3
          edit-2
          7 months ago

          Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

          The incentives are all wrong for this, except in FOSS. It’s never going to be a priority for Microsoft because everyone is used to the (lack of) speed of Windows, and “now a bit faster!” isn’t a great marketing line. And it’s not in the interests of hardware companies that need to keep shifting new boxes if the software doesn’t keep bogging each generation down eventually. So we end up stuck with proprietary bloatware everywhere.

            • @[email protected]
              link
              fedilink
              English
              27 months ago

              Let’s be fair, Ms was vastly outrunning Intel for a long time, it’s only slowed down recently, and now the problem isn’t single-thread bloat so much as it is an absolute lack of multicore scaling for almost all applications except some games, and even then windows fights as hard as it possibly can to stop you, like amd just proved yet again.

          • @[email protected]
            link
            fedilink
            English
            17 months ago

            Yes, mostly the applications aren’t there, if you need real cpu power (or gpu for that matter), you’re running linux or on the cloud.

            But we are reaching a point where the desktop has to either be relegated to the level of embedded terminal (ie ugly tablet, before it’s dropped altogether), or make the leap to genuine compute tool, and I fear we’re going to see the former.

    • LiveLM
      link
      fedilink
      English
      37 months ago

      what else could they give us?

      AI!!!

      ^^/s

    • @[email protected]
      link
      fedilink
      English
      57 months ago

      I have a 5900x and honestly don’t see any need for an upgrade anytime soon.

      A new CPU would maybe give me like 10 fps more in games, but a new GPU would do more. And I don’t think the CPU will be a bottle neck in the next few years

      • @[email protected]
        link
        fedilink
        English
        47 months ago

        Even beyond that, short of something like blender, Windows just can’t handle that kind of horsepower, it’s not designed for it and the UI bogs down fairly fast.

        Linux, otoh, I find can eat as much CPU as you throw at it, but often many graphics applications start bogging down the X server for me.

        So I have a windows machine with the best GPU but passable cpu and a decent workstation gpu with insane cpu power on linux.

          • @[email protected]
            link
            fedilink
            English
            17 months ago

            Meh, not nearly as configurable as linux, some things you can’t change.

            NFS beats SMB into a cocked hat.

            You start spending more time in a terminal on linux, because you’re not dealing with your machine, you’re always connecting to other machines with their resources to do things. Yeah a terminal on windows makes a difference, and I ran cygwin for a while, it’s still not clean.

            Installing software sucks, either having to download or the few stuff that goes through a store. Not that building from source is much better, but most stuff comes from distro repos now.

            Once I got lxc containers though, actually once I tried freebsd I lost my windows tolerance. Being able to construct a new effective “OS” with a few keystrokes is incredible, install progarms there, even graphical ones, no trace on your main system. There’s just no answer.

            Also plasma is an awesome DE.

            • @[email protected]
              link
              fedilink
              English
              27 months ago

              Ah, ok, I thought you were taking about Windows not being able to run CPU at full speed. But yes, it’s certainly a different OS with ups and downs.

              • @[email protected]
                link
                fedilink
                English
                17 months ago

                Well, it can’t run multithreaded jobs at full speed.

                Exhibit A: The latest AMD patch for multicore scheduling across NUMA.

  • @[email protected]
    link
    fedilink
    English
    17 months ago

    I am still running an FX-8320 and it’s fast enough for everything that I need it for. It baffles me to see people arguing about the differences between different Ryzen CPUs.

    • @[email protected]OP
      link
      fedilink
      English
      17 months ago

      Some people use computers for more demanding things. For anyone who just uses the computer for web browsing, email and watching videos, anything but the most feeble machine from the past decade or more will be fine.

  • @[email protected]
    link
    fedilink
    English
    347 months ago

    Waiting for 9000 X3D. For most people, 7800X3D is more performant than anything 9000 series.

  • @[email protected]
    link
    fedilink
    English
    127 months ago

    I’ll probably get one, once enough of its vulnerabilities are discovered and post-mitigation benchmarks are released.
    And once I have enough money.

    • Hello Hotel
      link
      fedilink
      English
      6
      edit-2
      7 months ago

      money… chip I have in my rig right now is so expensive, I would need to save up for at least a year. its not broken so the money can be used on other things.

      the capitolists are almost at the end of the of hungry hungry hippos game played with the world’s money.

  • Semperverus
    link
    fedilink
    English
    97 months ago

    Two words: Microsoft Pluton.

    Aaaaint touching that shit.

    • @[email protected]
      link
      fedilink
      English
      57 months ago

      Oh gosh. Forgot all about that shit. No thanks.

      Do AMD not realise that Linux/Privacy nerds stuck with them regardless for years. Would they have survived without that loyalty?

      • @[email protected]
        link
        fedilink
        English
        67 months ago

        Do linux and privacy focused consumers actually make up a large portion of their market share? Linux users still make up a small portion of desktop users, and not even all of those really care much about privacy.

        • @[email protected]
          link
          fedilink
          English
          37 months ago

          For many years AMD was uncompetitive compared to Intel / Nvidia. Intel had 80% of the market at one point. It probably would have died off if it wasn’t for folk that wanted Linux compatibility. Many run FOSS because of privacy. Linux is a key part of that.

        • @[email protected]
          link
          fedilink
          English
          47 months ago

          By themselves, no.

          But they’re the people friends and family ask for help when deciding to buy a computer. It’s why Intel has slumped. Most people don’t know what a CPU does, so that’s not why they’re picking Intel or AMD - they’re choosing based off recommendations from more knowledgeable people.

  • @[email protected]
    link
    fedilink
    English
    77 months ago

    Price is probably #1.

    Bit of speculation here with no real sources ; There was a boom in late 2022 through 2023 when people could finally reliably get parts again. I’m guessing many who wanted to upgrade already did in the past 2 years. Anyone who got a new computer in 2020 onward should be fine for at least a few more years. I think the average is around 7 years.

    The market will probably see a surge between 2027-2030 as people begin replacing their “covid era” computers.The market right now is mainly seeing anyone with a pre-covid computer who bought a nice top of line machine for about 1k. They’re looking at current pricing and choosing to go with today’s mid-low teir, which will outclass their old 201x top of the line computer.

    Another factor could be AAA gaming hasn’t exactly been pumping out hit new tiles the last 5 years. People who wanted to play cyberpunk or Eldon ring already upgraded by the time Wukon came out.

    With less new games requirng the latest and greatest means the need to upgrade is going drop too.

    Again all speculation…

  • @[email protected]
    link
    fedilink
    English
    17 months ago

    I don’t need one right now and seeing how development slowed down won’t need one in the foreseeable future