Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • @Dra@lemmy.zip
    link
    fedilink
    English
    24
    edit-2
    1 year ago

    I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

    Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

    • @AlijahTheMediocre@lemmy.world
      link
      fedilink
      English
      61 year ago

      If only game developers optimized their games…

      The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

    • @Obi@sopuli.xyz
      link
      fedilink
      English
      21 year ago

      Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

    • @Asafum@feddit.nl
      link
      fedilink
      English
      241 year ago

      Lmao

      We have your comment: what am I doing with 20gb vram?

      And one comment down: it’s actually criminal there is only 20gb vram

    • @Eccitaze@yiffit.net
      link
      fedilink
      English
      281 year ago

      An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

    • @Hadriscus@lemm.ee
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

    • @Blackmist@feddit.uk
      link
      fedilink
      English
      101 year ago

      Current gen consoles becoming the baseline is probably it.

      As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

      That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

  • @Rakonat@lemmy.world
    link
    fedilink
    English
    951 year ago

    Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

    Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

    It’s possible we’re approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

    TL;DR Nvidia is trying to sell a card at twice it’s value cause greed.

    • @genie@lemmy.world
      link
      fedilink
      English
      161 year ago

      Couldn’t agree more! Abstracting to a general economic case – those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn’t quite add up @nvidia :)

      Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.

    • @Evilcoleslaw@lemmy.world
      link
      fedilink
      English
      39
      edit-2
      1 year ago

      They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

      The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

        • @Evilcoleslaw@lemmy.world
          link
          fedilink
          English
          2
          edit-2
          1 year ago

          True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.

      • mihies
        link
        fedilink
        101 year ago

        And they beat AMD in efficiency! I’m (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

        • @MonkderZweite@feddit.ch
          link
          fedilink
          English
          211 year ago

          Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD’s newer cards?

          • @pycorax@lemmy.world
            link
            fedilink
            English
            51 year ago

            They seem to be but honestly, this generation hasn’t been very impressive for both team green and red. I got a 6950 XT last year and seeing all these new releases has only proven that I made a good investment.

            • @Daveyborn@lemmy.world
              link
              fedilink
              English
              21 year ago

              Nothing compelling enough for me to hop off of a titan Xp yet. (Bought a titan because it was cheaper than a 1070 at the time because of scalpers)

          • @Crashumbc@lemmy.world
            link
            fedilink
            English
            21 year ago

            30 series maybe.

            40 series power usage Nvidia destroys AMD.

            The 4070 uses WAY less than a 3070… It’s 200 (220 for supera) that’s nearly more than my 1070 170w

  • BargsimBoyz
    link
    fedilink
    English
    15
    edit-2
    1 year ago

    What’s going on? It’s overpriced and completely unnecessary for most people. There’s also a cost of living crisis.

    I play every game I want to on high graphics with my old 1070. Unless you’re working on very graphically intensive apps or you’re a pc master race moron then there’s no need for new cards.

    • @n3m37h@sh.itjust.works
      link
      fedilink
      English
      21 year ago

      It was a night and day difference going from a 1060 6gb to a 6700xt. The prices are still kida shit but that goes for everything

    • @chiliedogg@lemmy.world
      link
      fedilink
      English
      41 year ago

      I still game 1080p and it looks fine. I’m not dropping 2500 bucks to get a 4k monitor and video card to run it when I won’t even register the difference during actual gameplay.

  • Binthinkin
    link
    fedilink
    301 year ago

    You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

    Aren’t they taking the 4080 completely off the market too?

    • @TheGrandNagus@lemmy.world
      link
      fedilink
      151 year ago

      Aren’t they taking the 4080 completely off the market too?

      Apparently they stopped production of it months ago. Whatever still exists on shelves is only there because nobody has been buying them.

      Honestly this has been the worst 80-class Nvidia card ever. The GTX 480 was a complete joke but even that managed to sell ok.

  • @CosmoNova@lemmy.world
    link
    fedilink
    English
    251 year ago

    I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.

    • @Thorny_Insight@lemm.ee
      link
      fedilink
      English
      51 year ago

      And here I’m thinking upgrading from two 512mb cards to a GTX 1660 SUPER with 6GB VRAM is going to be good for another 10 years. The heck does someone need 16 gigs for?

      • @barsoap@lemm.ee
        link
        fedilink
        English
        11 year ago

        AI. But you’re right my 4G 5500XT so far is putting up a valiant fight though I kinda dread trying out CP77 again after the big patch it’s under spec now. Was a mistake to buy that thing in the first place, should’ve gone with 8G but I just had to be pigheaded with my old “workstation rule” – Don’t spend more on the GPU than on the CPU.

      • @Crashumbc@lemmy.world
        link
        fedilink
        English
        31 year ago

        Unless your gaming that’s fine.

        But if you want to play any newer AAA games (even less than 5-8 years old) or use more than 1080p. You’ll need better.

      • @pycorax@lemmy.world
        link
        fedilink
        English
        101 year ago

        Future proofing. GPUs are expensive and I expect to be able to use it for at least the next 7 years, even better if it lasts longer than that.

  • @trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    411 year ago

    less than 20gb of vram in 2024?

    The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

    • @BorgDrone@lemmy.one
      link
      fedilink
      English
      121 year ago

      The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

    • lemmyvore
      link
      fedilink
      English
      231 year ago

      I don’t think they care. In fact I think they’re going to exit the consumer market eventually, it’s just peanuts to them and the only reason they’re still catering to it is to use it as field testing (and you’re paying them for the privilege which is quite ironic).

      • @Kyrgizion@lemmy.world
        link
        fedilink
        English
        141 year ago

        This. Corporations are lining up in droves for gpu’s to run AI applications. Nvidia doesn’t care about regular consumers because we aren’t even their primary market anymore, just a bonus to be squeezed.

        • @wewbull@feddit.uk
          link
          fedilink
          English
          101 year ago

          If Nvidia pivot completely out of the consumer space, which I can totally see coming, they are placing the company totally dependent on the AI hype train. That’s a fairly precarious position in my eyes. I’ve yet to see an actual application which it solves with enough reliability to be more than just a curiosity.

          • @agitatedpotato@lemmy.world
            link
            fedilink
            English
            11 year ago

            Yeah but if they pump their valuation high enough, they will have plenty of time to sell off shares before their decision start to effect the rest of the people who work there.

          • @willis936@lemmy.world
            link
            fedilink
            English
            6
            edit-2
            1 year ago

            They leaned their strategy pretty hard into mining when that was on the table. They for sure chase trends and alienate their base. Any way to juice near term profits and they will. It’s working out for them right now, so surely it will forever.

      • @genie@lemmy.world
        link
        fedilink
        English
        31 year ago

        Right? TPUs make more sense at scale (especially for LLMs & similar). The consumer market is more about hype and being a household name than it is about revenue.

  • @hark@lemmy.world
    link
    fedilink
    English
    141 year ago

    So many options, with small differences between them, all overpriced to the high heavens. I’m sticking with my GTX 1070 since it serves my needs and I’ll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that’s $430 in today’s dollars.

  • @LOLjoeWTF@lemmy.world
    link
    fedilink
    English
    661 year ago

    My Nvidia 1070 with 8gb vram is still playing all of my games. Not everything gets Ultra, nor my monitor isn’t 4K. Forever I am the “value buyer”. It’s hard to put money into something that is marginally better though. I thought 16g would be a no-brainer.

    • @MeatsOfRage@lemmy.world
      link
      fedilink
      English
      711 year ago

      Exactly, people get to caught up in the Digital Foundry-ification of ultra max settings running at a perfect ~120 unlocked frames. Relax my dudes and remember the best games of your life were perfect dark with your friends running at 9 FPS.

      1080p is fine, medium settings are fine. If the game is good you won’t sweat the details.

      • @Crashumbc@lemmy.world
        link
        fedilink
        English
        21 year ago

        You lost me at 1080p. It’s a basic quality of life thing. Even 1440p is a HUGE upgrade even for regular computer use not even gaming.

        I run 4k but I use/need it more work space at work than gaming.

      • @ABCDE@lemmy.world
        link
        fedilink
        English
        281 year ago

        remember the best games of your life were perfect dark with your friends running at 9 FPS.

        The frame rate was shat on at the time and with good reason, that was unplayable for me. Best times were Halo 4-16 local multiplayer.

      • @Whom@midwest.social
        link
        fedilink
        English
        2
        edit-2
        1 year ago

        I agree that this happens to an extent but Digital Foundry in particular makes a point to take into account performance of the cards most used by regular people and are one of the biggest forces in that space pushing people to not just hit “ultra” and move on as you can see with their optimized settings series and the like, as well as getting the best out of older games as in their retro series. They like games that look good and play smoothly, of course, but I don’t think it’s fair to associate them with that kind of ULTRA MAX OR DIE attitude.

        I think there’s sometimes an overcorrection from the “gameplay over graphics” crowd. I’ve been part of that group before and get it, it’s frustrating when from your perspective the industry is ignoring the parts of games that you care about the most. But it’s a strange thing to pick on because at the end of the day pretty things that feel smooth to play are wonderful! That can be done on a toaster with beautiful pixel art / low poly 3D models, but it can also be done in dramatically different ways by pushing high end hardware to its limits. There’s room for both and I adore both. Games are art like anything else and it’d be strange to tell people who appreciate going to a beautiful movie shot on particularly nice film on-location in expensive places just because it’s still a good movie if you watch it on an old laptop with awful web compression or because an underground mumblecore film from 2003 is also great.

        Graphics aren’t all that matter to me but if the primary joy someone gets from gaming is seeing ultra-detailed and perfectly rendered scenes the best way they possibly can, good for them. Personally, I like getting good visuals when I can but my primary concern is always framerate, as particularly in first person games even 60fps often triggers my motion sickness and forces me to stick to short sessions. Ultimately I see the this whole debate as a relic of the past that only made sense when the only games the average person had access to were AAA/AA releases. Low-spec gaming is better than it has ever been, with the indie scene continuing to go strong like it has for the past 15+ years and an ever-expanding backlog of classics which now run on just about anything every year.

        • swayevenly
          link
          fedilink
          English
          71 year ago

          Not to shill for them but Alex makes it a point to run tests and to include optimized settings for non flagship hardware in every review he does. I’m not sure where your digital foundry nomenclatures are coming from.

          And no, 30fps is not fine…

          • ☂️-
            link
            fedilink
            English
            31 year ago

            i was referring to the op i was responding to

      • Ragdoll X
        link
        fedilink
        English
        171 year ago

        As someone who really doesn’t care much for game graphics I feel that a comment I wrote a few months ago also fits here:

        I’ve never really cared much about graphics in video games, and a game can still be great with even the simplest of graphics - see the Faith series, for example. Interesting story and still has some good scares despite the 8-bit graphics.

        To me many of these games with retro aesthetics (either because they’re actually retro or the dev decided to go with a retro style) don’t really feel dated, but rather nostalgic and charming in their own special way.

        And many other people also don’t seem to care much about graphics. Minecraft and Roblox are very popular despite having very simplistic graphics, and every now and then a new gameplay video about some horror game with a retro aesthetic will pop up on my recommended, and so far I’ve never seen anyone complain about the graphics, only compliments about them being interesting, nostalgic and charming.

        Also I have a potato PC, and it can’t run these modern 8K FPS games anyway, so having these games with simpler graphics that I can actually run is nice. But maybe that’s just me.

        • Flying Squid
          link
          fedilink
          English
          11 year ago

          I kind of feel the same way about TV resolution. I have a 1080p TV and a 720p TV and I’m honestly fine with them. Sure, there’s better quality out there, but I can always go to the movies if I want that. And I have the advantages of TVs without any ‘smart’ bullshit. They can’t even connect to the internet.

          I’m not saying no one else should buy 8k TVs or whatever, if that’s what you want, fine, but there are plenty of people I’ve talked to who feel the same way as me, so I’m glad they haven’t done anything like make us all change to new TVs again like they did when they updated to HD.

    • Sibbo
      link
      fedilink
      English
      251 year ago

      You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.

      • @2xar@lemmy.world
        link
        fedilink
        English
        571 year ago

        That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.

        • @SailorMoss@sh.itjust.works
          link
          fedilink
          English
          51 year ago

          I bought a GTX780 for $500 MSRP circa 2013. I considered that to be crazy expensive at the time, but I was going all out on that system. Currently I run a GTX1080Ti(bought used) with 11GB of VRAM and they want me to spend $600 for 1 more GB of VRAM? PS5 has 16 GB of shared memory, 16GB should be entry level of VRAM for a system thats expected to keep up with this generation of graphics. There’s no reason for Nvidia to do this other than to force users to upgrade sooner.

          Funny part is the market is so fucked that reviewers are lauding this a decent deal. I think the 1080Ti will last me until OLED matures and I finally upgrade from a 1080p monitor. According to the steam survey most gamers are in a similar boat.

        • @reinar@distress.digital
          link
          fedilink
          English
          4
          edit-2
          1 year ago

          Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd.

          There’s much more effort involved to produce modern GPU now. Either way, if NVidia would be truly greedy, they’d close gaming gpu business right away and would produce only AI accelerators. You can take same 4070, add $200 worth of GDDR chips to the layout and sell this for $15k minimum, shit would be on backorder.

        • @Whom@midwest.social
          link
          fedilink
          English
          51 year ago

          Yeah, I think it’s important not to lose perspective here and let expectations slide just because Nvidia are being more awful right now. Make no mistake, value went out the window a long time ago and AMD are also fucking us, just a little less hard than their main competitor. Even adjusting for inflation, what used to get you the top of the line now gets you last-gen midrange.

      • Pope-King Joe
        link
        fedilink
        English
        51 year ago

        Yeah right? I got my 6700 XT for just over $400USD. It was a great deal.

        • Spaz
          link
          fedilink
          English
          2
          edit-2
          1 year ago

          Just got my brand new 6800xt for $350, upgrading from a 970 screw Nvidia.

      • ඞmir
        link
        fedilink
        English
        21 year ago

        That’s a shit deal when the 4070 is €550

    • @9488fcea02a9@sh.itjust.works
      link
      fedilink
      English
      181 year ago

      The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each