Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • @[email protected]
    link
    fedilink
    English
    411 year ago

    less than 20gb of vram in 2024?

    The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

    • @[email protected]
      link
      fedilink
      English
      121 year ago

      The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

  • @[email protected]
    link
    fedilink
    English
    951 year ago

    Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

    Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

    It’s possible we’re approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

    TL;DR Nvidia is trying to sell a card at twice it’s value cause greed.

    • @[email protected]
      link
      fedilink
      English
      39
      edit-2
      1 year ago

      They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

      The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

      • ☂️-
        link
        fedilink
        English
        51 year ago

        Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.

        I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.

      • mihies
        link
        fedilink
        101 year ago

        And they beat AMD in efficiency! I’m (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

        • @[email protected]
          link
          fedilink
          English
          211 year ago

          Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD’s newer cards?

          • @[email protected]
            link
            fedilink
            English
            51 year ago

            They seem to be but honestly, this generation hasn’t been very impressive for both team green and red. I got a 6950 XT last year and seeing all these new releases has only proven that I made a good investment.

            • @[email protected]
              link
              fedilink
              English
              21 year ago

              Nothing compelling enough for me to hop off of a titan Xp yet. (Bought a titan because it was cheaper than a 1070 at the time because of scalpers)

          • @[email protected]
            link
            fedilink
            English
            21 year ago

            30 series maybe.

            40 series power usage Nvidia destroys AMD.

            The 4070 uses WAY less than a 3070… It’s 200 (220 for supera) that’s nearly more than my 1070 170w

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          1 year ago

          True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.

    • @[email protected]
      link
      fedilink
      English
      161 year ago

      Couldn’t agree more! Abstracting to a general economic case – those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn’t quite add up @nvidia :)

      Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.

  • @[email protected]
    link
    fedilink
    English
    1281 year ago

    GPUs haven’t been reasonably priced since the 1000 series.

    And now there’s no coin mining promising some money back.

    • @[email protected]
      link
      fedilink
      English
      181 year ago

      The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each

    • Sibbo
      link
      fedilink
      English
      251 year ago

      You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.

      • ඞmir
        link
        fedilink
        English
        21 year ago

        That’s a shit deal when the 4070 is €550

      • Pope-King Joe
        link
        fedilink
        English
        51 year ago

        Yeah right? I got my 6700 XT for just over $400USD. It was a great deal.

        • Spaz
          link
          fedilink
          English
          2
          edit-2
          1 year ago

          Just got my brand new 6800xt for $350, upgrading from a 970 screw Nvidia.

      • @[email protected]
        link
        fedilink
        English
        571 year ago

        That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.

        • @[email protected]
          link
          fedilink
          English
          4
          edit-2
          1 year ago

          Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd.

          There’s much more effort involved to produce modern GPU now. Either way, if NVidia would be truly greedy, they’d close gaming gpu business right away and would produce only AI accelerators. You can take same 4070, add $200 worth of GDDR chips to the layout and sell this for $15k minimum, shit would be on backorder.

        • @[email protected]
          link
          fedilink
          English
          51 year ago

          Yeah, I think it’s important not to lose perspective here and let expectations slide just because Nvidia are being more awful right now. Make no mistake, value went out the window a long time ago and AMD are also fucking us, just a little less hard than their main competitor. Even adjusting for inflation, what used to get you the top of the line now gets you last-gen midrange.

        • @[email protected]
          link
          fedilink
          English
          51 year ago

          I bought a GTX780 for $500 MSRP circa 2013. I considered that to be crazy expensive at the time, but I was going all out on that system. Currently I run a GTX1080Ti(bought used) with 11GB of VRAM and they want me to spend $600 for 1 more GB of VRAM? PS5 has 16 GB of shared memory, 16GB should be entry level of VRAM for a system thats expected to keep up with this generation of graphics. There’s no reason for Nvidia to do this other than to force users to upgrade sooner.

          Funny part is the market is so fucked that reviewers are lauding this a decent deal. I think the 1080Ti will last me until OLED matures and I finally upgrade from a 1080p monitor. According to the steam survey most gamers are in a similar boat.

  • @[email protected]
    link
    fedilink
    English
    1391 year ago

    Yep, it’s the RAM, but also just a mismatched value proposition.

    I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

    But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

  • @[email protected]
    link
    fedilink
    English
    551 year ago

    600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      1 year ago

      12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offer any advantage. Or do you think having 15fps average more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

      The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

      Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.

      PS I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one.

      • @[email protected]
        link
        fedilink
        English
        31 year ago

        D4 on Linux. Literally the only bottleneck is it eats 11GB of my 1080Ti’s VRAM for breakfast and then still wants lunch and dinner. Plays 4k on high with prefect fps otherwise. Starts glitching like crazy once VRAM is exhausted after 10-15 minutes.

        Zero issues on a 20GB card. I understand that shitty code on single game is not exactly universal example, but it is a valid reason to want more VRAM.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          1 year ago

          It’s not enough though and the sales are showing it. 7800xt is a decent card but it isnt an amazing offer, it is just a good one. For some people, It is a slightly better value for money option. But those nvidia things have value too. So the value proposition isnt as clearcut, even though it should be considering that AMD is behind.

          The steam stats should tell you what consumers think. And while consumers are not infallible, they are a pretty good indicator. The most popular amd card is the 580, which is arguably one of the best cards of all time. Except it came out 6 years ago. Did AMD have a better marketing back then? No. Did they have the performance crown? Nope. But that didnt stop the 580 from being an amazing card.

          The 7800xt could have been the new 580, mid/high end card, with decent vram. Except you could get the 580 for 200€, while the 7800xt costs literally three times as much. When your “good” card is so expensive, customers have higher expectations. It isnt just about running games well(cheaper cards can do that too), it is about luxury features, like ray tracing and upscaling tech.

          Imagine if the 7800xt was 400€. We wouldnt even have this conversation. But it isnt. In fact, in Europe it launched at basically the same price as a 4070. Even today, it is 50€-80€ cheaper. If nvidia is scamming us with inferior offers, why arent AMD offers infinitely better in value? Because AMD is also scamming us, just very slightly less so.

          • @[email protected]
            link
            fedilink
            English
            21 year ago

            $100 sure feels much more solid than rtx that a ton of games don’t even support. There are a bunch of people that just want to play in 4k and couldn’t care less about features you call luxury.

            That requires more VRAM and 7800xt and xtx deliver that perfectly.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

        This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn’t enough anymore. Even though the cards were still rasterizing quickly enough they weren’t useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.

        And I’m not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don’t remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn’t deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.

        At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren’t current anymore, and it was only used in my Linux server.

  • Altima NEO
    link
    fedilink
    English
    311 year ago

    The RAM is so lame. It really needed more.

    Performance exceeding the 3090, but limited by 12 gigs of RAM .

  • @[email protected]
    link
    fedilink
    English
    31 year ago

    Is this the one that they nerfed so that they could sell them in China around the US AI laws?

  • @[email protected]
    link
    fedilink
    English
    141 year ago

    So many options, with small differences between them, all overpriced to the high heavens. I’m sticking with my GTX 1070 since it serves my needs and I’ll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that’s $430 in today’s dollars.

  • @[email protected]
    link
    fedilink
    English
    24
    edit-2
    1 year ago

    I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

    Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

    • @[email protected]
      link
      fedilink
      English
      281 year ago

      An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

    • @[email protected]
      link
      fedilink
      English
      101 year ago

      Current gen consoles becoming the baseline is probably it.

      As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

      That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

    • @[email protected]
      link
      fedilink
      English
      61 year ago

      If only game developers optimized their games…

      The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

    • @[email protected]
      link
      fedilink
      English
      241 year ago

      Lmao

      We have your comment: what am I doing with 20gb vram?

      And one comment down: it’s actually criminal there is only 20gb vram

  • BargsimBoyz
    link
    fedilink
    English
    15
    edit-2
    1 year ago

    What’s going on? It’s overpriced and completely unnecessary for most people. There’s also a cost of living crisis.

    I play every game I want to on high graphics with my old 1070. Unless you’re working on very graphically intensive apps or you’re a pc master race moron then there’s no need for new cards.

    • @[email protected]
      link
      fedilink
      English
      41 year ago

      I still game 1080p and it looks fine. I’m not dropping 2500 bucks to get a 4k monitor and video card to run it when I won’t even register the difference during actual gameplay.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      It was a night and day difference going from a 1060 6gb to a 6700xt. The prices are still kida shit but that goes for everything

  • Shirasho
    link
    fedilink
    English
    261 year ago

    I don’t know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn’t going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn’t not skip over a 4070 Super just because it has 12GB of RAM.

    This is a card that targets 1440p. It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

    • mozz
      link
      fedilink
      91 year ago

      Is it weird that until I read this I forgot that GPUs can make graphics

    • atocci
      link
      fedilink
      21 year ago

      My monitor is only 1440p, so it’s just what i need. I ordered the Founders Edition card from Best Buy on a whim after I stumbled across it at launch time by coincidence. I’d been mulling over the idea of getting a prebuilt PC to replace my laptop for a few weeks at that point and was on the lookout for sales on ones with a 4070. Guess I’ll be building my own instead now.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      I think the only reason you’d really need that kind of grunt is on a 4K TV anyway, and even then you can use DLSS or whatever the other one is to upscale.

    • Deceptichum
      link
      fedilink
      61 year ago

      I’m fine playing at 30fps, I don’t really notice much of a difference. For me ram is the biggest influence in a purchase due to the capabilities it opens up for local AI stuff.

      • iAmTheTot
        link
        fedilink
        181 year ago

        If someone says they don’t notice a difference between 60 FPS and 120+ FPS, I think… okay, it is diminishing returns, 60 is pretty good. But if someone says they don’t notice a difference between 30 and 60… you need to get your eyes checked mate.

        • Deceptichum
          link
          fedilink
          131 year ago

          I notice a difference, it’s just not enough to make it a big deal for me. It’s like going from 1080 to 1440, you can see it but it’s not really an issue being on 1080.

          • @[email protected]
            link
            fedilink
            English
            61 year ago

            It depends on the game, quick action packed stuff you can see the jumping and in something like a shooter it can be a disadvantage.

            For something like Slay the Spire tho, totally fine.

            • @[email protected]
              link
              fedilink
              English
              31 year ago

              I’m at the age where if games require such quick reactions that the difference in FPS matters, I’m going to get my ass handed to me by the younguns anyway…

              • @[email protected]
                link
                fedilink
                English
                11 year ago

                Well maybe if you had a 240hz monitor… ;)

                Totally fair, just worth point out that it can/does make a difference in those games as it can literally mean the difference in firing where someone was rather then where they are because of how long it takes for you to see the next frame.

    • @[email protected]
      link
      fedilink
      English
      171 year ago

      It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

      There are many games that cut it awfully close with 12GB at 1440p, for some it’s actually not enough. And when Nvidia pushes Raytracing as hard as they do, not giving us the little extra memory we need for that is just a dick move.

      Whatever this card costs, 12GB of vram is simply not appropriate.

  • Binthinkin
    link
    fedilink
    301 year ago

    You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

    Aren’t they taking the 4080 completely off the market too?

    • @[email protected]
      link
      fedilink
      151 year ago

      Aren’t they taking the 4080 completely off the market too?

      Apparently they stopped production of it months ago. Whatever still exists on shelves is only there because nobody has been buying them.

      Honestly this has been the worst 80-class Nvidia card ever. The GTX 480 was a complete joke but even that managed to sell ok.