• @[email protected]
    link
    fedilink
    English
    167 months ago

    I’m sure these will be great options in 5 years when the dust finally settles on the scalper market and they’re about to roll out RTX 6xxx.

    • @[email protected]
      link
      fedilink
      English
      17
      edit-2
      7 months ago

      Scalpers were basically non existent in the 4xxx series. They’re not some boogieman that always raises prices. They work under certain market conditions, conditions which don’t currently exist in the GPU space, and there’s no particular reason to think this generation will be much different than the last.

      Maybe on the initial release, but not for long after.

      • @[email protected]
        link
        fedilink
        English
        17 months ago

        Scalpers were basically non existent in the 4xxx series.

        Bull fucking shit. I was trying to buy a 4090 for like a year. Couldn’t find anything even approaching retail. Most were $2.3k+.

      • @[email protected]
        link
        fedilink
        English
        10
        edit-2
        7 months ago

        The 4090 basically never went for MSRP until Q4 2024… and now it’s OOS everywhere.

        nobody scalped the 4080 because it was shit price/perf. 75% of the price of a 4090 too… so why not just pay the extra 25% and get the best?

        the 4070ti (aka base 4080) was too pricey to scalp given that once you start cranking up the price then why not pay the scalper fee for a 4090.

        Things below that are not scalp worthy.

        • SaltySalamander
          link
          fedilink
          27 months ago

          The 4090 basically never went for MSRP until Q4 2024

          This had nothing to do with scalpers though. Just pure corporate greed.

          • @[email protected]
            link
            fedilink
            English
            17 months ago

            I’m not so sure. Companies were definitely buying many up, but they typically stick to business purchasing channels like CDW/Dell/HP etc.

            Consumer boxed cards sold by retailers might have went to some small businesses/startups and independent developers but largely they were picked up by scalpers or gamers.

            I work in IT and have never went to a store to buy a video card unless it was an emergency need to get a system functional again. It’s vastly preferred to buy things through a VAR where warranties and support are much more robust than consumer channels.

  • @[email protected]
    link
    fedilink
    English
    67 months ago

    No thanks; I’m good. Still feeling the sting over buying my 4080 Super last spring. Also it’s doing me just fine for my work and for games.

  • @[email protected]
    link
    fedilink
    English
    97 months ago

    The far cry benchmark is the most telling. Looks like it’s around a 15% uplift based on that.

    • Subverb
      link
      fedilink
      English
      47 months ago

      About two months ago I upgraded from 3090 to 4090. On my 1440p I basically couldn’t tell. I play mostly MMOs and ARPGs.

      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        7 months ago

        Those genres aren’t really known for having brutal performance requirements. You have to play the bleeding edge stuff that adds prototype graphics postprocessing in their ultra or optional settings.

        When you compare non RT performance the frame delta is tiny. When you compare RT it’s a lot bigger. I think most of the RT implementations are very flawed today and that it’s largely snake oil so far, but some people are obsessed.

        I will say you can probably undervolt / underclock / power throttle that 4090 and get great frames per watt.

  • @[email protected]
    link
    fedilink
    English
    567 months ago

    By rendering only 25% of the frames we made DLSS4 100% faster than DLSS3. Which only renders 50% of the frames! - NVIDIA unironically

    • @[email protected]
      link
      fedilink
      English
      26
      edit-2
      7 months ago

      You living in the past, rendering 100% of the frames is called Brute Force Rendering, that’s for losers.

      With only 2k trump coins our new graphic card can run Cyberpunk 2077, a game from 4 years ago, at 30 fps with RTX ON but you see with DLSS and all the other crap magic we can run at 280 FPS!!! Everything is blurry and ugly as fuck but look at the numbers!!!

  • @[email protected]
    link
    fedilink
    English
    257 months ago

    Okay losers, time for you to spend obscene amounts to do your part in funding the terrible shit company nvidia.

            • @[email protected]
              link
              fedilink
              English
              17 months ago

              Sure, if your understanding of economy is based off of a lemonade stand a dollar a cup…

              • SaltySalamander
                link
                fedilink
                17 months ago

                If I buy a used card from an individual, nVidia doesn’t get a penny. How is that complicated?

                • @[email protected]
                  link
                  fedilink
                  English
                  17 months ago

                  Ok…they do. They get increased market share, which is measurable and valuable to shareholders, increasing stock value and increasing company liquidity.

            • @[email protected]
              link
              fedilink
              English
              67 months ago

              You still gotta use their shitty NVIDIA experience app (bloated with ads that make NVIDIA money when you open it), and you buying a used NVIDIA card increases demand (and thus prices) on all NVIDIA cards.

              If you are a gamer and not doing AI stuff then buying a non-NVIDIA card is entirely an option.

  • The Hobbyist
    link
    fedilink
    English
    72
    edit-2
    7 months ago

    The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.

    All Nvidia performance plots I’ve seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.

    Edit:

    • @[email protected]
      link
      fedilink
      English
      297 months ago

      Thanks for the heads up.

      I really don’t like that new Frame interpolation tech and think it’s almost only useful to marketers but not for actual gaming.

      At least I wouldn’t touch it with any competitive game.

      Hopefully we will get third party benchmarks soon without the bullshit perfs from Nvidia.

      • warm
        link
        fedilink
        157 months ago

        Yes, fuck all this frame generation and upscaling bs.

        • Lucy :3
          link
          fedilink
          English
          17 months ago

          I wouldn’t say fuck upscaling entirely, especially for 4k it can be useful on older cards. FSR made it possible to play Hitman on my 1070. But yeah, if I’m going for 4k I probably want very good graphics too, eg. in RDR2, and I don’t want any upscaling there. I’m so used to native 4k that I immediately spot if it’s anything else - even in Minecraft.

          And frame generation is only useful in non-competetive games where you already have over 60 FPS, otherwise it will still be extremely sluggish, - in which case, it’s not realy useful anymore.

          • warm
            link
            fedilink
            27 months ago

            The point is, hardware is powerful enough for native 4K, but instead of that power being used properly, games are made quickly and then upscaling technology is slapped on at the end. DLSS has become a crutch and Nvidia are happy to keep pushing it and keeping a reason for you to buy their GPUs every generation, because otherwise we are at diminishing returns already.

            It’s useful for use on older hardware, yes, I have no issue with that, I have issue with it being used on hardware that could otherwise easily run 4K 120FPS+ with standard rasterization and being marketed as a ‘must’.

        • @[email protected]
          cake
          link
          fedilink
          English
          137 months ago

          From personal experience, I’d say the end result for framegen is hit or miss. In some cases, you get a much smoother framerate without any noticeable downsides, and in others, your frame times are all over the place and it makes the game look choppy. For example, I couldn’t play CP2077 with franegen at all. I had more frames, but in reality it felt like I actually had fewer. With Ark Survival Ascended, I’m not seeing any downside and it basically doubled my framerate.

          Upscaling, I’m generally sold on. If you try to upscale from 1080p to 4K, it’s usually pretty obvious, but you can render at 80% of the resolution and upscale the last 20% and get a pretty big framerate bump while getting better visuals than rendering at 100% with reduced settings.

          That said, I would rather have better actual performance than just perceived performance.

        • Dark Arc
          link
          fedilink
          English
          67 months ago

          Eh I’m pretty happy with the upscaling. I did several tests and upscaling won out for me personally as a happy middle ground to render Hunt Showdown at 4k vs running at 2k with great FPA and no upscaling or 4k with no upscaling but bad FPS.

          • warm
            link
            fedilink
            97 months ago

            Legitimate upscaling is fine, but this DLSS/FSR ML upscaling is dogshit and just introduces so many artifacts. It has become a crutch for developers, so they dont build their games properly anymore. Hardware is so strong and yet games perform worse than they did 10 years ago.

            • Dark Arc
              link
              fedilink
              English
              2
              edit-2
              7 months ago

              I mean this is FSR upscaling that I’m referring to. I did several comparisons and determined that it looked significantly better to upscaling using FSR from 2K -> 4k than it did to run at 2k.

              Hunt has other ghosting issues but they’re related to CryEngine’s fake ray tracing technology (unrelated to the Nvidia/AMD ray tracing) and they happen without any upscaling applied.

      • @[email protected]
        link
        fedilink
        English
        4
        edit-2
        7 months ago

        On the site with the performance graphs, Farcry and Plague Tale should be more representative, if you want to ignore FG. That’s still only two games, with first-party benchmarks, so wait for third-party anyway.

  • @[email protected]
    link
    fedilink
    English
    137 months ago

    My last new graphics card was a 1080, I‘ve bought second hand since then and will keep doing that cause these prices are…

      • SaltySalamander
        link
        fedilink
        17 months ago

        Still using a 1080 Ti and I definitely have lots of reasons to upgrade, but I’m not willing to spread the cheeks that wide.

  • @[email protected]
    link
    fedilink
    English
    447 months ago

    LOL, their demo shows Cyberpunk running at a mere 27fps on the 5090 with DLSS off. Is that supposed to sell me on this product?

    • @[email protected]
      link
      fedilink
      English
      57 months ago

      Their whole gaming business model now is encouraging devs to stick features that have no hope of rendering quickly in order to sell this new frame generation rubbish.

          • @[email protected]
            link
            fedilink
            English
            27 months ago

            Whether it’s 50% or 200% it’s pointless if the avg FPS can’t even reach the bare minimum of 30.

          • kat
            link
            fedilink
            English
            4
            edit-2
            7 months ago

            Not every game has frame gen… not everybody wanna introduce lag to input. So 50% is 100% sketchy marketing. You can keep your 7 frames, Imma wait for 6090

            • @[email protected]
              link
              fedilink
              English
              17 months ago

              Both figures are without DLSS, FG, whatever. Just native 4k with Path Tracing enabled, that’s why it’s so low.

              The sketchy marketing is comparing a 5070 with a 4090, but that’s not what this is about.

              • kat
                link
                fedilink
                English
                37 months ago

                Like I said, I base performance without frame gen. 5090 is not twice as powerful as a 4090, which they advertise, without frame gen.

    • VindictiveJudge
      link
      fedilink
      English
      27 months ago

      Unfortunately, that’s the anti-scalper countermeasure. Crippling their crypto mining potential didn’t impact scalping very much, so they increased the price with the RTX 40 series. The RTX 40s were much easier to find than the RTX 30s were, so here we are for the RTX 50s. They’re already on the edge of what people will pay, so they’re less attractive to scalpers. We’ll probably see an initial wave of scalped 3090s for $3500-$4000, then it will drop off after a few months and the market will mostly have un-scalped ones with fancy coolers for $2200-$2500 from Zotac, MSI, Gigabyte, etc.

      • @[email protected]
        link
        fedilink
        English
        47 months ago

        The switch from proof of work to proof of stake in ETH right before the 40 series launch was the primary driver of the increased availability.

      • @[email protected]
        link
        fedilink
        English
        37 months ago

        Not really a countermeasure, but the scalping certainly proved that there is a lot of people willing to buy their stuff at high prices.

      • @[email protected]
        link
        fedilink
        English
        37 months ago

        The existence of scalpers means demand exceeds supply. Pricing them this high is a countermeasures against scalpers…in that Nvidia wants to make the money that scalpers would have made .

      • SaltySalamander
        link
        fedilink
        17 months ago

        No, it’s a direct result of observing the market during those periods and seeing the lemmings beating down doors to pay 600-1000 dollars over MSRP. They realized the market is stupid and will bear the extra cost.

    • @[email protected]
      link
      fedilink
      English
      97 months ago

      Nvidia is just doing what every monopoly does, and AMD is just playing into it like they did on CPUs with Intel. They’ll keep competing for price performance for a few years then drop something that drops them back on top (or at least near it).

  • moonlight
    link
    fedilink
    97 months ago

    I think it’s going to be a long time before I upgrade my graphics card with these prices.

  • kingthrillgore
    link
    fedilink
    English
    147 months ago

    Two problems, they are big ones:

    1. The hardware is expensive for a marginal improvement
    2. The games coming out that best leverage the features like Ray tracing are also expensive and not good
    • @[email protected]
      link
      fedilink
      English
      47 months ago

      Nvidia claims the 5070 will give 4090 performance. That’s a huge generation uplift if it’s true. Of course, we’ll have to wait for independent benchmarks to confirm that.

      The best ray tracing games I’ve seen are applying it to older games, like Quake II or Minecraft.

      • lazynooblet
        link
        fedilink
        English
        107 months ago

        I expect they tell us it can achieve that because under the hood DLSS4 gives it more performance if enabled.

        But is that a fair comparison?

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          7 months ago

          They’ve already said it’s all because of DLSS 4. The 5070 needs the new 4x FG to match the 4090, although I don’t know if the 4090 has the “old” 2x FG enabled, probably not.