Sorry if I’m not the first to bring this up. It seems like a simple enough solution.

    • @[email protected]
      link
      fedilink
      12 years ago

      Not getting enough sales, gotta jack up the price so they make the same amount of money. Seems legit.

  • 520
    link
    fedilink
    262 years ago

    The problem is, the ML people will continue to buy Nvidia. They don’t have a choice.

    • @[email protected]
      link
      fedilink
      182 years ago

      Even a lot of CG professional users are locked into NVIDIA due to software using CUDA. It sucks.

  • @[email protected]
    link
    fedilink
    92 years ago

    Similarly if nvidia wanted me to buy their cards they’d get their drivers sorted out. While plugging in an amd card just works with literally no setup from me, nvidia will get no money from me

    • @[email protected]
      link
      fedilink
      English
      02 years ago

      Why would datacenters be buying consumer grade cards? Nvidia has the A series cards for enterprise that are basically identical to consumer ones but with features useful for enterprise unlocked.

      • @[email protected]
        link
        fedilink
        22 years ago

        I think you mean their Tesla line of cards? The A (e.g. A100) stands for the generation name (e.g. Ada or Ampere, don’t remember which one got the A), and that same name applies to both the consumer line (GeForce and Quadro) and the data’s centre cards.

        The hardware isn’t identical either. I don’t know all the differences, but I know at least that the data centre cards have SXM connectors that greatly increase data throughput.

    • @[email protected]
      link
      fedilink
      English
      102 years ago

      There are also games that don’t render a square mile of a city in photorealistic quality.

      • Valdair
        link
        fedilink
        52 years ago

        Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.

        Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.

        I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.

        Nvidia and AMD just keep cranking the power on the cards, they’re now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just… is not. Especially not in first person games.

        • @[email protected]
          link
          fedilink
          English
          22 years ago

          W/r/t Baldur’s Gate 3, I don’t think the bottleneck is the GPU. Act 3 is incredibly ambitious in terms of NPC density, and AI is one of those things that’s still very hard to parallelize.

        • @[email protected]
          link
          fedilink
          1
          edit-2
          2 years ago

          Graphical fidelity has not materially improved since the days of Crysis 1

          I think you may have rose tinted glasses on this point, the level of detail in environments and accuracy of shading, especially of dynamic objects, has increased greatly. Material shading has also gotten insanely good compared to what we had then. Just peep the PBR materials on guns in modern FPS games, it’s incredible, Crysis just had normals and specular maps all black or grey guns that are kinda shiny and normal mapped. If you went inside of a small building or whatever there was hardly any shading or shadows to make it look right either.

          Crysis is a very clever use of what was available to make it look good, but we can do a hell of a lot better now (without raytracing) At the time shaders were getting really computationally cheap to implement so those still look relatively good, but geometry and framebuffer size just did not keep pace at all, tesselation was the next hotness after that because it was supposed to help fix the limited geometry horsepower contemporary cards had by utilizing their extremely powerful shader cores to do some of the heavy lifting. Just look at the rocks in Crysis compared to the foliage and it’s really obvious this was the case. Bad Company 2 is another good example of good shaders with really crushingly limited geometry though there are clever workarounds there to make it look pretty good still.

          I could see the argument that the juice isn’t worth the squeeze to you, but graphics have very noticeably advanced in that time.

      • metaStatic
        link
        fedilink
        12 years ago

        I’m currently part of the problem and this is so fucking true. Games have really stopped pushing the envelope because they either have to be cross platform compatible or they’re not even PC first.

        3D mark is the only thing I could find to put a dent in my 3060ti

  • @[email protected]
    link
    fedilink
    262 years ago

    Funnily enough I just, like an hour before reading this post bought an AMD card. And I’ve been using NVIDIA since the early 00’s.

    For me it’s good linux support. Tired of dealing with their drivers.

    Will losing me as a customer make a difference to NVIDIA? Nope. Do I feel good about ditching a company that doesn’t treat me well as a consumer? Absolutely!

    • @[email protected]
      link
      fedilink
      52 years ago

      Absolutely indeed! I’ll never buy an Nvidia card because of how anti-customer they are. It started with them locking out PCI passthrough when I was building a gaming Linux machine like ten years ago.

      I wonder if moving people towards the idea of just following the companies that don’t treat them with contempt is an angle that will work. I know Steph Sterling’s video on “consumer” vs “customer” helped crystallize that attitude in me.

      • @[email protected]
        link
        fedilink
        22 years ago

        I’m not familiar with that video but I’m intrigued. I’ll have to check it out.

        I don’t know. I don’t have much faith in people to act against companies in a meaningful way. Amazon and Walmart are good examples. I feel like it’s common knowledge at this point that these companies are harmful but still they thrive.

    • BlinkerFluidOP
      link
      fedilink
      English
      6
      edit-2
      2 years ago

      Suddenly your video card is as mundane and trivial a solved problem as your keyboard or mouse.

      It just works and you never have to even think about it.

      To even consider that a reality as someone who’s used Linux since Ubuntu 8.10… I feel spoiled.

      • Lifted_lowered
        link
        fedilink
        2
        edit-2
        2 years ago

        Those were rough days. I started with Dapper Drake but there was no way to actually get my trackpad drivers until 8.04. Kudos for sticking with linux

        • BlinkerFluidOP
          link
          fedilink
          English
          22 years ago

          I was hooked. It was the first time my PC felt as transparent and lie-free as notebook paper.

          Like, there’s nothing to hide because nothing is. It’s pure, truthful freedom and that meant more to me than raw usability. I tried to do everything possible on Linux that i was told I couldn’t do, hell, I ran Team Fortress 2 and Half Life in wine way pre-proton.

          and it sucked, but it was cool tho!

      • @[email protected]
        link
        fedilink
        2
        edit-2
        2 years ago

        Don’t even get me started on linux audio support.

        I recall exactly once back in the day that Ubuntu actually just played audio through a laptop I installed it on and I damn near lost my mind.

        like 30 minutes ago I installed Mint on a laptop and literally everything just worked as if I installed windows from the backup image. (I’m not sure power states are working 100% but it’s close enough and probably would with 3rd party driver)

        • BlinkerFluidOP
          link
          fedilink
          English
          1
          edit-2
          2 years ago

          I used some Ubuntu derivative for recording shitty music me and my buddy made in a trailer. OSS off of a turtle beach soundcard with a hacked together driver, crammed into a shitty Windows Vista era desktop.

          I felt like some sort of junk wizard.

          I use arch these days, Garuda mainly. I’ve done the whole song and dance from Arch to Gentoo. I know the system, now I want to relax and let something I suck at, giving myself features be more in the hands of a catering staff of folks and the Garuda boys know how to pamper.

          The dragons kinda… yeah, the art’s kinda cringe but damn, this is the definition of fully featured.

          • @[email protected]
            link
            fedilink
            1
            edit-2
            2 years ago

            I was definitely a junk wizard back in the day, as I’ve grown older and have less time and more money I just want stuff that works. I used to build entire (pretty acceptably decent) home theater systems out of $150 worth of stuff off craigslist and yard sales. When you know how it all works you can cobble together some real goofy shit that works.

            It’s about the exact amount of cringe I expect from a non mainstream linux distro. but aye who doesn’t like dragons and eagles? I’ll have to try it out on this old zenbook.

      • @[email protected]
        link
        fedilink
        52 years ago

        You’ll almost certainly be perfectly fine. AMD cards generally work a lot smoother, and the open source drivers means things can be well supported all the time and it’s great.

        On Nvidia, in my experience, it’s occasionally a hassle if you’re using a bleeding edge kernel (which you won’t be if you’re on a “normal” distro), where something changes and breaks the proprietary Nvidia driver… And if Nvidia drops support for your graphics card in their driver you may have issues upgrading to a new kernel because the old driver won’t work on the new kernel. But honestly, I wouldn’t let any of this get in the way of running Linux. You have a new card, you’ll probably upgrade before it’s an issue, and the proprietary driver is something we all get mad about, but it mostly works well and there’s a good chance you won’t really notice any issues.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        2 years ago

        Nvidia on Linux is better than ever before, even over the past couple of months there were tremendous improvements. As long as you use X11 you will have a pretty good gaming experience across the board, but Nvidia driver updates are often a headache. With AMD, you don’t even have to think about it, unless Davinci Resolve forces you to, but even then it’s a better situation.

        Anyway, comments like “you’ll be 100% fine” are not really based on reality, occasionally Nvidia will break things. However if you use the BTRFS filesystem with Timeshift (or even that wretched Snapper) set up, then this is merely a minor inconvenience. (for example before I moved to Arch, Ubuntu pushed an Nvidia update that broke my system, happened in June…)

      • @[email protected]
        link
        fedilink
        42 years ago

        I ran my 1060 just fine for a few year. Nvidia has an official, but proprietary driver that might not run well on some distro’s. Personally I haven’t had any issues, though it would be better to stick with xorg and not wayland. Wayland support on nvidia I’ve heard isn’t great, but it does work

        • @[email protected]
          link
          fedilink
          42 years ago

          This. You’re mostly at the mercy of their proprietary drivers. There’s issues, like lagging Wayland support as mentioned. They will generally work though, I don’t want to dissuade you from trying out Linux.

          There is an open source driver too, but it doesn’t perform well.

      • @[email protected]
        cake
        link
        fedilink
        22 years ago

        Depends on the distro. Otherwise you’ll have to install the nvidia drivers yourself, and if memory serves it’s not as smooth of a process as on Windows. If you use Pop OS you should be golden, as that Linux distro does all the work for you.

  • Big P
    link
    fedilink
    English
    92 years ago

    You know that, I know that. Most people browsing here know that. Anyone you speak to who buys GPUs would probably also agree. Still not gonna change.

  • @[email protected]
    link
    fedilink
    English
    22 years ago

    I was originally eyeballing to get 3080 pre-covid then everything goes to the moon with crypto, can’t even buy 2080 under 1k. So I stuck with my 1080, some shit happened to my MB so I had to upgrade and that’s where I decide to switch to all AMD, took me another couple months before I finally snatch a 6800XT at amd direct, it was not cheap($812 CAD after tax) but like almost 500 cheaper than what you can find on any other vendor tries to gouge you with their bundles.

    Quite happy with that and their software. BUT, there are some weird crash related the screen sleep after certain version of driver so I turn off screen manually. A newer driver seems to fix that about a month or 2 ago. I will wait a couple update to revert my screen sleep policy again. Most modern game runs quite well.

    Honest the newer 7900 ones are priced too high so I might either skip entire gen or wait until 8900 is out then get a good 7900 XT for cheaper price. Basically I have no intention to buy any thing priced higher than 800 pre-tax.

    With cheap ram and everything else, there is really no reason for GPU to still keep at that price point. Remember 3080 was priced $699? Yep, that’s where I get my budget number of <800CAD.

  • @[email protected]
    link
    fedilink
    6
    edit-2
    2 years ago

    Well when AMD finally catches up with nVidia and offers a high-end GPU with FG and decent Ray Tracing, I’ll gladly switch. I’d love nothing more than to have an all-AMD PC.

  • @[email protected]
    link
    fedilink
    92 years ago

    I went with best AMD I could get at the time, 7900xtx sapphire nitro. For gaming, it’s already really good, I can use raytracing, although not on best settings on some games,but in most cases I can just max out the settings.

    Main complaint atm is that self hosting AI is much more annoying than it would be on nvidia. I usually get everything to work eventually, but it always needs that extra level of technical fuckery.

  • @[email protected]
    link
    fedilink
    372 years ago

    All of Lemmy and all of Reddit could comply with this without it making a difference.

    And the last card I bought was a 1060, a lot of us are already basically doing this.

    You have not successfully unionized gaming hardware customers with this post.

    • BlinkerFluidOP
      link
      fedilink
      English
      7
      edit-2
      2 years ago

      One less sale is victory enough. It’s one more than before the post.

    • @[email protected]
      link
      fedilink
      22 years ago

      Buddy all of reddit is hundreds of millions of people each month. If even a small fraction of them build their own PCs, they’d have a massive impact on nVidia sales

            • @[email protected]
              link
              fedilink
              22 years ago

              I wasn’t able to find something outlining just the sales of the 4000 series cards, but the first graphic of this link at least has their total desktop GPU sales, which comes out to 30.34 million in 2022. Let’s put a fraction of hundreds of millions at 5% of 200 million to be generous to your argument. That’s 10 million. Then let’s say these people upgrade their GPUs once every 3 years, which is shorter than the average person, but the average person also isn’t buying top-of-the-line GPUs. So 3.33 million. 3.33/30.34 is 10.9% of sales.

              So even when we’re looking at their total sales and not specifically just the current gen, assume 200 million reddit users a month when it’s closer to 300, and assume the people willing to shell out thousands of dollars for the best GPU aren’t doing so every single time a new generation comes out, we’re still at 11% of their sales.

  • Excel
    link
    fedilink
    English
    42 years ago

    And as soon as they have any competitors we might consider it

  • Jordan Lund
    link
    fedilink
    English
    12 years ago

    As a console gamer, I don’t have to worry about it. Xbox Series X, PS5, Steam Deck are all AMD based. The Switch is Nvidia, but honestly, I can’t remember the last time I turned the Switch on.