• Avid Amoeba
    link
    fedilink
    1131 year ago

    For newer GPUs from the Turing, Ampere, Ada Lovelace, or Hopper architectures, NVIDIA recommends switching to the open-source GPU kernel modules.

    So 20-series onwards.

      • Irremarkable
        link
        fedilink
        32
        edit-2
        1 year ago

        Maybe it’s just because I’m older and more jaded, but that really feels like the last truly good era for GPUs.

        Those 10 series cards had a ton of staying power, and the 480/580 were such damn good value cards.

          • Norah (pup/it/she)
            link
            fedilink
            English
            41 year ago

            I bought a secondhand 1080 a couple years ago when the crypto bubble burst finally and it’s still serving my needs just fine. It could play Baldur’s Gate 3 just fine on release last year, which was the last “new” game I played on it. Seems like it’ll still be good for a few years to come so yeah.

        • @[email protected]
          link
          fedilink
          331 year ago

          It’s more that back then was a better time for price to performance value. The 3000 and 4000 series cards were basically linear upgrades in terms of price to performance.

          It’s an indicator that there haven’t been major innovations in the GPU space, besides perhaps the addition of the AI and Raytracing stuff, if you want to count those as upgrades.

          • Chaotic Entropy
            link
            fedilink
            English
            131 year ago

            It feels like the crypto mining goldrush really changed the way GPU manufacturers view the market.

            • @[email protected]
              link
              fedilink
              71 year ago

              I feel like AI has changed the game. Why sell retail when people are paying you billions to run LLMs in the cloud.

        • @[email protected]
          link
          fedilink
          English
          81 year ago

          That was mostly because the 20 series was so bad. Expensive, didn’t perform lightyears better to justify the price, raytracing wasn’t used in any games (until recently).

          The 30 series was supposed to be more of a return to form, then covid + mining ruined things.

          • @[email protected]
            link
            fedilink
            21 year ago

            I got a 2060 super and i must say i’m very happy, i do 3d stuff so the ray tracing was plenty useful and despite it getting a bit it fairs pretty great in most games and the price was okay at the time (500 €still a bit high since it was during the bitcoin mining madness =-=")

      • Avid Amoeba
        link
        fedilink
        31 year ago

        I think it works but the performance might not be ideal. Keep on the proprietary module.

        • Norah (pup/it/she)
          link
          fedilink
          English
          31 year ago

          (and probably isn’t allowed to)

          I doubt very much it’s about whether they are allowed too or not. They’re the ones at the top of the hardware supply chain, designing their own chips and having them fabricated. It’s them telling other companies, like Gigabyte and EVGA, what they are allowed or not allowed to do.

            • @[email protected]
              link
              fedilink
              91 year ago

              Hmdi 2.1 and the hdmi consortium prevented them from releasing code. It wasn’t even proprietary, just based on a licensed implementation from what I understood.

      • Zoot
        link
        fedilink
        31 year ago

        Yep! My pre-built 1660 super i got years ago is still chugging along amazingly as a streaming device for my steam deck.