Sorry if I’m not the first to bring this up. It seems like a simple enough solution.
People did stop buying them. Their consumer GPU shipments are the lowest they’ve been in over a decade.
But consumer habits aren’t the reason for the high prices. It’s the exploding AI market. Nvidia makes even higher margins on chips they allocate to parts for machine learning in data centers. As it is, they can’t make enough chips to fill the demand for AI.
I’m doing my part! (with an all-AMD setup)
You know that, I know that. Most people browsing here know that. Anyone you speak to who buys GPUs would probably also agree. Still not gonna change.
I was originally eyeballing to get 3080 pre-covid then everything goes to the moon with crypto, can’t even buy 2080 under 1k. So I stuck with my 1080, some shit happened to my MB so I had to upgrade and that’s where I decide to switch to all AMD, took me another couple months before I finally snatch a 6800XT at amd direct, it was not cheap($812 CAD after tax) but like almost 500 cheaper than what you can find on any other vendor tries to gouge you with their bundles.
Quite happy with that and their software. BUT, there are some weird crash related the screen sleep after certain version of driver so I turn off screen manually. A newer driver seems to fix that about a month or 2 ago. I will wait a couple update to revert my screen sleep policy again. Most modern game runs quite well.
Honest the newer 7900 ones are priced too high so I might either skip entire gen or wait until 8900 is out then get a good 7900 XT for cheaper price. Basically I have no intention to buy any thing priced higher than 800 pre-tax.
With cheap ram and everything else, there is really no reason for GPU to still keep at that price point. Remember 3080 was priced $699? Yep, that’s where I get my budget number of <800CAD.
In this particular case, it’s a bit more complicated.
I suspect the majority of 30x0 & 40x0 card sales continue to be for non-gaming or hybrid uses. I suspect that if pure gamers stopped buying them today for months, it wouldn’t make much of a difference to their bottom line.
Until there’s reasonable competition for training AI models at reasonable prices, people are going to continue buying their cards because it’s the most cost-effective thing – even at the outrageous prices.
China is still buying gimped cards because what other choice do they have
Exactly. The only other choices here are to buy used with a risk or wait longer to upgrade.
What other company besides AMD makes GPUs, and what other company makes GPUs that are supported by machine learning programs?
AMD supports ML, its just a lot of smaller projects are made with CUDA backends, and dont have developers there to switch from CUDA to OpenCL or similar.
Some of the major ML libraries that used to built around CUDA like Tensorflow has already made non CUDA branches, but thats only because tensorflow is open source, ubiquitous in the scene and litterally has google behind it.
ML for more niche uses basically is in the chicken and egg situation. People wont use other gpus for ML because theres no dev working on non CUDA backends. No ones working on non CUDA backends because the devs end up buying Nvidia, which is basically what Nvidia wants.
There are a bunch of followers but a lack in of leaders to move the direction in a more open compute environment.
Huh, my bad. I was operating off of old information. They’ve actually already released the sdk and apis I was referring to.
I jumped to team red this build.
I have been very happy with my 7900XTX.
4K max settings / FPS on every game I’ve thrown at it.
I don’t play the latest games, so I guess I could hit a wall if I play the recent AAA releases, but many times they simply don’t interest me.No joke, probably intel. The cards won’t hold a candle to a 4090 but they’re actually pretty decent for both gaming and ML tasks. AMD definitely needs to speed up the timeline on their new ML api tho.
Problem with Intel cards is that they’re a relatively recent release, and not very popular yet. It’s going to be a while before games optimize for them.
For example, the ARC cards aren’t supported for Starfield. Like they might run but not as well as they could if Starfield had optimized for them too. But the card’s only been out a year.
The more people use Arc the quicker it becomes mainstream and optimised for but arc is still considered “beta” and slow in peoples minds even though there were huge improvements and the old benchmarks don’t hold any value anymore. chicken and Egg problem. :/
Disclaimer: i have an arc 770 16GB because every other sensible upgrade path would have cost 3x-4x more for the same performance uplift (and I’m not buying an 8GB card in 2023+) but now I’m starting to get really angry at people blaming Intel for “not supporting this new game” - all that gpus should support is the graphics API to the letter of the specification, all this day-1 patching and driver hotfixes to make games run decent is bs. Games need to feed the API and GPUs need to process what the API tells it to, nothing more nothing less. It’s a complex issue and i think Nvidia held the monopoly for too long, everything is optimised for Nvidia at the cost of making it worse for everyone else.
Isn’t the entire point of DirectX and OpenGL that it abstracts away the GPU-specific details? You write code once and it works on any graphics card that supports the standard? It sounds like games are moving towards what we had in the old days, where they have specific code per graphics card?
I think the issue started with gpu-architecture tailored technologies like physx or gameworks but im probably wrong. For example I have nothing against physx but it only runs on nvidia cores natively (fast), i have an issue when there’s a monetary incentive or exclusive partnering of nvidia and game studios - so if you want to play the game with all the features, bells and whistles, it was designed with you would need to also buy their overpriced (and current gen: underperforming) gpus just because you’d be missing out on features or performance on any other gpu architecture.
If this trend continues everybody will need a €1k+ gpu from nvidia and a €1k+ gpu from AMD and hot-swap between them depending on what game you wish to play.
Apple. Their own processors have both GPUs and AI accelerators. But for some reason, the industry refuses to use them.
My Intel Arc 750 works quite well at 1080 and is perfectly sufficient for me. If people need hyper refresh rates and resolution and all all the bells well then have fun paying for it. But if you need functional, competent gaming, at US$200 Arc is nice.
Exactly, Nvidia doesn’t have real competition. In gaming sure, but no one is actually competiting with CUDA.
AMD has ROCm which tries to get close. I’ve been able to get some CUDA applications running on a 6700xt, although they are noticeably slower than running on a comparable NVidia card. Maybe we’ll see more projects adding native ROCm support now that AMD is trying to cater to the enterprise market.
They kinda have that, yes. But it was not supported on windows until this year and is in general not officially supported on consumer graphics cards.
Still hoping it will improve, because AMD ships with more VRAM at the same price point, but ROCm feels kinda half assed when looking at the official support investment by AMD.
I don’t own any nvidia hardware out of principal, but ROCm is no where even close to cuda as far as mindshare goes. At this point I rather just have a cuda->rocm shim I can use, in the same was as directx->vulkan does with proton. Trying to fight for mindshare sucks, so trying to get every dev to support it just feel like a massive uphill battle.
And as soon as they have any competitors we might consider it
I mean, you could also say they’ll stop price gouging when competitors can meet their quality and support level. What’s the alternative?
AMD is a lot better than before both in terms of hardware and software, far better in fact. For people that don’t buy the top of the line card every other year AMD is a real alternative, more so than in a long time.
I love my 6900XTH, killer chip. if you don’t expect ray tracing it’s an absolute monster. I bought it because it was what was available on the shelf but ultimately I feel like it was the best choice for me. I don’t think I’d buy another nvidia card for a while with the shit they’ve pulled, and I’d previously bought dozens of EVGA nvidia cards.
I just wish FSR2 could be improved to reduce ghosting. it’s already OK so any improvement would make it very good.
The reason nvidia has the r&d budget it has is because you buy their cards. AMD is just now matching them on that but they used to have about half the resources.
I mean, I buy my cards second hand because I’m dirt poor, but I still want the best option for my money. It’s a hard sell to convince people to buy inferior products to expand the market.
Except that most of nvidia’s features are borderline gimmicks or will not have a long lifespan. DLSS will eventually die and we’ll all use FSR just like it happened with gsync vs freesync. Raytracing is very taxing still for all cards in all brands to be worth it. RTX voice (just like AMD noise supression) is not something that useful (I don’t even want it). What else do you have that could make an nvidia card “superior”? The only thing I can think of is that the most recent nvidia cards are way more energy efficient than amd’s.
The only thing I can think of is that the most recent nvidia cards are way more energy efficient than amd’s.
I think power draw is so close to even right now, that it’s not worth talking about.
edit: yep, I was wrong.
I just checked and it’s not thaaat close but yeah, it’s not a deal braker. From what I saw, the 7800xt consumes about 60w more than a 4070 so nvidia’s certainly better.
You are right, I checked some power draw benchmarks & it draws 45-50W more in regular gaming, it’s quite shocking ~
Do you read benchmarks before writing this kind of comment?
Well when AMD finally catches up with nVidia and offers a high-end GPU with FG and decent Ray Tracing, I’ll gladly switch. I’d love nothing more than to have an all-AMD PC.
Uhh they did. And the card prices are still high, so what now, genius?
Not getting enough sales, gotta jack up the price so they make the same amount of money. Seems legit.
The problem is, the ML people will continue to buy Nvidia. They don’t have a choice.
Even a lot of CG professional users are locked into NVIDIA due to software using CUDA. It sucks.
As a console gamer, I don’t have to worry about it. Xbox Series X, PS5, Steam Deck are all AMD based. The Switch is Nvidia, but honestly, I can’t remember the last time I turned the Switch on.
Funnily enough I just, like an hour before reading this post bought an AMD card. And I’ve been using NVIDIA since the early 00’s.
For me it’s good linux support. Tired of dealing with their drivers.
Will losing me as a customer make a difference to NVIDIA? Nope. Do I feel good about ditching a company that doesn’t treat me well as a consumer? Absolutely!
Have a 3060ti, was thinking of moving to Linux. Is there no support from Nvidia?
You’ll almost certainly be perfectly fine. AMD cards generally work a lot smoother, and the open source drivers means things can be well supported all the time and it’s great.
On Nvidia, in my experience, it’s occasionally a hassle if you’re using a bleeding edge kernel (which you won’t be if you’re on a “normal” distro), where something changes and breaks the proprietary Nvidia driver… And if Nvidia drops support for your graphics card in their driver you may have issues upgrading to a new kernel because the old driver won’t work on the new kernel. But honestly, I wouldn’t let any of this get in the way of running Linux. You have a new card, you’ll probably upgrade before it’s an issue, and the proprietary driver is something we all get mad about, but it mostly works well and there’s a good chance you won’t really notice any issues.
Nvidia on Linux is better than ever before, even over the past couple of months there were tremendous improvements. As long as you use X11 you will have a pretty good gaming experience across the board, but Nvidia driver updates are often a headache. With AMD, you don’t even have to think about it, unless Davinci Resolve forces you to, but even then it’s a better situation.
Anyway, comments like “you’ll be 100% fine” are not really based on reality, occasionally Nvidia will break things. However if you use the BTRFS filesystem with Timeshift (or even that wretched Snapper) set up, then this is merely a minor inconvenience. (for example before I moved to Arch, Ubuntu pushed an Nvidia update that broke my system, happened in June…)
I ran my 1060 just fine for a few year. Nvidia has an official, but proprietary driver that might not run well on some distro’s. Personally I haven’t had any issues, though it would be better to stick with xorg and not wayland. Wayland support on nvidia I’ve heard isn’t great, but it does work
This. You’re mostly at the mercy of their proprietary drivers. There’s issues, like lagging Wayland support as mentioned. They will generally work though, I don’t want to dissuade you from trying out Linux.
There is an open source driver too, but it doesn’t perform well.
Depends on the distro. Otherwise you’ll have to install the nvidia drivers yourself, and if memory serves it’s not as smooth of a process as on Windows. If you use Pop OS you should be golden, as that Linux distro does all the work for you.
Suddenly your video card is as mundane and trivial a solved problem as your keyboard or mouse.
It just works and you never have to even think about it.
To even consider that a reality as someone who’s used Linux since Ubuntu 8.10… I feel spoiled.
Those were rough days. I started with Dapper Drake but there was no way to actually get my trackpad drivers until 8.04. Kudos for sticking with linux
I was hooked. It was the first time my PC felt as transparent and lie-free as notebook paper.
Like, there’s nothing to hide because nothing is. It’s pure, truthful freedom and that meant more to me than raw usability. I tried to do everything possible on Linux that i was told I couldn’t do, hell, I ran Team Fortress 2 and Half Life in wine way pre-proton.
and it sucked, but it was cool tho!
Don’t even get me started on linux audio support.
I recall exactly once back in the day that Ubuntu actually just played audio through a laptop I installed it on and I damn near lost my mind.
like 30 minutes ago I installed Mint on a laptop and literally everything just worked as if I installed windows from the backup image. (I’m not sure power states are working 100% but it’s close enough and probably would with 3rd party driver)
I used some Ubuntu derivative for recording shitty music me and my buddy made in a trailer. OSS off of a turtle beach soundcard with a hacked together driver, crammed into a shitty Windows Vista era desktop.
I felt like some sort of junk wizard.
I use arch these days, Garuda mainly. I’ve done the whole song and dance from Arch to Gentoo. I know the system, now I want to relax and let something I suck at, giving myself features be more in the hands of a catering staff of folks and the Garuda boys know how to pamper.
The dragons kinda… yeah, the art’s kinda cringe but damn, this is the definition of fully featured.
I was definitely a junk wizard back in the day, as I’ve grown older and have less time and more money I just want stuff that works. I used to build entire (pretty acceptably decent) home theater systems out of $150 worth of stuff off craigslist and yard sales. When you know how it all works you can cobble together some real goofy shit that works.
It’s about the exact amount of cringe I expect from a non mainstream linux distro. but aye who doesn’t like dragons and eagles? I’ll have to try it out on this old zenbook.
Absolutely indeed! I’ll never buy an Nvidia card because of how anti-customer they are. It started with them locking out PCI passthrough when I was building a gaming Linux machine like ten years ago.
I wonder if moving people towards the idea of just following the companies that don’t treat them with contempt is an angle that will work. I know Steph Sterling’s video on “consumer” vs “customer” helped crystallize that attitude in me.
I’m not familiar with that video but I’m intrigued. I’ll have to check it out.
I don’t know. I don’t have much faith in people to act against companies in a meaningful way. Amazon and Walmart are good examples. I feel like it’s common knowledge at this point that these companies are harmful but still they thrive.
Both brands are still doing it, so I’m still not buying.
Sigh… maybe next year.
All of Lemmy and all of Reddit could comply with this without it making a difference.
And the last card I bought was a 1060, a lot of us are already basically doing this.
You have not successfully unionized gaming hardware customers with this post.
Buddy all of reddit is hundreds of millions of people each month. If even a small fraction of them build their own PCs, they’d have a massive impact on nVidia sales
Do you think the majority of nvidia’s customers are redditors?
Do you know what a fraction of hundreds of millions means?
Yeah, about 2-4% of total units sold in '22?
I wasn’t able to find something outlining just the sales of the 4000 series cards, but the first graphic of this link at least has their total desktop GPU sales, which comes out to 30.34 million in 2022. Let’s put a fraction of hundreds of millions at 5% of 200 million to be generous to your argument. That’s 10 million. Then let’s say these people upgrade their GPUs once every 3 years, which is shorter than the average person, but the average person also isn’t buying top-of-the-line GPUs. So 3.33 million. 3.33/30.34 is 10.9% of sales.
So even when we’re looking at their total sales and not specifically just the current gen, assume 200 million reddit users a month when it’s closer to 300, and assume the people willing to shell out thousands of dollars for the best GPU aren’t doing so every single time a new generation comes out, we’re still at 11% of their sales.
One less sale is victory enough. It’s one more than before the post.