Amd has been a shitshownof a company since their beginning. Don’t believe they wouldn’t be gouging if they could.
I wish AMD offered solid hardware ray tracing… Nvidia has a near-total monopoly on GPU rendering workstations, because there’s simply no competition.
There is no doubt that AMD is a better company than NVIDIA in OSS terms.
But don’t simp for a company, vote with your wallet and always look for the best and consumer friendly product.
For now, not gonna lie AMD is pretty rad, but I hope next generation Intel GPUs are competitive.
I think AMD is a great competitor and we need more competition to lay it to NVIDIA and AMD as well, BUT HOLY FUCK. I can’t stand AMD’s software/control panel vs NVIDIA’s.
I just switched from nvidia to and and I have the exact opposite feeling lol
Aye. The Nvidia control center was cool when I installed it for my Ti 4600 in 2002 and not much has changed. I’m not particularly fond but the aesthetics of the Radeon software, but it beats the heck out of the semi-useless GeForce experience. I have to make an account just to see if there’s a driver update available? I can’t even control fan speeds in Windows without third party software?
They’re both bad but in comparison Nvidia’s offering is garbage.
I am so fucking sick of having to make an account for everything, I swear to God
You don’t need an account for drivers. You can still get those for free off their website just like you could in 1999. You only need an account for their experience app.
I thought the current gen Intel ones are actually pretty decent. Solid budget choice for modern games.
If it can run them… I sold mine because they never actually fixed the drivers. Out of hundreds of games on my PC, it was able to run 3-4. This isn’t before their updates either. This happened 2 weeks ago. It can’t run davinci resolve despite having good encoders, it couldn’t even fucking run valorant Also they are only good in benchmarks, I found that my old 3050 was outperforming it in terms of fps.
i can’t encode my video with amd gpu, this is why i stay with nvidia and his Nvenc. When amd will propose this kind of use, maybe i will change my gpu
Why can’t you? Encoder has been on parity for years
Not OC, but per my last experience with it NVENC was way easier to work with.
You install the NVIDIA drivers, you install CUDA libs (in Fedora that’s separate, at least) and it works.
For AMD, you need to figure out that you need the proprietary driver for AMF (which didn’t have a proper installer for anything that wasn’t Ubuntu the last time I tried it) or be stuck with the unfortunately not as good VAAPI. After that you usually had to hunt for guides on how to use the encoder in the program you want (OBS used to be a particular nightmare for it, hopefully it got better with time).
I hope things got and continue to get better, specially since I’m 100% going to get an AMD setup after my laptop eventually dies.
I think DaVinci resolve for AMD had a fix by Nobara
GPU prices being affordable is definitely not a priority of AMD’s. They price everything to be barely competitive with the Nvidia equivalent. 10-15% cheaper for comparable raster performance but far worse RT performance and no DLSS.
Which is odd because back when AMD was in a similar performance deficit on the CPU front (Zen 1, Zen+, and Zen 2), AMD had absolutely no qualms or (public) reservations about pricing their CPUs where they needed to be. They were the value kings on that front, which is exactly what they needed to be at the time. They need that with GPUs and just refuse to go there. They follow Nvidia’s pricing lead.
not to mention except north america, in almost all countries amd gpu is always $100 more expensive than nvidia counterpart making it just non sense to buy any amd card unless you are just a fanboy
Say it loud and say it proud, cooperations are no one’s friend!
I have zen 2 and the apu is good enough for me, high end shit is always ridiculous.
Corporations are not our friends. 🤷♂️
I agree, it’s just strange from a business perspective too. Obviously the people in charge of AMD feel that this is the correct course of action, but they’ve been losing ground for years and years in the GPU space. At least as an outside observer this approach is not serving them well for GPU. Pricing more aggressively today will hurt their margins temporarily but with such a mindshare dominated market they need to start to grow their marketshare early. They need people to use their shit and realize it’s fine. They did it with CPUs…
something many people overlook is how intertwined nvidia, intel and amd are. not only does the personnel routinely switch between those companies but they also have the same top share holders. there’s no natural competition between them. it’s like a choreograhped light saber fight where all of them are swinging but none seem to have any intention to hit flesh. a show to make sure nobody says the m word.
…mayfabe?
They’re cycling out the old curse words. The Carlin ones are now fine. The new list is:
- Monopoly
- Union
- Rights
- Child labor
Put em all together and we’re getting “murc’d”
100%. Outside of brand loyalty, I just simply don’t see any reason to buy AMD’s higher tier GPUs over Nvidia right now. And that’s coming from a long, long time AMD fan.
Sure, their raster performance is comparable at times, but almost never actually beats out similar tiers from Nvidia. And regardless, DLSS virtually nullifies that, especially since the vast majority of games for the last 4 years or so now support it. So I genuinely don’t understand AMD trying to price similarly to Nvidia. Their high end cards are inferior in almost every objective metric that matters to the majority of users, yet still ask for $1k for their flagship GPU.
Sorry for the tangent, I just wish AMD would focus on their core demographic of users. They have phenomenal CPUs and middling GPUs, so target your demographics accordingly, i.e. good value budget and mid-tier GPUs. They had that market segment on complete lockdown during the RX 580 era, I wish they’d return to that. Hell, they figured it out with their console APUs. PS5/XSX are crazy good value. Maybe their next generation will shift that way in their PC segment.
If you’re running Linux there’s only one option
It’s especially egregious with high end GPUs. Anyone paying >$500 for a GPU is someone that wants to enable ray tracing, let alone at a $1000. I don’t get what AMD is thinking at these price points.
FSR being an open feature is great in many ways but long-term its hardware agnostic approach is harming AMD. They need hardware accelerated upscaling like Nvidia and even Intel. Give it some stupid name similar name (Enhanced FSR or whatever) and make it use the same software hooks so that both versions can run off the same game functions (similar to what Intel did with XeSS).
AMD still has better Linux support for now, which is about 90% of the reason I went with them for now.
We’ve come a long, long way, baby.
I’m confused, was there a time when i3 cores were better than i5?
It used to be for a while that i3 was dual core with hyper threading, where the i5 was quad core with no hyper threading, and the i7 was quad core with HT.
Oooh I see, thanks
Amd’s epyc server cpus would be like 64 Machamp. Mf is huge and requires a hell of a cooler. See them at the datacenter I work at and when I opened the server up I thought I was looking at a turbocharged car engine or something.
That’s very true, but perhaps I should have specified this is a very, very old meme (thus why we have come a long way). Probably 10-15 years old? Back when AMD really was struggling with performance issues, before they came back with the Ryzen series. Epyc servers are only like six years old, IIRC.
Intel does the same thing with drivers
Nvidia is only now catching up
In a somewhat related note. Would anyone be able to recommend a upgrade/sidegrade option to go amd over my 3080ti? Just been meaning to be done with Nvidia
6950xt is a good value if you can find it on sale.
I have read many of the comments in the thread, but there is a very basic question I hope someone can help me with: what does the OP even mean?
I know what AMD is and what they do, but “taking W’s”? And “giving them away”?
“W” is a letter often used to represent a “Win” which I assume is what’s meant here since that’s what AMD have been doing.
My problem when buying my last GPU is that AMD’s answer to CUDA, ROCm, was just miles behind and not really supported on their consumer GPUs. From what I se now that has changed for the better, but it’s still hard to trust when CUDA is so dominant and mature. I don’t want to reward NVIDIA, but I want to use my GPU for some deep learning projects too and don’t really have a choice at the moment.
I’ve become more and more convinced that considerations like yours, which I do not understand since I don’t rely on GPUs professionally, have been the main driver of Nvidia’s market share. It makes sense.
The online gamer talk is that people just buy Nvidia for no good reason, it’s just internet guys refusing to do any real research because they only want a reason to stroke their own egos. This gamer-based GPU market is a loud minority whose video games don’t seem to rely too heavily on any card features for decent performance, or especially compatibility, with what they’re doing. Thus, the constant idea that people “buy Nvidia for no good reason except marketing”.
But if AMD cards can’t really handle things like machine learning, then obviously that is a HUGE deficiency. The public probably isn’t certain of its needs when it spends $400 on a graphics card, it just notices that serious users choose Nvidia for some reason. The public buys Nvidia, just in case. Maybe they want to do something they haven’t thought of yet. I guess they’re right. The card also plays games pretty well, if that’s all they ever do.
If you KNOW for certain that you just want to play games, then yeah, the AMD card offers a lot of bang for your buck. People aren’t that certain when they assemble a system, though, or when they buy a pre-built. I would venture that the average shopper at least entertains the idea that they might do some light video editing, the use case feels inevitable for the modern PC owner. So already they’re worrying about maybe some sort of compatibility issue with software they haven’t bought, yet. I’ve heard a lot of stories like yours, and so have they. I’ve never heard the reverse. I’ve never heard somebody say they’d like to try Nvidia but they need AMD. Never. So everyone tends to buy Nvidia.
The people dropping the ball are the reviewers, who should be putting a LOT more emphasis on use cases like yours. People are putting a lot of money into labs for exhaustive testing of cooling fans for fuck’s sake, but just running the same old gaming benchmarks like that’s the only thing anyone will ever do with the most expensive component in the modern PC.
I’ve also heard of some software that just does not work without CUDA. Those differences between cards should be tested and the results made public. The hardware journalism scene needs to stop focusing so hard on damned video games and start focusing on all the software where Nvidia vs AMD really does make a difference, maybe it would force AMD to step up its game. At the very least, the gamebros would stop acting like people buy Nvidia cards for no reason except some sort of weird flex.
No, dummy, AMD can’t run a lot of important shit that you don’t care about. There’s more to this than the FPS count on Shadow of the Tomb Raider.
Well the counterpoint is that NVIDIA’s Linux drivers are famously garbage, which also pisses off professionals. From what I see from AMD now with ROCm, it seems like they’ve gone the right way. Maybe they can convince me next time I’m on the lookout for a GPU.
But overall you’re right yeah. My feeling is that AMD is competitive with NVIDIA regarding price/performance, but NVIDIA has broader feature support. This is both in games and in professional use cases. I do feel like AMD is steadily improving in the past years though. In the gaming world FSR seems almost as ubiquitous (or maybe even more ) as DLSS, and ROCm support seems to have grown rapidly as well. Hopefully they keep going, so I’ll have a choice for my next GPU.
It’s a shame there’s not really an equivalent comparison to the CUDA cores on AMD cards, being able to offload rendering to the GPU and getting instant feedback is so important when sculpting (without having to fall back to using eevee)
This is a super hefty dose of copium.
AMD has never been a serious competitor. They might have been the choice on a few SKUs through the years but they produce trash compared to Nvidia.
Where do you even come up with this stuff? People who give a single shit about raytracing: buy Nvidia. Anyone else with half a brain…
👌
I like AMD but they’re still overpriced, nothing compelling in the $200-300 range since 5600 XT and RX 580, and I keep hearing stories about unoptimized drivers (can’t confirm myself cause I’m still on 5600 XT with mostly older games). They’re the lesser of two evils, but they’re far from chad-doge behavior at this point.
The 200-300 range has been dead really since the 5600xt and the crypto boom.
Second hand at the 400 mark they’re insane with the 6800XT
I got a new rx6700 xt 2 months ago for 250€ and I must say that I’m very happy with it. It performs well in newer games and got 12gb vram.
That is what you have to do if you’re behind the competition. Don’t think they’ll keep this up for long if they happen to be the industry leader.
Always back the underdoge
I just bought my first Nvidia card since the TNT2. Up to today I always looked for the most FPS for the money.
This time my focus was on energy efficiency, and the AMD cards suck at the moment. 4070 about 200w, 6800 about 300w. AMD really has to fix that.
Regarding DLSS: I activated it in control, and it looks… off? Edges seem unsharp, not all the time, but often, sometimes only for a second, sometimes longer. I believe it is the only game I have that has support for it, but I’m not impressed.
At OP: Brand loyality is the worst. Neither Nvidia nor AMD like you. Get the best value for your money.
Btw, Nvidia needed an account to let me use their driver. Holy shit, that’s fucked up!
According to TomsHardware, the RX 6800 is currently the most efficient.
Maybe, but it draws 280 watts instead of 200. That’s just too much, at least for me.
4070 about 200w, 6800 about 300w. AMD really has to fix that.
But if you compare cards from the same generation, like the 3070 and 6800, they’re much closer. Nvidia still has the edge, but the 3070 TGP is 220W vs the 6800 at 250W.
if im not wrong 6800 perform way better than 3070
You don’t necessarily need an account to use the Nvidia drivers, just if you want automatic updates through GeForce Experience. Not saying that’s any better, in fact it’s almost as shitty, just wanted to clarify.
I just used a junk email to make an account for the auto updates.
There is a way around the account requirement. I uninstalled GeForce experience forever ago
Wait what?? Thank you, I will look into it, I don’t need that crap!!
You can get drivers directly from their site without an account aswell
When you install the drivers there is a checkbox for geforce experience. I think you need to do “custom installation” or advanced or whatever they call it instead of just clicking the install button they show.
Aaaaaaand… it’s gone! Thanks!!
I’ll never go for Nvidia ever again.
I’ve been a Linux only user for over twenty years now and Nvidia is the fucking devil. Their drivers range in quality anywhere from “ugh” to “wtf!” and my current Nvidia card (it’s a loan) gives me continuous screen artifacts and kwin (screen manager) crashes. AMD drivers just work.