I’m sorry, is a $1000 now cheap for gpu? I remember when an 80 series cost $500 and it felt expensive.
Yeah they ain’t cheap. AMD just follows NVidia’s pricing and just undercuts them by a few hundred. AMD has zero reason to price their GPUs this high. While I sorta get why NVidia does it. There is massive demand for their chips outside the gaming sphere. And these businesses are willing to pay top dollar. I bet most of their production capacity is allocated to produce data center GPUs.
Genuinely what are you talking about?
RX 7800 xt is dropping beginning of next month at $500 and it’s a beast of a GPU.
Yeah sure, but it’s not comparable to an 80 series. The 7900xt costs $900 and the xtx costs $1000 at msrp. Thats a ton of money.
The 4070 costs $600 btw, which is more comparable to the 7800xt you mentioned
Yeah agree the high range for amd is meh, if you’re just looking for the best out there money is no object, fine with >$1000 gpus, Nvidia has no competition there. The 7900s are more competitive with the Nvidia 4080s and undercut those on price too, 4080s are $1200. So they should really be looked at as a 4080 alternative not 4090 which has no real alternative. Amd offers nothing nearly as expensive as a 4090.
Im very interested in the 7800xt which is a 4070 competitor. If it’s outperforming the 4070 in most respects like the amd numbers suggests I think it’ll be a great value since it’s $100 cheaper. The 4070 only having 12 gb of vram is pretty disappointing too for future proofing, especially for the price. Would like to wait on the reviews of the 7800xt of course first, we only have company provided numbers so far. Also interested in their progress on ray tracing and fsr, which they’ve clearly been behind Nvidia on for a number of years. But getting enough fps and achieving the resolution you want should still be priority number one over something more niche and game dependent like ray tracing when you’re picking out a card I think.
Again, amd does undercut nvidia but not by a lot. There’s no reason for pricing their gpus so high other than pure greed. A 1000 usd is pretty damn expensive for something that does nothing by itself.
So no, AMD is not cheap
Oh didn’t mean to imply they’re not greedy, it’s a company. Nvidia would be far greedier though then unless you value their extra features with otherwise worse performance by that much more money. And without competition Nvidia would have free reign to get even more absurd in their pricing. Some competition is better than none. Maybe Intel gpus will start getting good and we can really get some competition going to drive prices lower hopefully.
Yeah but the same thing can be said about phones, it’s the new norm and for something that’ll easily last you 4-6 years, it’s a worthwhile investment I feel
deleted by creator
I just want Nvidia to make something $200-300 with 12-16GB VRAM, their price increases and calling $500+ “mid-range” is absolutely criminal
deleted by creator
While I’m no Nvidia fanboy, the rtx 4060 has a tdp of just 110 watts, lower even than that 1060 (with a tdp of 120W, IIRC).
Your point certainly stands for the higher-end cards, though.
deleted by creator
I still remember when I got my top of the line GeForce 9800 GTX+ for like $230. Even with inflation, that’s $320…
That is what you have to do if you’re behind the competition. Don’t think they’ll keep this up for long if they happen to be the industry leader.
Always back the underdoge
*peek
Wait, does this mean the adrenaline software is finally out for linux? Can we undervolt/set fan curves now? The interfaceless free driver is so freaking noisy with my gpu.
I checked, you guys are still celebrating the mesa code that was contributed ages ago -.- Yes it works and it’s foss. And AMD has been lazy on linux ever since, we get the bare minimum. They don’t beat nvidia by much imho.
Am I having a stroke, or does that actually say “here’s the our source code”?
Oops
youre crazy
*your crazy
what about my crazy ?
we’ll let you peak 🗻
what if this is just a pyschology test and we are expected to not notice and discuss amd or nvidia
Intel does the same thing with drivers
Nvidia is only now catching up
Which License?
Where I can find the source?
As far as I searched what is free software is the Vulkan implementation that runs on top of the intrinsic GPU and drivers (that have DRM and no source code).
The intrinsic GPU drivers on the kernel are still close source. So basically AMD and NVIDIA are the same. They both have source for some engines implementation but both kernel drivers are close source.
https://github.com/GPUOpen-Drivers/
amdgpu is a blob.
I’m missing something?
This is probably about the FSR3 presentation https://youtu.be/zttHxmKFpm4?si=OyZOmoX22MQJDOst
Here is an alternative Piped link(s): https://piped.video/zttHxmKFpm4?si=OyZOmoX22MQJDOst
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
AMDGPU is open source: https://github.com/radeonopencompute/rock-kernel-driver/, it’s also upstreamed into Linux. The firmware is a binary blob though.
Thank you so much I couldn’t find it!! Well, then I’m switching into AMD next hardware change!
AMD’s had some buggy drivers and misleading graphs, but they’re overall infinitely more consumer-friendly than Nvidia
It is the lesser of two evils imo. Not saying that AMD is any good, their alternatives are just that bad.
drivers have been solid for years now
Good one!
Can’t speak for Microsoft users (except - abandon all hope), but since Kernel 5.4 I’ve been on 2 different Radeons and a vega. Zero drivers. Just latest STABLE Mesa. If the game worked on Protondb, it worked for me.
The triple whammy of semiconductor shortage, pandemic and cryptocunts has really fucked PC gaming for a generation. The price is way out of line with the capabilities compared to a PS5.
I’m still on a 1060 for my PC, and it’s only my GSync monitor that saves it. Variable frame rates really is great for all PC games tbh. You don’t have to frig about with settings as much because Opening Bare Area runs at 60fps, but the later Hall of a Million Alpha Effects runs at 30. You just let it rip between 40 and 80, no tearing, and fairly even frame pacing. The old “is this game looking as good as it can on my hardware while still playing smoothly?” question goes away, because you just get extra frames instead, and just knock the whole thing down one notch when it gets too bad. I’m spending more time playing and less time tweaking and that can only be a good thing.
I’m just clutching my pre-covid, pre-shortage GTX 1080ti. Hoping it’ll keep powering through a little longer. Honestly, it’s an amazing card. If it ever dies on me or becomes too obsolete, I’ll frame it and hang it on my wall.
I just wish AMD cards were better at ray tracing and “work” than Nvidia cards. Otherwise I’d have already splurged on an AMD if I could.
In a somewhat related note. Would anyone be able to recommend a upgrade/sidegrade option to go amd over my 3080ti? Just been meaning to be done with Nvidia
6950xt is a good value if you can find it on sale.
As an AMD fanboy, I approve of this.
And now, for a serious note: been running linux daily for almost 20 years and AMD machines are, per my personal experience, always smoother to install, run and maintain.
I’ve been intel w/ nvidia since 2007 on Linux. Recent trends have me thinking AMD is the way to go for my next one though. I think I’ve got so used to the rough edges of Nvidia that they stopped bothering me.
As someone who has been ignoring AMD for most of this time, (my last AMD product was something in the Athlon XP line), can I do Intel CPU w/ AMD discrete GPU?
Can I get back to you, say, in three weeks?
I’m about to put together a machine based on a AB350 chipset, with a Ryzen 5 (g series, for graphics from the start) and after that I intend to install on it a budget RX580.
If the thing doesn’t ignite or explode, I’ll gladly share the end result.
No rush whatsoever, but I’d be thrilled to hear about your results when you are done.
Yeah, AMD GPUs work great across the board no matter the CPU.
I should have specified “in Linux” more explicitly - same answer? :-)
Yeah, this is what my wife was doing. I’m also doing the reverse: AMD CPU, NVidia GPU. I considered AMD but went NVidia mostly for the PPW on an undervolted 4070. It results in a cool, quiet, low-wattage machine that can handle anything that matters to me, which AMD GPUs still can’t match this gen even with the upcoming 7800XT they’re trying to compare against the 4070. I’d wait for some PPW analysis before making a choice depending on your needs. There’s way more to the analysis than GPU source code or even raw performance that is often overlooked.
Oh,and don’t sleep on AMD. Though I don’t feel like the AM5 platform is fully baked, Ryzen architecture is rock-solid and I fully recommend using it if your history with Athlon is what’s keeping you away. I actively avoided them for the same reason until a friend convinced me otherwise, and I’m so glad I did.
Thanks, will take a hard look when it’s time to buy again. I forgot to specify that I was explicitly discussing Linux usage also - assume same answer?
Can’t speak to that, unfortunately. But I assume there would be no issues. The devices themselves are system agnostic; Windows isn’t doing anything special to make them play nice with each other.
Hey there fellow 20 year using Linux desktop Linux fanboy! Exactly the same here
I am not alone!!! Yes!
Thought I was on [email protected] for a second
I love the “she/they/it”
Not sure how to interpret this, but thanks, I guess
Well, I feel like a he/they/it. I don’t know if you mean it the way I interpret it but… I was raised in a way and time that felt to nudge me towards masculinity having been AMAB. But I don’t feel like I fully subscribe to the idea of being a “man” a “he”. But also for whatever reason I do mostly fit under that gender identity, and have always used he/him. So now-a-days I’m thinking of myself as more of a “he/they”. Mix in a bit of existential thinking, and a general lack of care about my pronouns… I’ve literally thought about putting “he/they/it” as my pronouns.
But I’ve refrained out of concern it would be taken as being dismissive of other’s gender identity, and/or the idea of gender identity altogether. When I only feel that way about my own gender identity.
So yeah, I saw your name and identified with it, it brought a little joy to my day, and wanted to share that with you.
Oh cool. I thought you might have meant that in a sarcastic way, talked about them being in my display name, or in another way.
removed by mod