This is getting out of hand. The other day I saw the requirements for “Hell is Us” game and it’s ridiculous. My RX6600 can’t play anything anymore. I’ve downloaded several PS3 ROMs and now I’m playing old games. So much better than this insanity. This is probably what I’m going to be doing now, play old games.
Edit: I wanted to edit the post for more context and to vent/rant a little.
I don’t want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.
They’re $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).
I can’t afford these high end GPUs, but now very few games work on low settings and I’d get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an “omfg, wtf is this horrible shit” moment.
I’m so sick of this shit!
I don’t regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I’m emulating and it’s actually pretty awesome. I’ve missed out on so many games in my youth so now I’m just going to catch up on what I’ve missed out on. Everything works in 4k now and I’m getting my full 60FPS and I’m having so much fun.
The reason has nothing to so with higher quality. They just do cost cutting of not having to create lighting and pass the cost on to the customer by having us raytrace and thus requires more horse power that GPU makers failed to deliver.
Not defending anyone, but I don’t think implementing raytracing is a cost-cutting measure.
Also, are there games that force you to enable raytracing (when a big performance hit). The ones I have played always allow you to disable it.
I think (one of) the real reasons behind some games being so poorly optimized is the developers using upscaling as a crutch when they shouldn’t.
The newest Doom and Indiana Jones have mandatory raytracing.
And the reason for poor optimization is mostly publishers not giving the devs enough time and resources to optimize or just poor management.
Yup. Some people would downvote you for saying this. People out there defending big corporations.
Obligatory reminder that Titanfall 2 is pretty great, runs great (its built on a modified Portal 2 era Source branch) looks great… and with Northstar, people figured out how to mod the client to work with custom servers and mods… and it all works on linux as well, literally has its own custom Proton branch.
But uh yeah, the new AAA graphics paradigm is:
Everything is a dynamic light, game devs don’t optimize shit anymore because they’re all being slave driven by corporate…
…because everything is built for stupid high graphical realism fidelity, few AAAs make any novel or engaging, fully fleshed actual gameplay…
But hey, nobody has to bake light and shadow maps anymore!
Assuming, of course, your card can support real time raytracing, which it can’t, so we had to invent intelligent frame upscaling to replace well optimized AA methods, but that also isn’t enough, so we had to invent (fake) frame gen.
Oh and all the cards that can do this are all sitting around double MSRP, because board partners can do whatever the fuck they want, and retailers don’t even bother to attempt to stop scalpers.
…
The 9060 XT launches this week, and its supposed to MSRP at $350, for the 16 gig model.
My guess is we will get maybe 12 hours of prices between $350 and $450 (for the fancier partner models)… and then $500 to $550 will be the new ‘baseline’ price for whatever is left by next week.
If you’re planning on trying to get a 9060 XT, good fucking luck, you’re almost certainly gonna need it.
…
The GPU market is largely doing the same thing thats happened with cars, housing: everything is a luxury model, none to few viable economy options even get newly mfg’d, then the entire consumer base goes into debt to keep up their lifestyle, then all the debt bubbles pop and consumer spending ability craters… and… maybe then, 6 months to a year after that, the GPU mfg’rs could possibly start releasing actual economy models? Maybe?
…
Either way, a whole lot of AAA studios are either going to keep monetizing harder and harder… or realize that as we enter this 2nd Great Depression… that shit ain’t gonna work for a mass consumer base, people just won’t have the money.
Or I guess Klarna and Afterpay come built in to GTA 6 Online shark card purchases.
Why not?
Damn, you fucking spoke my heart. This is the whole point of my post. I won’t repeat what you said, but thank you for crafting this comment. Nailed it.
people out here really refusing to compress, compile, and optimize their work
That’s my whole point. Why are we fsulting ourselves and defending big studios who got greedy and lazy? Games used to look good and play good on shit hardware because developers had no choice. Now, you get downvoted a berated if you say that. It’s not a hardware issue, it’s a game industry issue. They are lazy. They don’t give a shit. They release shit half baked and broken then “fix” it later.
I’ve been a PC gamer for almost 30 years now. This perpetual march of buying new PC upgrades to play new games is an old song.
deleted by creator
You want to play at 4k on an old low end card. I’m sorry but this one is on you.
Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I’m lucky my steam deck can run it or I’d be screwed.
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
I’m so used to having a toaster for a desktop I had to look up what FSR is. I’m too college student budgeted to touch FSR.
Last I played months ago, I recall it being fine enough when I had my deck docked. FPS was stable and didn’t look too bad. Though, I’m not an expert on anything graphics, so my word doesn’t mean much.
It’s a PS5-only game running on a portable device. Considering the state of a lot of ports (including this one at launch lol), it’s a miracle that it runs this well.
Games should be easier to run, with better modding tools. Simple as that, Morrowind with mods is still a fantastic time and it runs well on a Steam Deck, something we can carry around with us.
Idk man, I bought Sol Cesto yesterday, and I’m pretty sure my toaster could run it
Edit:
RX 6600, two 4k monitors
Bruh. I have a 3080 Ti and barely feel comfortable running my games in 2k. I’m pretty sure the 6600 was made with only 1080p and lower in mind.
I dunno my 3080Ti runs 4k 90+ FPS no problem. Maybe not all games but most decently modern games.
😂. I know, dude. That’s my whole point. Why do WE have to bear the burden of optimizing the game? Why don’t the developers optimize their games? It’s bonkers the game “hell is us” listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You’re saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they’ll run well on your boutique monitor.
It’s a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I’ll keep running new games on my older card at 1080 and be perfectly happy with it.
That’s why I went back to the roots. I’m now playing older games at 4k 60 fps no problem. I’ll stick with emulators. I’d rather not spend the $700. I’ll still complain about new games not running for me, though. That’s the only thing I can do beside playing older games instead 😂
Or just run newer games at 1080p. Unless you’re unhealthily close to the monitor you probably won’t even see the difference.
If you’re rubbing it on a TV across the room, you probably literally can’t see the difference.
I do run them at 1080p, trust me. Here is the thing, though, running 1080p on a native 4k screen makes for a horrible looking picture. It just looks off and very bad. Try it if you can. It’s best when the screen itself is physically 1080p. I think you fat-fingered the “b” in “running”. Came out funny 😂
1080p scales to 4k perfectly unless you have a weird aspect ratio, since it can just treat each square of 4 screen pixels as 1.
What looks bad is trying to run anything between 1080p and 4k, since it’s not a perfect 4:1 relationship.
You’ll want to use Lossless Scaling. It’ll quadruple the pixels without any filtering and make the output not look weird on a 4k display.
Elaborate, please. What res would that be?
(And then you have portable boxes that somehow ADvertise to run games in 4k 60fps for 499$ 🤣🤣)
It’s not bonkers though. Fill rate (the time it takes to render all the pixels your monitor is displaying) is a massive issue with ever increasingly photo realistic games because you can’t rely on any simple tricks to optimize the rendering pipeline, because there is so much details on the screen that every single pixel can potentially completely change at any given moment, and also be very different from its neighbors (hence the popularity of temporal upscalers like DLSS, because extrapolating from the previous frame(s) is really the last trick that still kind of works. Emphasis on “kind of”)
If you don’t want to sell a kidney to buy a good GPU for high resolutions, do yourself a favor and try to get a 1440p monitor, you’ll have a much easier time running high end games. Or run your games at a lower res but it usually looks bad.
I personally experienced this firsthand when I upgraded to 1440p from 1080p a while ago, suddenly none of my games could run at max settings in my native resolution, even though it was perfectly fine before. Also saw the same problem in bigger proportions when I replaced my 1440p monitor with a 4k one at work and we hadn’t received the new GPUs yet.
I’ll just play older games. Everything runs at 4k 60 fps no issue on RPCS3 and I’ve been having a freaking blast. Started with uncharted 1, and man, I’ve missed out on this game. I’m going to stick with older games.
just replay Deus Ex forever lol
Might as well. lol. I’ve started with uncharted 1 and I’ve forgotten how amazing that game is. Running at 4k 60fps no problem. I’m having a great time.
Bro what are you smoking. You can run PS3 games, but can’t play cutting edge PC games?
That’s insanity.
I just upgraded my CPU from a 2700X to a 9800X3D and can finally play PS3 games emulated, my performance in AAA games hasn’t changed but then again I don’t use AMD GPUs which are basically productivity cards at this point that can run some games as a coincidence, nor do I try my luck at 4K.
1440p, RTX 3090, path tracing in CP2077, even with DLSS is barely playable at 30 FPS.
I don’t smoke. Also, playing PS3 games on RPCS3 has very little to do with my GPU, it’s mostly the CPU, and I have a decent CPU, Ryzen 7 5700G. Also, your comment has some contradictions and doesn’t make much sense, no offense. lol
I’d like to know what those contradictions are, if you’ll indulge me. Sorry if my comment was confusing, I added some paragraph breaks to make it easier to read and fixed a typo.
What I don’t get about your setup is how you have such a powerful CPU, but don’t get an equally powerful GPU?
It seems more than anything your system just isn’t balanced for demands of modern gaming (very GPU centric).
My old PC with a Ryzen 2700X, or hell - even a Ryzen 1600 that I originally built it with, but paired with an RTX 3090 instead of an AMD GPU, would handily outdo yours at most gaming tasks, even 4K gaming is just about doable on such a setup.
I understand that GPU prices are shit, but in light of that the best approach is to spend as little as possible on everything else.
What I don’t get about your setup is how you have such a powerful CPU, but don’t get an equally powerful GPU?
That’s the part that cleared everything up for me (it could be me as English isn’t my first language lol). I apologize. As for my CPU, I got it on sale for $120 and ran it for around 6 months without a dGPU, until I got an RX580 for free from a friend for around half a year then “upgraded” to my RX6600 that I got from Facebook for $100. That’s why I have a weaker GPU and a decent CPU. Hope that clears things up now. 😅
If you find yourself playing old games more and more, [email protected] or [email protected] might just be the communities you need to join if you haven’t already.
[email protected] is another one worth looking at. It’s for people who don’t play games at launch, but wait a few years instead :)
Hell yeah, joined all three. Thank you
play on low?
Try playing Indiana Jones and the great circle on low.
is that not possible?
I’ve edited my post for more context. It is not possible in my case. I use all AMD and this game forces DLSS. Setting the game to low at 1080p barely gives me 20 fps and even that is unstable. Going lower res than that makes the game look like absolute dogshit. So I just gave up. I now play everything at 4k 60fps on rpcs3 no problem.
wow shit game then. but not the rule. still i think more and more devs think we have so powerful gpus, well looking at steam hardware we do. cant stop progress.
https://www.google.com/search?q=Indiana+Jones+great+circle+disable+dlss
Seems to be a known, common issue. I see a bunch of pages that could help you out, since it sounds like it’s just this game that you’re having trouble with.
It’s not just insane power requirements. It’s putting graphics bling ahead of playability and fun.
I recently bought Doom 2016 and Mosa Lina. I’ve had more fun with the latter than the former, even if I’ve been a Doom and Doom II player all my life.
for me many old games are just that, old. some have still great modern gameplay and just as good music as back then. if the artstyle is nice then i play them and dont think they are ugly.
some games i really think are fucking ugly like the when the textures are verw low blurry and low resolution.
and these 2d games or games with sprites (how to call them? like age of empires 1 or 2. for me they dont age, its their artstyle and if you make it 3d then its not nive anymore.
what game i cant play are these, that are 3d but something makes it that the picture is all so flat and confusing. metal gear solid 3, as an example.
It’s fucking insanity. I got a great deal on 2 4k monitors from Facebook marketplace (I’ve edited the post for more context) and now I can’t play anything. :/
Surely you’re not trying to use a card that targets 1080p gaming for 4k.
That’s all I can afford. And there was no way I’d pass those monitors. Everything is good when emulating, though. This post was more of a venting/ranting post.
Sounds like you don’t understand how demanding it is for a graphics card to run at 4K. Like trying booting up the same game 4 times at 1080p. You expect your graphics card to be able to handle that?
That’s certainly fair, however it’s unrealistic to expect a card to perform the same while doing +4x the work, all else being equal.
Have you tried playing at non-native resolutions?
I mean you probably are just going to have to run the games at 1080p instead of 4k until you can afford a better gpu
Some games run like shit even on 1080p. Lmao. But I get ya
You can always just run the monitors at 1080p when playing games. Perfect integer scaling doesn’t look bad if the display isn’t massive.
Don’t worry - once you’ve upgraded everything the games are still the same yellow-markings-mean-climbable, branching-skill-tree, press-Y-to-Parry, level-gated-fetch-quests, enemies-drop-loot-that-is-not-as-good-as-what-you-already-have, collect-all-the-orbs nonsense your computer could already play.
Damn, that’s another thing. All games are the same, but have different environments. That’s why I get bored with them really quick.
4k gaming is a scam. I’ve been using 1440p for 20 years and never had issues running every game that comes out at max settings.
Using 1440p for 20 years doesn’t mean 4k is a scam
i made two statements and you made none.
I don’t believe you
It is a scam if you’re buying 27" monitors like op. You can only cram so much dpi in a monitor before you get diminishing returns. I’ve been playing in 1440p, 27" for a while, and can barely see the pixels if I put my eyes 10 cm away from the screen (and I’ve been playing arma reforger, so there’s been a lot of squinting at bushes through a high-powered scope lately).
I’ve also used a 4k, 32" screen for a long time at work (in gamedev, so I wasn’t looking at excel files either… Well, actually I also was but that’s besides the point) and couldn’t really tell the difference with my home setup on that front (though I admit 32" at 1440p doesn’t look great sometimes, I also tried that for a while). Really, the most noticeable things were the HDR and the ludicrous fps I could get from having a top-of-the-line CPU and GPU (and 128 Go RAM also helped a bit I guess)
Good point, I didn’t keep ops display size in mind when i read this comment of “4k is a scam”, which may have been their point.
In my reply, I’m saying 4k isn’t a scam in general.