This is getting out of hand. The other day I saw the requirements for “Hell is Us” game and it’s ridiculous. My RX6600 can’t play anything anymore. I’ve downloaded several PS3 ROMs and now I’m playing old games. So much better than this insanity. This is probably what I’m going to be doing now, play old games.
Edit: I wanted to edit the post for more context and to vent/rant a little.
I don’t want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.
They’re $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).
I can’t afford these high end GPUs, but now very few games work on low settings and I’d get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an “omfg, wtf is this horrible shit” moment.
I’m so sick of this shit!
I don’t regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I’m emulating and it’s actually pretty awesome. I’ve missed out on so many games in my youth so now I’m just going to catch up on what I’ve missed out on. Everything works in 4k now and I’m getting my full 60FPS and I’m having so much fun.
It’s not just insane power requirements. It’s putting graphics bling ahead of playability and fun.
I recently bought Doom 2016 and Mosa Lina. I’ve had more fun with the latter than the former, even if I’ve been a Doom and Doom II player all my life.
for me many old games are just that, old. some have still great modern gameplay and just as good music as back then. if the artstyle is nice then i play them and dont think they are ugly.
some games i really think are fucking ugly like the when the textures are verw low blurry and low resolution.
and these 2d games or games with sprites (how to call them? like age of empires 1 or 2. for me they dont age, its their artstyle and if you make it 3d then its not nive anymore.
what game i cant play are these, that are 3d but something makes it that the picture is all so flat and confusing. metal gear solid 3, as an example.
Ita fucking insanity. I got a great deal on 2 4k monitors from Facebook marketplace (I’ve edited the post for more context) and now I can’t play anything. :/
Surely you’re not trying to use a card that targets 1080p gaming for 4k.
That’s all I can afford. And there was no way I’d pass those monitors. Everything is good when emulating, though. This post was more of a venting/ranting post.
You can always just run the monitors at 1080p when playing games. Perfect integer scaling doesn’t look bad if the display isn’t massive.
Sounds like you don’t understand how demanding it is for a graphics card to run at 4K. Like trying booting up the same game 4 times at 1080p. You expect your graphics card to be able to handle that?
I mean you probably are just going to have to run the games at 1080p instead of 4k until you can afford a better gpu
Some games run like shit even on 1080p. Lmao. But I get ya
That’s certainly fair, however it’s unrealistic to expect a card to perform the same while doing +4x the work, all else being equal.
Have you tried playing at non-native resolutions?
The reason has nothing to so with higher quality. They just do cost cutting of not having to create lighting and pass the cost on to the customer by having us raytrace and thus requires more horse power that GPU makers failed to deliver.
Not defending anyone, but I don’t think implementing raytracing is a cost-cutting measure.
Also, are there games that force you to enable raytracing (when a big performance hit). The ones I have played always allow you to disable it.
I think (one of) the real reasons behind some games being so poorly optimized is the developers using upscaling as a crutch when they shouldn’t.
The newest Doom and Indiana Jones have mandatory raytracing.
And the reason for poor optimization is mostly publishers not giving the devs enough time and resources to optimize or just poor management.
Yup. Some people would downvote you for saying this. People out there defending big corporations.
Have you tried just setting the resolution to 1920x1080 or are you literally trying to run AAA games at 4K on a card that was targeting 1080p when it was released, 4 and a half years ago?
I didn’t mention that in the op, but Indiana Jones was running like shit on 1080 low settings. The fucking game forces DLSS. This is where gaming heading, forced DLSS and forced garbage so we are forced to buy expensive shit
That’s just on Indiana Jones and bad optimization. There are still plenty of other, newer games that should run perfectly fine. Of course the big, chunky games that are marketed as “Look at how graphically intense this is! Look at the ray tracing!” are going to run poorly.
Though I will absolutely agree that a lot of studios are throwing optimization out the window when developing new games, just relying on the latest hardware to power through it.
Yup, and the new hardware is now skyrocketing in prices. Nvidia and AMD still make 8GB cards and no new game can run on this shit anymore.
Well yes, Ray Tracing is required. A 6600 XT is a mid rage card several generations ago that has very early generation RT support. Not even the 7000 series does it super well.
Trying to expect anything good from something ray traced with a 6600 xt is a pipe dream. Let alone 4k. Minimum requirements on steam usually are 1080p 30fps (sometimes 720p/30fps WITH upscaling DLSS/FSR)
Yes gpus are expensive, but it doesnt mean you should expect those kinds of titles to run on your hardware.
You havent mentioned what CPU you have, as running 4k also increases that demand a considerable amount, especially with 2x 4k monitors.
If you want a gpu upgrade the Arc battlemage, and soon Arc Celestial GPUs should be affordable. Battlemage is coming down to 300 bucks new rn, so those should be able to handle newer titles better, but from a second hand perspective, DONT expect to be able to play RT required games, let alone, WITHOUT upscaling.
I hate the peices as much as anyone else, but the only option we have is to wait for better second hand cards over time, just stick to Indle/AA games (not AAA or “AAAA”), and possibly run games at 1080p instead of 4k.
Best of luck getting everything running!
Remember when oblivion came out? Or crysis? They were so hard to run that they became meme benchmarks lol
And now that “gaming” is incredibly mainstream, the push to be more and more marketable by investors by pushing technology in graphics because that’s what sells.
Graphics too hard to run or not, I just want good games. And all this priority on intense graphical fidelity doth not maketh for many resources for the rest of the game and often shows a priority on PROFIT over all else.
Not buying games, for any reason, including you can’t play them, is probably the best and healthiest thing to happen to gaming since indie gaming…
So, go, play your 15 year old games. Enjoy what’s actually fun. The world will be a better place for it.
God damn, I couldn’t agree more. It’s been a blast playing older games, not gonna lie. I just can’t understand why I need to spend $700 plus on a GPU ONLY to be able to play something.
Because investors realized that gaming is lucrative and leads society. You think it’s a coincidence that Valve created basically the first successful digital distribution platforms and now every entertainment medium followed their model? Do you think microprocessors need to be that good on phones for Web browsing? AI compute hardware came from gaming gpu technology. The software originally made for and tested on games.
But, like music publishers trying to push the newest and latest and greatest music on us, only to have you realize that “oh yeah, 80s music actually kicks ass” and “oh fuck, 50s music WAS neat” or if you have or when you do “holy shit classical is actually amazing and lasted for literally hundreds of years whereas current pop music is constantly only like 2 years old”…
…Like music in that way, old games don’t just magically become irrelevant and bad, despite what pop culture may try to tell us. Compatibility may be the biggest issue, but a lot of old games are legitimately better than newer ones (a lot of old games were bad too). It’s all the social and technical evolution of the medium, and once you start being able to look at it that way - once that perspective and vibe catches on a bit more - I think gaming will be healthier.
Don’t spend 700 bucks. The game will still be there. Wait. Play fun things now that you can, and if when it’s economically playable for you it’s still fun, then it’ll still be fun. There are far too many games right now to even seriously consider fomo for anything but the MOST socially important games, and those are few and far between, and usually very easy to run.
You’re doing fine.
Don’t look at how much 5090s cost in like Australia or some other countries.
Absolutely spot on. It needs a personal change. A change in mentality, the way we think of entertainment, games in this case. We need to stop believing these profit hungry people and stop chasing after those “shadows” and “highlights” and every single detail in games. When you enjoy a game, you won’t even notice all those “details”. You’re not going to be running around in a game looking at trees and sunlight. I guarantee you that it all becomes blur in the background and you won’t even care about it. It’s all a collective anxiety they’ve trained us into. Chasing after those fps numbers and details is what got us hooked to their exorbitant prices and shit performance. It was somewhat a wakeup call for me, if you wanna call it that. I just got tired of stressing about my hardware, that’s totally fine and capable, not playing their shit and poorly optimized games. Then come those who defend corporations and berate you for “choosing the wrong resolution”. Why? I like 4k. It looks nice. Why do I have to bear the burden of their poorly optimized games. They ARE 100% more than capable of optimizing their games to run on 4k ON MY RX6600, but they don’t want to. They’re lazy. It won’t make them enough money. It won’t squeeze the last penny out of our pockets. They’re also listening to Nvidia. Of course Nvidia wants us to believe that games are “so good” nowadays that we need their top of the line $3000 GPU to play them on 4k. How dare we peasants play on 4k on a non $3000 GPU? Blasphemy!!! Fuck’em. At least there are still folks like you out there who understand this bullshit and don’t bootlick.
Lol you’re cute.
Also, it’s a fascinating novel perspective you’ve presented, in that our caring about fps and small details is a learned obsessive behavior. I’d love to hear more about how you think that works and came to be.
Go back further back. It is like this ever since gaming on PC is a thing. Doom, Wing commander 3, Quake III, …
You have to go back to when gaming was dominated by the Amiga and Atari ST to find a time when it hasn’t been like this.
I’d say thanks to Indie games, it is actually much easier to have a pleasing gaming life on low specs nowadays.
I don’t do much gaming these days, so my hardware isn’t appropriate and this likely wasn’t going to go anywhere anyways, but yesterday I had a quick thought whether I should buy Blue Prince and play it myself, before I watch someone else play it. And I kid you not, my first thought why I shouldn’t buy it was “Nah, it’s got 3D graphics, it’s probably not going to run well”.
I genuinely have no idea, if that even makes sense. The game uses cell-shaded graphics, so you probably could’ve implemented it on the PS1.
Well, except that it needs a fairly high resolution to be able to see some of the puzzle clues. And it’s got soft lighting and whatnot. And the solo dev probably didn’t spend their entire dev time optimizing for hardware as weak as mine.I just thought it was funny that we’ve been doing 3D for 3 decades and somehow it’s still relevant for my judgement how a game will run. 🫠
100%. And this is my whole point. Why is it OUR problem not the devs problem? Why do I have to buy into expensive shit?
Don’t worry - once you’ve upgraded everything the games are still the same yellow-markings-mean-climbable, branching-skill-tree, press-Y-to-Parry, level-gated-fetch-quests, enemies-drop-loot-that-is-not-as-good-as-what-you-already-have, collect-all-the-orbs nonsense your computer could already play.
Damn, that’s another thing. All games are the same, but have different environments. That’s why I get bored with them really quick.
If you find yourself playing old games more and more, [email protected] or [email protected] might just be the communities you need to join if you haven’t already.
[email protected] is another one worth looking at. It’s for people who don’t play games at launch, but wait a few years instead :)
Hell yeah, joined all three. Thank you
just replay Deus Ex forever lol
Might as well. lol. I’ve started with uncharted 1 and I’ve forgotten how amazing that game is. Running at 4k 60fps no problem. I’m having a great time.
there are fun games that dont require crazy gpus/monitors, you can enjoy + support those instead
Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I’m lucky my steam deck can run it or I’d be screwed.
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
I’m so used to having a toaster for a desktop I had to look up what FSR is. I’m too college student budgeted to touch FSR.
Last I played months ago, I recall it being fine enough when I had my deck docked. FPS was stable and didn’t look too bad. Though, I’m not an expert on anything graphics, so my word doesn’t mean much.
It’s a PS5-only game running on a portable device. Considering the state of a lot of ports (including this one at launch lol), it’s a miracle that it runs this well.
Okay I’m going to go against the grain, and will probably get downvoted to hell, but this is not new. This is PC gaming. This has always been PC gaming. Hot take - you don’t need 4k@60fps to be able to have fun playing games.
New games require top of the line hardware (or hardware that doesn’t even exist yet) for high-ultra settings. Always have, always will. (Hell, we had an entire meme about ‘can it run crysis’, a game that literally could only play on low-medium on even the highest level machines for a few years) Game makers want to make their games not just work now, but want them to look great in 5 years too. Unless you have shelled out over a grand this year for the absolute latest GPU, you should not expect any new game to run on great settings.
In fact, I do keep my PC fairly bleeding edge and I can’t drive more than High settings on most games - and that’s okay. Eventually I’ll play them on Ultra, when hardware catches up. It’s fine.
And as for low to mid level hardware I was there too - and that’s just PC gaming friend. I played Borderlands and Left4Dead the year they came out on a very old Radeon card at 640x480 in windowed mode, medium settings, at about 40fps.
Again, this is just what PC gaming is. If you want crisp ultra graphics, you’re gonna have to shell out the ultra payments. Otherwise, fine tuning low to medium payments, becoming okay with sub 60fps, this is all fairly normal.
Personally, when I upgrade I find great joy in going back and “rediscovering” some of the older games and playing them on ultra for the first time.
And honestly, most modern games still look great on medium or low if you just put the textures on high. And that usually only affects VRAM usage and not performance.
I usually buy my PCs with an expected lifetime of about 10 years. And I don’t even buy the highest level components. Just enough to get High graphics at whatever is currently the most common resolution. After those ten years they usually can still play the newest releases at low settings while still looking better than ten year old games. You just have to play around with the settings a bit.
My Steam Deck is the only gaming PC I have that is actually struggling having new releases look good. But that was to be expected. And most of them still work if I tolerate abysmally low resolutions at 25 to 30 fps.
I disagree and I’m too lazy to explain but in short, it’s not about High/Ultra settings, that’s just name for the settings. It’s about how the games look, play, etc… vs how they perform. And I don’t remember that PC gaming was ever before so bad even when we’ve got shitty console ports.
You were so lazy it sounds like you agree with the og comment, not disagree. In essence, I think your comment aligns with the one you are replying to.
(Calling u so lazy is a light hearted joke that joins into my point)
4:30AM 2.6.2025 BC (before coffee) morning me wrote that comment.
I honestly just glanced at and dreamt the comment I was replying to.
Hell is Us requires a 5600 XT to play (from looking at its minimum requirements on Steam), you have a 6600 so you should be good (obviously not at 4k though).
Also seeing the whole 4k thing, the monitors might not have been a mistake, but they are probably an investment for the future at this point when budget friendly 4k cards are a thing, until then, you’ll have to rely on upscaling from lower resolutions (if thats possible with tools built into the game) or just playing in a non-native resolution (1080p) for new games.
I can’t believe how poorly most Unity games run these days. We’re talking fairly basic 2D games that are struggling to run well on hardware from this decade. It’s really pathetic.
Yup, this is what I’m fucking saying. I’m so sick of it. I’m so sick of this “you’ll need a $3000 GPU to run a 4k game at 30 fps”. Like what the actual fuck? Who asked for this?
Ive had to undervolt my GPU/CPU to use less, but yeah it is terrible usage now days.
You want to play at 4k on an old low end card. I’m sorry but this one is on you.