lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?
https://www.youtube.com/watch?v=uCGD9dT12C0
Get a new game engine, Todd. Bethesda owns id Software. id Tech is right where.
IdTech isn’t built for open world gameplay
Rage??? Wasn’t Rage open world?
Kind of. It was more smaller locations linked together by loading screens a la Borderlands 2 rather than the typically seamless worlds Bethesda are usually known for. Although you could definitely argue that this was the approach taken by Bethesda for Starfield.
Wasn’t the drivable overworld one big map? I honestly can’t remember now, it’s been so long since I played it.
I do remember them harping on about “megatextures” and what this seemed to mean is that just turning on the spot caused all the textures to have to load back in as they appeared. I dunno if they abandoned that idea or improved it massively, but I don’t remember any other game ever doing that.
IdTech 7 does not use megatextures, the last engine to use it is IdTech 6
My memory could also be being fuzzy. Might have been more like Oblivion and Skyrim.
As for the megatexture thing, it’s not done anymore because it’s not needed. The reason they had to have textures load back in was because the 360/PS3 only had 512MB of total RAM, and while the 360 had shared RAM, the PS3 had two 256MB sticks for the the CPU and GPU respectively. Nowadays even the Xbox 1 is rocking 8GB.
I thought Megatextures were more to avoid the tiled look of many textured landscapes at the time. The idea that the artists can zoom into any point and paint what they need without worrying that it will then appear somewhere else on the map.
Looking around, some people seem to think they were replaced by virtual texturing, but I’ve been out the loop for a long time so haven’t really kept up with what that is. I assume it allows much the same, but far more efficient than a giant texture map. Death Stranding is an example that must use something similar, because as you move about you wear down permanent paths over the landscape.
Right I think I got confused. The megatexture is a huuuuge single texture covering the entire map geometry. It has a ridiculous size (at the time of Rage, it was 32,000 by 32,000). It also holds data for which bits should be treated as what type of terrain for footprints etc.
The problem with this approach is it eats a shit ton of RAM, which the 7th gen consoles didn’t have much of. Thus the only high quality textured that were loaded in were the ones the player could see, and loaded out when the player couldn’t.
Megatextures are used in all IdTech games since, but because they weren’t open world and/or targeted 8th gen consoles and later, with much more RAM, unloading the textures isn’t necessary.
They did lol, and that’s a really dumb question by a tech illiterate. Optimization isn’t a boolean state, it’s a constant ongoing process that still needs more work.
Also optimization happens between a minimum and a maximum. If Bethesda defines that the game should have a certain minimal visual quality (texture and model resolution, render distance, etc), that will lead to the minimum that hardware has to offer to handle it.
What modders so far did, was to lower the minimum further (for example by replacing textures with lower resolutions). That’s a good way to make it run on older hardware, but it’s no optimization Bethesda would or should do, because that’s not what they envisioned for their game.
Always irks me when gamers talk about optimization like it was a specific process or just changing a setting
Todd Just had to toggle the “make game run good” button, massive over sight to be honest.
/s
So what’s this about modders immediately being able to improve performance with mods within a week after release?
They only meant to say that Bethesda did optimize throughout the development process. You can’t do gamedev without continually optimizing.
That does not mean, Bethesda pushed those optimizations particularly far. And you can definitely say that Bethesda should have done more.Point is, if you ask “Why didn’t you optimize?”, Todd Howard will give you a shit-eating-grin response that they did, and technically he is right with that.
It’s usually several small things, like stopping the game from reading a 5GB file several times over and over again.
That they know they won’t do
My entire comment is about why your response doesn’t make sense, they do and it’s not a process that’s ever “done”. It’s whether how optimized is it and if it runs well on targeted specs.
Kiss my ass Todd, my 6700xt and Ryzen 5 5600x shouldn’t have to run your game at 1440p native with low settings to get 60fps
Look man I’m not trying to defend Howard here or imply you’re tech illiterate, or that all your issues clear up 100%, but have you by chance updated your driver’s? Mine were out of date enough starfield threw up a warning which I ignored and was not having a good experience with the same as you(iunno your ram or storage config but I was running on an average NVME and 32 gigs ram with 5600x and 6700xt). But after I updated a lot of the issues smoothed out, not all, but most. At 60 fps average with a mix of med high at 1080p though. Maybe worth a try?
I’ve checked for new drivers daily since the game came our, the ones I have are from mid-august though so maybe I’ll double check again
I’m on a 5700xt and game runs around 60 fps at 1080p everything cranked and no FSR or resolution scale applied, so I’d say either your drivers are out of date or something else is wrong there imo
I’ll have to double check, nothing immediately stands out as wrong, the game is on an NVME, I’m running 3600mhz cl14 memory, and I just redid the thermal paste on my CPU. With all that being said, most other games I play get 100+fps, including Forza Horizon 5 with its known memory leak issue and Battlefield 5, so I don’t think anything is wrong with the system
honestly it runs fine on my 5700xt r5 3600 combo. not max settings, I set to “high” from memory as the game defaulted to the minimum for me, but I could bump it no worries. no real frame rate or stuttering issues. I’d love to run it higher but I’m a realist, and new PC is on the cards anyway over the next year
I have a 5700 xt as well, paired with a 5600X. The game runs perfectly well. I honestly had more issues with stuttering in BG3.
3070 can’t get consistent 60 fps at 1080p high. No stuttering though just low fps.
i actually turn frame counter off. i know im not getting 60, but what i am getting is sufficient that it doesnt ruin my fun, and on my older hardware im ok with it. if i looked at the counter i would probably be more dissapointed
Me too, I only do it to find the best settings that provide a good balance between visuals and performance and then turn the fps counter off.
Can we get HDR support please Todd? No?
I installed an optimized textures mod and instantly improved my performance by like… 20 frames, maybe more.
I have an RX 6600 XT that can run Cyberpunk on high no problem. C’mon Bethesda, the game is really fun, but this is embarrassingly bad optimization.
16 times the detail?
My PC runs better games just fine.
I did, just buy a new one bro.
Am I the only one playing this game just fine on a RTX 2070?
They all wanna run it at 4k resolution…
I’m playing with an R7 5700x and a 3060ti. NVMe SSD storage, 32gb RAM. Ultra settings at 1440p on an ultra wide monitor. I don’t understand the fuss, it is completely playable for me. I guess my standards are just lower? I don’t know. It’s seriously fine. I try not to get wrapped up in actual numbers, I’m more concerned about how it looks and feels to my actual eyes. Not the prettiest game I’ve ever seen but it looks alright to me and it feels smooth to me.
“fine” is subjective, I guess.
To me fine means above 60fps at all times.
I have an “UFO rated” computer on userbenchmark and it’s rarely above 40fps. And the game is not even pretty. Not fine by my standards.Haha yeah I’m on the same and its great! I did have issues when I didn’t realize I installed on HDD, but once I moved the install to my SSD, no problems at all on 2070.
runs great on my 2080, just the occasional white lines that look like shooting stars, not actually sure if thats what they are supposed to be
Todd says “Toddally normal, is best game, five stars”
With Xbox sales in the gutter you would think they would make an effort to make some money on PC sales, but nope. Looks like the game is a bomb.
I think, their strategy is actually the reverse. They try to strengthen XBOX by making this game a quasi-exclusive for it.
I don’t think, they’ll gain many new XBOX customers with Starfield, as it’s neither so exceptionally good, nor does it do things, you can’t find in other games, to make anyone buy a new console for it.
But since their other big IP this year, Redfall, was a complete dud, they’re probably rather even worried of losing long-time customers.
This. I think they expected an Xbox sales surge.
AMD might have had a surge in component sales due to Starfield bundles. But I can’t see it selling a lot of Xbox consoles when it’s a game that kind of makes the console look bad with a 30fps cap even on the top-of-the-range variant.
Optimise for which PC though? There’s only so much you can optimise for general PCs.
Nah, there’s tons of things you can optimize, independent of the hardware. The whole industry runs on smokes and mirrors, because even a 2D game can bring the strongest hardware to its knees, if it’s badly coded / unoptimized.
(Yes, I have experience with that. 🙃)
And there’s always more smokes and mirrors you could be integrating to squeeze out more performance.
Check out steam hardware survey, and target the most used config. That way you can make sure your game is enjoyable for the most players
This would be insane. The majority of Steam users are running outdated hardware. Devs aren’t going to cut their PC games down just to focus on the majority.
Consoles at this point are literally just outdated hardware (within months of release) that everyone agrees to keep supporting. It really wouldn’t be that different for a dev to use the Steam hardware survey to come up with clear patterns for their target system.
insane you say? So it’s much more sane to aim your PC optimization towards a config that only the top 5% use? So that 80% of the possible users cannot run it?.. interesting definition of insanity you have there. Forsaking 80% of your possible target group, therefore missing out on a bunch of money, instead you put out some hot garbage that needs a PC with the cost of a small new car to be played, to still look like absolute shit
People’s PCs will improve in years to come and as tech advances, games like Starfield will look fantastic and run remarkably well. PC gaming always used to be about pushing the boundaries of what’s possible, not catering to decade old hardware.
Sure, but up until recently new games still looked and ran kinda decent on mid-tier off the shelf hardware
Starfield, just as any new triple a title (excluding bg3), is just another proof how incompetent, greedy and fucked up big game studios have become.
I’m sorry but i don’t think starfield looks nearly as good as it demands performance. And to get acceptable performance with current hardware, you have to crank down the quality so far that it looks shit again.
This isn’t “pushing the boundaries”. This is simply “not understanding what the market wants”.
Not optimized for HIS pc
Works on my PC
– That guy
This argument was a really good one when consoles used to be highly specialized to play games, but new ones are just PCs with a different OS.
Consoles are essentially PCs locked down to gaming but they still have their own APIs and have very few hardware variations. Games can be optimised for the handful of different consoles in ways that just aren’t possible with the thousands of combinations of PC components.
One good example from current gen is the shared RAM between CPU and GPU in the PS5. That doesn’t really exist in the PC world (yet), even in systems with “shared VRAM” (in those PC setups, the GPU just gets a chunk of regular non-VRAM that the CPU will no longer access until the GPU gives it up). In the PS5 it’s implemented as a way to eliminate making copies of data between system RAM and VRAM, which can hypothetically be a boost to efficiency, depending on the workload. Of course it also leads to a cheaper hardware bill of materials, which was probably Sony’s primary impetus.
I think the trend for manufacturers has clearly been away from that sort of thing, though. There used to be very deep, architectural differences between PCs and consoles (anyone remember PS3’s Cell?), and for the most part, those days are over.
I have an old FX-8350 processor, 16GB of RAM, and a 3060 RTX video card. I locked the FPS to 30 through Nvidia Control Panel and it runs pretty well, better than I expected on an old 8 (sorta 4) core processor from 2012. Before I locked it the FPS kept varying between 30 and 60 and that made me feel queasy.
The main thing I don’t like is how faded/hazy many things look, needs some contrast or sharpening. I installed a mod but it didn’t go far enough for my taste.
I miss my old FX-8350 :( That’s the cpu I used in my first build. Only issue I ever had with it was Denuvo DRM.
Sounds like you may want to look into reshade. There’s a ton of reshade presets up on the Starfield nexus, many of which may make the improvements you’re looking for.
Yep that will be my next step, too busy playing for now lol. Hopefully Bethesda will add some things like brightness/contrast, FOV, post-process level, texture quality, and sharpening.
Sad but true
deleted by creator