I think you are imagining modern hardware to just be like a 4090. Any modern hardware here meaning current generation GPU/CPUs. They should be able to run at max settings yes. The performance match ups of low to mid range hardware of this generation overlaps with mid to high of the last generation (and so on), so just talking years here doesn’t really translate.
People holding on to their 1080tis obviously shouldn’t be expecting max settings, but people with 3080s, 4070s, 6800XTs (even 2080ti/3070s), should absolutely be able to play on max settings. Especially games like Starfield that are not exactly cutting edge, there’s a lot older games that had a lot of work put into performance from the start and they look and run a lot better than this.
I have an i9 9900k and a 4070ti and can play it butter smooth max settings in 4k 100% rendering. The CPU is definitely starting to show its age, but I haven’t had any complaints about starfields performance.
That said I can’t fucking stand consoles. I get that companies would be stupid to not sell something to as many people as possible, but I’m so sick and tired of seeing good games handicapped because they need to run on a rotten potato Xbox from 10 years ago or whatever…
Like 40-45fps? I’ve seen a couple people say this now, but every outlet I have seen benchmark performance contradicts it. I don’t consider 40fps smooth at all, but I guess consoles even have to suffer with 30fps in some cases, so a lot of people are okay with it.
Consoles dictate a lot of triple A games, that’s where the biggest profit is and why PC is an after thought like it was here.
I actually never pulled up an FPS meter as it has been so smooth I never felt the need to check. I’ll see when I get home later what it actually is in neon or somewhere “busy.”
Yup like sure add it but at least disable it all by default, but motion blur does make low fps look better, if you can put up with the blur that is (I can’t), it’s used heavily on consoles for that reason.
Modern literally means the most recent release. And games should be pushing those to the limits on max settings. I semi agree that even the next release of GPUs should be able yo get more out of the game, ie design the game for the future.
If you’re expecting a 2080 to run a game on max, what limits are we pushing with every new gen? You’d be hampering yourself and leave a bunch of dead weight on modern and semi modern GPUs.
Which I explained would mean the 4000 series/7000 series GPUs and the 13th Gen/Zen 4 CPUs, but the worst one from one of these is not better than the best of the previous generation, so it’s not as cut and dry as ‘modern/old’.
Starfield is pushing no limits, thats the point. It’s just built like shit, so it runs like it. I could maybe be swayed a bit on the matter if it was absolutely ground breaking, but it isn’t. It’s Fallout 4 in space with less stuff going on at any one time.
I think you are imagining modern hardware to just be like a 4090. Any modern hardware here meaning current generation GPU/CPUs. They should be able to run at max settings yes. The performance match ups of low to mid range hardware of this generation overlaps with mid to high of the last generation (and so on), so just talking years here doesn’t really translate.
People holding on to their 1080tis obviously shouldn’t be expecting max settings, but people with 3080s, 4070s, 6800XTs (even 2080ti/3070s), should absolutely be able to play on max settings. Especially games like Starfield that are not exactly cutting edge, there’s a lot older games that had a lot of work put into performance from the start and they look and run a lot better than this.
I have an i9 9900k and a 4070ti and can play it butter smooth max settings in 4k 100% rendering. The CPU is definitely starting to show its age, but I haven’t had any complaints about starfields performance.
That said I can’t fucking stand consoles. I get that companies would be stupid to not sell something to as many people as possible, but I’m so sick and tired of seeing good games handicapped because they need to run on a rotten potato Xbox from 10 years ago or whatever…
Like 40-45fps? I’ve seen a couple people say this now, but every outlet I have seen benchmark performance contradicts it. I don’t consider 40fps smooth at all, but I guess consoles even have to suffer with 30fps in some cases, so a lot of people are okay with it.
Consoles dictate a lot of triple A games, that’s where the biggest profit is and why PC is an after thought like it was here.
I actually never pulled up an FPS meter as it has been so smooth I never felt the need to check. I’ll see when I get home later what it actually is in neon or somewhere “busy.”
Do you have motion blur on?
I’ll never understand why developers add stuff that make the game look so much worse…
Looking at you chromatic aberration, motion blur, film grain, vignette…
The first thing I do with a new game is check graphics settings and nuke that extra garbage lol
Yup like sure add it but at least disable it all by default, but motion blur does make low fps look better, if you can put up with the blur that is (I can’t), it’s used heavily on consoles for that reason.
deleted by creator
Modern literally means the most recent release. And games should be pushing those to the limits on max settings. I semi agree that even the next release of GPUs should be able yo get more out of the game, ie design the game for the future.
If you’re expecting a 2080 to run a game on max, what limits are we pushing with every new gen? You’d be hampering yourself and leave a bunch of dead weight on modern and semi modern GPUs.
Which I explained would mean the 4000 series/7000 series GPUs and the 13th Gen/Zen 4 CPUs, but the worst one from one of these is not better than the best of the previous generation, so it’s not as cut and dry as ‘modern/old’.
Starfield is pushing no limits, thats the point. It’s just built like shit, so it runs like it. I could maybe be swayed a bit on the matter if it was absolutely ground breaking, but it isn’t. It’s Fallout 4 in space with less stuff going on at any one time.