So, another "modern gaming" trend that's getting on my nerves is the hyper emphasis on upscaling technologies. I game at 1080p. I want to play games at 1080p. I tend to lower almost every other setting before I reduce quality from 1080p native. But it seems like no new game wants to actually run at 1080p. I was playing COD the other night while watching the act man's new video on modern warfare from 2007, and then I kinda noticed, wait, why do the graphics look so much better in that 2007 game than my 2025 game? And I quickly realized, OH WAIT, THEY SOMEHOW DOWNSCALED THE IMAGE QUALITY BY DEFAULT, no wonder everything is so blurry. So I tried native FSR upscaling and it tanked my framerate hard. Went from like 160 FPS to like 70 with dips. And...ugh, this is an issue I've had with all my new games so far. They all wanna downscale or use TAA by default and it just looks SO FRICKING BAD! But then you go back to native res and then the game runs like hot garbage. It's ridiculous.
Look, I'm getting to the point I think a lot of games looked better like 10 years ago. Because at least they were designed around running at native resolution. And quite frankly, the AA techniques used back then were better. I remember we had FXAA, which introduced some blur but not much, with MSAA being the demanding one. 4x made the image look nice and sharp and even 2x improved things. And if you couldnt run it you turned it off. You had jaggies but at least the image was sharp. Now they like FORCE you to use AA mostly. And it's built right into these upscaling techs. FSR and DLSS both do AA via upscaling. And then if you dont wanna use that there's TAA, which looks like absolute crap and makes the image look horrifically downscaled.
Battlefield 6 looks awful until I swapped from TAA to FSR native. And then once again, my frames dropped. Image is sharp now, but once again, now I'm running around 60 with dips. And I gotta be careful of that 8 GB VRAM buffer, which is crippling even a fricking 6650 XT (which is like 5050 level performance, if not worse these days given AMD kinda abandoning drivers to some degree).
Doom the dark ages, dont even try running that at native. VRAM buffer is over 8 GB, it stutters, it runs like 45 FPS. So I gotta scale THAT down. Thankfully they have a sharpening option which i tweaked until it looked like native again, but I shouldnt have to do so many kinds of workarounds just to run stuff at native resolution.
In the past, games were just designed for native resolution. Period. You go back to playing a 360 era game, it looks sharp and crisp, even if the graphics are dated. Modern games, the visual quality is on paper much better, but then it looks like hot garbage because its like 720p upscaled to 1080p. And devs just treat that like it's normal.
I've never been into upscaling. When the tech came out I was like "yeah that's nice", but lets face it, it was meant for people to upscale to 1440p or 4k. If you use it at lower resolutions, it doesnt work as well. And as a 1080p gamer it was never meant for lower resolutions to be upscaled to 1080p. But devs have kinds just gotten lazy, stopped optimizing games, and now if you want native resolution, you gotta give up on all other settings. It's crazy. I hate it.
To me, this is not progress. Low resolution looks like crap. Upscaling looks like crap. I don't care how much better graphics look in the 2020s. Again, that stuff stopped being a huge deal long ago. I haven't been wowed by a game graphically in years, and part of it is because most of them look like smeary messes in practice because low resolution. That and ray tracing isn't the revolution everyone acts like it is. Quite frankly, much like AI, all these new techs, as far as I'm concerned, are artificially shoved down our throats, and dont make gaming better, but worse.
I'd literally rather have 360 era graphics with native res than 2020 graphics with upscaling. Or at least PS4 era graphics with native, that seems to be the sweet spot. Really. I'm tired of these devs chasing trends and making gaming worse. It's one thing if I'm playing a game on aging hardware and it runs like crap, but when they're still selling cards at that power level for $250 even today, and games dont run right on them unless you upscale, it's a problem.
People will say, we need more power for the games to run, but in reality, the devs choose their detail and framerate targets. Games should be made for available hardware. And when devs keep raising the bar when the tech isn't getting any cheaper, that's a problem.
There's NO reason to make a game that runs on PS5 tier hardware at below 1080p/60 FPS, if you do this, you just decided to make a poorly optimized game. Period, sorry, not sorry. You could just as easily make a slightly worse looking game that still looks reasonably good. It's not the 2000s any more, it's not like the difference between a truly great looking game and an okay one is that huge anyway since tech has advanced to a point where all games look good.
So yeah, less emphasis on fancy lighting effects almost no one notices, more emphasis on running stuff at actually native resolution at good framerates. because again, if you aint running stuff at native, it looks like hot garbage anyway and any visual improvements arent worth it.
No comments:
Post a Comment