Wednesday, October 22, 2025

Discussing "what's happening with system requirements is getting ridiculous"

 So, I saw this video, and I wanted to discuss it. Basically, it talks about something that I've talked about on here, and has been a pet peeve for me for a while. And that is system requirements inflation.

Previously, i discussed it in the context of "peak gaming", and how it just takes an order of magnitude more resources just to really see better graphics. And I stand by that, but I also want to discuss some other stuff.

One topic being optimization. Developers are horrible at optimizing these days. He's right, if arkham knight, a 2015 game, released today, it would require an RTX card for no reason. Developers dont care about being efficient with resources, as hardware improves, they just target like, 5 year old hardware and call it a day. It's why a machine that in 2008 was like a super computer is now slow as crap at doing the same things. 4 GB RAM? Dont make me laugh. Dual or quad core CPU? Too slow. 8800 GT? Half as powerful as modern integrated, which struggles to do basic things. Does stuff need this much power? No. Do developers care? No.

This happens every console generation btw. When new consoles come out, the next generation of consoles are the development target, and if you got a really powerful console like the PS3/360 were back around 2005ish, or the PS5/XBSX were in 2020, requirements are gonna inflate. And it's ALWAYS BS like this.

Like, you noticed this more as a PC gamer. Because graphics are iterative. Every year, games look better. ANd back in the 90s and early 2000s, it was easy to see how a few years difference led to massive improvements, but around the start of the 360/PS3 generation, I noticed the optimization issue. And around 2005-2006 when these new consoles came out, suddenly requirements went up massively sometimes for no reason. How do I know this? Because of the games at the time. Take half life 2. A late gen 6 game. The first source engine game. Ran on my PC at the time at 30-60 FPS. Take half life 2 episode 2. Same engine more or less, same graphics....ran at 15 FPS. Why? Because F U, that's why. Now, it's not a huge difference since, as a 20 year old game, it'll run on a fricking smartphone these days in theory, but did it kinda piss me off at the time? hell yes. And it wasn't the only one. Battlefield 2. Ran at 30+ FPS. Battlefield 2142 which had the same graphics and came out one year later....15 FPS. Team fortress 2 in 2007 which also ran source? Ran in the single digits half the time on my PC. I compensated by playing pyro, but god it was awfully optimized. Was it justified? No. It was just the half life 2 engine again. CS source ran at like 30-60 FPS. Why did this run at like 15? Again, because they didnt optimize it. 

 We also saw this at the end of the PS3/360 generation and the transition to PS4. Battlefield 3. Ran mostly at 60 on my system, maybe some dips to 45 on the DLC. Battlefield 4, ran at a stuttery 30 FPS and somehow looked WORSE. Like, the game looked worse, and it ran worse. This happens every console launch. You get beautiful and well optimized late generation games that look as good or better than the next generation of games, and then the launch titles on the next console gen require tons more resources while looking the same or worse. Because the devs dont try to optimize. As hardware gets more powerful, developers get lazy and stop optimizing for older hardware, and games look like crap while running like crap. 

And the PS5 generation has ALWAYS been this way. It's just terrible. And it's even worse now since we ARE at peak gaming where we can't even rationally justify the graphics based on looks alone. Like, it takes way more resources for games to look worse. And now, in the 2020s, because we're reaching peaks with hardware development given moores law slowing down, and the GPU industry is being run monopolistically where nvidia is like $400 for a 60 card? Yeah, let's do it. Admittedly was wrong on one thing. You dont need a 4080 unless youre running at 4k. I'd say the minimum requirements are like...a ryzen 5600x, 16 GB RAM, and a RX 6600/3060 if you want a decent quality experience. Which isnt too terrible, but keep in mind, the current consoles are 5 years old now and they're still designed for hardware that's around as powerful as that. 

I already dread next gen. That's coming in maybe 2-3 years, and I'm already hearing rumors of like $800 PS6s with $1200 xboxes. It's INSANE. Do these people think we're made of money? Which is what I do wanna touch on. A big reason this cycle exists, is consumerism. It's like what the hunnicutt books said, with consumerism. They dont want people to "beat capitalism." You're SUPPOSED to feel like you're non a never ending treadmill where you cant get ahead. Because you're not supposed to get ahead. Your lifestyle was designed. You will consume, and you will always buy new products. It's planned obsolescence. Old computers are supposed to be thrown out every 4-8 years. Smartphones and tablets every 2-3. Devs stop optimizing. New hardware is pushed. You buy new hardware to play the newest games. The newest games dont or shouldnt require this level of hardware, but if they didnt, maybe people would stop buying new stuff, and that means profits go down. No, they need you to keep consuming, and working, so rich guy at the top makes money. So you work, so you can consume. And then your stuff breaks, so you work more to produce new stuff, which you then consume. It's a never ending cycle. Youre not supposed to rest, you're not supposed to chill. Just keep working and buying, working and buying. because god forbid if people dont spend their whole lives working, and they dont have to buy the newest thing just to do what they've always done. 

It's one thing if real progress was being made. Back 20-30 years ago, you could argue that it was. We went from 2D games, to 3D, to varying levels of rapidly improving 3D. But Im gonna be honest, the 360/PS3 was the last time I really felt magic over a new console. Because it was the last true leap that was like WOW this is WAY better than the last one. Even then after becoming a PC gamer, I could see how the leap, from a PC standpoint, was more iterative than a generational jump. Again, good last gen games from the last year or two of gen 6 looked almost as good as early gen 7 games. Late gen 7 games looked just as good or sometimes better than early gen 8 games. And gen 9, forget it. It's just gen 8 but terribly optimized with technologies i never asked for like ray tracing being rammed down my throat. And thats the thing, as a consumer, did i ever ASK for this stuff? No. These corporate people just decided they wanted to push it down our throats, created the ecosystem that eventually made it mandatory, and made us have to upgrade even though in theory, a fricking 1060 should STILL be perfectly capable for modern games. If it was enough for late gen 8, it should be enough for gen 9. But it isn't because, well, you need ray tracing now. Even if you cant tell the difference most of the time.

And btw, i aint gonna say there's NO difference in graphics between now and say, 2018, but is it enough to justify such a massive jump in requirements? No. But again, everything is designed around newer hardware now, so if you got the old stuff, you're left behind whether you like it or not. 

I'm just saying, back in the day, sometimes you'd see a difference. Back in the 2000s, some PC games would look 5 years ahead of everything else that was out. Think Doom 3, a late gen 6 game. Or crysis. Crysis was mindblowing if you had a top tier PC at the time and its high requirements were justified. But crysis was really the last game that made me feel wowed on that scale. Maybe planetside 2 for CPUs. Or like, battlefield 1 or 5, those were amazing looking at the same and pushed hardware, especially CPUs. But then 2042 had the whole "requires far more resources while looking way worse" curse, and BF6 isnt even pushing boundaries after that disaster. 

Oh, and one topic I do wanna touch on while I'm here. FRICKING POKEMON LEGENDS ZA, MAN. That game looks truly HORRIBLE. And people are like, "well what do you expect its for a last gen handheld." Uh...that handheld is based on an nvidia shield tablet and packs as much power as the PS3/360. And the game looks like it's on like sega dreamcast, or PS2, or gamecube. The only platform that tier of graphics should be acceptable on is like, mobile gaming. Even then, people have posted screens of games like honkai star rail and are like, yeah, this looks WAY better than this $70 switch title. And I know some are like, but it's "FUN." I mean, I loved red/blue back in the day, but come on, have standards. There's no reason a $300 handheld with $70 games should have games that look, and run, that badly. But again, optimization. 

If you go back 20 years, this is what people were cooking on the same level of graphics hardware:

 https://www.youtube.com/watch?v=vbkIoxuMI38

 https://www.youtube.com/watch?v=BiNK2k1OtMM

 https://www.youtube.com/watch?v=ww1TjHuBZYo

 Those were early PS3/360 tier games. They ran on pretty modest hardware and were before we even had the system requirements boom at the time I mentioned. They ran quite well and they looked amazing at the time. I was really wowed by these at the time. 

Then you look at pokemon legends ZA and it's like...WHAT ARE YOU DOING?! YOU'RE CHARGING $70 FOR THIS?! 

 Heck, just to make a comparison with modern games, just looking up the footage of the above games, I found this, which kinda proves my point all along.

 https://www.youtube.com/watch?v=WOQbEBcQ0bo

Somehow modern games end up being worse in some regard than old games. The details that mattered back then and that could be done on fairly simple computers the fraction as powerful as a modern smartphone arent even in new games, despite those new games requiring, and Im not kidding in some cases, dozens, if not 100x the computing power. Seriously, this industry is so screwed.

And to be blunt, I dread next gen. Because it looks like it's gonna be $800-1200 for a console that, while faster than any computer currently available, probably has games marginally better looking than we have now, but take like 10 years to properly develop for, and require, currently nonexistent specs just to run. 

Is it worth it?

Honestly, I think graphics and gaming peaked at gen 8 and everything past that is a mistake.  

 EDIT: Dear god, that last link sums up everything. Not only are modern games horribly optimized (because they dont try), but in the details games from 10-20 years ago literally do things better. I knew modern gaming sucked but I didnt really think it was that bad, but it is. And games require more resources for THIS?! No wonder im not impressed by modern games much. 

 

No comments:

Post a Comment