Friday, October 27, 2023

PC gaming is reaching a crisis (crysis?) point

 So...I've been looking at new PC game releases today, and how they perform, and I really have to ask these developers, ARE YOU GUYS FREAKING INSANE?!

Seriously. I'm really starting to think that PC gaming is reaching a crisis point of affordability. It's absurd. 

Up until last year, I had a GTX 1060 as my main card. It's a pretty common card. One of the most common in the world actually. It was the best bang for your buck in 2016-2017 and it provided an insane value. It cost me $270, and it was able to play games high-ultra at the time and hung in there all the way through 2022.

However, I knew that in 2023, there would be a cutting off of last gen backwards compatibility, and that performance would skyrocket.

Honestly, around the same time, GPU prices started coming down. For a while, I couldnt even afford a decent upgrade from my 1060. Nvidia jacked up the prices of GPUs with its 20 series cards, driving the GTX 2060 to $350, almost as much as the previous gen 70 card. This was before inflation btw. And the 1660 super was only like 35% faster than what I got. Then in 2020, they released the 3060, which was only slightly faster than the 2060 for $330, and the 3060 ti for $400. If you wanted a $250-300 card from Nvidia you had to settle for a 3050, only 40-50% faster than my 1060. 

Eventually, AMD caved and lowered prices. In retrospect AMD always was cheaper, offering the 5600x with 1080/2060 like performance for like $270, and eventually dropping their overpriced 6000 series cards down to decent prices. Things went down fast, and it was finally time to upgrade, just in time for that system requirements wall to hit in 2023. So I bought a 6650 XT for $230, which was an AMAZING deal, given the 3060 the nvidia analogue and still cost like $340, and went about my merry way. I even encouraged others to upgrade on here if they were holding out, saying this is the best time and things aren't gonna get better any time soon.

And a year later, that advice as largely held. While the 6650 XT for $230 became commonplace, newer alternatives like the 7600 or 4060 aren't much faster (we're talking <10%) and cost $250-300. So the market is holding, and this is the new normal with GPUs.

And it seemed like a great deal. My 6650 XT still chewed through all the games out there, and stuff I was relegated to running my 1060 on low with I could suddenly run on high or ultra. Maybe a few super demanding games I had to do medium to get a smooth frame rate, but it wasn't the end of the world. Quite frankly if I was more willing to use FSR I could push settings more, but honestly I prefer lower settings with better image quality. 

But now we're seeing the wall hit. And man it is NOT pretty. If this is the future of PC gaming, we're screwed.

The early year games didn't seem bad, like Dead Space and Dead Island 2 and crap like that. But the last month or two as we're approaching the end year releases seem INSANE. 

First we had that Immortals of Avium, which seemed like a tech demo, but it seemed like from what other benchmarkers have done, it can't even hold 1080p 60 FPS on my 6650 XT. I'd need to use upscaling on LOW just to reach that.

Starfield is just as bad. And everyone was complaining about how janky THAT was. 

And now we got crap like Alan Wake 2, which REQUIRES a 2060/6600 and is unsupported on the 5000 series of cards from 3 YEARS AGO. I mean, WHAT?! And that requirement was for 1080p/30 FPS with upscaling ON. WHAT?! Granted, it turns out it's not AS bad as predicted, but still almost as bad.

Ark Survival Ascended seems about as bad. Benchmark numbers aren't out but those who have tried it are reporting that even 4090s struggle with the game. And once again, people with what are considered "low end" cards (apparently a brand new $250 GPU is "low end" these days) are basically getting a 30-45 FPS experience...on low...with upscaling on.

I mean, I'm sorry, but...WHAT?!

I mean, gaming has evolved a lot from the old days. Yes, we used to have games back in the 90s and 2000s that really crushed hardware and required the best just to run things on low. But...it was a different time. Back then, graphics advanced quickly. From like 1995 when 3D started being mainstream through 2005ish, graphics got appreciably better by the year. But then things started plateauing after 2005, and due to the prevalence of consoles, requirements slowed down a bit. Things started becoming more about resolution and frame rate. It used to be 30 FPS was standard. But then people started pushing 60. Resolution mattered. We moved past running stuff at 800x600 or 720p to higher resolutions like 900p and 1080p. These days 1080p60FPS is considered almost like a minimum spec. A lot of higher end cards have been targetting 1440p, 4k, or even stuff like high refresh rates. A lot of gamers don't just want 60, they want 120, 144, or even 240+ FPS in games these days. 

And I know that a lot of the super high resolution stuff and high refresh rates stuff is premium. And ray tracing is premium. And honestly? Because I value just getting my foot in the door, I opt to stay on 1080p/60 FPS. Which is why I could keep my 1060 for so long. I focus on longevity for hardware. I buy every few years, buy a reasonable mid range card (although these days my preferred price range is budget, but let's be honest, that's more because the market is broken, not that my expectations are too high). If anything, my expectations are low. because low expections saves me money. And my only real aspiration is 1080p 60 FPS. I don't even need to run games at high or ultra. If I can that's nice, but that normally means that my GPU will be sufficient for some time to come. I got gas in the tank if you will, because the requirements went up.

And again, sensing the writing on the wall for the 1060, I retired it with dignity and got something over twice as fast for roughly the same price. 

But now these new games with these new requirements...I'm sorry, but...what?

Sub 60 FPS? With sub 1080p resolutions? What? Are we regressing? Is this 2007 again where we were expected to play Crysis on 720p 30 FPS on an 8800 GT?

And it's not even one game like crysis. We've had three such games in the past 2 months. Which means I'm sensing a trend. And this isn't a good one.

Look, if this is the future, PC gaming is reaching an affordability crisis. I just upgraded my GPU a YEAR ago. And now suddenly I run some of these new releases as bad or worse as I was running stuff on my 1060 when I retired it. What the hell, man?! What is up with these requirements?

And don't say consoles. yes, the consoles set the baseline, but these consoles were sold to us on basis of high frame rates and resolutions being a thing. Who the hell wants to play games on a console at 720p 30 FPS? In 2023? Why would you even DESIGN something like this?

And given how the console specs translate to PC specs (hint, my 6650 XT is a "console like experience"), this is completely DESTROYING the PC market. 

Because we can't even upgrade. What can we upgrade to? This isn't like the 2000s when Doom 3 did this and Crysis did this, and we could expect GPUs to double in price/performance literally every 3 years or less. It took 6-7 years just to go from a 1060/580 to a 6650 XT/3060/7600/4060 at the same price point. And GPUs are more expensive than ever. It used to be $250 was mid range and a $500 card was high end. Sure, more expensive existed, but they were like dual cards and SLI setups. Now we got cards as expensive as $1600. We treat $500-600 like the new "midrange". And we treat $200-300 GPU buyers like garbage.

And I can tell you, looking at steam hardware survey, I have to wonder what drugs these people are on. I mean, until recently, the 1060 and the 1650 were the top cards on steam. Those are slowly being phased out for the likes of the 3060. So I'm really right there with your typical PC gamer. I used a 1060, upgraded to AMD's 3060 equivalent, and that's where most consumers are.

So tell me, why are these cards aging like a banana i bought when it had spots all over it already?

Normally, a "60" type card lasts....4-6 years. That's typical. Think about it. a 2007 era 8800 GT lasted to around 2013ish for gaming. A 2010 era 460 lasted to around 2014-2015. A 2012 era 660 lasted to around 2017-2018. A 1060 lasted to around 2022, when I replaced it. I mean, I know the value of these cards, how long they're supposed to last, and the frame rate they're supposed to provide in their age range. Not saying you CANT get screwed. For example a 2008-2009 era 260 only lasted to 2012-2013 because of no DX11 support. A 560 from 2011 lasted only to 2014-2015. A 960 from 2014 lasted to around 2017-2018 (although the 970 which was only $330 at the time lasted also to around 2022, so 8 years). So...you CAN get screwed. But again, if I time my upgrade well, and I should have, given the 6650 XT released as a $400 GPU and went down in price like a brick after COVID, and its replacement is literally single digit performance better for roughly the same amount of money (normally bad upgrades come after a period of long stagnation with the next generation almost doubling performance overnight), this doesnt seem like a bad time to upgrade. After all, not only is it needed, but this WAS the big performance jump. Until recently we were talking 1660 supers for sub $300 and now suddenly, BOOM, 6600s, 6650 XTs, 7600s, 3060s, and 4060s...with no replacements planned until 2025. I should be set. That's normally how the market works. 

But with this system requirements wall, holy crap. I doubled my performance overnight, and then game devs just doubled requirements overnight. And now the 6650 XT is running like my 1060 did at the end of its life cycle in new games. And these cards cost up to $300. And...that's STILL the best price/performance level. I mean the next level up is a 6700 XT for $330-350. And that's only 25% better. And then a 6800 is $400. And if you notice I'm not even listing nvidia cards because the 3060 ti and 4060 ti are overpriced and still only have 8 GB VRAM, which is a limitation on a card of such price range (my 1060 had 6 GB, the 480 had 8 GB in 2016, and the 6650 XT has 8 and while it seems fine for such a cheap card, if I'm spending 50-100% more money, I want 50-100% more VRAM). 

And yeah. It's insane. There's nothing to upgrade to for a decent price range. They're really just going back to sub 1080p sub 60 FPS, sometimes as low as like 540p30 FPS with these new games, and oh well. if you actually want 1080p60 on low without upscaling, guess you need to go for at least like a 7800 XT for $500 or a 4070 for $600. 

Hell, people with $1600 4090s are complaining Ark is crushing their cards. They're getting performance that in a sane game you'd get with my card. Like 1080p, 30-45 FPS on ultra, maybe 60 on medium. That's....fairish. 

But you shouldn't need a $600 card just to reach 60 on low settings at 1080p. That's RIDICULOUS.

Again, this isn't an old card. The 6650 XT is still being sold. And its replacement, the 7600, is barely any faster. Nor are the nvidia equivalents, the 3060 or 4060. So this isn't a bad card and bad hardware. Why is this running like Crysis in 2007 on an equivalent card from its era? Again, who would design a game like this in 2023?

Honestly, between this system requirements inflation and the inflation of actual GPUs...PC gaming is becoming unaffordable. This is INSANE. GPU price/performance is stagnating. Price is going up. Requirements are going WAY up. Honestly, my 6650 XT should be a serviceable card until around 2026 maybe. I can understand my 1060 getting too old for new games, being relegated to 30 FPS on like low with FSR all the way down. That's expected. It's ancient. But....honestly, given where hardware is, and where requirements are, this ain't good. No one actually wants to go back to an era where sub 60 FPS and sub 1080p is acceptable. You realize people play up to 4k, right? And upscaling kinda sucks, right? Just because it exists doesnt mean it should come at the cost of optimizing your games properly. I dont care how good your game looks if its blurry and stuttery when I play it, ya know? And I know, as a "60" card buyer, I'm not gonna get ultra every game. i resigned myself to that fact. I'm used to tinkering with settings. But I don't think that "hey you should be able to run games on low 1080p/60 FPS native" is a whole lot to ask here. Maybe have A game or two like cyberpunk that pushes things, but most games should run fine. 

This is getting out of hand. PC gaming is becoming flat out unaffordable for the masses at this rate. It's not 2005 any more. running games at 800x600 30 FPS should be a thing of the past, down there with N64 games running games at 18 FPS 240p with heavy fog on. 

I can understand something like a 1070 or 1660 super being a requirement these days, but when I see stuff like 2060, 1080, 5700 XT, and the target frame rate is 30 with DLSS/FSR on...that's just...yikes. Optimize your fricking games. No one wants to game like this in 2023 unless you're on an 8 year old system.

No comments:

Post a Comment