Friday, October 27, 2023

Why I don't buy the inflation argument and why I partially blame Nvidia for the system requirements crisis

 So, I see a lot of apologia online for why the market is as bad as it is, and I honestly see it as a lot of nonsense. The fact is, a lot of PC gamers ultimately have very conservative economic views on things, being de facto market fundamentalists and believing humans need to contort themselves to this all powerful market rather than markets be contorted to human needs. 

A lot of PC gamers these days (at least the ones who are left) are also quite hung up on STATUS as opposed to accessibility. There seems to be a correlation between those who have the more expensive hardware and those who are complete butthats when discussing this subject, with 4090 owners just dunking on everyone else for not being as rich as them. it's like they don't care if the masses can enjoy PC gaming as long as it works for them. Then again, that's generally speaking the republican attitude on the economy in a nutshell and why I don't care what most of them think. They literally don't want the masses to have what they do because it makes them feel less special for being able to afford it. 

Anyway, a common argument these guys make is that the market has changed, cheap GPUs are a thing for the past, $300 is now low end, and if you want to game on PC seriously and get a good experience you need to spend $500-600+ in order to get a good experience, with some people having GPUs as expensive as $1600.

A lot of these people will also use the argument that "inflation" makes it where things aren't as cheap as they were in the past and things aren't like they were in 2010 any more, arguing that yeah a $200 card back then is now $300 and blah blah blah.

Okay, so, let's really process this. A dollar in 2010 is about $1.41 in 2023. With most of that inflation happening in the past 2 years. Okay, so yeah, we could say that yes, things are more expensive than they were, if we buy the argument that things should be more expensive. 

Still, first of all, the problems with the GPU market are a lot more severe than the rest of the economy per dollar. The entire sub $200 market is effectively gone these days. You used to be able to get relatively decent GPUs as low as $100 as recently as 2017. The 1050 provided that experience. And it didn't age well due to lack of VRAM, but if you were to spend $100 today, you would literally get no better than the 1050 if you can find it at all. hell they want $80-90 for a 1030. A 1060 type card like a 6500 XT or 1650 is still $150. And that ain't even enough to run games any more.

When we talk about the kinds of experiences you get from say, Alan Wake 2 or the new Ark game, we're talking like 540p/30 FPS for something like a 2060. That's a card that's still over $200. And you know what? You could do that with an $80 card in 2008. I would know. I did it. I had a 3650 that could run recent games, even crysis 1, at 800x600 at 30 FPS. Hell I used an IGP chip in a $400 2011 era laptop with similar GPU horsepower to run stuff like crysis 2 or BF3 as late as 2011. And I got an actually playable experience. 

So let's cut the BS here. The actual low end GPU market is basically screwed. The $200-300 is the first rung where you dont get an awful experience and with these games getting more demanding, its still getting borderline. This was never the case back in the day. 

Like....okay, a 460 cost $230 in 2010. That's $324 today. But at the same time, the next rung down, a 5770 on the AMD side, was still almost as capable, and costs $226 today. And that would keep you set for about as long as a bigger beefier card like a 460 or 5850. 

And keep in mind, the 3060 is currently like $280, the 4060 is $300. And those are being treated as trash, bare bones, minimal experiences. 

You now need something like $600 to get the experience you used to get with $200. It's getting out of hand. 

And honestly? I don't even buy the inflation argument other than this for a lot of reasons.

First of all, I kind of believe that outside of raw materials, technology should be somewhat inflation proof. A TV used to cost like $500 back in 1950. Does that mean it should cost like $5000 today? NO! If anything you can buy a $100 720p 32" TV from walmart and get a much better experience than anything back then. 

The first cell phone cost $4000 back in like the 90s. Does that mean an iphone should cost $10k today? No. iphones are STILL overpriced at like $600-1200 or whatever, but at the same time there are decent phones down to $200 and even some that are cheaper than that. You can buy like a $40 teac phone with smart phone capabilities these days. Hell I won $20 in my orientation in college and used it to buy a cell phone back then that was a trac phone. 

So...tech is supposed to get cheaper as time goes on. I dont buy that these various tiers of tech should get more expensive. Because the tier is basically buying into a price range, and that shouldnt inflate much as time goes on. Because the tech just gets cheaper and cheaper every year. 

Speaking of which, time to turn the blame to where the blame belongs. And I blame Nvidia for the state of the current market. We have a duopoly with the GPU market. You have Nvidia and you have AMD. Sure, Intel is breaking into the market but their tech is still immature and not worth buying IMO, so yeah.

Nvidia has 85% market share. They typically have superior products to AMD, but AMD has products of acceptable quality IMO. Still, most people go Nvidia.

Nvidia has always had a habit of, once they get the upper hand, just gouging the audience for all they got. Back in the late 2000s, they tried this. They got the upper hand with the 8800 GT and it took AMD several generations to come back from that. By 2008-2009 they were releasing the GTX 200 series and they were basically unchallenged by AMD. So they started charging $400 for their GTX 260 and up to $750 for their flagship. 

Of course, this resonated poorly and AMD flooded the market with cheap products. This drove prices down where by 2010 both sides had good products and competed well against each other. Nvidia was always a little better though and gained more and more market share over time as their products were just more stable, less buggy, and had more features.

And eventually, AMD kinda...imploded like they did with the CPU market vs intel. Their flagship 480 and 580 they could only sell for $200-300, vs the GTX 1060, while Nvidia had the upper hand in the higher end market. Now, of course, the mid range market was where the action always was, what always dominated steam hardware surveys were mainstays like the GTX 460, 760 and 1060, with AMD's best selling models competing with that. The high end market was always enthusiast. 

For a while, the top end single card GPU dropped to around $500-600, but when AMD started having trouble competing again, Nvidia started experimenting with higher end models that cost upward of $700 or even $1000 to see how they sold. And a small minority of PC gamers always went for them, but most stayed to most mainstream products. 

But after the last golden era in 2016-2017, it seemed like progress just...stopped. The 10 series lasted an abnormally long time, and AMD's attempts to compete with the higher end lineup ended up being failures that most of the public werent interested in. While budget gamers were always okay with AMD to some extent, high end gamers just wanted nothing to do with these expensive cards when Nvidia had more established and stable products with more features for the same price.

And Nvidia took over 2 years to even replace the legendary 10 series. And when they did, they ended up giving the public a middle finger in my opinion. It was the perfect opportunity for Nvidia to get greedy and milk the market again. Nvidia started, in the place of offering better raster performance, offering more exclusive technologies. After all, AMD could compete with them on performance but what drove people to Nvidia was their features and stability. And Nvidia just came out with what they saw as the holy grail of PC gaming: ray tracing. Here's the thing, most games use techniques called rasterization to render scenes. They basically simulate lighting and the like, but dont use actual lighting because the physics to do so are too complicated to do in real time. But Nvidia perfected a tech to do it.

Now, this tech kinda sucked. As a PC gamer I never had much interest in it. Mainly because 1) the performance costs were insane, and we'd be regressing from 1080p/60 FPS to 30 FPS at lower resolutions just to do it. Which was...enough to play games, but after experiencing higher frame rates and resolutions, who would wanna go back to that? And they developed AI upscaling known as DLSS to help with that. Basically, it was primarily aimed to help premium gamers upscale 1080p images to 1440p or 4k without a performance hit with AI basically driving this tech. So basically they created super exclusive tech that crushed even the GPUs they made to use it, but they came up with a way around this by using AI upscaling to scale lower resolution images to higher resolution ones. Neither tech was perfect, but they kinda worked and high end gamers bought it.

But it was a catch, this tech was exclusive to hardware on Nvidia GPUs...and because it cost money to put on the card, they raised the price of those GPUs. As such, they pushed the RTX 2060 (they called cards RTX instead of GTX now to focus on ray tracing) to a $350 price point rivalling the GTX 1070 before it. And they did that all the way up the stack. And below the 2060 they released cheaper cards without those features that were much weaker, with the GTX 1660 super replacing the 1060 at the price point and only being a measly 35% better. And that's where the market stayed after that until 2022. 

So basically, all of this happened in 2018, before COVID or inflation. It happened because Nvidia decided to shove new expensive tech down our throats, drive up their costs, and stagnate the market, while focusing on moving away from pure rasterization to ray tracing and AI upscaling.

Now, I never bought nvidia's marketing pitch. I had a 1060, I planned only on using it a few years, figuring that it would be BTFOed by the next generation, but given the mediocre price/performance uplift, nothing Nvidia had was worth it. Up to that point, we'd see doubling in GPU performance every 3 years. Meaning by 2019 there should be a $250 card that runs twice as fast as the 1060. But instead there was only a 35% one. 

And of course, AMD was caught with their pants down. They had no answer. They released the RX 5000 series, but once again couldnt compete with nvidia at the high end. Still, they offered better raster performance for the money, but lacked all of Nvidia's fancy new features. And most bought Nvidia. DIdnt help that the RX 5000 series was also SUPER unstable and had horridly buggy drivers. Sure, I blame nvidia for greed, but AMD just didnt get their crap together. 

And that was the state of the market when COVID hit. and between higher demand for PC parts from that, and the crypto market going through another bubble, the GPU market just...broke. Like again, the prices Nvidia pushed before inflation, this new normal of $350 "60" cards and stagnating markets for anything under that point, was a deliberate choice by nvidia. They explicitly decided to abandon the low end market to force their RTX and DLSS crap, and that wasnt good. And they pushed marketing that "rasterization is old" and "this is the future" and people ate it up.

Now, I dont inherently have anything against either tech. Upscaling is a cool feature that could extend the life of old cards as we later saw when AMD released an open source equivalent with their 6000 series cards (which could be used on older AMD and Nvidia cards as well). And ray tracing, eh....as long as its a cherry on top and not required. I dont wanna go back to barely running crap at 30 FPS. 

But...here's the thing. Nvidia wasnt pushing these techs to make PC gaming better for everyone. They locked the tech to their own GPUs and basically forced people into their own eco system, with each subsequent series forcing their previous ones into obsolescence. And the whole time, performance for the money was stagnating. But what WASNT stagnating was Nvidia increasing the costs of their cards every generation. The top end 1080 ti was $700. The 2080 ti was $1000 (when they bumped all cards up an entire price tier). The 3080 ti was $1200, with the 3090 ti up to $2000, and now the 4090 is $1600. You can see where this is going.

And of course, while the 2000 and 3000 series had this new tech, 1000 series cards didn't and by the time we got the 4000 series more recently, now Nvidia is experimenting with another technology called frame generation that creates "fake frames" at the expense of latency to drive performance up.

Basically, price/performance has since stagnated, with these technologies like DLSS upscaling and frame generation, exclusive to nvidia cards, making up for it. You can see where this is going...

Anyway, COVID drove prices through the roof, as did the crypto boom during covid, and GPU prices wernt up an INSANE amount. And Nvidia made out like a bandit.

And AMD had their 6000 series, which had their own alternatives to DLSS (FSR) and their own ray tracing, but their solutions werent as good so people stuck with Nvidia. And they tried pricing their cards like nvidia's, which worked during the mining boom, but when demand dropped off in 2022, things...collapsed. 

And prices dropped. 

But...AMD cards dropped more than Nvidia ones. Nvidia ones remained near their overpriced MSRPs, while AMD cards dropped to be up to 33% cheaper for the same level of performance. And that's when I bought one. People say AMD's features are inferior, but let's face it, I'm NEVER using ray tracing, and the differences between FSR and DLSS are so minor in practice that I fail to see any reason to consider the expensive Nvidia card. The fact is, while DLSS is better, both technologies are primarily used to upscale 1080p or higher images to 4k, and both technologies are kinda bad at say, native 1080p. 

And for the most part, FSR had been a godsend as it was able to be used on my old 1060. And while it made games blurrier, it also helped extend the life of the card through its period of extreme stagnation.

But now it seems like we're entering this awkward period where FSR and DLSS are becoming a requirement just to run games, with games requiring DLSS or FSR to hit 30 FPS at 1080p. Meaning it's upscaling from like 720p, 540p, or even 320p to output an image at 1080p. 

And that...isn't good.

Last year, I was playing major releases at 1080p/60 with FSR on on a 1060. Now I'm just struggling to hit 60 at all low with FSR on on my 6650 XT, which is over twice as powerful? WHAT?!

Again, push comes to shove, I blame Nvidia for this conundrum. They have decided to stagnate the market, making it cost more and more money to upgrade to decent levels of hardware, and now games are using stuff like upscaling tech and now frame generation to actually push frame rates up to what they should be. Games are being rendered at low resolutions, upscaled to higher resolutions, and frame generation tech creates fake frames that add latency in order to make the experience acceptable.

These forms of tech were originally designed so rich gamers could game more easily at 4k, because 4k literally requires 4x the brute force to run games effectively. But with these features becoming MANDATORY just to play games at good frame rates, who does that help? Not gamers, it helps nvidia. 

Because you're gonna need to constantly upgraded to new, overpriced nvidia cards to get the newest versions of their exclusive tech. DLSS doesnt work before the 2000 series. And frame gen doesnt work before the 4000 series. And while AMD has some implementations of such ideas, like always, AMD is struggling, because they're a smaller company that doesnt have the R&D funds of the top dogs to make better tech. And because this better tech also requires exclusive hardware, which is what locks people into the nvidia model, people not wishing to use nvidia and their exclusive stuff are stuck with more flawed and mediocre adaptions of it that might be accessible on a wider array of hardware, but are worse.

This sucks. I honestly wish gaming just stuck to providing better rasterization performance. if it did, 4070 level cards would likely be available for $300 by this point, based on the trends prior to 2016. But instead we only grew half that amount in price/performance, and honestly, it took a massive crypto boom and an AMD fire sale of GPUs they couldn't sell at a higher price point just to get that much performance out of stuff.

Meanwhile, the tech keeps getting more expensive, performance stagnates, and new games will increasingly be reliant on these new technologies. 

This is why PC gaming is in a bad spot. It's effectively a market failure, driven by an uncompetitive duopoly turning into a de facto monopoly, and the one company pushing more and more exclusive tech to demand more and more money from gamers. And what are you gonna do? Buy AMD? As a budget conscious consumer who understands what's going on...I did. I don't regret it. I bought when Nvidia had a 50% price premium on their products vs AMD's equivalent, which is INSANE no matter how you look at it. Hell, at my desired price point, nvidia basically abandoned the market entirely, since anything below their $280 (at the time) 3050 was just old stuff that didn't even use nvidia's tech. And that $280 card performed worse than AMD's $200 card. 

The market's broken. It's a market failure, and I primarily blame nvidia for it. It's not "inflation" as many are saying. Inflation rarely applies to technology like this, Nvidia started jacking up prices in 2018, and honestly when AMD competes it drives market forces back down to where they should be. But now because Nvidia invested so much in fancy tech, game devs are using it as a substitute for optimizing their games properly, and that's just making PC gaming a miserable experience.

And that's how we got to the point that people are using $200-300 GPUs to upscale 540p to 1080p and running games at 30-40 FPS. It's not that the market is some force of nature that can't be trifled with. It's that nvidia is a force of nature that can't be trifled with, and they need to be regulated and potentially broke up in order to make the market competitive again. Markets are human made forces and are regularly engineered to achieve certain outcomes. And the GPU market is just broken and in my opinion requires some engineering to make it work for the people again.

I don't know what the solutions are. But I think we need more competition, and we need all of this exclusive tech nonsense to stop.

No comments:

Post a Comment