Thursday, April 16, 2026

The best and worst years to upgrade your PC

 Okay, so we got the tier lists for CPUs and GPUs, let's do it. What years were the optimal years to upgrade?

2006

Eh, I mean, you could get a really good CPU/GPU combo here, but I think you'd be better off waiting for stuff to get cheaper in 2007. The tech is there, but it was more expensive in 2006. 2007 was a far better value. So, B tier I'd say. I mean, youd probably get a lower clocked core 2 with like an 8600 GT if you budgeted, which is...okay, but given what was available in 2007...nah. 

2007

 If you bought a Q6600 (or even E8400) and paired it with a 8800 GT, you'd be solid through 2013. A 6 year PC was legendary given how fast things moved back then. S tier. 

2008

 I mean, basically whatever you could buy in 2008 you could buy in 2007. You had higher end core 2 quads that were expensive. you had the GTX 200 series, but honestly, the value isn't really here as much. Still, A tier, because for budget buyers it was pretty solid. 

2009

 2009 was a weird time. If you bought around now, you'd be buying aging parts or CPUs/GPUs that wouldnt last super long. You would kinda get the short end of the stuck. Phenom IIs, Nehalem CPUs, aging DX10 GPUs...wasn't great. C tier. 

2010

CPUs are even more dated, GPUs were better though with HD 5000 series and GTX 400 series on the market. Given GPUs were better and the CPUs would last you until 2014-2015, I'd say B tier, but yeah, wasn't great. Not terrible either.

2011

On the CPU side of things, you'd do well with a 2500k or 2600k, GPUs were mid though. Still, you'd be able to stuck a much more powerful GPU in an aging build and use it a few more years. So not terrible. B tier.

2012

Between sandy/ivy bridge for CPUs and the GTX 600/HD 7000 series for GPUs. This is probably the golden year. S tier. You got something good here it would last until 2018ish. If you got something top tier, maybe longer.

2013

Not a bad year. Definitely post peak, but you wouldnt get burned here. A tier. 

2014

If you bought a high end computer like a 4790k with a 970 or 980....great year. if youre a midrange buyer buying a 4670k with a 960 or R9 280 or something. Not a great year. B tier. 

2015

 The value proposition is no better than 2014. Except you lost another year. If you bought midrange, you'd ESPECIALLY get burned here. Quad core i5 6600k with like a 960 2 GB? OOF. That's like....F tier. But because high end buyers didn't make it out too bad, I'll give it a D. 

2016

 CPU wise, unless you went 6700k, you're getting burned. We're at the end of the intel stagnation era and CPU wise you'd get something that ages badly. The GPUs were legendary though. Too bad you didn't wait another 1.5 years though. I'd say C tier overall though. 

2017

Beginning of the year is just more 2016. Kaby lake changed nothing and Zen was middling. You didnt make it out badly if you bought an i7, but otherwise, you got burned. End of the year, good time. Grab a 8600k or 8700k, pair it with a 1060/580 or higher end card. And you're golden well into around 2022-2023. S tier year if you timed it properly. 

2018

 CPU wise, things are no better than late 2017. The market is what the market is. The GPU market starts going to crap as you face eyewatering prices for the Nvidia 2000 series. If you bought a 2060 or higher, you were golden. I'd say like a 8700k/2060 or 2070 build is a solid higher end build. 8600k+1660 Ti or something is kinda meh though. And then you had crypto driving prices for GPUs nuts for a while. B tier. 

2019

 If you bought zen 2, it was a bit better than 2018, but otherwise, same options more or less. Still, given the crapshow about to commence, it was arguably a good year just because the next couple were so bad. Still, you missed the 2017 boat. C tier. 

2020

 God help you if your computer broke down. CPUs werent that bad, Rocket lake and Zen 2 were both competitive. Zen 3 was expensive but solid. But GPUs became unaffordable. I graded the 3000 series on MSRP, but yeah, this is the era where GTX 1050 tis started costing $400. F tier.

2021

 CPU wise interesting things were happening, but once again, GPUs. F tier. Everything was a poor value around this time. 

 2022

 CPUs weren't quite ripe for the pickings yet, but getting there. Zen 3 was going down, alder lake was interesting, but pretty expensive. GPUs were expensive for most of the year, but prices started crashing around the holidays, especially on RX 6000 series. Eh...D tier overall though, unless you bought around christmas. 

2023

GO GO GO GO! GPU prices drop, can finally afford GPUs again. CPU prices drop like a brick. You got a nice beginning of a stagnation era going on. This is the best it's gonna get for a while. This is the optimal time I think for a new computer. You buy here, you get cheap stuff, and you can sit on it through RAMmageddon like I currently am. I dont think that the peaks here are as good as previous eras, given the GPU market has fundamentally changed. but this is the best it's gonna get. A tier. 

2024

The deals are fundamentally unchanged. It's not bad but 2023 was peak season. B tier.

2025

 CPU wise nothing interesting has happened in the past few years. Weve been in a holding pattern since 2023. GPU wise, the 5000 series and 9000 series offered decent uplift, but youre still stuck with 8 GB under $300, although a 9060 XT 16 GB or 5060 Ti 16 GB is tempting. Hopefully you locked in before RAMmageddon. Things started getting rough in Q4. C tier. 

2026

 RIP. We're in the midst of RAMmageddon. I guess it ain't AS bad as COVID, but i guess if you need to buy new RAM it really is. Still, I'm gonna say SLIGHTLY better than COVID, but if you gotta pay $400 for RAM and and $200 for an SSD, not really. F tier. 

So all in all where do things stand?

S tier

2007, 2012, 2017

A tier

2008, 2013, 2023

B tier

2006, 2010, 2011, 2014, 2018, 2024

C tier

2009, 2016, 2019, 2025

D tier

2015, 2022

F tier

2020, 2021, 2026

So, some of these are subjective and can depend on price range. I emphasized midrange buyers here. Like, i5 CPU, 60 class GPU. Yeah. I did consider higher end buyers into the equation somewhat, and some years definitely varied there. Like 2015 could go either way. Either you got burned bad, or you got a GOAT computer that made it through COVID. 

The best times to upgrade were 2007, 2012, 2017 (but only late 2017), and 2023ish. 2023 is A tier because the GPU market never returned to its pre 2018 normal, but still, it's an honorary S tier here simply because hey, you gotta upgrade some time. 

On the flip side, the objectively worst times were 2020, 2021, and 2026. COVID and the new RAMmageddon F-ed up the market in ways that just made it beyond terrible. Even ignoring the longevity of components, you just got burned on price. 

Sometimes years were good on CPUs but not GPUs, like 2011. Some were good on GPUs but not CPUs like 2016. It REALLY depends on what you bought. But yeah. There seems to be an optimal pattern where every 5-6 years, there's a golden year or two in there. If you time it properly, you get the best deals. My luck is mixed. 

I got my GPU in 2022 and CPU in 2023, and hit that optimal A, borderline S tier zone.

Previously I bought a CPU early 2017, but got the GPU late 2017. Kinda like a 2016 year, but I compensated with an i7 to soften the blow knowing the CPU market was kinda F-ed at the moment. 

2010 before that. It was okay. CPU didnt age great, GPU aged okay though. And I got a 580 which eventually got upgraded to a 760 courtesy of EVGA in 2012, so the GPU hit that sweet spot there, but the CPU aged like milk. 

Before that, I got as PC in 2005, which i didnt account for 2005, but that would've been a pretty C tier kind of year, and given i bought at the beginning so I had 2003/2004 tech, it was a D tier decision but one driven by necessity. 

And yeah. Right now, I'm in the goldilocks zone waiting for the next big jump. Probably won't happen until 2028-2030 when RAMmageddon passes, if we arent just witnessing the death of affordable computing thanks to AI. Seriously, we can tell just based on what I wrote that AI is just screwing things up as bad as COVID and crypto. I think it'll eventually pass. but we can clearly see that with GPUs we're frogs in a boiling pot here.  And yeah...

GPU Tier list

 So, I decided to do the same thing I did with CPUs but with GPUs. Here, I'll focus not just on the high end, but also midrange components, and even budget components. I feel like I gotta do that because with CPUs there remains a healthy budget market even today, whereas with GPUs...that's all but dried up. So yeah. I'm basically gonna take the entire product stack into consideration, with emphasis on the $200-300 price range. 

Nvidia 8000 series (2006)

Arguably the GOAT. Definitely the GOAT of the 2000s. When I got into PC gaming, everyone was talking about these, because much like core 2 duos and quads on the CPU side, on GPUs, these crushed the 360 and PS3. Again, those consoles were designed as high end PCs at the time, and here Nvidia and Intel just obsoleted them in like a year. They were also quite affordable. The venerated 8800 GT was $250, and that was an expensive card at the time. Normally the high end was $500ish, and here, the 8800 GTX was $600. And the $250 model almost kept up with it. It was insane. Anyway, much cheaper options existed too like the 8500 GT, the 8600 GT, etc., although none was as solid of a value as the 8800 GT, which BTFOed the consoles 1-2 years after launch. S tier.

Radeon HD 3000 series (2007)

AMD just acquired ATI not long before this, and yeah....ATI was always the "AMD" of GPUs. Like, the second brand, the budget brand. The cheap brand. I mean, it was okay, but the driver support was wonky, and not very long lasting. My first GPU was actually a HD 3650, the AMD equivalent of an 8600 GT or 9500 GT. And yeah the driver experience was awful. But it breathed new life into my old HP desktop regardless. I can't hate on it too bad, B tier. Drivers only lasted about 5 years though. 

Nvidia 9000 series (2008)

The refresh didn't bring much improvement on the high end. Just a bunch of rebrands, the real magic was with the more midrange cards like the 9500 GT and 9600 GT, which were very cheap and very potent. Again, golden age of PC gaming right here. Not quite GOATED, but A tier.

Nvidia GTX 200 series (2008)

Basically, the third iteration of "tesla" (8000/9000 series) architecture. It was becoming quite dated by this point, and lacked DX11 support, which would give it a reduced lifespan. It was very powerful, with the GTX 280 being like 2x the 8800 GT, but it was also VERY expensive. Given AMD struggled to compete with it, basically, nvidia did what Nvidia does and just created tons of new price tiers to offer premium performance, but not really bringing it to the masses. The GTX 260 was like $400. Ya know? It was crazy. And it really didn't last very long due to its lack of features. Just a bad series all around IMO. D tier. 

Radeon HD 4000 series (2008)

Like the 200 series and the HD 3000 series, it didnt last very long. It, too, lacked DX11. AMD was horrid with driver support. But it was much cheaper than the 200 series and offered a much better value at the time. By this point the 8800 GT's performance level was offered by the HD 4830 for like $130. Again, GPU prices were crazy back then. This is why I dunk on the market now. Back then, you could get good hardware for cheap. If anything, the CPU was more of a pain than the GPU long term. Even the 4870 was like only $300. Even with inflation, it's nothing like today.

The big advantage of AMD here is that it did drive the GTX 200 prices down. Nvidia got super arrogant and AMD smacked them down. Still, I can't say the HD 4000 series had great longevity, so I kinda gotta give it like a B tier.

Radeon HD 5000 series (2009)

This took aim at the GTX 200 series too, although I'd say its primary competitor was the GTX 400 series. It targetted the higher end of the 200 series and traded favorably with the 400 series, and at least for the $200 series, often at a lower price. It also had decently better driver support, more VRAM, and DX11 support, making it about as reasonably futureproof as a card could be back then. Gotta keep in mind, 4-6 years was normal back then. And this actually was viable until around 2015-2016, whereas the HD 4000 series and GTX 200 series cards were running out of steam by 2013. A tier. Not quite S, but a solid offering from AMD.

Nvidia GTX 400 series (2010)

A bit late, and a reaction to the HD 5000 series, but it did have better driver support long term. AMD pulled the plug on the 5000 series in 2015 when windows 10 launched, whereas this got drivers until 2018. It didn't matter a ton ton because both GPUs were fairly deprecated by then, but I do think the Nvidia card aged just slightly better here. All in all, also A tier. Both brands did a solid job here.

Radeon HD 6000 series (2011)

And here we have the inevitable refreshes. This one did make 5000 equivalents cheaper, but they had no more longevity. The HD 6000 series was discontinued at around the same time as the HD 5000 series driver wise, and the 1 GB VRAM was the bane of this card's long term existence. C tier. 

Nvidia GTX 500 series (2011)

Similar to the HD 6000 series, it was refreshed 400 series. It has slightly more VRAM, but still topped out at 1.5 GB which didn't help it massively long term. Drivers aged a little better. I guess the advantages it had made it a bit better than AMD's offerings, but neither aged particularly well. I considered giving it a B but I cant justify it, it's more of a C. 

Radeon HD 7000 series (2012)

This one was the GOAT. The venerated GCN architecture that AMD used for generations after this. It sported up to 3 GB VRAM and its architecture aged quite well due to being used in the PS4, having relatively long driver support for AMD, and being reasonably powerful for its time. Probably one of the best series AMD ever launched. S tier.

Nvidia GTX 600 series (2012)

Also pretty GOATED. The GTX 660 and 660 Ti were basically 580s with more VRAM. The high end seemed to lag a bit though, and the kepler architecture kinda aged poorly. This is where AMD "fine wine" became a thing. The HD 7000 series was like the AMD series that aged super gracefully, and then this Nvidia one just...didn't. Nvidia just kept making new architectures while AMD iterated on RDNA. And while Nvidia's innovations would eventually help it capture the high end, AMD kinda dominated the midrange for a while and kept Nvidia in check price wise. I'll give this an A tier since I feel like the Radeons were actually the winners here. 

 Nvidia GTX 700 series (2013)

 REFRESHES GALORE! I mean, there's a pattern, you'd have one series that revolutionizes the market and then a refresh which improved things, but not as much. And yeah, the 700 series was kinda mid. It felt like a bit of a stagnation. The higher end was held back by VRAM, the architecture aged kinda badly. It wasn't bad for the time, but kind of mid. B tier. 

AMD RX 200 series (2013)

This felt like a 7000 series refresh. And I don't even recall much about it. Looking it up, I'd say it had more movement than Nvidia. I mean, the RX 280X was on par with the 7970 GHZ edition for $300. So probably a better value than Nvidia at the time. Heck, it actually does look like really good value, looking it up. I'll give it an A tier since this should have kicked Nvidia's butt. It had really competitive offerings all the way down the stack. And my gosh, look at those sub $100 offerings. Not saying those were very good, but they had so many. I think the 250/250X were the lowest end worth buying IIRC. But yeah that's 5850/460 performance for like $100. That's INSANE. This is why I always rip the market now. Oh, and the RX 290X looked like it was on par with the 780 Ti...while costing much less. 

Yeah, I'm giving this an A tier. Very strong A tier. Actually kinda makes the 700 series refresh look mid. 

Nvidia GTX 900 series (2014)

This series was a bit of a mixed bag for me. If you were at the high end, it was amazing. GTX 970 and above were very solid. The 950 and 960...they were mid. Very, very mid. Like GTX 660 3.0 level mid. The higher ones aged very well, with some of them being the first 8-10 year GPUs. This is where we started getting real longevity with GPUs. The driver support, VRAM, and general power of cards was enough to carry you up to the modern era. So around 2023 or so. 

Still, I can't give this S tier because....well....those mediocre midrange cards. So A tier.

 AMD RX 300 series (2015)

 So, these were a bit more mid I think. The high end card was between a 970 and 980. The midrange was competitive with Nvidia looking it up but nothing special. AMD kinda feel off here. It wasn't bad. But it wasn't great. It's like HD 7000 series 3.0 at this point. And yeah, IIRC these guy discontinued at the same time as the 7000 series so yeah, they aged kinda badly. C tier. 

Nvidia GTX 1000 Series (2016-2017)

 These are the GOATs of the 2010s. You had very solid performance gains, very solid pricing. Lots of VRAM, solid driver support. And yeah, reasonably long life. The 1060 was on par with a 980, the 1070 was on par with a 980 Ti. The 1080 Ti is still on par with the card I use today. This is the last series we truly had a lot of progress for the money with. Some say they never made another series like this because it was just too good and people wouldnt buy more cards. S tier. 

AMD RX 400 series (2016)

AND kinda flopped at the top end, topping out with the $250 RX 480 8 GB. But it had solid value. And AMD was very competitive for what it brought to the table. I can't give it S tier because again, it topped out at the $300 range, and we can see this is where AMD really ran out of steam here and failed to compete. But still it deserves at least a B for value alone. 

AMD RX 500 series (2017)

It's a refresh. SIngle digit performance gains. Just 400 series slightly overclocked. I swear they launched it just so they could say the 580 actually beat the 480 by like 2% or something. Still, despite that, it also had worse driver support over time. Honestly the magic is gone. C tier. 

 AMD RX Vega series (2017)

 These are AMD's high end cards intended to compete with the 1070, 1080, etc. They flopped. They were buggy messes. Very power inefficient. They were hated by the masses. AMD was kinda imploding by this point. And this is kinda how we got the Nvidia monopoly we have today....F tier. 

 Nvidia GTX1600/RTX 2000 Series (2018)

 And this is where Nvidia turned evil. Nvidia raised the prices like it was the GTX 200 days all over again. The RTX 2060 was priced more like a GTX 1070. It was the lowest end RTX card at $350. And yeah. Many of their performance gains were blunted by price increases. This is also where we got ray tracing from, and DLSS, which should be seen as massive innovations, but given what it did to the industry...no. This is where we started witnessing the death of the affordable GPU. Up through this point, we had options all the way from like sub $100 up to $700. But here, Nvidia pushed $1k for the 2080 Ti, $700 for the 2080, $500 for the 2070, and $350 for the 2060.

The 16 series were for the poors and featured a 1660 Ti/Super for around $250-270, so...GTX 1070 perfomance with less VRAM. The 1650 and 1650 Super were around 1060 level for around $150ish. And those were okay. But yeah. Im going to be honest. This was a very mid series. I mean, groundbreaking in some ways, with the higher end cards still relevant to this day, but yeah, only if you were rich. D tier. 

 AMD RX 5000 series (2019)

 AMD offered the somewhat more affordable RX 5000 series cards, which were a much better value for the money on paper, but they lacked those RTX and DLSS style features and a bunch of other things. They also were buggy and broken driver wise, and by this point AMD's driver support returned to their norm. So these aged like milk. Would've been a welcome addition to the market in 2017, but by 2019, yeah, AMD is kinda in their death throes here while Nvidia is securing its monopoly status in the market...

F tier. between this and Vega, they're in their GPU bulldozer moment. 

Nvidia RTX 3000 series (2020-2021)

Another relatively unattractive series from nvidia. Most of the lineup stuck to 8 GB RAM. While the cards at the higher end had plenty of power, they were crippled by the VRAM. The 3060 had 12 though, making it a relatively GOATED Nvidia staple. Btw, they did that because they knew the market would never accept 6 GB at this point. So they unironically made probably the most futureproof 60 card since the 1060 here. But it was $330 at launch and due to the COVID pandemic and crypto nonsense, good luck finding one. This era was just hell for GPU pricing. 

It's probably the most memorable of the RTX series, but yeah. I just can't help but hate it. B tier and I'm being generous. 

AMD RX 6000 series (2021-2022)

The RX 6000 series started out very unattractive. They tried to arrogantly compete with Nvidia at the same price/performance points, it didn't work. Their offerings were inferior. Even if on par with raster, they often had less VRAM at the low end, although at the high end they had more. They had ray tracing, but it was sub 2000 series performance levels. Their driver support is iffy on these. They've aged...okay, but given AMD was already talking about cutting support, yeah...AMD really needs to stop doing that. It's not acceptable any more. Especially given how long we keep GPUs these days. It's not like 2008 any more where things advanced so fast it didnt matter if a 4 year old GPU no longer got drivers, you do that crap in the 2020s and the internet is gonna hate you for it.

Still, I gotta say, this is also arguably the best of AMD's modern offerings IMO. Post COVID, AMD's GPUs crashed in price first, leading to that golden moment for those 2014-2017 era GPU owners to finally upgrade to a modern card. And having a 6650 XT, it's still a solid value, even to this day. I wouldnt recommend it over a 5050, but it's still available. Probably the GOAT of the 2020s so far, but I cant give it S tier. It's more like an A tier.

And to be fair it had its stinkers too. The 6400 and 6500 were cut down jokes of cards. ANd yeah, the sub $200 market is now dead by this point. 

Nvidia RTX 4000 series (2023)

At the high end, this is a very solid series. At the low end, it felt more middling. Its weird to see the "low end" now include $300 GPUs, but again, you see what happened to the market here? The low end market just dried up. The 4060 is now the bottom dog here. And it was a middling improvement over the likes of the 6650 XT and 3060. It had less VRAM than the 3060. It was $300 when you could buy a 6700 XT for a similar price. It had Nvidia's tech, and AMD wasn't being TOO aggressive on price here. AMD was offering $250 for similar performance. I mean, it's kinda what the market was until RAMmagedon eh....again, I'm not overly impressed. I guess B tier solely because the high end of the market was thriving, but otherwise I'd give this a C. 

AMD RX 7000 series (2023)

 This series was largely offered along side the 6000 series, at similar price points as the 6000 series. It was newer and has better tech, and AMD isnt talking about totally cutting driver support yet, but yeah it also didn't offer much the 6000 series didn't. They were kinda competing with themselves as much as with Nvidia on this one. They offered a better value than Nvidia on price/performance, but they never could compete with the top end. Not as good as the 6000 series, and yeah. Idk, I kinda feel like this is the RX 500 series all over again. Just not topping out at $300. So idk, I'll give it a B, but it feels like an unnecessary refresh. 

Nvidia 5000 series (2025)

I'm not gonna include RAMageddon in this as it's neither AMD nor Nvidia's fault. But based on MSRP and market conditions. I kinda felt meh on this. On the one hand, at the 60 level, the 5060 was $300, and offered a pretty decent jump over the 3060 and 4060 finally, but it still had 8 GB VRAM. Which is just barely acceptable. It was a nice jump, but it offered poor value, and in order to get more, you needed to spend $430 on a 16 GB 5060 Ti. This is pre RAMmagedon. Are they insane? Well, they have a monopoly, and what are you gonna do, buy Nvidia? Like really, they dont care. They offered a $250 5050 which is basically just a 4060, but yeah, again, not a huge amount of movement this gen. I mean in raw power a few GPUs had an okay shift in performance, but most didn't, and 8 GB RAM is atrocious by this point. 

Honestly, C tier...

AMD RX 9000 series (2025)

 AMD's equivalent. Sometimes a better deal. I know the 9070 XT can be a better deal with a 5070 Ti, for instance at the higher end. At the low end, the 9060 XT seemed to be a cheaper 5060, at $280 pre RAMmageddon, with the 16 GB version trading favorably with the 5060 Ti for less. So it does offer more value. But again, AMD still has inferior tech, their driver support for the future is iffy. There are drawbacks. Still, depending on the value offered, I'd easily buy it over Nvidia right now. Nvidia is the default brand to buy but if AMD has better value it has better value. Sadly, much of that has been erased by the RAM shortage, with prices being as high as Nvidia's for the 9060 XT for instance. The 9070 XT is still compelling at the high end though. Still, we're talking like $350-800 for GPUs now. Where's the low end market? As I said, it just flat out died since the RTX 2000 series. So...idk....I'll give it C tier too. It isnt that amazing for the money. 

Conclusion

Honestly, I feel like this story goes all in with everything else I've been saying about the GPU market going to crap since the 2000 series. You can definitely see the inflection point there and the subsequent death of the low end and even parts of the midrange market, with the new low end offering prices similar to what used to be midrange at best, even upper midrange. It's a joke. It's terrible. I hate the modern market. GPUs used to be so cheap, and so competitive. And all that's been gone since about 2018. To be fair, even in 2015-2017, we started seeing signs of the trouble. The root cause? AMD just failed to compete, and then when Nvidia asserted its dominance with new tech, they could just charge whatever they wanted and get away with it. Even more so due to crypto and more recently, AI. It's a joke. There's a reason Im like "if things continued the way they used to, we should be getting 5090 performance for $300", I'm not kidding. Things used to advance that fast. And now they're super slow. Which can mean longevity, but it also means paying a lot more as well. 

Anyway, the tiers, here they are:

S Tier

Nvidia 8000 Series (2006-2007), Radeon HD 7000 series (2012), Nvidia 1000 series (2016-2017) 

A Tier

Nvidia 9000 series (2008), Radeon HD 5000 series (2010), GTX 400 series (2010), Nvidia 600 series (2012), AMD RX 200 series (2013), Nvidia 900 series (2014), AMD RX 6000 series (2021-2022)

B Tier

Radeon HD 3000 series (2007), Nvidia 9000 series (2008), Radeon HD 4000 series (2008), Nvidia GTX 700 series (2013), AMD RX 400 series (2016), Nvidia RTX 3000 series (2021), AMD RX 7000 series (2023) 

C Tier

Radeon HD 6000 series (2011), Nvidia 500 series (2011), AMD RX 300 series (2015), AMD RX 500 series (2017), Nvidia RTX 4000 series (2023), Nvidia 5000 series (2025), AMD RX 9000 series (2025) 

D Tier

Nvidia 200 series (2008), Nvidia 16/2000 series (2018) 

F Tier

AMD RX Vega series (2017), AMD RX 5000 series (2019)

Some tiers are debatable. Arguably, the F and D tiers should be one tier. You can argue the 2000 series wasnt THAT bad and should be C tier, but I hate what it did to the market, and yeah. On the high end, some of the A series ones are borderline S, but I did reserve S for what I thought were the best.  

CPU tier list

 So....I saw some people on a hardware forum glazing AMD again and acting like the 5800X3D is "the GOAT" of processors, and it's...not. Don't get me wrong, it's a solid processor, but it the grand scheme of things, I don't think it's "the GOAT." If anything, it's a rather average offering all things considered. But that got me wondering, what ARE the best CPUs of all time. We always talk about the "legendary" Q6600s, the 2600ks, and I kinda gotta ask, what's the modern equivalent of that? Spoilers, I'd probably say the 7800X3D for reasons I'll get to later, but let's have a discussion about this.

For reference, I only plan on going back to 2006 or so. My expertise is spotty before that and 20 years is a reasonable cutoff anyway. I'll primarily be focused on the generations, with limited discussion of individual CPUs, although I might point out a few standouts, particularly in the mid range and high end. I'm also focused exclusively on consumer grade CPUs, so not mobile, or HEDT, whatever. And another emphasis of mine is gaming performance. With that said, let's begin.

Core 2 Series (2006-2008)

So, the Core 2 Series was LEGENDARY. As discussed recently, the 360 and PS3 were built like high end PCs at the time. But because tech advanced a lot more quickly back then, those consoles were obsolete a year into their existence. LITERALLY. The 2006-2008 hardware jumps were INSANE. A core 2 Duo was such a massive performance over what came before it it literally obsoleted everything before it overnight. And a lot of those CPUs were pretty solid through the entire console generation. If you had like an E8400 or especially a Q6600, you were set until the end of the generation, with the dual cores struggling, but the quads holding strong. And then they got to jump to the next LEGENDARY CPUs. S tier.

One more thing I wanna mention, they did have a few different lines of these, I dont remember them well. I think the weaker ones were first, but then the second and third iterations where a lot stronger. Q6600 was technically first iteration, super expensive, but then it got cheap as the later ones ruled out. That one in particular deserves a shoutout.

Phenom I (2007-2008)

AMD was caught by core 2 with their pants down, and struggled to develop a compelling alternative. They matched intel on core count, but the single core performance was markedly worse, and intel dominated gaming charts. The fact is, these CPUs never aged well. The single core was a bit deficient, intel dominated charts, and by the time games used all those cores, there were better options on the market. Common trend for AMD in this history lesson. Not as bad as some CPUs on this list, but an underwhelming showing. D tier.

Nehalem (2008-2009)

Nehalem was a huge architectural leap forward for intel, but not a huge performance leap. This first gen had barely better performance than flagship core 2 quads, although it offered the performance at a lower price. It did establish the core i series as we know it, but it wasn't really super strong. Kind of moving sideways to move forward. The i5 760 and i7s aged better than other contemporary processors like core 2 and phenom, but they still didnt age well in the grand scheme of things. Kinda awkward. Not bad, but not great. B tier.

Phenom II (2009-2010)

So, I owned one of these things. It was a relatively budget series compared to intel, but it promsed similar performance. However, remember what I said about architecture? Yeah...

This was basically an answer to core 2...after intel had already moved on to their core i series. And remember those architectural improvements, yeah, the CPUs were about as powerful on paper, but due to being a bit more dated architecturally, they kinda fell on their face at times, and lacked instruction sets as they aged as well (a lot of older AMD CPUs struggled in this regard, it's why I have such dismal views of them mostly). 

Still, not a terrible outing. The quad cores were the optimal choice here. They 6 cores but games wouldnt use more cores reliably until these were horribly outdated. 

C tier. 

Sandy bridge (2011)

The sandy bridge processors were THE GOAT processors of the entire 2010s I think. The 2500k was a solid CPU through the entire quad core era, with the 2600k arguably being solid until around like 2021. Seriously, if you had high end, you probably had the first 8-10 year processor. Like a solid 30-40% bump over what came before, and was the last real big bump until 2017. It was so good it basically heralded in the "intel stagnation era." S tier.

Bulldozer (2011)

While intel was having their best moment, AMD was having their worst. They somehow made a processor with weaker performance than their Phenom series. They had more cores, but again, more cores were kinda useless at this point in time. But yeah, between intel dominating gaming charts and AMD imploding, we got the intel stagnation era. F tier.

Ivy Bridge (2012)

Basically, more sandy bridge. Single digit gains. If you bought one of these you were likely golden, but they didnt offer anything not already offered. I cant hate on them since they dominated the gaming market and you still had solid longevity, but not as goated as sandy bridge. A tier.

Piledriver (2012)

So bad that AMD stopped making CPUs for 5 years. Im not kidding. I mean, it was their 2nd generation of bulldozer. It still sucked, but not as bad. At least it beat the phenoms by this point. Still, they trailed gaming charts big time. Between these and the phenoms, this is how AMD developed such a bad reputation overall. D tier. 

Haswell (2013)

Sandy bridge 3.0. Not as goated and as long lasting as sandy bridge, although it did have newer instructions which helped. Still....B tier. Kind of a mid outing.

Devil's Canyon (2014)

Haswell refresh....this one is more popular primarily for the 4790k, which had higher clocks and was pretty solid, but yeah...more of the same. I'm probably gonna go A tier for the 4790k alone, as that was a solid 8-10 year processor, but other than that...just more stagnation.

Btw, during this era I was waiting to upgrade my phenoms but I didnt wanna get caught with the last quad core generation before intel moved on, and I was thinking, gee this is the 4th one in a row. Clearly the fever has to break at some point, right? My phenom aged like milk. Huge reason I gave it a C tier. 

Skylake (2015)

The jump to DDR4 made it the biggest jump in years, and the 6700k was a solid 8 year CPU. Still, the normal quad core started showing its age a bit pretty early into its lifespan, so while the 2500k, 3570k, and 4650/4670k owners got a reasonable lifespan here, 6600k people very much didn't (see: BF5, a 2018 game). So...kind of a middling generation. 5th year of quad cores, set the 14nm standard, but still, it wasn't great. B tier. 

Kaby Lake (2017)

So...remember what I said about not wanting to catch the last generation of quad cores? Yeah....I'm STILL holding out on a phenom II over here by this point. I wanted to wait for AMD Ryzen, but after disappointing gaming reviews, I bought a 7700k. 

This was, perhaps, the WORST generation to upgrade. The i7s didnt age that badly. The 7700k was still a solid 6 year processor, but a 6 year i7 is pretty pathetic. Most i7s were like 8-10 year processors up to this point. And if you got anything below an i7...well....lets just say, you were already struggling with BF1 and BF5 ran like crap on quad cores. So this is perhaps one of the worst intel processors ever. I won't quite give it an F tier, but a D? Yeah. It aged like milk. 

Zen 1 (2017)

*sigh*, I really chose a bad time to upgrade, didn't I? AMD really overpromised, in typical AMD fashion. I know AMD fanboys glaze ryzen, but let me explain why, to me, it felt like bulldozer's third coming. It was impressive on paper. it offered up to 8 cores with SMT in an era where intel offered only 4. But its cores were weaker, especially for gaming. Architecturally, it was a hot mess. Intel had an interconnect called "ring bus" which allowed fast low latency communication. It was why intel had faster gaming performance despite similar on paper specs. AMD had their CPU stretched over multiple tiles with what's called "infinity fabric." Between worse IPC, low clock speeds, and this latency penalty, the 1700 was running almost 35-40% behind a 7700k in single core gaming workloads. This led to very middling gaming performance. AMD fanboys swore it would give people more longevity, but at the high end...no, no it didn't. I have to say though, the 1600x was a compelling product vs the 7600k though. The extra cores did come in handy, although for most of the lineup, the two series traded blows with intel being favorable in gaming performance outside of the 1600 vs 7600k range. They just had better single thread while AMD threw cores at the problem. Still, I gotta give them credit for innovating. C tier. 

Coffee Lake (2017-2018)

One thing Ryzen did do, it forced intel to get off of its butt and release more cores. The next few years would represent a massive shift in performance with the two companies going all out, and things shifting rapidly. By 2020, the i7 7700k would be an i3, and the if you bought anything other than coffee lake in 2017, you basically got the short end of the stick. 

So...there's 2 iterations of coffee lake. The first bumped the i5 and i7 series up to 6 cores. The refresh also led to the first i9, being an 8 core 16 thread model. It was still on intel's 14nm process, but eh, it still was fast enough to BTFO Ryzen for gaming. THe 8600k offered 7700k like performance for a bit less money, and the 8700k is arguably GOAT status. It's hard to say though. Hardware unboxed recently reviewed all intel processors from 2017 on and coffee lake was more incremental than I thought. Still, I feel like if I had a 6 core I probably would've held onto it for a few more years, which...would've helped at the time, but then screwed me with rampocalypse and the 2020s' own version of stagnation.

Still, A for effort. A tier. I think A tier is fair here. 

Zen+ (2018)

So, this was like Zen 1.5, and it was basically the 2000 series. Not gonna lie, the 2000 series is what I hoped the 1000 series would deliver on. Weaker single thread performance than intel, but closer to 20-25% rather than 30-40%. The 2700x was a solid 20-25% performance over the 1700. Still it was a bit too little too late by this point. While it wasn't awful, intel asserted itself as the gaming king by this point and it was fairly middling as a result. A good kaby lake killer, but not a coffee lake killer at all.  I'll say C tier. 

Zen 2 (2019)

While intel was stuck on 14nm+++ by this point, AMD was making strides in closing the gap. On paper, Zen 2 beat intel 14nm, but due to the interconnect I discussed with the latency penalty, it was still middling. Still, it closed the gap ENOUGH where I would finally say zen 2 was worth buying. Intel still had better single core, but AMD was offering SMT, and you could argue a 3600x would age better than, say, a 9600k. And I would say in retrospect it did. The 9600k was closer to the 3300x, a 4 core Zen 2 CPU that had SMT and due to being on one tile, lacked the latency penalty. I'd say a solid B tier offering here.

Comet lake (2020)

Intel, stuck on their Skylake process still, but needing to do something, upped the core counts again, this time matching AMD. And they could generally outperform AMD in gaming scenarios. By this point, the 6700k/7700k were i3s, the 8700k was an i5, and the 9900k was an i7. And then there was a 10 core 10900k. The architecture was clearly dated by this point, but they still had the edge over AMD, and while I cant say this aged super well given what was to come, it didn't age badly either. Considering its 6 years later, I'd still consider CPUs like this to be viable, albeit dated. B tier. 

Zen 3 (2020-2022)

Okay, so, this IS where the magic really happened for AMD. Zen 3 is where AMD temporarily took the lead. Intel got stuck in 14nm++++ hell, and AMD kept innovating and learning from their own flaws. Intel stagnated. This one had the big architectural changes that turned AMD into the powerhouse that it is today. They upped their tile size to 8, so they could have 8 core CPUs with low latency, and then they created X3D a few years later, which supercharged their CPUs, completely overcoming AMD's historic gaming flaws and putting them ahead of the pack. 

Still...is it an S tier goat? I'd say no. Mainly because their lead was temporary and while the series has aged gracefully, they were relatively quickly surpassed by future AMD and Intel products. Even the 5800 X3D had an intel answer at the time and that level of performance has since become comfortably midrange. It didnt quite have the dominance of say, the 2600k. 

A tier. 

Rocket Lake (2021)

I'm tempted to make a "challenger lake" joke, but that's too harsh and more apt for raptor lake. Rocket Lake is intel's "bulldozer" moment. They tried shifting to a new architecture, only for it to be worse than their last. The series has always been iffy, and honestly, it was always a poor value. 

Still, it was a rather short lived mistake. D tier. 

Alder Lake (2021)

Just like Zen 3 is what made AMD into what they are today, Alder Lake made intel into what they are. Alder lake is a very interesting series. It was compatible with both DDR4 and DDR5 RAM, with performance comparable to Zen 3 on DDR4, and surpassing it on DDR5. The 12900k even matched the 5800X3D, and that CPU was AMD's answer to intel temporarily taking back the gaming crown. Both CPUs offered similar performance, but zen 3 typically undercut intel on price. 

Both CPUs are also still relevant today and available as budget options. Still, at the time, they were expensive. 

In a way, Alder lake was intel's Zen moment in a good way. It not only featured a massive node jump, modernizing it, but it also introduced ecores, giving Intel unparalleled multithreaded performance. Ecores arent as fast as Pcores, but they're solid for adding multithreaded oomph.

On the downside, these CPUs ran very hot and were very power hungry. Not like they burned themselves out though. That's intel's next F up. 

A tier. Very solid A tier. Arguably, the 12900k is a modern Q6600. I'll actually stand by that. Its multicore oomph is still i5 tier today, and its gaming performance keeps up with more modern mid tier performers. it also got quite a few price drops over time as further changes made it a compelling budget and midrange option. I bought one of these for this reason. 

Zen 4 (2022-2023)

AMD made a jump to DDR5 here, and that put them back on top. They matched their 5800X3D performance with a normal 7700x, and their platform was arguably superior to Alder Lake. While quite expensive at first, it sported a 10% single thread advantage in performance, putting them on top again, although their CPUs had fewer cores than intel. Over time, it got cheaper though, as DDR5 became more affordable until recently, and the CPU and Mobo prices dropped, and now it's a solid option. The 7800X3D is arguably the GOAT, because it released in 2023, is now 3 years old, and intel has no real answer to it even now. And AMD's answer to it is...the 9800X3D...yeah...

All in all, if anything deserves an S tier, it's probably this. Still, I struggle to give it S tier. Even if the 7800X3D is the GOAT of modern gaming processors IMO, and I had an opportunity to buy one, I didn't, because of platform teething issues. DDR5 is kind of an unstable product, and its especially unstable on AMD's ecosystem. So I bought a 12900k instead, saving myself $100, and a lot of headaches in the process. Still, if I had to say any processor is the "GOAT" of the modern era, it's this. THis is the new 2600k, if anything is. A 5800X3D is arguably more like a 2500k, still a solid product, but it's not gonna age as gracefully. By this point, virtually all midrange CPUs get comparable gaming performance. So...A tier I guess. A very A tier that might get retroactively inducted into S tier. 

Raptor Lake (2022-2023)

Remember that "Challenger Lake" joke I almost made with rocket lake? I'm gonna invoke that now. Remember how I said that Alder Lake was super hot and power hungry? Well, imagine you're intel, AMD releases Zen 4, and while it's not THAT much faster, you REALLY wanna keep that gaming crown, so you try pushing even MORE cores, with MORE power, and MORE clock speed.....and then it all kinda...blows up? Yeah. That's raptor lake. 

On paper, it was a very solid series, perhaps not GOAT status, since it never had an answer to the 7800X3D, but it still gave AMD a good run for its money. It bumped the core counts again, turning the 13700k into an OCed 12900k, and 12900k performance was now gotten by a midrange 13600k. By this point, I was upgrading, and I was salivating over the 13600k, the 5800X3D, and while I did consider the 7800X3D when microcenter released a deal on it...again, RAM teething issues. Still, I ended up going 12900k for the lower price, and that's what it's big legacy was in a positive way. It set the bar for the modern era, driving the cost of modern CPUs down. You could now get 12900k/5800X3D performance for i5 prices. You had more cores than you knew what to do with, and to me, AMD offerings kinda looked quaint in compairson. I kinda realize by this point we're in another stagnation era. AMD is STILL offering 6-8 cores for R5s/R7s. While intel's offerings are 14 cores for a 13600k and up to 20 by the time the 14700k came out. And sure, it might be 2 ecores to a P core performance wise, but that's still a lot of processing power, while matching the gaming performance.

Too bad they fricking fried themselves from the voltage. And had manufacturing defects from oxidation issues. Really. I have to give them F tier just for that. Otherwise, a solid A tier. 

Zen 5 (2024-2026)

the AMD stagnation era has officially begun. For most of the lineup, it was called Zen 5% for a reason. You got a 5% uplift. So we're talking intel 2010s era stagnation here. THe 9800X3D was the biggest improvement, further concentrating their lead over intel. And now they're refreshing it again with the 9850X3D. Stop, stop, they're already dead.

Still, on that front, I do wanna point something out. Most Zen processors that are X3D are R7s. Basically AMD's equivalent to an i7. They're premium products. When sandy bridge dominated, AMD CPUs performed closest to dual core i3s. Intel is still viable well into the i5 and i7 range here, especially with price cuts to older alder lake CPUs. LIke, the 9700x isnt anything special. Nor is the 9600x. AMD does have 6 core 7600X3Ds at like microcenter, and they are a compelling value these days, just note that it's buying a 6 core...in 2026. Idk, I kinda feel like that's a "7700k" move right there. It's fast...until games want more cores, which they will inevitably do. Not saying that its worth buying a multicored monster if youre a gamer only, but I do think that the 7800X3D/9800X3D are like the modern 8700k/9900k style CPUs, while the 7600X3D is like an 8600k, and the intel CPUs are closer to like...idk...zen 2700xs or 3700xs or something. 

As such, Zen 5 is definitely not S tier either. Idk if anything is. I'm tempted to say B tier. It's the 2nd and 3rd iterations of the 7800X3D followed by rather mid offerings in the midrange. I dont think there is a clear S tier, the 7800X3D might quality as an S tier CPU, but the zen 4 series is more A tier IMO.

Arrow Lake (2024-2026)

So after Supernova lake blew up intel (I hope that isn't foreshadowing anything given "nova lake" is the next gen of intel parts to come out), intel kind of imploded into a white dwarf of failure, having their own bulldozer moment once again. I dont think their failure is AS bad as bulldozer, as that was so messed up it killed AMD for 5 years and intel enjoyed massive advantages at virtually all price ranges, something AMD has failed to achieve IMO, but yeah....Arrow lake is slower than raptor lake. They regressed to something akin to alder lake, stepping off the gas so the dont blow up their own CPUs any more, and redid their entire architecture, pulling a zen 1 move and moving toward a tile architecture with lots of latency. So...yeah. lots of raw CPU power, relatively poor gaming performance. Not awful awful, I feel like intel can actually compete at the midrange here, but they arent taking any gaming crowns here. 

The refresh with the 250k, and 270k is regaining some momentum, but it's really just "congrats, you're back at the raptor lake levels of performance you achieved 4 years ago." We truly are in the new stagnation era...

Honestly. Arrow Lake gets a D tier. At least it didn't blow up. But still, a bit of a regression here followed by starting to dig itself out of a hole. I dont think this is truly as bad as bulldozer, but intel is clearly struggling here. Still, I do think that this is more like a zen 1 moment to say the least. Kind of a flawed reboot with inferior performance, with them already having their zen+ moment with the refresh. 

Conclusion

And thats where we are. Later in 2026 or in early 2027, we're likely to see AMD release Zen 6 and intel to release Nova lake. Both are supposed to sport higher IPC gains, meaning we finally jump in single core performance. Core count boosts are supposed to happen too. AMD will offer 12 cores on a CCD, whereas intel is offering up to 52 cores, although I think their mainstream models will feature a more tame 28. Intel is also support to have a BLLC cache or something, which is supposed to counter X3D. So this can heat up and get interesting.

Still, I feel like my 12900k is a fricking Q6600 over here. 16 cores, 24 threads, decent single core, not the fastest, but it is looking to possibly be a legendary processor at this point. It should've been BTFOed by raptor lake and arrow lake, but between raptor lake...imploding...and arrow lake...also imploding in its own way...it's still standing up there. Not as fast as a premium X3D CPU but for a 5 year old CPU, it's aging quite gracefully. I cant see myself replacing this before 2030 at this rate. 

So yeah, let's post the final results:

S tier

Core 2 (2006-2008), Sandy Bridge (2011)

A tier

Ivy Bridge (2012), Devil's Canyon (2014), Coffee Lake (2017-2018), Zen 3 (2020), Alder Lake (2021), Zen 4 (2022-2023)

B tier

Nehalem (2008), Haswell (2013), Skylake (2015), Zen 2 (2019), Comet Lake (2020), Zen 5 (2024-2026)

C tier

Phenom II (2009-2010), Zen 1 (2017), Zen+ (2018)

D tier

Phenom I (2007-2008), Piledriver (2012), Kaby Lake (2017), Rocket Lake (2021), Arrow Lake (2024-2026)

F tier

Bulldozer (2011), Raptor Lake (2022-2023)

HOnestly, some placements can be disputed. Zen 4 arguably belongs in S tier, I just didnt think its midrange offerings were really that special. Coffee lake is possibly S tier, but I dont think it aged as well as I thought it did in retrospect. Raptor lake was A tier in terms of on paper offerings, it just....fried itself. And yeah the positions are debatable. But honestly....

S tier- GOAT status

A tier- Strong offerings, but not quite GOAT status

B tier- A decent launch that didnt offer anything special but wasnt bad either.

C tier- Offerings were mediocre, but had some redeeming value

D tier- Bad and clearly lagging behind the competition, but not legendarily so

F tier- The greatest F ups of all time

And yeah. That's my tier list. 

Summarizing some of my thoughts on feminism from the previous article

 So, I understand I write rambly messes sometimes, but there is one point from that article I do wanna highlight more succinctly, and that's basically this.

I feel like a lot of modern feminism and womens' liberation is hypocritical in the sense that it changed the social contract for women, but not for men. A lot of this comes from the whole ideological privilege stuff. It's only progress if we change things for under privileged groups, but not privileged groups. This approach is kind of one dimensional and ignores the greater context of the suffering that we might engage in in our society. I mean, sure, one group can have it worse, but that doesn't make it right for ANY group.

The traditionalist social contract is this: women stay at home and pop out and raise kids, men get jobs and make the money. Men having control of finances put women in a compromised position, often unable to leave bad marriages, so they wanted more power and the ability to get jobs and climb the corporate ladder. I'm fine with this btw, Im sure AF not a traditionalist. I am, if anything, a more consistent progressive. I mean, I'm the UBI guy. I want everyone to have financial independence and not be dependent on ANYONE. Spouses, employers, what have you. 

But...feminism kind of enforces a "rights and empowerment for me but not for thee" kind of mindset. When we talk about the idea of men NOT working or wanting to stay home and be "house husbands" or "stay at home boyfriends", people think those men are losers. I mean, again, this feminism raised women to the status of men on paper, and again, totally a good thing, but it kind of did so within the language of jobs and work, which i see as negative. Because in my worldview, wage labor is just a form of soft slavery and you're forced into abusive relationships with employers instead of spouses. We really gotta get over this idea that hard work makes us financially independent. I mean, it kinda has the same ring to it as "work makes you free"....gee, who else thought that?

And....it's not just this subject where these dynamics are at play. It's everything. I cant stand a lot of pro choice spaces these days because they're full of loud and obnoxious women shouting "my body my choice", but the second we talk about financial abortion for men, the traditionalist sex shaming comes out. All of the sudden these people sound like evangelicals saying you shouldnt have had sex and if you didnt wanna be locked into child support for 18 years you should've been more responsible. Again, these guys embrace all those christian ethics. Just not when it applies to women. but men, well F men. 

Feminists push body positivity movements, which, again, IN MODERATION, I dont have an issue with. I mean, we shouldnt act like being fat is "okay" from a health standpoint, but honestly, people should be free to live as they want. But then when it comes to men, well, they want the ideal partner. Under 6 foot, bad. A little bit of a beer gut, bad. Balding, don't even bother. Not financially successful, ew, go away loser. And then they complain there arent any good men out there, ya know, like incels would for women. 

Again, its like men shouldnt have standards but women should.

Men are supposed to be the ones to approach women, but then we're chastised or called creepy for doing so. This one is a huge pet peeve for me in particular, given my social anxiety and autism. 

And as we know in the autistic communities, despite the vast majority being straight men, their concerns get suppressed in favor of that minority of left wing activists who are women, gay, trans, etc. And we're just supposed to stfu, we're shamed for venting our concerns as straight men, and are often unwelcome in those communities. If we express anything remotely similar to the "there's no good men out there" mindset but toward women even in moderation, we're accused of being incels. 

Again, it's the double I standards I don't like.

Men have concerns. Just because we're historically "privileged" in power dynamics doesnt mean those concerns are illegitimate, which, I think, is a HUGE reason this stuff is so massively unpopular with the majority of people (and arguably a reason the orange moron won again). 

Men have issues with employment and financial success, especially in the modern era.

We have problems with issues like child support, custody issues with children, etc. We're trusted less in these courts. And the rulings often go against us.

We're expected to be masculine, to not share our feelings, to suck it up.

We're expected to navigate an increasingly dysfunctional maze of confusing mixed signals in order to date. The dynamics have shifted where everything feels like it's on the women's terms and we're just blasted for doing everything wrong. 

I mean, younger generations dont even wanna date. And the most regressive of men are going back toward traditionalism and wanting to go back to the 1950s again in reaction to these changes. Which is bad. Again, wanna make it clear, NOT a traditionalist. Not alt right here. You can call me some variation of "alt left", whatever that may mean (left wing but not traditionally left wing?). But yeah Im clearly trying to approach this from an egalitarian perspective.

Honestly, MRA stuff isn't the answer. We need an ethic that sees people as people and tries to push men and women as EQUALS. Instead, it seems like men get all the traditionalist social obligations and none of the progress. Maybe if progress means making women more like men, feminism has done that, but as a man, I think the entire traditionalist framework is dysfunctional in the first place. I dont wanna save or preserve it for either side. Just as I'm a "liberal feminist" if you wanna call it that, I'm just as progressive and egalitarian for men. And that means rejection of the entire old social contract related to traditionalism, which all comes from christianity btw. Like the work ethic is christian, the idea of life being split into institutions like marriage and work with social obligations, that's christian. And I just flat out, straight up reject those "christian" institutions, those "christian" norms, and that entire way of life. I'm anti traditionalist. And I just think, for all the progress have made over the past century, men need to make progress too. That means breaking the work ethic, that means more rights like financial abortion and more equal rulings in custody courts. And that means maybe women have to put up with some uncomfortable advances, at least in moderation (obviously there is SOME level where youre just harassing people and yeah, I'm not supporting anything that actually puts women in danger). And it means that maybe if they wanna find someone at all, they gotta adjust to the market a bit more instead of having unrealistic standards and then asking "where have all the good men gone?"

Again, for me, the big problem here is the double standards. I just feel like to accommodate these changes that have been made over the past half century to a century, that the social contract needs to change for men too. That's progress, not a regression. Again, I'm not regressive. I'm just not a traditionalist who wants to go back to the past. I just feel like feminism only did half the job and now men are taking on all of the social obligations and it's not working for us. And it's time to actually make something that works in the twenty-first century. 

Getting to the heart of why consoles are obsolete IMO

 So...the switch 2 discussion kinda touched on something that really explains why I dislike consoles these days. It really struck me when someone asked "well if a handheld has the capabilities of a phone, what makes it so special?"

I mean, the answer is "the software." Because, again, console game exclusivity is what drove most sales in the past and still drives nintendo console sales today, but to give the more "PC master race" argument here....

That's kind of the reality of consoles. Consoles are just PCs. In the olden days, the ecosystems were different. You had boxes that just played games. They made sense in an era where most people didn't own a PC (ie, the 80s and 90s). They also made sense in an era where PCs were very expensive, and went obsolete quickly (like the 90s and 2000s). PCs often werent a good medium for gaming because requirements would be high, PCs would be expensive, and progress happened so mindblowingly fast that the PC you bought would be obsolete by the time you brought it home. Back then, having an affordable device that could run games for a good 5-7 years seemed like a comparatively good deal. 

But starting in the 360/PS3 generation, the facade slipped. PCs became more affordable, and consoles became more expensive. Games that would be on PC would be on console and vice versa. Consoles started experimenting with online play, often charging money for it, while PC didn't. That's what drove me toward PCs most I think. And when I got into PC, I started learning that you could run console games at higher frame rates and resolutions as well. 

After a while you kinda realize that consoles are just PCs. I wanna repeat that. CONSOLES ARE JUST PCs. They have CPUs, GPUs, RAM, storage, an operating system, etc. And if anything, they're just a walled garden. You got more limited capabilities, and as we shifted toward online, that pushed people toward paying to use that console's online ecosystem. 

And....PCs got the whole "PC master race" thing going because after a while, PC gamers realized they WERE getting the superior experience, playing higher end versions of the same games while paying not much more than consoles with all of their online subscriptions cost.

And with steam, games were CHEAP. 

And heck, with consoles, we kinda knew what the consoles were packing by this point and what's comparable hardware. For example, the Xbox 360 had like some triple core processor with 512 MB RAM and a Radeon x1950 GPU. The PS3 had a "cell" processor which was equivalent with similar RAM and a GTX 7800. 

Given the 8800 GT came out right after this, people could put together a core 2 build (quad core or dual core) with 2-4 GB RAM and something like an 8600 GT or HD 3650 and get similar performance. if one invested in something like a HD 3850 or 8800 GT, they would get comparable performance until the end of the console generation. By the time you got to the end of that generation, people with core 2 quads, phenom II x4s, or newer i5/i7 CPUs with 4-8 GB RAM and something like a 460, 560, or 660 would be packing specs that could largely keep up with the next gen for the first half of the generation, while massively surpassing the 360/PS3.

The PS4 and Xbox one wielded basically an 8 core FX CPU at like 1.6 GHz, so it was like a quad core desktop equivalent, 8 GB RAM, and a HD 7850ish GPU (the XBox one was closer to a 7790). 

The PS5 is like an underclocked 3700x (so like a 2700x) with 16 GB RAM and a RX 6700 GPU. 

Again, consoles are just PCs. 

Hell if we wanna be honest, the switch 1 was just a modded nvidia shield tablet with more RAM. It had raw specs similar to the PS3/360 generation hardware, maybe a little weaker on the GPU front. The switch 2 uses some custom Nvidia chip with like I think 12 GB RAM and the GPU is the equivalent of like a 1050 ti or something, except newer. 

Again, it's all PCs. Everything is a PC. Hell, your phone is a PC. My razer edge handheld has a modified snapdragon 888+ in it (rebadged to G3X Gen 1, basically its an 888 with active cooling), with 6 GB RAM, and a Adreno 660. I estimate its power being comparable to a i7 2600k or i5 6600k, with a GTX 460 class GPU. So a bit weaker than the steam deck or a switch 2, but still substantial. I remember when that was a beefy computer, back in the early 2010s. It's not THAT great these days, on par with a budget laptop, but yeah.

The steam deck, also a PC. It's like an underclocked Ryzen 3 3100, with 16 GB RAM, and a RDNA2 GPU on par with like a GTX 760/1050, or a RX 560. Not great, but entry level, better than the above. 

And that's the thing. The razer edge is able to do what it does efficiently, in a small form factor, and has a reasonable battery life, like 4-10 hours depending on use case. The steam deck goes dry after 1.5 hours, or 2 in the revision, when hammered. To be fair, the edge can go lower if I use the 144 hz refresh rate, but at 60 I save a lot on battery efficiency. but to me, that's what I find acceptable for a handheld. 3-4 hours minimum is a reasonable expectation. 1-2 is unacceptable. At that point, it's like why even bother make it portable? And again, thats why i rag the switch 2. Again, it's too powerful for a handheld, to the point that it's overpriced and unwieldy, but it's also an underpowered console, given the chip in consoles is 4x stronger. 

So...yeah....is a switch 2 a PC? Always has been. For all the talk of consoles being special, they're not. They're just custom PCs with custom OSes that turn them into a walled garden, and then nintendo adds some gimmicks to them like detachable joycons and the handheld/PC hybrid thing. But it's ultimately...just a PC. 

And that's why I once again, dislike nintendo's mentality. it's very much stuck in the past of needing a specialized PC to run their games...when they could just launch stuff...on PC...or android....and we could play their games there. But no, they wanna stick to the walled garden setup and it rubs me the wrong way. 

Again, if you're a PC gamer, you'll understand that everything is a PC at the end of the day. Smartphones are just tiny pocketable PCs optimized for portability and battery efficiency. Tablets are big smartphones without the phone part, also optimized for portability. Laptops are just desktiop PCs made portable. For a while we had surface tablets which were laptop/tablet hybrids. And consoles are just specialized PCs for playing specific games. 

Again, it's all PCs. And when I criticize nintendo, I'm criticizing the fact that 1) they have a walled garden and refuse to join the omni-ecosystem I mentioned earlier today. 2) they could have chosen any specs in a wide spectrum of specs. They could have focused on more portability, or more power. They could have focused more on a budget experience, or a more power user type one. They wanted a machine that does it all, and it technically does, but it does it all poorly. It's too weak to compete with traditional consoles. It's too bulky and overpriced to be a true handheld. It just sucks at everything in a way where I'm like "ew, I dont want that." 

And yeah. To go back to the nintendo fanboy's point about "what separates it from a PC or a phone?" Nothing. it is a PC or phone. It's just a matter of form factor and price. And maybe I do want a PC or phone like experience? I mean, my "home console" is a gaming PC. My handheld is basically an android tablet with a controller attached. And in the future, if anything, I'd like to see a more unified ecosystem between the two. Currently, android is android and windows is windows, but i know some are experimenting with windows emulation on android and if valve decides to step into that market by making steam work on android....well....that would be great. Because phones are powerful enough these days to run most gen 7 and even early gen 8 games if the software is there for them. And I'd LOVE to play my older PC games on my edge. So gabe, plz consider. I'd love a steam phone or a steam tablet at an acceptable price point (as in, cheaper than steam deck, or around $200-300). Or alternatively, the ability to turn any old android device into a steam machine. Just saying. 

Do women need to chill on their views toward men?

 So, some are gonna blast me for this, but Asmongold had a video about how young women have grown to hate men, and have dating standards incongruent to the actual dating market. I guess this is the opposite side of the coin from my "are gen Z men okay?" type posts I've made since the 2024 election.

As we know, men have been driven right, and women left. A lot of this polarization has completely destroyed dating mechanics among gen Z. There's a lot of tribalism on both sides, and normally I put young men supporting MAGA on blast, but after watching this, I believe women deserve criticism too. 

A lot of women seem to HATE men. Like, the stereotype of "man hating" women seems true, at least based on some of the data presented. I can get SOME of that. I mean, if you're dealing with conservative men who want a tradwife where you're supposed to be barefoot and pregnant all of the time, they have super regressive views, yeah, exclude those frickers from your dating pool. I get it. But, it seems like a lot of women DO have their own "incel" problem, as Asmongold pointed out, or a "femcel" problem where they just seem to hate on men all day and can never find the right man because their dating standards are ridiculously high. And...again, pot calling the kettle black given I have unrealistic standards myself, but I'm more than willing to admit my own faults on that subject and I have standards virtually no women seem to meet. But here, it does seem like women do have ridiculous standards.

Like, on politics, I can understand needing some level of worldview and lifestyle compatibility. You dont want someone who sees you as breeding cattle, I get it. But then most women would refuse to date someone for a differing view on like, gaza or some crap? That's mental. And the fact that this kind of exclusivity is driven primarily by women is a bit insane to me. Because again, some political differences cant be reconciled, I get that. But I feel like modern radical feminism, ie, what the right calls "gender ideology", what I call 'wokeism", is just toxic. It really is. It encourages an insane level of tribalism where ANY deviation from expectations just puts you on these peoples' craplist. Me, purity tests can be justified, but pick your battles. Really. I mean, this isnt all that different than the leftists hating on AOC over israel. Like...let's be blunt, why TF should MOST americans care about what other peoples' opinions on israel are? It's just bonkers to me. and it's all sides. Like I had a long time friend from college unfriend me recently because she's a die hard zionist and I've been more critical of israel lately on social media. It's insane to just push people out of your life who you've known for years over this issue. Like totally mental. But....for these guys, most women, it's the opposite, you bet they're for palestine and if you have ANY pro israel views you're gone. And if you're nuanced like me, well, you're especially screwed because both sides end up dogpiling on you for not toeing their line exactly. And here's the thing. Does this crap even matter? Again, something as basic and central as whether your partner thinks you deserve rights, I can understand how there's a basic disconnect there. But this is just stupid.

An issue that is arguably more within the purview of lifestyle, but that I feel passionate about to talk about on this, is also that of the "stay at home boyfriend" thing. Like, a lot of women JUST dont like that idea. I mean, I can get you dont wanna support someone else, okay. You do you. BUT...I'm gonna be honest. This whole femcel thing of "there's never any good men left"? Yeah...about that. Look. Here's my honest views on this issue, as a "NEET", and as someone who in a relationship would ideally wanna be a "stay at home boyfriend" of some sorts. 

As far as feminism goes, I'm fine with women having rights. I'm fine with women working. I'm fine with people doing what they want. Honestly, I don't get the work thing because I feel like capitalism is a soft form of slavery, honestly. But, I can understand, coming from a position of being reliant on men who would often be abusive and not see women as equals, why women would view work as more...liberating. 

However, I also think, in the 21st century, we need to have an honest talk about work and it being basically a form of slavery we don't call slavery. And...I think in terms of dating, we should talk about the whole stay at home boyfriend thing. The economy is not what it used to be for men. And millions of men have been displaced from it, mostly through no fault of their own. It's tough out there. We dont have the opportunities, what we have to settle for in the form of low wage service jobs sucks. And I know a lot of it has to do with education, women are becoming more educated than men, but for men....it sucks too. I went to college and came out of it unable to do much with my degree. And it was demoralizing. And that's how I became a NEET. I'm a victim of the 2008 financial crisis and the fact that no one was hiring for YEARS after that. I'm also a victim of deindustrialization and the fact that the job market in a lot of areas just sucks these days. Now, you can argue some aspects were my fault. I chose a poor major, and post deconversion from christianity, I lost what one would call a good "work ethic" (because that work ethic was tied to the christian worldview I've come to reject). But that's also why I'm able to criticize the system as it is. Rather than be all wrapped up in this christian work ethic, I have a nice solid dose of "reality" thee and see things for what they really are. I left "the cave", and quite frankly, I kinda have different views on the subject. To me, work isn't liberating. It's literally slavery that we dont call slavery. And I seek liberation from this system. 

In a lot of ways, to go back to feminism, I think the tradeoff is weird here. The social contract changed for women. Which, dont get me wrong, as a progressive liberal type, is positive. More options is nice. But it never changed for men. And the world HAS changed for men a lot in the past half century or more. We're still expected to be breadwinners, to have it together, to be everything to women, but then women so changed their side of the contract is makes it impossible for us to date properly. I've discussed previously we often cant approach women in a healthy way. Dating sites are a fricking joke. Apparently women get their pick from dozens, if not hundreds of women, men get no bites back unless they're super tall and super handsome, and have the right job, blah blah blah. We're expected to be breadwinners in an era where jobs are hard to come by and millions are pushed out of the job market. No one wants to date us because we're "losers" if we don't work. And the women on the news segment that asmongold covered were like "ick" at the thought of dating a guy without a job. And it's just...awful. It's fricking awful for men. 

And I'm not gonna say the answer is to become MRAs or incels, it's not. That's just more toxic gender ideology, just from a masculine, conservative point of view. I dont want to bring back traditionalism, or machismo, or stoicism, or this whole jordan peterson/andrew tate "hit the gym" mentality. That stuff is toxic, and I reject it outright. I also reject a lot of asmongold's conservative takes in HIS video. 

But...I'm going to be blunt. We need a new social contract. It's fine for things to change for women, but men need their own changes to their own expectations too.  And I'd like to see things like, say, a UBI and less emphasis on work being this end all be all of everything. Dual income households from work is not an improvement in my view. Needing both partners to work to keep a household afloat is NOT liberation. It's just another form of enslavement. Men needing to be breadwinners keeps us in this matrix of work centric social standards that we should wish to free ourselves from. All of the stuff I talk about. UBI, medicare for all, this stuff should LIBERATE us. Liberate ALL of us from all forms of coercion arrangements. UBI frees women from bad relationships with men. it does. Finances are often what keeps abused women in bad relationships. A UBI would allow women to walk away more easily. And UBI would free men from the coercive aspect of jobs and work. And it would ideally free them from the social expectations that women expect from men, or at least reduce those expectations somewhat. 

Basically, all of this stuff would free us from crappy social obligations that don't make sense. Make no mistake, women's liberation is a good thing. I'm not a traditionalist. But...once hypocrisy I notice among feminists is they're all for progress for themselves, while holding men to traditionalist standards. And I think that's at the root of the problems discussed here. Women don't want to be tradwives, but they want tradhusbands. And, quite frankly, we need to just be anti tradition. That's the ultimate destiny of the left IMO. Women had their own liberation from "the patriarchy", but men need liberation from the economic coercion to work in my view. Again, a UBI would help this. It would make the idea of dating someone without a job a bit more normalized. Because both parties would have an income stream, job or not, and both could contribute somewhat. I can kinda understand why women wouldnt want someone to be dead weight, but again, UBI would solve that. Otherwise, it's just, "well you gotta sell yourself into slavery to have a relationship" and that just ain't attractive to me. I'm sorry, it's not. Ya know?

Just my view. And in the modern age...again, women are free to have the standards they want, as someone with unreasonably high standards myself, I have no room to judge, BUT....at some point, women are gonna have to take what they can get and if that involves someone not in the workforce, or someone who isn't 6 feet, or is balding at the ripe old age of 27, or has a beer gut, well, maybe you should consider it? Otherwise, it looks like gen Z is just gonna be a hyper polarized mess of no one dating anyone, the next generation not being made, and our population pyramid becoming a space needle. 

Oh, and this has been on my mind, but if that happens, we might end up backlashing into traditionalist barbarism sooner or later as people blame "womens' liberation" for that happening and pushing the return to "the good old days", since people seem more inclined to wanna return to some idealized version of the past than actually move toward a future that actually works. Just saying. It's a pattern I've observed. How many human centered capitalists are there and how many people instead become alt right MAGA hacks? Yeah. Just basic math. My own take? Fix the underlying social issues and change our social standards. But a lot of people would rather revert to the past than to take that road that's less traveled. Just something I fear for the future, ladies. From your friendly progressive male who isn't an obnoxious hyper feminist but obviously hates MAGA and traditionalism too.  

Wednesday, April 15, 2026

Why the switch 2 is the worst of all worlds and why I find it so unattractive

 So...I got arguing with nintendo fanboys again. Wouldnt recommend it. They're some of the most deluded and tribalistic people I've come across outside of the republican party. But I do want to articulate some points I have regarding the switch 2, and why I find it so unattractive. None of this is new, and my opinions can be derived from previous things I said, both related to the switch 2, and other similar items like the steam deck, but I feel like it's worth discussing. 

Too bulky and expensive for a handheld, too weak for a home console

The biggest critique I have in particular is that the switch 2 occupies that awkward steam deck like space of being the worst of all possible worlds. The switch 2 isn't just a home console, nor is it just a handheld, it wants to do both, and in doing so, it made design tradeoffs that make it awkward for both propositions. 

First, the handheld side of things. As I said, I think if youre gonna release a handheld, it should be cheap and portable. It's not gonna be the best power wise, but it doesn't have to be. I mean, that's the ever existing tradeoff of portability. Portable consoles have always been 1-2 generations behind the home console or PC equivalent. The game boy and game gear were 8 bit SNES/master like systems when the SNES and Genesis were home consoles. The Game Boy Advance is closest to say, a genesis 32x in the gamecube era. So 1.5 generations behind. The DS was an N64. The 3DS was like a dreamcast or gamecube. The switch 1 was like an Xbox 360 or PS3. The Switch 2 now is pushing PS4 Pro type specs. But for some reason, people want the thing to compete with the PS5. And that's problematic. 

It's trying to repeat the Steam Deck strategy of cramming PS4 type specs into a handheld form factor. And while due to releasing several years later the Switch 2 is stronger than a steam deck, it's still very weak compared to a PS5. A PS5 or similar computer is 4x stronger on GPU alone, ignoring other specs. So trying to run PS5 ports is not gonna be a good time. Quite frankly, people are marveling that it runs cyberpunk on minimum at like 40 FPS, but honestly, is that something we should celebrate? It's a bad experience, compare to a home console or a dedicated PC. Sure, it's portable, but at this point Id rather focus on older games that run well than newer ones that don't. 

And what are the tradeoffs for this? High price, bulky, low battery life. It's the game gear all over again and the game boy beat the game gear for a reason. The point of a handheld is portability. Just because you CAN cram more power into something doesn't mean you should. And I cant imagine a parent would appreciate an expensive $450 device getting smashed because kids be kids and kids are rough with their stuff. Nintendo handhelds used to be built like tanks and we STILL managed to break them because kids are rough with each other and their stuff. But you know, something being the equivalent of $250 or less is a lot more swallowable than $450 if you gotta replace something. 

A device like the switch 2 is something you wanna never take out of the house and always baby. It's not something your kid is gonna play in the grocery store while youre shopping. They're not gonna play it in a restaurant while waiting for food. For all the talk of portabiity, I NEVER see kids playing with switches in the wild. It's always phones or tablets. Because android devices are cheap and can play games too. And kids wanna play their fortnite, minecraft, and epstein forbid, roblox (seriously, if your kid plays this keep an eye on them, i'm sure youre familiar with the pedo controversies with that one). Even more so they're likely to watch pewdiepie or markiplier stream a game or something. Point is, the switch 2 isnt very portable. And I NEVER see anyone using one. It seems to be their "home handheld" due to the cost, not something you take with you literally everywhere, which is the point of these things.

So....the switch 2....bad as a portable console. But home console, again, it's still markedly weaker than the 6 year old PS5 and Xbox Series X. In all fairness to nintendo, they've played the "weaker" game for a while now. For a while, it was arguably a price point tradeoff. You can buy a more expensive console thats better, but you are paying 1.5-2x as much. And nintendo consoles always have weaker third party support as a result. Because a lot of developers dont wanna bother trying to scale down games to be a generation behind the competition. So you get a mixed bag of inferior ports with obvious compromises, if the game makes it to the older platform at all. And many don't. 

But then people say....well....it seems like what you want is a phone. But the problem is that a kid can just play games on a phone. Nintendo has to innovate and do something different here. But again...maybe what I want out of a handheld IS a phone?

I mean....my main handheld is a razer edge. It's a tablet with a relatively premium chipset in it (although its aging by this point) that has a razer kishi controller attached. It runs most games in the play store, and does a lot of emulation just fine. And it IS more portable. And you know what? Because it is a tablet as well....I can talk to friends on it, surf the internet, watch videos. AND game. 

Honestly, I think the primary reason this thing got a bad rap was because it was released as a premium device. It was $400 at launch, $600 if you wanted 5G, it flopped at that price point, razer never supported it well post launch. And yeah, I bought it toward the end of its life cycle in 2024 when they were discounting them to $200 just to get rid of them. By that point,, $200 was a good price point, competitive with retro handhelds with similar specs at the time. And it's a pretty nice device for $200. I can see why it flopped for $400 as that's a huge asking price, but for $200? It was a steal. 

With that said, what's wrong with a $200-300 phone like gaming handheld? Again, people argue it's not powerful enough or unique enough. but for me, the point of nintendo consoles is the software. You cant play nintendo games on the google play store. You cant play them on steam. You gotta buy them from them. Honestly, why not do the nvidia shield model of releasing nintendo exclusives? I mean, maybe piracy. Nintendo is very protective of their IP, but as gabe newell of valve would say, piracy is a service problem, and maybe nintendo just is out of touch with consumers here. Of course, they're very prideful and dont wanna change, and quite frankly, their Japanese company culture is stuck in the past. Which finally brings me to the other point.

Their business model is stuck in the past

So, here's a basic economic lesson I think nintendo has to learn. And it's a huge reason I struggle to justify buying nintendo consoles. For my whole life, the general economics of video games is this: you have multiple options, but you can only afford one. Sega genesis or SNES? Playstation or Nintendo 64? Gamecube, PS2, or Xbox? Wii, Xbox 360, or PS3? Etc. And PC kinda became a fourth option over time, that i trended toward after that.

But generally....I could only afford one. And that locks you into that company's eco system with the pros and cons. Nintendo's being....good exclusive franchises, you got your mario, your mariokart, pokemon, zelda, etc., but again, due to the weak/underpowered console thing, often poor third party support. You get inferior ports of games if you get ports at all. And you get locked out of the rest of the ecosystem.

This basic economic tenet is still relevant today. if you have limited money to spend, you wanna get the best bang for your buck. And thankfully, due to the proliferation of PC gaming and the digitization of everything, I feel like the rest of the industry is trending toward an "omni-ecosystem" so to speak. Console exclusivity is becoming a thing of the past. While theres still somewhat of a rivalry between sony and microsoft, I feel like as a PC gamer I get the majority of titles from both. I know Sony seems to be backtracking from this, and xbox is kinda dying because of this. But I think it's been good for consumers. Locking people into locked down ecosystems is generally limiting to consumers. releasing games on multiple platforms is good for them. And as a PC gamer, I generally go for a decent gaming PC as my "one option" to maximize my access to games, and I use an android handheld as discussed above for portability.

But that's where Nintendo falters big for me. Nintendo is stuck in the past, in an era of console exclusivity and forcing people to buy their specific hardware to run their specific software. And console exclusivity is a good strategy for getting people to invest in that ecosystem, but we have the "omni-ecosystem" now, and they're the only hold outs of the old world. And honestly? Their value aint great. Historically, you could argue the cost was low enough where some WOULD consider a wii or a switch as a second console. But for me, I always considered that to be rather borderline of a proposition. I mean, I can sustain a PC up to this point, and then spend $200 for an entry level android device for on the go. But then nintendo wants me to spend $300 to buy THEIR console, and with their new one, $450? And for what? Just to buy their games? If I only played nintendo games, that would be a good deal, but remember what I said otherwise. Their consoles are underpowered, their third party support is crap. You get inferior ports when you get ports at all. And often times you DONT get ports at all. So...nintendo is forcing me to choose between their ecosystem, and access to the larger omniecosystem of the digital age. And while I guess, if I didnt invest in such a nice PC to play games from that ecosystem I could swing it, that would inevitably cost me access to other games I care about. Battlefield games typically are rather demanding for example, and they're my main multiplayer game since around 2010ish. COD isnt on nintendo either. And while you can play them on older hardware, it's not the best experience. And I'd just fall behind on other AAA games I care about. Starfield doesnt run on weaker devices. Hell, it is barely functional on my PC. It doesnt work for compromised "steam deck" or switch 2 type experiences at all. And that's the engine a hypothetical Elder Scrolls 6 or Fallout 5 would run on. Doom the dark ages, outer worlds 2, both mediocre games, but I was into them. And they wont run on compromised hardware, they barely run on what I have (well OW2 arguably has breathing room, but Doom doesnt). 

I mean, this is the bad dilemma nintendo's business model forces on me. Do really wanna pay extra just to play mariokart, the new mario game, the new pokemon game, etc? Dont get me wrong, I love those franchises, but they're not so essential to my gaming experience I HAVE to have them.

The fact is, Nintendo tries to throw their weight around and push them into their ecosystem. And for some nintendo die hards, it works. I know some will pay whatever nintendo charges for those games because they're the franchises they grew up with and they have to have them. But I don't. My tastes have matured over the years, and yeah I grew up playing pokemon and mario like everyone else (even then, not consistently, I was a sega kid), and yeah, over time, Nintendo's franchises just started mattering less to me. Especially when given the ecosystem tradeoffs of "nintendo vs everything else" are at play. Sorry, Nintendo, you're just not that important. And for me, they gotta adjust to the market. Which means...less console exclusivity, and joining the omni-ecosystem with everyone else. 

By the way, if I did have to choose of the big three, I'd easily choose microsoft. ID games (Doom/Quake), Bethesda (Fallout/TES), activision (COD), 343 (Halo), Microsoft really does have a powerhouse of games I find essential to my gaming experience these days, and I'd miss them a lot more than I'd miss mario or pokemon, believe me. But...honestly? Im fine with exclusivity just dying and everyone being in the same general ecosystem. I thought the world was ending when sega discontinued the dreamcast, but now sega has stuff on every other platform. And while their game quality has largely declined over time, I still like having the flexibility. And that's what I ultimately value. The best value for the money. The most games for the least amount of money. Nintendo wants to charge a premium for hardware and for games, they lose a lot of clout with me. And yeah, all that exclusivity stuff the fanboys seem to love for some reason ("but but, if they didn't do that, what would separate them from phones?" Exactly...maybe what I want is...a gaming phone for a handheld, or a gaming PC for a console....these devices are just PCs....just with custom hardware and OSes, that's ALL THEY ARE, and the sooner we realize that, the sooner we can move beyond consoles and just embrace the omni-ecosystem), I just hate. Because it's not a good value, and it feels antiquated, like the product of a past era of gaming that the digital age has allowed us to grow out of. 

I want to play nintendo games on my phone or tablet. I want to play them on my PC. If nintendo wants to release some sort of specialized gaming oriented device, like a gaming phone that can do that, well, that's all well and good. But they dont wanna. Because they're stuck in the old world. And they're too prideful to change, and fear losing profits. But...as I see it, they're losing me as a customer by gatekeeping their stuff behind extra paywalls. Because that's what a console is. A paywall. And then they overcharge for their games, never put them on sale, or put them on sale with such a shallow discount I could buy a 6 month old game on steam at a similar price. And yeah. It's just unattractive.

Conclusion

And yeah, that's where I'm at with nintendo. Btw, I take more shots at them recently because on top of it all their switch 2 IS very expensive for what it is and I fear that has second order effects for the other consoles. We're talking $650-1000 for next gen consoles. $650 not TOO TOO bad, but given the PS5 currently goes for that and went UP from $500....the next gen console probably will cost close to $1k. So that's why i normally dunk on them.

But I also wanted to discuss the problems with business model they seem to have with me beyond that. The fact is, the switch 2 is an awkward value proposition. it's an oversized, overpriced, and overpowered handheld, but on the flip side, it's still underpowered for a home console.

And again, the whole "console" thing aint doing it for me any more. They're forcing exclusivity in an era where the rest of the ecosystem is more unified. And the financial barriers associated with that always leave me with the bitter pill of either buying nintendo to play the handful of franchises they make that I like, or just...playing everything else on what I'd otherwise buy. And I would literally choose "everything else" over nintendo and their exclusives, especially given their third party experience just flat out sucks. Even with the switch 2 it sucks due to the console STILL being massively underpowered vs even the 6 year old PS5 or an equivalent gaming PC. Like really, the specs are stuck in like the early to mid 2010s at best. Which...dont get me wrong, can put out decent visuals on paper, BUT...when games are designed for hardware 4-6x better than that...you're just not gonna have a good time on that tier of hardware. And given their software costs as well and the fact that unlike on steam they never seem to put things on sale, it's like, yeah, why bother?

Idk, I just can't get into nintendo. Maybe at one point I found their value proposition more attractive, but even in retrospect I often didn't. I often did buy other consoles at the time as well. Sometimes I regret it. Like...for all the talk of crapping on nintendo, honestly? I NEVER liked Sony. I think that their exclusive franchises are boring AF, and yeah. I dont even know why everyone loved the PS2. I bought one out of peer pressure and in retrospect wish i had an xbox or gamecube instead. but I digress. The point is, nintendo has been increasingly unattractive to me as we entered the digital age with us trending toward a more unified ecosystem in the 360/PS3 days. And yeah it has been that long. You could at least make an argument for them in the SNES era, the N64 era, the gamecube era, but after that it was like....okay....you got underpowered consoles, poor third party support, Im literally cutting myself off from the rest of the gaming ecosystem just to play a handful of franchises I like. I just never found it viable. Get with the digital age nintendo, it's just long enough. You guys are like the apple of gaming. And not in a good way. More in a "walled garden" kind of way.