So I decided to do an interesting thought experiment about how standards of performance have changed over time, given the discussion on ray tracing as of late and how I basically said people don't wanna do things as we did in the distant past.
I was originally gonna focus on SNES/Genesis as the baseline as it was originally the last 2D console, but I've found interesting stats with gen 4, which also apply to gen 3. Virtually all consoles on the nintendo and sega side of things ran at 60 FPS at around 240p. To discuss specifics:
Gen 3
NES- 256x240 @ 60hz
Sega Master System- 256x192 @ 60hz
Gen 4
SNES- 256x240 @ 60hz
Sega Genesis- 320x224 @ 60hz
Sega CD- 320x224 @ 60hz
Sega 32x- 320x240 @ 60 Hz
So, resolution wise, standards were pretty low, but games could run up to 60 FPS. Apparently a lot of 2D games ran at this frame rate (although with some slow down), although if you wanted to run 3D games like Doom or Star Fox you'd chug along at like 15 FPS on the SNES, and down to 20-30 on the 32x. I know for a fact these systems weren't running 3D games at a full 60 FPS because the lag was kinda terrible in retrospect.
With that said, let's make a transition to 3D.
Gen 5
Sony Playstation- 256x224/320x240 @ 60hz (2D games, 60 FPS, 3D games, 30 FPS)
Nintendo 64- 320x240 to 640x480 @ 20-30 FPS
Sega Saturn- 320x224 to 704x480 @ 30-60 FPS
Here, they don't even go by hz. I guess all consoles are capable of 60 on paper, but it seems obvious that the standard was 60 FPS for 2D and 20-30 FPS for 2D. A lot of N64 games ran at 20 FPS and that...tracks, in retrospect. It is surprising to see that the N64 and saturn were capable of 480p, but didn't use it. But given the frame rate, I'm not surprised. Looking at a few 3D titles on the saturn that measured FPS, I'm noticing quake hovered around 20 and tomb raider would have frame drops as low as 10 or so.
So yeah. Gaming was rough back in the day. I will admit, UNLIKE RAY TRACING, the jump to 3D was the most mind blowing gaming experience of my life. Like, you had to be there but I remember getting an N64 for the first time and being like HOLY CRAP coming off of the Sega Genesis and 32x. Doom was the most impressive experience I had before this point.
But yeah, you were largely lucky to get 30 FPS. It was worth it, don't get me wrong. Again, gaming improved at a quantum leap back then.
Oh, and because PC games were kinda weird back then, I'll discuss those too. A lot of them used horribly low resolutions too. 320x200 was the low end, 640x480 was typical, and 1024x768 meant you were pretty ballin' back then. And of course games would run up to 60, but it ultimately depended on your hardware. I'd imagine 640x480 @ 30 FPS was your average experience.
Gen 6
Sega Dreamcast- 640x480 @ 60 FPS
Sony Playstation 2- 640x480 @ 30-60 FPS
Nintendo Gamecube- 640x480 @ 30-60 FPS
Microsoft Xbox- 640x480 to 1280x720 @ 30-60 FPS
With gen 6, we seen a general bumping up to 640x480 being standard, which made sense given we still operated on CRT TVs and HD TVs weren't prevalent yet. While games could run up to 60 FPS, it seems like outside of the dreamcast (which had weaker visuals) many games didn't target them, and many instead aimed for 30 FPS. Even on the PS2, which seemed to often target 60, sometimes you'd have games with "half frames" so it actually functioned at 30, and a lot of heavier 3D ports also targetted 30.
Back then, 30 FPS was acceptable because 3D was pretty primitive. I mean,a lot of devs wanted to maximize graphics and frame rates were considered an afterthought. Im under the impression most gen 6 games on average ran a bit better than gen 5, but yeah.
As for PC, on windows XP you'd run games from 800x600 up to 2560x1440 depending on hardware. Of course, 2560x1440 was quite high end, similar to 4k today. A lot of gamers aimed for closer to the lower end of that spectrum. 60 FPS was common for lighter titles, with people aiming for 30 with heavier.
Also, because the XP era spanned both gen 6 and gen 7, yeah, you can probably imagine there was a bit of variation between say, the early 2000s when gen 6 was in place, and the late 2000s. A lot of early 2000s games often ran more at like 640x480 up to around 1280x1024 in my experience, although i had to run at the lower end of that if the games ran at all. In the late 2000s, I definitely remember targetting 30 in more demanding games. If you wanted to run like Doom 3, or FEAR, or Crysis, you were often settling for 30 on more modest rigs. Still, we started seeing 60 FPS becoming commonplace in the late 2000s as we went into gen 7, and a huge appeal to PC gaming was that PC gamers were running games at higher frame rates, resolutions, and graphics than the consoles could put out. So we did eventually see that standard creeping up by gen 7. As for the consoles...
Gen 7
Microsoft Xbox 360- 720p-1080p @ 30-60 FPS
Sony Playstation 3- 720p-1080p @ 30-60 FPS
Nintendo Wii- 640x480 @ 30-60 FPS
Here we see the shift to HD, with the PS3 and 360 typically running at 720p/30 FPS standard. And some games to my knowledge actually ran a bit worse, like think 576p or something in some cases. And frame rates would often chug under 30 in demanding scenes. Yeah, resolutions improved, but frame rates were still limited at the time.
And of course this is where nintendo kinda went in the direction of underpowered consoles where the Wii was more like....a game cube with motion controls. And it also had 30-60 FPS demanding on the game. A lot of first party titles were 60, but a lot of games ran at 30 and were lucky to run at all. And of course, the Wii generally had reduced visuals too.
PC gaming, as discussed above, also made a transition to higher resolutions as well. Although in this era I largely stayed on a CRT for much of it and ran stuff at a low resolution to maximize frame rates. Again, 30 was still acceptable, especially in heavier titles, but 60 was increasingly commonplace.
Gen 8
Microsoft Xbox One- 720p-1080p @ 30-60 FPS
Sony Playstation 4- 720p-1080p @ 30-60 FPS
Microsoft Xbox One X- 1080p 60 FPS - 4k @ 30-60 FPS
Sony Playstation 4 Pro- 1440p-4k @ 30-60 FPS
Nintendo Wii U- 720p-1080p @ 30-60 FPS
Nintendo Switch- 720p-1080p @ 30-60 FPS
So this is where things get relatively modern. 720p is now considered the minimum. Some games on weaker iterations of consoles would only get 720p @ 30 FPS, although there were a lot of titles that ran at 30-60 FPS at like 900p and stuff from what I remember. The second half of the generation really pushed boundaries here. The "pro" versions of each handheld released half way through the generations really upped the bar, making 1080p/60 FPS the standard, and with higher resolutions possible.
PC gaming also raised the bar. I mean, I found 30-60 FPS acceptable in the first half of gen 8 on my dated rig, but 60 became increasingly the norm, with 30 FPS becoming straight up unacceptable as time went on. This trend started in gen 7 I'd say, where 60 FPS started becoming more commonplace due to the higher level of hardware PC gamers tended to enjoy, but yeah, between the pro models of consoles and PC hardware far exceeding console baselines by the second half of the generations, it eventually got to the point of "bro, you still game at 30 FPS? what's wrong with you?" Higher end gamers would push for even higher frame rates and resolutions, but I typically stayed around 1080p/60 to maximize my own builds' longevity.
And of course nintendo stayed at gen 7 levels of fidelity since they started doing their "generation behind but it's cheap" thing.
Gen 9
Sony Playstation 5- 1080p-4k @ 60-144 FPS
Sony Playstation 5 Pro- 1440p-4k @ 60-120 FPS
Microsoft Series X- 1080p-4k @ 60-120 FPS
Microsoft Series S- 720p-1440p @ 30-120 FPS
Nintendo Switch 2- 720p-4k @ 30-60 FPS
Much like PCs, we seem to be having much higher resolutions and variable frame rates. While 1080p/60 FPS is often considered the baseline, we must not forget that a lot of games want to push boundaries, so some games will actually only run at 30 FPS. FSR is also often used to pad frame rates where you're not getting native solution in games.
Still, on paper, the bar is rather high. The consoles were largely intended to produce high quality visuals at high frame rates, but some dip#### devs still find it acceptable to foist 30 FPS garbage on us. It's often considered unacceptable when they do.
And this is why ray tracing isn't acceptable even now. In the past few console generations, PCs were often a full generation ahead of the consoles in terms of frame rates and resolution standards. We saw gen 6 console standards implemented during gen 5, we saw gen 7 standards during gen 6, we saw gen 8 standards emerging during gen 7, and we were gaming on gen 9 standards during gen 8. Like, the consoles and the PCs would start at relative parity but then in the second half of the generations the PC would leapfrog the consoles.
This is what I think a huge problem with gaming is, and why a lot of us PC gamers are complaining. When we see devs not optimize their games, or we see them push RT it's like DUDE NO ONE WANTS TO FRICKING GAME AT 30 FPS, WTF IS THIS. 30 FPS gaming hasn't been acceptable since the first half of gen 8. So like 2016ish.
And yeah, back in the past, it made more sense to trade frame rate for visuals, like gen 5, gen 6, but by gen 7 it started getting a bit old, at least on the PC gaming side of things. It was still acceptable for console ports and more dated PC builds, but yeah, I'd say around 2015-2017 is when 30 FPS REALLY went out of style for good. When I upgraded to my i7 7700k and 1060, it was like "yeah no, no one wants to play at 30 FPS, 60 is the new standard). So yeah. I feel like devs are just kinda out of touch at times.
Even worse, hardware is becoming INSANELY expensive. Like, we're stuck at parity with consoles now. And if anything an equivalent PC is becoming more expensive than a console. This is why I look at $800-1000 consoles next gen with dread. How much am I gonna have to spend to get that baseline of performance? And if they're just gonna try to push 30 FPS or make you get 60 with FSR or even worse frame gen or something, that's gonna be AWFUL. Like an actual REGRESSION.
I also dont think higher levels of visuals make as huge of a difference as they did. You could argue a FPS regression from the 2D era of genesis/SNES to the PS1/N64 era was well worth it at the time given how revolutionary that was, but ray tracing is just...lighting. And it's more saving devs time on developing their own lighting systems, rather than producing a well worth it visual benefit to customers.
So yeah. That's the issue of our time, and why we're complaining. We feel like we're now struggling just to stay at the targets we've grown accustomed to, and again, 30 FPS hasn't been common or acceptable in over 10 years now. We live in an era where 1080p/60 FPS feels like it should be the minimum unless you're on an underpowered device like a switch 2 or Xbox series S. We dont WANNA game on less than that. And with the cost of hardware skyrocketing, yeah we ESPECIALLY dont wanna see stuff push boundaries. it wasnt a big deal back in the day if a game like Doom 3, or FEAR, or Crysis pushed boundaries. Games typically ran on 4 year old hardware acceptably (30 FPS low back then), and then when you'd upgrade, your new rig would be 4-8x stronger than your old one. Nowadays, it still costs $300 just to get a PS5 equivalent GPU. 6 years after launch. We should be gaming on 5090 tier hardware by now at that price, but we're not. The whole market is just broken.
And yeah, that's why I decided to do this thought experiment. How did we get from like 240p/20 FPS N64 games to what we consider acceptable today? Gradually, with the bar raising every generation. And many of us dont want to regress to lower frame rates and resolutions just to play games with maxed out visuals. We just don't. It's not worth it. I wish devs would understand this sometimes when we see 30 FPS or sub 1080p resolution targets. That's unacceptable in 2026, go back and fix your fricking games. Because we can no longer rely on hardware enhancements just to power through.