Thursday, January 22, 2026

The point of simplified narratives and their importance for political worldviews

 So...as you guys know, I'm trying to write a book about my ideas. And in order to really make the book work, I kind of need to, you know, craft a worldview. So my first topic in the book is really establishing that worldview. I spend the first chapter discussing the ideological divide between myself and, say, the fundamentalist christians on the right. I spend the second establishing a brief overview of history.

 And my overview of history is based on Karl Widerquist's books, which discussed how income inequality, private property, and economic unfreedom came about. And the general gist is this. First, we had hunter gatherer societies where everyone was equal and free, then we shifted to farming where over time, as societies grew, they became more complex, required leaders to run the whole thing smoothly, and those leaders became increasingly authoritarian and divorced from the people they served. This is how we essentially got monarchies, feudalism, and those kinds of authoritarian systems. We had strong men who had this idea that they had a divine right to rule. They distributed property to their friends and allies. And then everyone else just kinda worked under them, paying tribute and taxes to the monarchy and the nobles. And out of that, we got capitalism, which mostly preserved the privileges of those who were wealthy under the old system, while functionally coercing the serf classes to work under "free market" principles. Land was privatized, people had to go get jobs, and that's how we got the system we got today.

It's a simple narrative. But is it too simple? Perhaps. I know a friend of mine recommended david wengrow and david graeber's book "the dawn of everything", which explicitly attempts to debunk these kinds of simplified "grand narratives" of history. It turns out that the narrative that widerquist put forward that I parrot tends to oversimplify to some degree (especially my own simplification, since I need to cut it very short in order to sum it up in just a few paragraphs). My friend said that my own narrative was "wrong" according to this book. I took the criticism under advisement, and left a note to myself to check into this book later when I came back to this topic, as I was writing on other topics at the time. Well, now the time has come to do that. And while I didnt want to read the book directly as it's quite long and I really only dwell on the issue for a couple of pages, I did read summaries of the book to get a gist of the main arguments, and read what others have said about it. One criticism really stands out to me, and I feel is very relevant to my own main book. 

Basically, the guy's criticism is this. The book does point out that history is far more complex in practice, and there are exceptions to the rule for everything, but it doesn't mean that the rule itself is necessarily wrong. If anything, a core weakness of this book (and I've heard this criticism from multiple places) is that while it attempts to tear down the existing narrative, it doesn't replace it with anything. It just points out that complexity exists. For example, we might have the narrative "there are four seasons, summer, fall, winter, and spring." And then some guy comes out of nowhere and is like "well ackshully, nuance exists, it doesnt snow in Arizona, so do they really experience winter? Some locations it's always cold like in the poles, some places it's always hot like in the tropics, sometimes it rains in the summer, sometimes it doesn't snow in the winter, sometimes some places have the seasons reversed because of being in the southern hemisphere, some places like southeastern asia might only have two seasons, a wet one and a dry one, etc."

And in the grand scheme of things, it's like "yeah, you're technically right, but this doesn't really disprove the main thesis, complexity exists but it doesnt mean the general theory is wrong or inaccurate, and many anthropologists understand exceptions to the rule exist while still accepting the general rule." And that's kind of the core weakness of this book as I understand it. Yes, history is complex. It's hard to simplify history into a simple narrative. However, if you need to, you will end up with...well...the narrative that these guys are trying to debunk. That originally we had relatively little inequality and unfreedom, something happened with society to make it more in a more authoritarian direction, and we deal with the consequences of that today as social systems have evolved in ways to protect the privileged classes at the expense of everyone else.

The person above even pointed out that this book could be used as fodder for the right, as the counter examples could provide talking points for right wingers to muddy the water. And that this could kill and sabotage left wing political movements. And in thinking about my book, and its context, and what I'm trying to accomplish, I tend to agree with that. 

What I'm trying to do here is establish a basic worldview. A frame of reference for the world in which we draw our sense of reality. My own worldview is set up in opposition to the conservative Christian worldview, which bases its own conception of reality on the bible, and tends to believe things like the world always operated under the principles of private property and what became capitalism, and that this was established by God, and deviations from it are immoral. Given 40% of the US give or take believes the world is 6000 years old or so, and that humans were created in their current form, this narrative has sway. And even among moderate christians, you still got people with one foot in that worldview, and another foot in the real worldview, where they're always reconciling religious doctrines with physical reality as we understand it.

I say, no, we need a narrative to explain how we got to where we are, so we can look at this system properly. So...I attempt to draw history from hunter gatherer societies through capitalism, but because my book isnt designed to focus on all of the details, and dwelling on the topic for too long will take away from the clarity of the book, I simplify. I point out that, "yeah, there are four seasons, summer, fall, winter, and spring", and I let the pieces fall from there. 

Does it evade nuance? Yes. Is that bad? Not necessarily. The more time I spend on a topic the more detail I can give, but the more detail that I give, the more it bogs me down, distracting from the main topics which deviated to this one to focus on. The less detail I give, the more I oversimplify, but if I know I oversimplify, is that really bad?

Well, again, if anything, the oversimplification is a necessity. The comment above was written by a leftist. leftists are often aware of the same history I draw from, and use that general narrative to critique capitalism as just an evolution of a system that favors the wealthy at the expense of all else, and calls for its abolition. I dont quite call for its abolition, but for heavy reform, BUT...I deal with a lot of the same common history that these guys do and my own analysis is parallel to leftism. I do emphasize different things than they do. They focus on the means of production and alienation, I focus more on the protestant work ethic and economic coercion. They think the solution is a new system. I think the solution is a new new deal within the existing system. There are ideological differences. 

But...again, imagine you try to point out a narrative and someone comes around complicating everything with a lot of details. And the right is a lot like this. To build on the weather ideology, we see this a lot with climate change. "Oh, its snowing in chicago, guess it still gets cold in winter, checkmate librul!"Or on evolution, I know I was taught creationism in high school and they focused on how various early transitional forms were found to be hoaxes, they emphasized microevolution over macroevolution (even though they're the same thing on different time scales), etc. And the point is, even if these small nuances exist, does that disprove the greater trends? And I would argue no.

And it's the same thing here. I think, as an author, well, if I had to replace this narrative based on this new work, what would I get? And the answer is that I would get...nothing. Just an overcomplicated picture bogged down in details that lacks any form of clarity on problem definition. And it probably misses the point. And that can sabotage political movements, because left wing political movements need a worldview, to compete with the right's worldview. We need a narrative of how we got here. And while I admit, we cant teach everything in a condensed format, we can at least teach the basics and general trends. 

The problem with this book is that it complicates conventional narratives, without replacing it. it just bogs down the reader in unnecessary details. And when youre trying to build a worldview, it paralyzes you. it leads to nihilism. It can unravel political movements, like the one I'm trying to build.

So...to respond to this book, I'll say this. I acknowledge that complexity exists in the real world. it is very difficult to explain everything that has happened in human history in only a few paragraphs, or pages. And I dont want to dedicate entire books JUST to this topic. There are other people who have covered it, and I can assure you, rereading the prehistory of private property lately, that stuff is addressed somewhat. Similar counter examples are brought up, but they dont disprove the narrative widerquist puts forward.

For me, I have to simplify though. I have to boil it down to "there are four seasons" without worrying about whether it snows in arizona or whether india experiences them the same as the US does. Why? Because the point of this book is to build a political movement with a clear problem definition and solution, and quite frankly, the exact details of the past dont matter, as long as I have a narrative to get me to the more important parts, which is capitalism. because even if the ancient world is complex, well....in the end, capitalism won. It's the worldwide system now. THe authoritarians won. They beat out these other systems through never ending conquest and imposition of their systems on the whole world. And that's kind of the point of discussing things anyway. Ya know? I spend like, maybe a couple pages discussing the distant past, and then the rest of the chapter I dedicate to discussing the history of capitalism and political movement within it. And that's what defines modern history anyway. All I really need to show is that hey, these social structures we practice today aren't natural. Things werent always like this, AND WE CAN CHANGE THEM, AND SHOULD CHANGE THEM! Is that so hard? Apparently, to people who want to write 700 page books arguing 'well ackshully some group in the distant past defied the trend", it is, but for the rest of us looking to use history of the past to educate people on CURRENT realities, it really isn't. And that's the point I'm trying to make here. 

Monday, January 19, 2026

How bad can computing get? A look at the lowest end products I could find online

 So...that article I wrote yesterday about how basic $300 laptops and $200 tablets can run a lot of modern stuff and are surprisingly useable had me wonder, well...what if I looked for the worst products available? How bad off would you be if your (or more likely, your mom or something) picked out the worst products in like walmart or something. What would you be limited to playing?

I'm gonna be honest, I dont own any of the products I want to discuss, nor do I want to buy them, but I will ascertain, based on various metrics including comparisons to products i do own, and benchmarks, how bad off you'll be if you end up with any of these.

 Windows laptop 

So, I decided to try walmart for this one and see what was the cheapest, worst laptops you can get are and dear god, it's awful. I really forget how variable quality is at the low end sometimes. I normally filter out the junk and hone in on the best specs for the money, but there's a lot of crap. You often got laptops next to other laptops 5-10x as strong for only a few dollars more. Obviously getting an i7 for $300 like my dad did isn't normal, I really searched for that deal and had microcenter within driving distance, but yeah. You should at least get an i3 or ryzen 3 system with 8 GB RAM I would say these days? But on walmart's site, I find a lot of ewaste for like $150-200 with celerons and pentiums and athlons in them. For reference, those are all low end processors. Sometimes comically low end with how bad they are. I mean, not gonna poor shame, I know how tough things can get, but yeah for the most part there's no reason to buy the vast majority of this junk. It's ewaste. You can often buy something far better for like not a lot more money. 

Anyway, sifting through the garbage, my goal today is to find the MOST garbage laptop I can. It should be noted I'm limiting myself to windows and its environment here. After all, the point is what is the worst gaming machine I can find and what you can play on it. There are a lot of awful chrome books too with what looks like really bad processors, but I wanna focus on windows first. 

Anyway, eventually I found this monstrosity for $126. It has a celeron N4020 processor, a particularly old one (there were a lot of faster N4500s, N100s, N150s, and N200s), 4 GB RAM, 64 GB storage, and for an IGP, the N4020 has a UHD 600. 

So first, let's focus on the processor. its a dual core, and it doesn't even have hyper threading. I havent seen CPUs that bad since like the early 2010s, then again anything below i3 level to me is just...terrible. I remember far cry 4 wouldnt even launch on a dual core after a bunch of people were hyping some pentium back in the day. And a lot of games from 2013 on just...thrash with horrible 1% lows framerate wise if they are on less than 4 cores. To compare it to my old 2011 laptop, it is at least better than that, and as I said, that thing felt like an Xbox 360 in practice, so maybe it could play games from before 2013 well?

4 GB RAM, here's the thing. New OSes dont do well with 4 GB RAM. I know my old laptop from 2011 is virtually unuseable on windows 10 because windows 10 liked to use 6 GB on my desktop when doing nothing at all. On 4 GB, my old laptop lagged and struggled to open anything. Was a decent experience back on 7 though, but yeah, 4 GB RAM hasn't been good since, again, before 2013 or so. After that I'd say you want a bare minimum, and opening task manager on my desktop, I'm currently using 8 GB RAM just sitting here with a few game launchers, discord, and a firefox window open with 4 tabs (mostly this blog and other links related to what I'm talking about). 

GPU wise, it has the intel UHD 600, which is probably slower than the 630. Comparing the two, its like 30% as powerful.  Now keep in mind, the 630 was 8800 GT territory and isn't too too bad. I mean, its very dated by modern standards, but again, going back to pre 2013 or so, it's sufficient for that. Comparing it to my old laptop chip, it's comparable to that from the few benchmarks available.

Now, keep in mind, I own a laptop like this. I bought it 15 years ago when i was a relatively midrange laptop that I got on sale at a good price. It had remarkable abilities for the time. Even could run BF3 which was brand new at the time. I ran it at 800x480 with 30 FPS, but hey, beggars can't be choosers, it was a marvel to see it run at all. 

However, I stopped using it after around 2015ish mostly. it got too slow. It couldnt run any new games after 2012-2013 at all well. And after W10 pushed itself on everyone, it became so slow i hated just turning it on. And after that, I just got into tablets and mobile devices instead. I kinda gave up on the idea of a budget "gaming laptop" because of how poorly tech like this ages (also why i havent been keen to get a steam deck, it's very comparable to what this laptop was at the time and I expected it to age the same way, it has admittedly aged a bit better, but not a ton better, so...yeah). 

But yeah. Would I recommend buying something like this in 2025? No. it's a horrible experience. If it's anything like that 2011 laptop, it'll take 5 minutes to boot, be unresponsive when you click on anything, and honestly, again, youre gonna be stuck with at most ps3/360 ports. Looking it up online, here's a playlist of YT videos with this CPU/GPU combo and its...worse than my old 2011 laptop was. Literally, looking at how it handles stuff like BF3, COD4, and stuff like that, I can attest that my AMD laptop from 2011 is faster than this at gaming. 

 Why would it be worse than the AMD one? If I had to guess, drivers. AMD has made GPUs for decades now whereas before intel Arc, all they had were IGPs, and despite making excellent CPUs, they often made bad GPUs and had bad drivers. I know in 2013 I recommended an intel HD 4000 laptop to a friend (comparable experience to this) and he had trouble running some games I could run simply because intel drivers sucked. Yeah at this level, I'd highly recommend buying AMD over intel when possible, just to get a better IGP. Normally with low end laptops like this the IGP is the big bottleneck with gaming as the rest of the specs are at least decent, but intel IGPs really fall on their face sometimes, which is why I'm kinda biased against them with GPU/IGP related purchases for gaming, even if they offer excellent products otherwise for the most part. 

AMD CPUs historically have been fairly mid (although are decent now), but again, when buying a laptop like this, it doesn't matter. Sometimes its better to get a slightly weaker CPU to get good IGP performance. Again, what's gonna bottleneck your system is your GPU mostly.

And RAM...4 GB RAM is awful and I know it held me back on my 2011 laptop at times, but yeah. 

Just...don't buy products like this. Again, always go for at least intel i3 or Ryzen R3 with 8 GB RAM these days if you want something like....not COMPLETELY awful. Again, at budget prices, you got a lot of options, although again, it's better to spring for something for like $300 and not end up with ewaste. If I were to buy something cheap from walmart right now, just looking at the options, spend a bit more and go for something like this or this. Something around the $300 mark that isnt awful. Which would I go for between those two?

Well, intel wise the CPU is a little better. AMD wise the GPU is better though, and keep in mind what I said about drivers. AMD aint the best at supporting their products long term, but they tend to be more functional for gaming in practice. Either way, the intel one is cheaper so the two are appropriately priced. id probably say is worth $60 for better and more consistent GPU performance though. 

What would these machines be able to do?

Well, not a ton, see how it falls on its face in these games, but many of them are rather demanding AAA games.  

Youre probably better off spending an extra hundred on a steam deck if youre serious about budget gaming, and forgoing the budget laptop experience altogether. But if you have to have a laptop, well, this is gonna destroy the chip from the cheap one. Notice how youre paying around 2-3x the price and getting 6x the performance? Low end be like that. Avoid low end. 

Really, I think the best lesson from all of this is "just get a steam deck, bro...".

Either way, the worst of the worst can at least get you xbox 360 level performance roughly, with a more modern laptop being....not all that terrible. I mean, it's not great, but it's not terrible....

 Anyway this is why i just switched to mobile devices.

Speaking of which, what about Chromebooks?

Chromebooks

ChromeOS seems a lot more limited gaming wise. It's not designed for gaming, although it's apparently able to do android gaming. Chromebooks are cheap, affordable, and while probably better than an ewaste laptop for general uses, are probably less capable of gaming. Apparently PC games are available through a compatibility layer, although given the low end nature of these systems, they probably won't run well. 

In terms of the specs, at the low end you get the same host of celeron processors that low end windows laptops can run, and I'd expect game compatibility to be worse than in a normal windows environment. However, I did manage to find a chromebook with an even worse CPU than in the windows ones, bringing us to new lows in performance.

So this thing has a N3350 processor, which is this much worse than the N4020 of the windows one. Graphically, they perform about the same, with the HD 500 in the chromebook being worse than the HD 600, but not that much worse. Still, you can't afford to lose power at this level.

Now to be fair, I know very little about chromebook environments, so on actual gaming performance, Id prefer to let videos do the talking, however, looking at videos I cant find this specific specs configuration, with videos talking about bad chromebooks still having better specs than this, and being limited on storage anyway. Still, just to compare it to say, my samsung s6 lite, on the android side, it looks comparable. Although on CPU it seems worse than the samsung exynos chip. Given the CPU was my biggest bottleneck in games like COD mobile, that doesn't bode well. 

It also barely has any storage at all. So...to answer the question what would it take for me to do browser gaming rather than actual PC gaming? Probably this thing. I don't even think I would do mobile gaming on this. Like, again, I wouldnt even have enough storage for games with its pathetic 16 GB storage. I havent used a tablet with 16 GB storage since my original memopad 7 from 2014, and that can't run decent mobile games these days. Heck, I havent used a PC with 16 GB storage since my original 90s era desktop which died in 2002.

So yeah. We're scraping the bottom of the barrel here. 

As far as what a good chromebook looks like, well it follows the pricing of windows. You can even get like intel i3 ones, although at that point I'd just want windows for gaming anyway. Still, say I settled on a middle ground and went for this. $180 for a CPU that's 2.5x as strong, and a GPU that's 4x as strong. I could probably like, at least do SOMETHING on that. For PC gaming, it can at least handle the basics in a windows environment at least. Not amazing, but like...better than the crap we've been talking about. 

Apparently on the android side, it won't even let this guy download games, so....yeah. Chromebooks seem to be a special level of gamer hell. If youre a parent who wants to get a system where your kid cant play any games and you wanna make them miserable (or just focus on school), get a chromebook. Apparently the things are useless. Not saying really dedicated people cant find way to jailbreak the things. but yeah.

With that said, let's focus on android tablets now.

Android tablets

 This is the space I retreated to for low end and portable gaming. I dont do the steam deck, as its kinda expensive and not very portable or good for what it does for the price (although better than laptops). I dont do laptops because they're even worse, and often even less powerful. Chromebooks are hell. But android? Well, android has the google play store, games meant for cheap and lower end devices (including 3D games), and yeah it's the best bang for your buck for the lower end. Even something like a $200 tablet like a Samsung A9+, which I own, is pretty capable of running most stuff Id wanna run. And if you wanna invest in an actual gaming handheld for $200, you got a lot of options. 

Still, given how powerful my older samsung s6 lite is, and how it competes with the above celerons. I suspect we'll approach new levels of low here today. On the one hand, gaming on android is like gaming on a curve, you can go a lot further given how undemanding the OS is and how the games are designed for this hardware level, but on the other hand...ewaste exists on android too.

The problem when I look at cheap android tablets, the market is flooded full of machines that dont have their specs published honestly. Theres a lot of tablets with chips that aren't even named, and RAM amounts that include a page file off of your drive so youre getting WAY less RAM than you're supposed to. As such, it's hard to properly evaluate them, especially if I do not own them, and I aint buying ewaste just to make this article. 

Still, there's a few for under $60 which have 2 GB RAM, and an unnamed quad core processor, probably an allwinner model. I would not recommend buying these, but again, I'm trying to find the worst tablets I can. 

 Looking up allwinners on google I get:

Allwinner quad-core processors are cost-effective System-on-Chips (SoCs) featuring four ARM CPU cores (often Cortex-A7 or A53) and integrated GPUs, designed for entry-level to mid-range devices like Android boxes, tablets, IoT gadgets, and smart displays, offering capabilities like 4K video decoding (H.265/HEVC) and supporting various connectivity options at low price points. Key examples include the A33 (Cortex-A7), H3 (Cortex-A7), A64 (Cortex-A53), and H618 (Cortex-A53), powering budget-friendly hardware with features like 4K video and Android/Linux support. 
Key Characteristics:
  • CPU Cores: Primarily ARM Cortex-A7 (older, more power-efficient) or Cortex-A53 (newer, better performance).
  • GPU: Often integrates Mali-400 MP2 or Mali-G31 MP2 for graphics.
  • Video Support: Hardware decoding for H.265/HEVC (4K @ 30fps) and H.264 (1080p @ 60fps) is common.
  • Target Markets: Entry-level tablets, smart home devices, OTT/Android TV boxes, automotive systems, and single-board computers (SBCs).
  • Cost-Effectiveness: A major selling point, making quad-core performance accessible in budget devices. 
  •  

 Basically, they're junk. I know the mali 400 and the mali g31 are really outdated chips I saw common in budget tablets like...a decade ago. I mean just comparing it to my samsung s6 lite 2020, we're talking 5% of the GPU power. With it refusing to run a lot of benchmarks at all. Dont buy devices like this guys. I mean, at this point, youre gonna have a lot more problems than not running games, these devices won't run much of anything at all. They will probably lag doing anything. Don't buy these kinds of devices.

Conclusion

So yeah, I'll be blunt. Dont buy the worst electronics you can. While you can occasionally get deals at the low end if you know how to shop and know what youre looking for specs wise, you can get good deals, for the most part, the actual bottom end of the market is an endless void of ewaste. You'll get something that not only won't game, it won't even run smoothly out of the box for the most part. It will be a terrible and frustrating experience. You probably won't even be able to run those browser games that I bashed.

Still, if you know what to look for, and are maybe willing to spend a bit more, you should be able to find something decent.

With laptops, I'd spend at least $250-300 for windows, and aim for at least an i3/r3 with 8 GB RAM. Anything with 4 GB RAM, or celeron/pentium/athlon processors are more or less certified ewaste at this point in my view. If you really know what you're doing and really patient with sales, you might even score something decent for $300-400 like an old i5/i7/e5/r7. You probably wont see laptops with dedicated GPUs until you hit like $500 on sale and those go quickly, but yeah.

Honestly, I'd probably just say buy a steam deck, but I understand people would rather buy a device for like school or office work too and non gaming tasks. 

With chromebooks, just...don't. They're not meant for gaming and seem to be designed for sadists (or parents/school administrators) who dont want you to game on the things at all. While they should run stuff off of google play, a lot of the time they wont be compatible with actual games, and have limited storage. They make ideal work/school machines though due to their low cost and locked down nature (as in, they're made for people who DONT want you to game on them). 

With android tablets....the sub $100 market is a black pit of awfulness. Just avoid. Still, in the $100s at least you should be able to score like a 4 GB Samsung A9+ or a cheap Lenovo tab based on my recent research and near $200, you can get the better 8 GB version of the A9+. Still, looking now it looks like the A11+ is out and is a bit better, but also more expensive ($220-250 for the 6 GB model). Yeah, i'd buy a A9+ on sale or a A11+ honestly. Just avoid the junk of the low end market.

Assuming you shop smartly and get something that's...not terrible, a world of gaming is open to you. A samsung tab should run most apps on the play store, although might not play the most demanding ones. And again, with PCs, something capable of running games at least through the mid point of gen 8 should be doable, and if you take my "just get a steam deck" advice, pretty much all of gen 8 minus multiplayer games, as well as early/lighter gen 9 titles.

Just...for the love of god avoid chromebooks. I'll have to research them more but again, they're literally made by the people who want cheap work machines that are extremely barebones and locked down on gaming. So yeah, I guess at the extreme low end gaming hell still exists. Which is the point of this. Just...how bad can it get? Now we know. Pretty fricking bad. Shop smartly, people. 

Yes, we should embrace a life of hedonism

 So...I had a debate with someone who had an interesting take on some subjects up my alley. Although i did have one significant difference from the guy. he seemed to value meritocracy for its own sake, and I don't. 

I find meritocracy to be rooted in structural functionalism. We need it to motivate people to work. But, assuming the work can be done without meritocracy, then meritocracy loses its meaning. Even more so, praising meritocracy for its own sake leads to the sad state of affairs we have today where we create jobs for their own sake, rather than allowing people to work less as automation makes us ever more efficient. I would argue we should change our social structures somewhat to account for technological advances, while believers in meritocracy just want more jobs for their own sake.

I saw a meme recently about how we live in an era where we can do the work of hundreds of people with one robot these days, and then it had some ancient philosopher like "I bet you just sit around eating figs all day and having orgies...right?....right?" And yeah, it was the anakin/padme one. 

We don't live in that reality. And I have my own ideas on why. It's because we're governed by the morality of protestant ascetics who thought enjoying life was sinful. Eating figs? That's gluttony! Orgies? That's lust! Not working? That's sloth! SIN! SIN! SIN! SIN EVERYWHERE! 

As if we shouldnt be enjoying life. But that's basically the logic of protestantism. Enjoying life is sinful, we shouldn't do it. We're sinful, we gotta atone to God for our sins, and this involves living a frugal life where we dont enjoy things and we just work all the time. 

And that's the ethic that dominates modern life. 

As a humanist, I have to think that somewhere along the way, we really got our priorities wrong. Maybe we should sit around all day eating figs and having orgies. Well...okay, not literally. We have far more diverse diets than figs. Of course that's why we Americans are so fat.  And maybe we shouldnt literally have orgies, I mean, that much unprotected sex with so many partners can cause complications like STD epidemics and unwanted pregnancies, although we do have ways to mitigate SOME consequences of that. And it could destroy romantic relationships. Some think the ubiquity of porn is causing issues with social relationships for example. I don't advocate for banning it, but I'm not gonna recommend people just goon 24/7 either. 

But what else can we do in a post work world? Well, I just got done talking about the ubiquity of gaming. Star Trek had the holo deck, they even talked a holo deck addiction, something i'd arguably have if we lived in that universe, but you know what? If that's what people wanna do with their lives, that's A okay in my book. I'm for freedom. And that's the thing. I'm for the freedom from forced obligations like work, but also freedom to enjoy life. The pursuit of happiness as its called. We should be free to do what makes us happy. And if that involves consumption of food, video games, certain nsfw materials between consenting partners, go for it, I don't care. As long as you aren't harming anyone, enjoy life. And people like other things too. We live in a consumerist culture. Some like to play sports, watch movies, travel, go to restaurants and amusement parks, and live the good life, and they should be able to. We seem to have nothing against hedonism when it increases someone's wealth. We just have this really nonsense work ethic to go along with it.

And don't get me wrong, to some degree, consumption increases work demands. And that's what makes consumerism and modern capitalism work. People consume, consumption increases demand, demand creates jobs, blah blah blah. But for me, it's a balance. if you have expensive and labor intensive tastes, you might have to work harder yourself for more money to enjoy that. But at the same time, if we live in a world where we can supply everyone's basic needs with a fraction of the labor we used to, AND we also live in a world of cheap and plentiful entertainment, why shouldn't basics be cheap or free? Why shouldnt we give people money? And why shouldn't people with less labor intensive or demanding tastes be able to live well while working very little?

We live in a world dominated by the protestant work ethic. but in my view, we should embrace a more hedonistic ethic. There's nothing wrong, in my view, of not wanting to work. Especially in modern society. I mean what's the point of all of this progress if we still have to work like a medieval peasant just to survive? All that protestant work ethic crap doesnt have its sociological justification in functionalism, I'd argue, it has its justification in conflict theory. We work to make rich people money while we live frugal lives always struggling to survive. And that's where we are. We act like creating jobs is such a great thing, when I just see it as subjecting people to work to justify giving them a paycheck to satisfy an ethic rooted in the logic of a long since bygone age at best. And at worst, we're literally just slaves of rich people. 

So yeah. That's my argument. There's nothing wrong with enjoying life. We SHOULD enjoy life. We should enjoy life more, and we should work less. Sorry, not sorry. F protestantism and its anti fun ethics.  

Sunday, January 18, 2026

The reverse gilded age of gaming

 So...I've been thinking about that last article I wrote about the accessibility of gaming, and I've come to realize gaming is rather dualistic in nature. On the one hand, gaming sucks. Modern gaming is expensive, it's predatory AF, and hardware prices are going up and up and up. On the other hand, gaming is more accessible than ever. Your basic computer for $300 has as much graphical power as the gaming PC I got when i graduated college in 2010, and can play virtually any older game up through 2015, and many low requirement games that are newer. A $400 steam deck can run most single player games up through 2022, although may have issues with multiplayer due to linux compatibility. A $200 tablet, smartphone, or android gaming handheld can run entire generations of past games on the thing. Everyone has more access to games than I had for most of my life. They just can't run the latest and greatest. And given the state of AAA gaming....are they really missing much if they miss the modern stuff? Arguably not really. 

So we have this dualistic nature of modern gaming. We have widespread accessibility to hardware capable of playing most of the classics and even many more modern lower end games, and we have have a crisis with AAA gaming, affordability, and the rising cost of PC parts. The high end of the market for the latest and greatest is turning hellish, but the past is more accessible than ever before, and people have access to decades of older games on modern hardware.

And this is kinda why, as I contemplate my choices going forward for gaming, I might just opt out of the latest and greatest. if PC gaming and console gaming is becoming so unaffordable, I might just opt out of it. However, that doesnt mean i'll stop gaming, it just means I'll shift to sticking with older and lower requirement games. A lot of my reasoning for wanting access to the latest and greatest is not wanting to be left behind. I know what it's like to have a crappy PC that can't run anything. But in the modern era, your typical low end PC is at least half as powerful as a PS4, and closing in on base xbox one in power. Steam deck and the like has literal PS4 era power. Even smartphones are arguably at least as powerful as like a ps3 or xbox 360 these days. We are, in a way, living in a golden age. it's just gilded because of all of the crap.

If anything it's a reverse gilded age. it's a golden age but all the crap is on the outside, where we look at modern gaming and see nothing but high system requirements, mediocre experiences, and things being in decline. But at the same time, the classics are more accessible than ever. The greats go on sale all the time for just a few dollars on steam. And they can be run on any computer just about. You wanna play PS3/360 era titles? Just buy a $300 craptop and pay anywhere from $2.50-10 per game and you got some of the best games ever made. Play the wealth of f2p games that would wow past me. Play doom eternal and cyberpunk 2077 on a fricking handheld. The possibilities are endless.

Sometimes I focus so much on how bad current thing is I forget to talk about the other side of that. Past experiences are more accessible than ever and can be run on just about anything. In a way its a reverse gilded age. Crappy on the outside, golden on the inside.  

Why am I talking more about video games than politics these days?

 So Ive noticed since the new years the vast majority of my posts are about gaming and computers, not politics. The answer should be pretty consistent with the vision this blog has always been. This is a passion project, and I'm guided by my passions, and I post about things that interest me. 

I could post about the crimes the Trump regime is engaging in, and how the world feels like it's on fire, but ya know, bad things are bad. And I hate writing posts that I feel like state the obvious. Obviously, ICE shooting people in the face is bad. And throwing flashbangs into cars full of children is bad. And blinding people because you shoot people in the face with "nonlethal" rounds is bad. And the regime naziposting on social media is bad. And toying with the idea of suspending elections is bad. And invading venezuela and just taking it over is bad. And threatening to do the same to Greenland, which is owned by Denmark, a NATO ally, is bad. 

Obviously, everything right now in the realm of politics is bad. Obviously, everything that is happening right now has eerie similarities to the third reich. Like literally. From mass deportations, the violation of rights, state violence against citizens, the militarization of police, invading and threatening to invade countries, it's all bad. America under the Trump administration is bad. We are the bad guys now. And it should go without saying. Hence why I just talked about 6 things in one article. I have little to nothing else to say on these subject. Yes, it's all bad, yes, it's all terrible. I don't know what else to say on it. Things are bad.

And again, I dont like just saying this is bad, that's bad, that's bad, because it should be basic common sense for anyone who agrees with at least the most basic values that I hold. 

Anyway, christmas happened, got new games, we also got a PC affordability crisis due to AI and that nonsense, so that's been more interesting. Hence why I've been posting on that.  

What would it take for me to ever play browser games seriously ever again?

 Okay, so I had an interesting question posed to me by a friend looking to design a game on itch.io, and what kind of game I'd like to see on there. My response was....nothing, because I'd never wanna play games from there. I consider browser games to be low quality games, and I havent played them since college. My argument is I could see people with no means to play anything better to play them, but when I think about what's available these days, I literally can't think of a scenario where I'd ever wanna play such games.

Let's talk about this from my perspective:

My history: 35 years of gaming

So I'm in my late 30s and I've been gaming since around....1991....1992...ish. I was like 4 when I started. I started with the game gear and sega genesis and I've been a pretty avid gamer for most of my life. As I see it, there's so many good games I've played over the course of my life that at this point, I just can't see browser gaming as serious gaming. The closest we ever got to that was quake live back around 2010, when Id made quake 3 arena a literal browser game people could play online. 

I will admit I did play some browser games in college, like bloons TD and the like, but quite frankly, the big appeal of those was I could play them on school computers, which were very locked down. I couldnt install anything on them, and the only alternative to browser games were sticking old games on a flash drive and playing those, and I did that. I did literally play the likes of Quake 2 and Unreal Tournament 99 on school computers off of a flash drive. All you needed to do was stick the game files on the drive, stick it into the computer and start the exe. Of course, the number of games that could do this was limited, but even then I did find a way to have alternatives to flash games.

But yeah, flash games were popular at the time because...well...we didnt quite have phones as we currently have them yet, it was the late 2000s and smartphones were in their infancy and not widespread yet. Flash games could be run on low end computers with integrated graphics and trust me, IGPs were so bad back in the day many of them couldnt run anything demanding. My home computer could barely run UT99 and quake 3 at the time, although I eventually fixed it up and improved it by adding more RAM and a dedicated GPU. Then I could run stuff up to around 2005 decently. And laptops at the time had horrible intel integrated graphics that was a good 5 years behind the curve graphically. So yeah options were limited. You could run like 90s games on that stuff, maybe early 2000s (and not emulation), but thats about it. You had options, but you kinda ran out fast. And again on school computers they were so locked down your options were truly limited.

Even then, again, I had my ways. My home computer with IGPs could still run very low requirement games and had about as much power as id say...a sega dreamcast. And I could stick a few games on a 1 GB flash drive and play them at school too. But still browser games greatly expanded my options.

But hey, it was the 2000s, and if you were in the 2000s on bad hardware, that's what you had access to. So...flash games became quite popular. But in the year 2026...why would ANYONE wanna play browser games?

Let's go through the various tech devices I have and discuss them, working backwards. 

Main gaming PC: i9 12900k, 32 GB DDR5, RX 6650 XT

  Yeah I got those 32 GB DDR5, I rich. Actually, I paid about as much for my entire CPU, RAM, mobo combo as the RAM currently costs. Yeah. The joys of upgrading from microcenter in 2023. But yeah, this can run ANYTHING currently. The most demanding game, it'll run it. You might be running on low with upscaling on in the most demanding of situations (looking at you, doom the dark ages), but yeah, they'll run, and that's all that matters. And I have so many games to play I'd never consider gaming on anything else. 

Previous PC: i7 7700k, 16 GB DDR4, GTX 1060

 So say my PC died and I had to go back to the previous PC. In reality, only a part would die, but still, I would need to go back to some combination of the prior parts, which i still own and keep as backups. What do i lose? Well, I can play anything up to 2022 well enough. 2023 is when "next gen" requirements really went through the roof, so I upgraded my GPU in 2022, and my CPU/RAM in 2023. Going back, my options would be more limited. BF6 might not run well, neither would COD BO7, but I'd expect them to still run in some form. Anything with ray tracing is a no go. VRAM could also kill me here. But still, would I be playing browser games? Nah, I'd be flipping back to slightly older PC games. Or running some current games with lower settings and frame rates. They would be borderline, but they would run.

No GPU: i9 12900k 32 GB DDR5, UHD 770

Say my GPUs died and I had to either go back to the HD 5850 sitting in my attic or use the integrated UHD 770. Id probably opt for the latter at this point given the 5850 hasnt had a driver update in over a decade now and has 1 GB VRAM, and the UHD 770 is still reasonably powerful. What could I run? Apparently quite a lot. Again, it's about as powerful as my old HD 5850. I might have to return to early-mid 2010s games, but that still gives me A LOT to play. I might not be playing BF6, but BF4? Maybe. The fact is, I'd have every game still available to me up through 2015 or so. Which is...quite a lot. Why would I ever play browser games?

I'm not even touching emulation, but yeah, looks like everything up through PS2/Gamecube would be playable if you're into that sort of thing too.

That gives you around 25 years of games of my ~35 years of gaming. Sure you lose the modern stuff, but we got so many eras at this point I still have zero reason to ever touch a browser game.  

Dad's laptop: i7 1265u, 8 GB DDR5, Iris XE graphics (but not really)

So I got this one for my dad a few christmases ago as he needed to replace his aging laptop, and I got this bad boy for like $300 from microcenter. So it's a pretty budget laptop. It said it came with Iris XE graphics, but didnt read the fine print and found out that it only gets the full graphical abilities when it has dual channel RAM, which this does not. Still, it can maintain 2/3 of the power without it, having 64 CUs available instead of 96. 

I never gamed on it but I estimate its capabilities as being similar to above, where it could probably fluidly run anything up to 2015 or so and maybe low requirement games that are newer. Looking up the XE graphics, this seems to be what it's capable of. So yeah, still giving me fluid details up to 2015 or so, and giving me no reason to play browser games.

i7 7700k, 16 GB DDR4, UHD 630

What about the integrated on my old desktop? I never used it given I had a HD 5850 as a backup, which was on par with the newer UHD 770, but this one was half as powerful. It was on par with the venerated 8800 GT, which was the high end GPU when I was in college everyone coveted, and could run anything up through 2013 well enough to my knowledge. Looking online it still seemed capable enough to run games like Apex legends, overwatch, and GTA 5 at playable framerates. And emulation wise you still have a pretty decent experience it seems. So there you go, it can still game at a standard far higher than the baseline that would be needed for me to seriously play browser games. 

A6 3400m, 4 GB DDR3, HD 6520g

I got this laptop back in 2011. It's now 15 years old. It cost around $380 at the time. It had an IGP in it about as powerful as what was in the 360, and this could run virtually all 360/PS3 era PC ports without issue. It's very dated today. But I literally used it for years as a "gaming laptop", kinda fulfilling a similar role the steam deck would now. I cant run anything past 2012ish on it well, but hey, that still gives me a huge backlog of games from the "good old days" of gaming. And yeah, even this was powerful enough for me to do "real gaming" by college standards, and if I had this laptop in college, it would've been on par with some of the gaming laptops my friends had at the time. It could even run Crysis 2! That's how dated the "can it run crysis" meme now is. Because any of the modern  integrated chips are on par with the high end chips available to run crysis at the time, and even an old AMD laptop from 2011 could run crysis. Obviously were talking low settings at 1024x600, but it ran. One day, crysis will be the "doom" of "can i get this to run on a toaster oven?" and we are heading toward a world of "yes, yes you can." 

So yeah, any modern computing device can run a wealth of games these days. Every modern computing device is a "gaming PC" by like 2009 standards. And that kind of excludes my willingness to play browser games on any of them for the most part. 

But let's go further, and look at various mobile devices throughout the years too. 

Razer Edge (Snapdragon G3X Gen 1 (aka, a 888+), 6 GB RAM, Adreno 660)

I picked up this handheld for $200 last year. It tears through almost any mobile game at some settings, barring warzone mobile due to its arcane requirements, and it's also an emulation powerhouse. It has access to so many games that it's ridiculous and I'm never board. I can play COD mobile on it, delta force (BF rip off), PUBG mobile, destiny rising, and yeah, even with mobile games, I got access to so many games, and not just games but HIGH QUALITY games that put say, combat arms, which i used to play in college a lot with friends, to shame. Truly, we live in a golden age of gaming, at least if we avoid the dumpster fire that is AAA gaming. We have so many options, and again, I'm kinda focusing on what LOW END devices can do, with smartphones and tablets being relatively "low end" by default. 

Samsung Galaxy Tab A9+ (Snapdragon 695, 8 GB RAM, Adreno 619)

I got this one for christmas and while nowhere near as powerful as the razer edge, it holds up quite well. It can play most of the same games at lower settings, and can even emulate most of the same games reasonably well. I got the 8 GB version for $180 on sale so yeah, this is what you get at the low end budget level, and this is my daily driver.

Samsung S6 Lite 2020 (Exynos 9611, 4 GB RAM, Mali G72 MP3)

This could also run COD mobile fairly well. Same with PUBG. Same with most google play store slop. More limited options with emulating but it seems to handle up to N64 and Dreamcast fairly well. It kinda fell out of favor for me as a gaming device over time, but still, it was fairly capable.

 Huawei Mediapad M3 (Kirin 950, 4 GB RAM, Mali T880 MP4)

It had similar performance as the S6 Lite. only reason I bought the S6 Lite was the battery aged badly and by the 3 year mark I was getting 2-4 hour battery life with the thing shutting off on its own with battery life as high as 50-60%. But power wise? It was almost as powerful as the samsung tablet and equally as capable barring OS limitations. 

Asus Memopad 7 (Some intel chip, 1 GB RAM, intel IGP)

 This was my first dip into android gaming and it sucked. I wish I got the Nexus 7 instead. It lagged doing ANYTHING. Still, I could run classic sonic games, doom, quake 1-3 and even crazy taxi on the thing, as well as a lot of casual games at the time that were borderline equivalent of browser games. So yeah, that's how far back you kinda need to get for me to wanna play a browser game. Even then, I had options. 

Which makes me go back to browser gaming on PC. Let's talk the old old stuff. 

2010 HP laptop (Intel Celeron 900, 3 GB RAM, GMA 4500)

 I got this one for like $200-300 on sale back in the day. It was crap. I could barely run anything on it. But I could still run early 2000s games like Quake 3, Command and conquer red alert 2 and generals, and even some stuff like half life 2, portal 1, and battlefield 2. Remember, anything can be a gaming PC if you try hard enough. But yeah, this is about the level where I'd wanna play browser games. 

2005 HP desktop Mark II (Athlon XP 3200+, 2 GB DDR1 RAM, HD 3650)

This is my old college era desktop after upgrading, and it was more capable than the above laptop. It ran all of the above but also stuff like FEAR, as well as borderline running TF2 and BF2142 at like 15 FPS. Again, anything can be a gaming PC if you try hard enough.

2005 HP desktop Mark I (Athlon XP 3200+, 512 MB RAM, Via/S3G Unichrome IGP)

*gets cross out* STAY BACK! STAY BACK! 

Yeah. This was HORRIBLE. Couldnt run anything past 2000 really. Quake 3, UT99, and Red Alert 2 is what it topped out at. And it couldnt run ANYTHING really. I mean when I found a game that ran i played the crap out of it no matter how bad it was. Like Soldier Front, aka rip off counter strike full of hackers and lag. Or soldat

Again, anything can be a gaming PC if you try hard enough, but this one really struggled to run anything. Anyway, given my options, browser games used to be some of my only options. I guess his is once again where I would say I would play browser games.

Can modern browser games even run on such ancient hardware?

 So yeah, we get it. You need to go back 15 years of hardware evolution and be stuck on some POS system with integrated graphics from the literal 2000s to wanna play browser games. but can modern browser games even run on those systems?

Googling it, I get this:

There are
no universal system requirements for itch.io itself, as it's a platform; instead, requirements depend on individual games, but most are indie titles with low specs (e.g., Windows 7+, 4GB RAM, basic GPU). The itch.io client runs on Windows, macOS, and Linux, while game specs vary from simple browser-based HTML5 games to more demanding Unity or Godot engine titles needing DirectX 10+, SSE2 support, and decent GPUs. Always check the specific game's page for its particular requirements. 
For the itch.io Client (Launcher):
  • Operating Systems: Windows, macOS, Linux.
  • Functionality: It's a basic application to download and manage games, requiring minimal resources beyond your browser. 
For Games on itch.io (Examples):
  • Very Basic (HTML5/PICO-8): Can run in a browser or on older hardware.
  • Unity Engine (Common): Windows 7 SP1+, macOS 10.11+, Ubuntu 12.04+, GPU with DX10 (Shader Model 4.0).
  • More Demanding Games:
    • Windows 10/11 (64-bit).
    • Intel Core i3/i5 or equivalent.
    • 4GB - 8GB RAM.
    • GPU with DX10/11/12 support (e.g., Nvidia GTX 600 series+). 
Key Takeaway:
Since itch.io hosts diverse indie games, look at the specific game's page for its minimum and recommended specs (CPU, RAM, Graphics, OS) before downloading, as they differ greatly. 

 ...WTF?

So....basic games, Windows 7, 4 GB RAM...the systems I have used that I WOULD play such games on wouldnt run such games. And then demanding games. i3s and i5s, with 4-8 GB RAM, and a GTX 600 series card. 

That's...ridiculous. I didnt mention it since the UHD 770 discussions covered it, but my original gaming PC out of college had a phenom II X4 965, 4 GB RAM, and a HD 5850, and that's older than what's mentioned. They want like a literal gaming PC from 2012 just to run crappy browser games.

And to give you an idea of what THOSE can run, well, better than the UHD 770, worse than the 1060. Say an i5 2500k, 8 GB RAM, GTX 660, pretty standard 2012ish gaming PC. You could run basically anything up through 2017-2018 on that. So yeah. I would NOT wanna touch browser games...on THAT. Even if I went for a i3 2100 with 4 GB RAM and a GTX 650, we're still talking most games up through 2013-2014 or so, with 8 GB RAM extending that to 2015. Basically, you get UHD 770 level performance.

So yeah. I know this is a long rant, but thinking about it, there's no situation I'd ever wanna run browser games as my primary choice. My steam library is too large, and goes back too long where I got decades of AAA games to rely on. Idk, I'm just not seeing the market. Who actually uses sites like that? Casuals? Maybe im being elitist or out of touch here, I just dont see the point.  

Thursday, January 15, 2026

Is it too much to ask that games use native resolution by default?

 So, another "modern gaming" trend that's getting on my nerves is the hyper emphasis on upscaling technologies. I game at 1080p. I want to play games at 1080p. I tend to lower almost every other setting before I reduce quality from 1080p native. But it seems like no new game wants to actually run at 1080p. I was playing COD the other night while watching the act man's new video on modern warfare from 2007, and then I kinda noticed, wait, why do the graphics look so much better in that 2007 game than my 2025 game? And I quickly realized, OH WAIT, THEY SOMEHOW DOWNSCALED THE IMAGE QUALITY BY DEFAULT, no wonder everything is so blurry. So I tried native FSR upscaling and it tanked my framerate hard. Went from like 160 FPS to like 70 with dips. And...ugh, this is an issue I've had with all my new games so far. They all wanna downscale or use TAA by default and it just looks SO FRICKING BAD! But then you go back to native res and then the game runs like hot garbage. It's ridiculous. 

Look, I'm getting to the point I think a lot of games looked better like 10 years ago. Because at least they were designed around running at native resolution. And quite frankly, the AA techniques used back then were better. I remember we had FXAA, which introduced some blur but not much, with MSAA being the demanding one. 4x made the image look nice and sharp and even 2x improved things. And if you couldnt run it you turned it off. You had jaggies but at least the image was sharp. Now they like FORCE you to use AA mostly. And it's built right into these upscaling techs. FSR and DLSS both do AA via upscaling. And then if you dont wanna use that there's TAA, which looks like absolute crap and makes the image look horrifically downscaled.

Battlefield 6 looks awful until I swapped from TAA to FSR native. And then once again, my frames dropped. Image is sharp now, but once again, now I'm running around 60 with dips. And I gotta be careful of that 8 GB VRAM buffer, which is crippling even a fricking 6650 XT (which is like 5050 level performance, if not worse these days given AMD kinda abandoning drivers to some degree). 

Doom the dark ages, dont even try running that at native. VRAM buffer is over 8 GB, it stutters, it runs like 45 FPS. So I gotta scale THAT down. Thankfully they have a sharpening option which i tweaked until it looked like native again, but I shouldnt have to do so many kinds of workarounds just to run stuff at native resolution. 

In the past, games were just designed for native resolution. Period. You go back to playing a 360 era game, it looks sharp and crisp, even if the graphics are dated. Modern games, the visual quality is on paper much better, but then it looks like hot garbage because its like 720p upscaled to 1080p. And devs just treat that like it's normal.

I've never been into upscaling. When the tech came out I was like "yeah that's nice", but lets face it, it was meant for people to upscale to 1440p or 4k. If you use it at lower resolutions, it doesnt work as well. And as a 1080p gamer it was never meant for lower resolutions to be upscaled to 1080p. But devs have kinds just gotten lazy, stopped optimizing games, and now if you want native resolution, you gotta give up on all other settings. It's crazy. I hate it. 

To me, this is not progress. Low resolution looks like crap. Upscaling looks like crap.  I don't care how much better graphics look in the 2020s. Again, that stuff stopped being a huge deal long ago. I haven't been wowed by a game graphically in years, and part of it is because most of them look like smeary messes in practice because low resolution. That and ray tracing isn't the revolution everyone acts like it is. Quite frankly, much like AI, all these new techs, as far as I'm concerned, are artificially shoved down our throats, and dont make gaming better, but worse.

I'd literally rather have 360 era graphics with native res than 2020 graphics with upscaling. Or at least PS4 era graphics with native, that seems to be the sweet spot. Really. I'm tired of these devs chasing trends and making gaming worse. It's one thing if I'm playing a game on aging hardware and it runs like crap, but when they're still selling cards at that power level for $250 even today, and games dont run right on them unless you upscale, it's a problem.

People will say, we need more power for the games to run, but in reality, the devs choose their detail and framerate targets. Games should be made for available hardware. And when devs keep raising the bar when the tech isn't getting any cheaper, that's a problem.

There's NO reason to make a game that runs on PS5 tier hardware at below 1080p/60 FPS, if you do this, you just decided to make a poorly optimized game. Period, sorry, not sorry. You could just as easily make a slightly worse looking game that still looks reasonably good. It's not the 2000s any more, it's not like the difference between a truly great looking game and an okay one is that huge anyway since tech has advanced to a point where all games look good. 

So yeah, less emphasis on fancy lighting effects almost no one notices, more emphasis on running stuff at actually native resolution at good framerates. because again, if you aint running stuff at native, it looks like hot garbage anyway and any visual improvements arent worth it.  

RED ALERT: Trump talks about not having 2026 midterms

 Okay, remember how I said Trump is such an existential threat to democracy that even I felt like I had to vote Biden/Harris in 2024? Yeah....this is why. After January 6th, I knew he would try this crap. I mean, he tried to overthrow election results once, of course he would try to do it again. 

Now, as articles point out, he has no authority to cancel elections, but I'm sure he's gonna try. Emergency powers to declare curfew on election day, ban mail in voting, crap like that. Not to mention the rank militarization of police and deploying the national guard on US soil to give blue states a hard time. Dude's a dictator. Or at least he wants to be. 

So yeah. Right now, the plan is to leverage the 2026 midterms to at least win the house. That will allow democrats to put checks and balances on him, given how right now he has the run of the entire federal government as the GOP is mostly falling in line behind him. But if he just cancels elections, we're kinda cooked, democracy is kinda cooked.

So yeah, we need to sound the alarm bells over this. We are at a very dangerous point in American history, and idk where this is gonna go. I don't even really wanna go there. And uh, yeah. I'm gonna end it here.  

Wednesday, January 14, 2026

As a consumer, I don't care about AI either

 So, I follow a lot of techtubers relevant to my own interests, and many of them were covering CES, aka, the "Consumer Electronics Show." However, many of them have noted that there's barely any coverage of actual stuff consumers want this year. The entire thing is a big AI hugbox. Barely anyone really innovated, they just pushed the AI stuff trying to make their stockholders happy. because everything is a hugbox about AI now. It's our entire economy just about. It's held up by like 5 companies all shifting money back and forth because everyone thinks everyone wants AI.

But here's the thing, no one really wants AI. At least not "normal" people. Normal people are starting to hate the idea of it. I mean, it's driving up the costs of computing substantially. RAM is quadruple the price it used to be. GPUs have been increasingly expensive for years. Some say we're witnessing the death of traditional computer, and you'd think at a consumer oriented electronics show, we'd see consumer oriented electronics. But no, most of CES is just AI AI AI AI AI! 

It's been like this for a few years now. I remember right after I bought my i9 12900k two years ago, I was reading about how they wanted to shoehorn "NPUs" into CPUs to have on board AI. And how this was the next biggest thing. And despite having little interest in it, I was like "oh great, planned obsolescence!" Like, fearing that these NPUs would obsolete me where games in 5 years are gonna start wanting onboard AI chips to do things. Well...as it turns out, NPUs were mostly a bust, and no one actually cares. They just want more powerful CPUs. 

And then there's stuff like microsoft copilot, which is being shoehorned into everything, and no one cares. Gemini, no one cares. The fact is, for the most part, NO ONE FRICKING CARES! 

Like really, these rich people need a reality check. AI is this trendy 2020s era buzzword, but I really know next to no one who genuinely cares about it or who uses it to make their life better. I mean, I admit I use it to bounce ideas off of once in a while, but that's about it. I dont talk to it every day. Quite frankly, the more I learn of its limitations the less I want to. It just isn't a revolution in a positive way. Most breakthroughs are negative. Like grok undressing people including minors, which is a big story this week. Or people having AI write papers for them in college. Or people going insane as AI tends to exacerbate peoples' mental illnesses. AI kinda reminds me of guilty spark from halo at times with that. Like, yeah, helpful computer thing, until it goes absolutely insane for some reason. 

Of course, again, I'm not entirely against the tech. I'm just against how forcefully it's being rammed down our throats and the negative societal consequences. It isn't making life better for the normal person. If anything it's making things worse. People are losing their jobs to it. Normally, the argument is when corporations do well so do people because trickle down economics and "job creators", but when people are literally losing their jobs over this stuff and no one is hiring except for the gestapo (ICE), yeah, again, I can see why people become luddites. I dont agree with the logic, as for me it's just a call for a UBI instead, but that's who AI is truly benefitting. It's corporations, professionals. They can replace human workers with it. They can do more with less. It has positive value for industry. But it does F all for normal people. \

And you know what? I wish this AI bubble would just pop and that this whole obsession with it would die so I can afford actual...consumer electronics again. I dont want 32 GB of DDR5 RAM to cost $300-400. I dont want to be stuck with 8 GB GPUs forever. And I dont wanna rent computers from Jeff Bezos or Nvidia or something and connect to the cloud for gaming. 

Like that's the thing. Given how this was originally about consumer electronics, let's discuss what consumers want. We want cheaper computers again. Cheaper game consoles. Affordable computing. We want to be able to build a computer for under $1000 that isn't entry level (hello, gabe cube pricing). We want to be able to buy that stuff for like $500-600. We want GPUs that are $200 and actually play games well. We want ones that are $300 to have more than 8 GB VRAM. We want 32GB DDR5 to go back to being like $100. We want cool new tech that can play games better. We want more powerful smartphones for less money. We want the switch 2 to be like $300, not $450. We want the other consoles to be around $500, or maybe less on sale. We dont want to pay more than $600 for a next gen console at launch and we want the cost of that to go down. We want our current hardware which we paid good money for to have reasonable longevity and not be obsoleted because tech bros decided to chase trends we don't care about and then ram them down our throats.

I'm not saying AI has to go. Again, I'm not anti AI, but let's be frank about AI in the 2020s. The tech is in its infancy, its power inefficient, and while I can see why the US government has use for it as the race to invest in AI is the 2020s version of the space race. It has a lot of important, say, DOD style applications. Corporations can use it to do things more efficiently, it's not all bad. But let's get rid of this idea of having AI in every PC for now. Maybe it'll be relevant in say, 10-20 years, but not now. Again, trying to push it now is like pushing for people to have PCs in like, 1960 or something. We'll eventually get there, but PCs didn't become a thing until 1980s at the earliest, and I'd argue they didn't go mainstream until the 90s or even the 2000s. But yeah. Imagine trying to shoehorn 1990s tech in like 1960....it's the same problem AI has now. Sure, that stuff had DOD style use cases....like....calculating physics for the literal space race. And it arguably had industrial uses too. But back at the time most people were like "yeah normal people will never need this" and now it's in everything. So yeah, I'm not gonna say it'll never get to that point for the rest of us, but again, we're talking a time scale of decades here. It's not there yet, and trying to force it is as disruptive to the economy now, as trying to force everyone to adopt PCs in the 1960s would've been. The tech just needs more time to cook before it's ready for widespread adoption. And trying to force it now is just disrupting the entire economy and making a lot of normal people very unhappy. 

Even as far as like DOD applications go...we need to be careful with this stuff. I mean, properly regulated, it could be useful. However, we current got a fascist government consolidating power, going around normal protections to build corporate databases like Palantir with data on every American. And that crap is 1984. And who else knows wtf these people are doing with AI behind the scenes. Plotting authoritarian takeovers, predicting and preparing for citizen resistance to said authoritarian takeovers. This AI stuff is being associated with the baddies. It's being used for evil. I mean, there's a lot of grim talk recently about how Palantir was founded to help kill left wingers, which....is rather forboding. And given how Peter Thiel and Elon Musk seem to be quite frankly anti democracy itself, yeah, there's a lot of concern there.

Again, this is why a lot of people are just anti AI in general. I recognize it's not the tech so much that it's how it's used, but when a bunch of out of touch billionaires with fascist sympathies are using it at the expense of normal people, can anyone be surprised when normal people grow to hate it?

But yeah. I'm tired of this AI crap. I kinda wish this bubble would just pop so we can move on with our lives. I know it'll cause a recession, but we all know it's coming anyway. The bubble is already there. It's just a matter of when it pops already.  

Saturday, January 10, 2026

Battlefield boomers are gonna ruin battlefield 6

 So...Battlefield 6. I got it. I play it all the time. It's my main game. Ive barely touched my other christmas games because BF6 is so good. It's the best battlefield since BF4. The game play is great. I can't say it's a perfect game. Some matches are insanely unbalanced, and the progression is so slow it's like watching grass grow. But all in all, solid 9/10 game. 

But I'm not here to review the game, I'm here to bash the haters. I've noted this phenomenon through the BF5 and 2042 eras, but forums like reddit are overtaken by what I call the "battlefield boomers." These being the people so obsessed with the past and hating on current thing while looking at least thing through rose colored glasses, it's ridiculous. And with BF6, their big problem is "OMG THE MAPS ARE TOO SMALL, IT'S NOT REAL BATTLEFIELD, WHAAA!" The reality? Well, I'd say about half the maps ARE tight urban maps that have relatively high player density. And I admit when I participated in some play tests and the beta, I have to admit it was jarring at first, all in all, I've gotten used to it, and the game now feels pretty nice. If anything, these maps are well built. The smaller ones arent too small. They're rarely operation metro or locker style player densities. They're like small-medium type maps like we've seen in the good old days, and those maps were often the most popular. because this is gonna shock the battlefield boomers, but a lot of people *gasp* LIKE infantry focused gameplay! They dont like those massive maps where you gotta walk like 300m to the next point only to get farmed by tanks and snipers. 

And it's not like larger maps dont exist. They do. There are quite large maps in this game as well. The game has something for everyone. But then these guys are like REEE BATTLEFIELD 3 and 4 were so much bigger and blah blah blah. Like...are we playing the same games? It's like they just selectively look at the maps and look at the games with nostalgia goggles.

And a constant complaint I keep hearing is we need bigger maps. And I'm gonna be the anti circlejerk here. No. No we don't. No one actually enjoys playing massive maps but snipers and vehicle drivers. Most people played small to medium sized maps. Im not saying we cant have a handful of larger maps, but to be fair, they're typically less popular.

Anyway, I hope dice learns to never ever listen to these people. They're morons and they ruin everything they touch. They're a vocal minority that hates on current thing no matter what it is and acts like similar thing from 15 years ago was so much better. I admit, I have nothing but love and respect for those 15 year old or so games these days. They're the best era of BF games. But this latest one replicated the magic and these guys are STILL complaining. Why? because much like the equivalent players in the halo community, you cant satisfy them. They're perpetually stuck in their own little golden age that only exists in their heads, and they hate on current thing no matter what. I've seen the halo community complain when they literally remastered halo 2 in a new engine with better graphics. Im serious. it's the same game, and they'll still complain because it's not the same as the 2004 one. I've seen people complain saying halo 3 on MCC is bad because it's not exactly the same as the 360 version. You can't take these people seriously. They're stuck in the past, they'll never ever be happy. Because their entire worldview is run by nostalgia. It's like the member berry thing from south park. Everything modern is bad, everything in the past was great, and you can never make them happy. 

BF6 is awesome. It's the best battlefield game in a decade. I'm addicted to playing it. I love it far more than I ever did 5 and 2042. Just fix the fricking progression system to make it less oppressive. That's literally my only major complaint.  

Doom: The Dark Ages is everything wrong with modern FPS gaming

 So, people who know me will know I've always been a HUGE doom fan. I got the first game 30 years ago on the Genesis 32x and have been addicted to it ever since. It's always been simple. Here's a gun, there's a bunch of demons from hell, KILL. RIP AND TEAR UNTIL IT IS DONE! 

The OG games were classics and still hold up to this day. Even more so with modded engines like GZdoom.

Doom 3 took the series in a survival horror direction that kind of was divisive, but I didnt DISlike it. I mean, it still felt like doom. It was slower, and more corridor shooty, but that was the zeitgeist of the times, and the 2000s era FPS games are kinda my thing.  

After that, Id kinda dropped the franchise. THey made quake 4, Rage, Wolfenstein games, and eventually came back in 2016. And Doom 2016 was arguably the best FPS of the 2010s campaign wise. 2016 was the peak of gaming for me, and after that, everything went to crap.

In 2020, they released Doom Eternal. That one was a bit divisive for me. here's the thing. When I play doom, I wanna shut my brain off and shoot things. Doom eternal added too much strategy to the game. Now it's like, use X gun to kill X demon in specific way. And then you face giant arenas full of varied demon types and you either switch between the guns every 5 seconds to kill X demon with Y gun as intended, or you just blast everything with the super shotgun and hope it works. And I'm...the latter. I dont got time for that crap.

I also disliked the direction the DLC took where they made the game more punishingly hard, the levels tedious and long, and doubled down on "use X gun to kill Y demon" not even making it optional.

So....after eternal, I went into dark ages with a mixed view on the direction of the franchise....and the dark ages seemed to double down on the bad stuff. Now there's a shield with a gun, you gotta parry and dodge at the right moments, use the shield at the right moment to block attacks, and it's like despite being like 6 levels in they're dropping more and more mechanics and sub mechanics on me. I'm frustrated with it. This isn't the good I grew up with. it's not the doom that I love. I wouldnt even call it doom any more. It's more like....some warhammer game or something like vermintide with guns.  

Really. ANd it just begs the question: why? Why did the game have to go in this direction? Again, originally it was here's a gun, there's your enemy, shoot them until you win. But now we need tons of mechanics and sub mechanics and parrying and all of this complexity and difficulty. Just having an experience of shooting stuff isn't enough. It has to be hard, it has to be complicated. 

It feels like every game is like this. It's why I'm nostalgic of the 90s and 2000s where games were just, shoot things until you win. But those are like "uncslop" type games now or whatever. Zoomers have more sophisticated tastes. They want more complexity. Normal deathmatch is no longer a thing. Now we need extraction shooters. We need tons and tons of recoil for a high "skill ceiling." We need mechanics on top of mechanics in doom now. Everything has to feel like an RPG. And everything feels complicated. And I hate it. I miss how we can't just have simple FPS games any more. Everything has to be complex, everything has to have a motif, a gimmick, and it isn't fun. Games in the 2020s kinda suck at times. And it's not even just the nostalgia for the past in this case. because this is the grandaddy of all FPS franchises just about. And they're messing up. 

Oh well, at least old games still exist and are fun. 

Thursday, January 8, 2026

Discussing how the right lies about everything

 So....this has been a hell of a week, hasn't it? Last year we were ringing in the new year, and it feels like we've experienced a whole year, or at least 1-2 months of collective trauma already. We've invaded a foreign country and stole their oil, and now ICE shot that lady in Minneapolis for no reason, and now two more in portland. And the most baffling part about the latter is the Trump administration's lies. Here's the video, warning, it's kind of graphic, but you dont see blood or anything. Anyway the trump administration is trying to frame this like the woman was some radical leftist trying to run over ICE officers with her car. And it's like...WTF. No. That isn't what we just saw. 

Like, if I were to be as objective and fair to ICE personnel as possible, I can KINDA see an argument for why pulling away from cops can be seen as dangerous, but the only danger the ICE officers were in was the danger they put themselves in by being that close to the car. The woman was clearly trying to swerve to avoid hitting anyone, just trying to flee. And while some of more authoritarian persuasions would argue she should have complied, well, yeah. Either way, this wasn't a justified use of force. This is what happens when you give untrained morons guns. Because let's face it, ICE people arent well trained. They accept literally almost anyone and are just giving these guys weapons. 

And honestly, let's not miss the big point, THIS SHOULDN'T HAVE EVEN HAPPENED. Like, okay, even if we grant the ICE officers every possible excuse under the sun. Okay. Well, why are they being deployed to Minnesota? They afraid Canadians are gonna come over the border? Why would they wanna come here? And no, that's not the real reason. They're there because 1) they're just deploying people to blue cities/states to make the citizens there miserable for daring to vote against the orange one, and 2) there's a community of Somali Americans that MAGA is making a big deal about and acting like they're turning the place into Mogadishu by virtue of being uncivilized or something. Ya know, more of that "they're eating the cats and dogs" crap except more like "this is literally black hawk down." So...hassling somali immigrants by virtue of being somali immigrants because MAGA are a bunch of racist, white nationalist, pieces of crap. 

And that's the big thing. Trump/Noem just deployed these people to Minneapolis without the local government's consent (hell, the mayor of Minneapolis in response to the incident literally told them to "gtfo"), and this incident happened. And when called on it, well, suddenly the woman was a radical leftist who tried to run over ICE agents intentionally. No, no she wasn't. She was some single mom who dropped her kids off at school or something, got stuck at an ICE checkpoint, and when a masked agent pulled up to her window shoving a gun in her face and telling her to get out of the car she freaked and ran. 

Again, this incident never should've happened. because ICE should've never been there. Sorry, not sorry. And that's the real moral of the story. THis is a policy failure, and this is directly on Trump and Noem's hands. Again, you can argue maybe she shouldn't have run as running from law enforcement is never a good idea, but still again, kinda ignoring the even bigger problem here that this deployment was technically illegal, it was unwanted, but hey, its okay for mr law and order over here to do whatever he wants because it doesnt account as illegal when he does it because he's trying to sieze control of the law enforcement apparatus. 

Anyway, can we at least agree that he's lying when he calls her some radical leftist terrorist? I mean, wtf? This administration has no shame when it comes to lying. Just like they called Kilmar Abrego Garcia a gang member or drug trafficker or whatever BS narrative they made up about him. Disgusting. 

The "millennial uncle" is my spirit animal

 *looks sternly at zoomers and gen alpha whenever one of my interests is brought up*

YOU DON'T KNOW CRAP ABOUT THAT, DO YOU? I WAS THERE WHEN POKEMON CAME OUT. I PLAYED GEN 1. I WAS THE FIRST ONE IN MY SCHOOL TO FIND THE MISSINGNO GLITCH. I SPENT 9/11 WATCHING DBZ. 

True story...

 Okay, so....I've been finding these kinds of videos making fun of us millennials now in our 30s when we try to relate to younger generations, and I'm just gonna say, yeah...I get it. I own being the quintessential 90s kid. I own being in the age range that is being made fun of here. And I own sounding like that in trying to talk to younger people. 

Heck, these zoomers and alphas have even called the games I grew up with "unc slop." Once again, I own that. And a lot of those games are better than the newer stuff that exists today, sorry, not sorry. 

Compared to the 90s, I thought the 2000s sucked at the time. I mean, 9/11, the stock market crash, and us being in this weird awkward age of digital stuff that never really aged well. I dont glorify that stuff as much as SOME do, but I do have some nostalgia toward it. 

Still. I miss the 2000s. They were a better time than now. Even if Bush-Cheney functionally ruined the 2000s in effect, Trump is like Bush Cheney's worst attributes on steroids. 

And culturally, it was better. It wasn't woke, or alt right. It just was. And everyone fricking understood it. We had more productive conversations on the internet. Games didnt suck, and we seemed to have a lot more of them a lot more quickly. And they got cheap quickly too because tech was ever advancing and stuff that was 4 years old at the time was considered ancient that could be run on your new family computer with integrated graphics. Again, it wasnt perfect, but it was better than now. 

And...on the whole divide between us millennials who have inherited the role gen x used to have as being the chill middle aged generation, and gen x turning into the new conservative boomer stereotypes, well...at least we can relate to you guys. And at least we try. And this millennial uncle stereotype, it's us trying to relate to you guys, to teach you more about the world and what came before you. We're trying to teach you our proud culture as the first generation to walk in the digital divide in a reasonable way. Because trust me, that's...the biggest cultural divide between generations. The pre internet people don't get the modern world at all. They dont. They can't relate to us like...AT ALL on things sometimes. They think that our PS2s were "nintendos", that's how out of touch they are. You hand them a video game controller and they wont know what to do. They might even struggle with simply turning on a computer or a tablet, let alone using it. 

Us...we get it. We get gaming. We were the forebearers of it. The franchises you know and love, we love too. And we remember the old stuff. You guys play pokemon legends ZA, we're here still going on about how great gen 1 was. You go on about anime, we tell you about toonami. You talk about the PS5, we talk about the PS2. We're basically you, just older. And we're just trying to relate to you. Would you rather we do that, or to do what our parents did and try to ban video games because they thought they caused violence? 

I admit, sometimes we push too hard. We act like the 90s and 2000s were perfect. They weren't. And I know some 90s kids can be quite annoying. Still. Again. You could do worse. You could end up with older generations who cant relate to you at all. Trust me, that's worse. 

Anyway, that's what I call aging gracefully. Basically being in my late 30s and still having an inner child that wants to play video games. Maybe our classics are....15-30 years out of date by this point. But they're still good. Kinda like action movies from the 80s were still good in the 2000s. Thats basically what "uncslop" is. It's stuff from 20 years ago that's still cool today. Just like how when it was the 2000s some of us still liked stuff from the 80s and 90s, especially in movies. But I digress. Respect your elders, kids. You don't know crap about that 30 year old franchise you happen to like the latest game on. We were there when it was created. 

Tuesday, January 6, 2026

I don't care about "AI slop" in video games

 So...AI is a complicated topic for me politically, mainly because of the political divisions involved. The right LOVES AI, wants to inject it into everything, destroy all the jobs, not have safety nets, MONEY MONEY MONEY! And the left...HATES AI, and acts like it murdered their dog because of the likes of Elon musk and silicon valley just pushing it in such an obnoxious way.

And here I am, in the middle, a pro technology progressive who loves the idea of AI, even if I hate the actual implementation at the hands of all of these corporations and billionaires. I don't care if AI takes jobs. Because, my whole thing is, we shouldnt be forced to work for a living any more in the first place. We should change our social structures to accommodate people in the 21st century away from an economy based on work, and quite frankly, I feel like the left are just luddites on this, becoming derangedly against AI in an absolutist, tribalistic fashion. Meanwhile, the right is basically going way too far with it, pushing it super aggressively into everything when it just doesnt have as many consumer uses, and now we cant afford computers any more because these corporations are buying up the entire world supply of RAM. And electricity is becoming expensive because they're buying up all the electricity. I've heard some call AI like the tree of might from DBZ where it's just sucking up everything and destroying the earth in the process.

But, again, for me, this isn't even AI as a concept that's bad. It's the implementation. It's pushing too much too fast. If we tried to do this with computers in the 1950s, we'd have the same problem. The tech just wasnt advanced enough, and it was super inefficient. And if the wealthy decided to suck up all the world's resources to bring us computers in the 1950s through, say, our TVs and radios, we'd have the same problem. It's energy efficiency that's the problem. Not the tech itself. It's the ownership among billionaires that's the problem. Not the tech itself. Basically, the problem of AI is a problem of rich people and billionaires, not a problem with the tech itself. And we can hate the rich people and the billionaires, I mean, no argument from me. But hating AI itself is dumb? AI IS the future to some degree. Every decade, we see innovation and new tech. Radios, TVs, computers, the internet, and they always revolutionize our societies. And AI is just...another step of that. And if we oppose it, we're gonna sound like our boomer parents going on about how great things were back in the 1970s. Yeah, okay mom, let's get you to bed already. Ya know? Seriously, a lot of US are turning into that in our 30s and 40s and it's dumb. 

If AI saves companies money on artists and they put it in games, what do I care, as long as the quality of the final product is good? but it seems like people just categorically hate AI in games. Like we should only use real artists, and they should be paid salaries, and jobs are good, shouldnt we have jobs? NO! We should fricking long for the day AI can replace work, so we can work less, again, the problem is our societal model of expecting people to work for a living...to live in a world where everyone is expected to work for a living, and if suddenly people can't work for a living, that's bad because our entire economy is designed around forcing people to work for a living. So maybe we shouldnt have to work for a living. Unpopular idea, but thats how I see it. Let them take all the jobs. I don't care. Just pay me for doing nothing instead. Ya know? 

Again, AI isn't bad in and of itself. AI is bad because our societal model is bad. We should address those societal issues. If we do that, AI becomes not just not a problem, but a solution to a lot of problems. Again, you just gotta break your mind out of this weird job centric mindset and suddenly it all makes sense. AI is good, it's not bad. It's billionaires that are bad, and it's our economic model that is bad. Tahnks for coming to my ted talk.