

What, you mean you don’t play games and go “Well that looked great! Well worth my time!” like an awful lot of the AAA game industry appears to think gamers do?
Huh.
Seriously though, I’m curious how we ended up in the make-shit-prettier race and not a make-the-writing-good, or make-the-game-actually-fun, or even things like make-more-than-two-dungeons (looking at you, Starfield) race.
Especially given the cost to me, personally, to keep upgrading my GPU has reached an untenable level: I’m sure as crap not paying $2000 for a new GPU just so we get a few extra frames of hair jiggle or slightly better lighting or whatever.
Right-ish, but I’d say there was actually a simpler problem than the one you laid out.
The immediate and obvious thing that killed OS/2 wasn’t the compatibility layer, it was driven by IBM not having any drivers for any hardware that was not sold by IBM, and Windows having (relatively) broad support for everything anyone was likely to actually have.
Worse, IBM pushed for support for features that IBM hardware support didn’t support to be killed, so you ended up with a Windows that supported your hardware, the features you wanted, and ran on cheaper hardware fighting it out with an OS/2 that did none of that.
IBM essentially decided to, well, be IBM and committed suicide in the market, and didn’t really address a lot of the stupid crap until Warp 3, at which point it didn’t matter and was years too late, and Windows 95 came swooping in shortly thereafter and that was the end of any real competition on the desktop OS scene for quite a while.