I can entertain the idea that perhaps typical users don't talk about it online because they're not very tech savvy so they just assume they're doing something wrong when a website or piece of software malfunctions.
But it's simply not possible for nearly all of these software programmers, web developers and tech enthusiasts to not be talking about it online.
Operating systems, desktop software, phone apps, games, firmware & drivers, major corporate websites, banking websites, gov websites, utilities websites... everything.
Over the past 5+ years it has increasingly become the norm for it to be horribly designed, malfunctioning or broken; or only tested on a single device with a single OS/browser of a single version.
Actual humans would not refrain from talking about this programming plague but bots set to not expose DEI-driven destruction absolutely would refrain.
Yep, most of the software in, say, 2005 was better programmed, ran better and was more functional than it is today. The same could be said about websites as well. Sure, few areas have experienced progress, but the basic functionality was much better back then than it is now.
I think part of the problem is in the fucked up development cycle, copyright stuff and the general environment of modern software development. Everything is being rewritten at the highest speed possible. Everything is being replaced at the highest rate possible. There's always frantic development and modernization going on while no one actually moves anywhere. No wonder that with each new iteration the quality is getting worse and worse. What's the point of doing something well if in some half a year or a year all of that shit is going to be replaced anyway? Hell, I'm still using scientific calculator app from 2003 and none of current shit can even approach the functionality and/or quality of that one!