Factories in question is unique in the world because of using extremely small process for producing ICs.
However, things are not as painted by MSM on that matter.
Look, only tiny percent of ICs really require top resolution process. It is RAM, flash, top processors and high-end GPU. But they are not only just insignificant part of total IC manufacturing, but really is not needed at all for any regular thing you could think about.
This hype about all that 7-5nm process if nothing more than a hype.
The need for top processors, enormous amount of memory or superior GPU is driven purely by ineffective software overbloated with modern bloatware frameworks and overall programming paradigm that prevail in corporate sector.
You don't need them at all for any of your needs. 10-year old computers/notebooks/phones are perfectly useable today with apropriate soft. You will never notice any difference between doing regular tasks on 10-year old notebook and modern one.
So, there is no any real need for all that 5nm processors and tons of RAM. There is need for optimising current software, not new ICs.
Also, high resolution process have significant drawbacks. Smaller elements on crystal means lower reliability and higher sensitivity to different fluctuations from thermal to natural radiation. It adds instability and shorten liftime. For those making profit from selling same shit again and again to the same customer it is very profitable, but that's all. Customer don't really need any top hardware, he need hardware that do what customer want.
Moreover - most sophisticated industries use pretty large process, and more reliability and protection from environment is necessary, larger process is used.
Then why there is all that hype about TSMC and other Taiwan factories? I think it is from one side an attempt to keep industry of crap running, producing less and less reliable hardware along with forcing customer to buy more and more of it by pushing bloatware that utilise all that unnecessary resources of new hardware, and partially an attempt to have hardware suitable for all their "AI" crap with BigData (which is really bloatware too, and could be perfectly run on much older hardware with little optimisation).
What will happen in case of troubles in Taiwan? Absolutely nothing. Moreover, there will be a chance that software manufacturers finally will be forced to start optimising their bloatware and make it something decent. Overall I think TPTB fucked up with that artificail nanometers race and attempts to establish world monopoly. They will not achieve that monopoly and control, either because they will fuck up with that high-tech factories (there was even voiced a plan to destroy TSMC factories in case of China-Taiwan reunion), either because customers will turn to cheaper and more reliable solutions.
So, don't rush to buy a new notebook/smartphone instead of old one. With high probability there will "suddenly" appear software that will be perfectly fine with your outdated hardware.
"You don't need them at all for any of your needs. 10-year old computers/notebooks/phones are perfectly useable today with apropriate soft. You will never notice any difference between doing regular tasks on 10-year old notebook and modern one."
Unless you're doing gaming, and not writing, internet surfing, and spreadsheets, OP is correct. Who cares about Moore's Law at that point.
Gaming is largely financed by Intel, Nvidia, AMD and so on. That thing with "Guys, this is first engineering samples of our latest videocard, your new game should lag and show low FPS on that hardware" is what I personally saw (and participated, guilty :) ) already in late 90s. Still have that 3D accelerator ("GPU" was not a common term at the time) card with "Not for sale. Not FCC approved" markings on PCB as kind of museum artifact. Today it is not just occasional donating fresh hardware samples with some one-time financial grants, but also a constant direct money flow from hardware manufacturers to game developers.
I'm pretty shure that all modern games could be optimized for flawless gameplay on the decade old hardware without any serious efforts.
And looks like the rest 20% is "changes for the sake of changes".
Every problem always have optimal solution. It's like a hill-like curve with a top where solution is good enough to completely solve the problem and is not complex enough to be unreliable. If that top point is reached, then there are almost nothing could be done to make things better. Only way to change things is to invent a completely new physical principle, tech or whatever, but that is exactly what nobody does. Since perfect solution, i.e. top of the curve is already reached, any change or innovation inevitable lead to worse result. And here they need marketing to convince customers that worse solution is better.
Interesting, if we somehow stole internal financial data from corporations, what difference between financing of R&D and marketing will we find? I think today its orders of magnitude difference and not in the favor of R&D.
And here is next obvious question - how long this scam could last? Should there be point when this situation will explode with following return to normal when R&D is much more important than marketing, or it will eventually stuck at the bottom of making useless and unreliable things with total customer brainwashing to make them happy with this crap?
And when mass-production nearly reached that single-junction limit, they began to produce some overcompicated multi-junction cells with purely symbolic efficiency adavantage, but with enormous price tag. In normal world nobody will pay 3-10 times more for some 1% of efficiency, especially taking in account reliability question.
But here we are. Marketing wins over common sense.
if A = "The need for top processors, enormous amount of memory or superior GPU"
and B = "ineffective software overbloated with modern bloatware frameworks and overall programming paradigm that prevail in corporate sector"
you're asserting A => B
but this is false
if it were true, then corporate PC sales would be increasing in line with such new devices
however, that is not happening
Corporate PC sales have not flatlined but have been experiencing a decline. According to Gartner, sales of personal computers were expected to fall 9.5% in 2022 due to supply chain constraints and geopolitical challenges. Business PC demand was expected to fall 7.2% year-on-year in 2022[1]. In 2023, AMD, a major chip designer, reported that sales remained flat for its data center business, indicating a slowdown in the enterprise sector[2]. The IDC also expected PC monitor sales to decrease over the remainder of 2022 and in 2023[4]. However, despite these declines, PC sales remain above pre-pandemic levels[3].
Citations:
[1] https://www.thenationalnews.com/business/technology/2022/07/05/global-pc-sales-to-drop-95-in-2022-because-of-multiple-challenges/
[2] https://www.datacenterdynamics.com/en/news/amd-revenues-fall-as-pc-cpu-sales-plummet-and-data-center-business-slows/
[3] https://www.tomshardware.com/news/pc-sales-remain-above-pre-pandemic-levels-despite-recent-declines
[4] https://mybroadband.co.za/news/hardware/450152-global-pc-monitor-sales-flatline.html
[5] https://www.bizjournals.com/sanjose/news/2014/07/10/pc-sales-flatline-after-2-year-slump-so-thats.html
I'm asseritng A <=> B. It is circular dependency. There is no any real need nor in bloatware, nor in hardware that could sustain it.
As for your numbers, you miss the point. Little decline or little rise of sales speak nothing about huge constant flow of unnecessary hardware to replace perfectly working one. This fluctuations perfectly explained by overall economy health.
If software is good enough to not force customers to upgrade constantly, this flow will be magnitude orders lower then we have now. Only to replace few suddenly broken or add a few to existing thousands. The sole fact that corporate PC sales exist at current levels is already a proof of artificially created demand. And fluctuations at few percent level does not change anything.
Market with artificially created demand could never be saturated. At the same time fair market, where demand is obviously limited quickly saturates and then fall with only supportive (or spare parts) sales.
There exists markets where constant demand is inevitable - it is energy, food, water and other consumable things. Computer tech, just like cars, home appliances and so on is not consumable by default. Humans do not chew PCs and drink cars to live. This things could last centuries, without any need to replace if designed properly.
You could easily find examples of such non-consumable tech in industry. F.e. decently serviced during its lifetime 70 years old lathe absolutely don't need replacement and not worse than lathe you could buy today. Sometimes it is even better than modern one. There are still a lot of half-century old (or even older) tech in industry that perfectly do its job. I don't see any reason why computer tech of last decade should be any different. Moore's law turn out to be a hoax. There is already nothing that could be improved in current computer architecture, that could noticeably rise its consumer usefulness. That was a thing in 90s, but not now. I estimate that optimum was reached around decade ago, everything we see since then is a pure creation of artificial demand from absolutely nothing.