Ayh know bcz ayh have worked on it. It is smaller gates at sustained colder temperatures. That has been the hang up with Moores Law of late. Can only go so small before heat becomes too intolerable. It is reason why newer gen chips are failing more frequently.
They tell you elaborate stories about QC and they dress up the particle theory model to seem like they have more control over it --> they don't. Qubits is a ridiculous larp on an audience who does not care, and they do it to validate the grants. Yes they are tricking every one.
There is a finitie limit to how much energy can pass thru a mass
And yes it is quite literally the same thing. Integrated chips are gates, open, close, count the data, compile. Amazingz engineering managed to reduce that to a multi layered microscopic behemoth (but with massive failure rates otherwise we would not still have i3 i5 etc) but temperature is the brick wall. Smaller means more heat and no where for it to go so the circuit fails. Temperature is the only "invention" of QC and it will never be normalized bcz keeping these units at such low kelvin sustainably is not feesible. At this point QC is a novelty project like any other super computer build that inevitably became a laughable hypothesis model generator that no normal human cares about and no researcher shd have credibility doing.
Ayh was working with an engineer at Princeton who worked on their QC program. Ayh visited the QC dev years ago. Fuck your bot opinion.
So quantum computing is fake beause an engineer told you, and what's really happening is really cold circuits. And they are tricking everyone.
Ayh know bcz ayh have worked on it. It is smaller gates at sustained colder temperatures. That has been the hang up with Moores Law of late. Can only go so small before heat becomes too intolerable. It is reason why newer gen chips are failing more frequently.
They tell you elaborate stories about QC and they dress up the particle theory model to seem like they have more control over it --> they don't. Qubits is a ridiculous larp on an audience who does not care, and they do it to validate the grants. Yes they are tricking every one.
Quantum computing is not simply sped up classical computing.
Who said sped up?
More gates is not "faster"
More gates means more data channeled at once
There is a finitie limit to how much energy can pass thru a mass
And yes it is quite literally the same thing. Integrated chips are gates, open, close, count the data, compile. Amazingz engineering managed to reduce that to a multi layered microscopic behemoth (but with massive failure rates otherwise we would not still have i3 i5 etc) but temperature is the brick wall. Smaller means more heat and no where for it to go so the circuit fails. Temperature is the only "invention" of QC and it will never be normalized bcz keeping these units at such low kelvin sustainably is not feesible. At this point QC is a novelty project like any other super computer build that inevitably became a laughable hypothesis model generator that no normal human cares about and no researcher shd have credibility doing.