At this point given the lack of results from Quantum Computing it may be mostly bs.
It's all kind of hazy, and the fact the picture is still hazy to the public makes me think this is all bullshit or at the very least a failed endeavor.
First, here is a laundry list of supposed quantum logic gates (about two dozen). https://en.wikipedia.org/wiki/List_of_quantum_logic_gates
Do any of these weird things exist? Companies are not talking about what kinds of logic gates their machines have if any. Secondly, if these can be made, how are they superior to a normal logic gate in my computer?
At this point I don't even know if these things make sense in theory.
For example they allege that in theory you get a benefit from superposition like so
A classical system with 2 bits can only represent one of 4 possible states at a time: 00 01 10 or 11. Let's say we evaluate a function f(x1, x2). To test all possible states with a classical system we run four functions f(00), f(01), f(10), f(11).
In a Quantum Approach we put the 2 qubits into superposition, and apply a quantum gate encoding 'f' in one sequence.
This compares apples to oranges. The classical system produces more information yielding 4 separate results. The quantum example gives an average of results. So a lot of information is lost.
Secondly how does one even build the logic to apply the function f on let's say 10 qubits all interconnected? Would it require custom hardware for every new complex function you wish to apply? What if we don't want all possible states but just a subsection of states on a given number of qubits? Based on the theoretical example this cannot be controlled for, you just get an average of all states.
Let's look at this press release from some supposedly amazing machine https://www.livescience.com/technology/computing/worlds-1st-fault-tolerant-quantum-computer-coming-2024-10000-qubit-in-2026
Logical qubits — physical quantum bits, or qubits, connected through quantum entanglement — reduce errors in quantum computers by storing the same data in different places. This diversifies the points of failure when running calculations.
Why would duplicating an atom (essentially) fix errors in your system? If an entangled bit is causing me errors, wouldn't it show the same error whether or not it is entangled (i.e duplicated) elsewhere? If it doesn't show the error elsewhere then that violates what quantum entanglement is supposed to mean.
Even if there is promise to this technology, it seems like too much too soon. A lot of fancy theory but not much to show for it.
Google has recently claimed it's QC did something that would take a super computer forever to do. But what did it actually do? It performed random circuit sampling which is of no practical use. It got random numbers out of a chaotic system. Wow.
Theoretical physicists are prone to get lost in interesting bullshit. Some of them probably have legitimate 160 IQ, but what is the point if it is wasted on fantasy? You can have your theories about all sorts of special kinds of logic gates, but will atoms actually behave the way you theorized? How many assumptions are baked into the cake? Theoreticians don't have to worry about this, but engineers do.
Then once money gets involved, all bets are off.
Einstein was found of thought experiments. I think that is part of the problem with relativity.
Einshtein was even worse. He took Lorenz transforms, not understanding (or intentionally twisting the point) that they describe distortions in observations when speed of something is close to the speed of acquiring observation information and posed them as equations describing reality of observed object. And then, doing "thought experiments" based on this cheating create a lot of wrong conclusions.
Lorenz equations are universal for any method of receiving insformation about observed object. If it is sound, then there will be speed of sound instead of speed of light, or speed of carrier pigeons if you use them to receive information. Obviously that does not mean that faster-than-sound or faster-than-pigeon is impossible because otherwise observer will see something strange, say, like this "telegraph to the past" non-existing paradox.
So Einshtein managed to use triple-cheating, first, using wrong conclusions about meaning of Lorenz math to think out non-existing paradoxes, second using paradoxes to "prove" the points of special relativity and third, build all that impossible model of the world where speed of receiving observation information is projected to real things.
Einstein is far ahead of Turing in scientific dishonesty.
I've heard you make this argument about Lorenz tranformations, and perhaps you're right, but the way I learned about them was that they were ad hoc fixes to Lorentz's ether theory with respect to the Michelson-Morely-Experiment. They require that the preferred frame is the ECI (Earth Centered Inertial) frame, and tell us that somehow length can physically contract and time can "dilate" when moving relative to this frame, which presumably carries the ether wind.
I haven't read the original papers from Lorentz, so this is all second hand. But that explanation made sense to me and forms part of my narrative on relativity as I understand it.
Lorenz transformations are universal for any wave/field, not necessary electromagnetic.
Just for fun, I checked pedowikia, both Russian and English version.
In Russian version there is a note:
In English version I didn't find any mention of the fact that almost any (all second order wave equations for sure) wave/field equations are Lorenz-covariant, with speed of wave/field propagation as c.
Meanwhile in acoustic and sonar measurements, and in hydro/aero-dynamics there a coefficient equal to square root of one minus mach number squared or sqrt(1-M^2), named β or "compressibility factor" is often used. It is literally exactly that sqrt(1-v^2/c^2) from Lorenz transformations.
But somehow acoustic scientists, for unknown reasons, does not insist that nothing could move faster than speed of sound. :)