

Well you’re not half wrong, at least the LUMI supercomputer, current nr. 9, runs Cray SUSE Linux Enterprise Server with minimal kernel daemons to reduce OS jitter on the compute nodes.


Well you’re not half wrong, at least the LUMI supercomputer, current nr. 9, runs Cray SUSE Linux Enterprise Server with minimal kernel daemons to reduce OS jitter on the compute nodes.
Add this to your list: https://en.wikipedia.org/wiki/Fock_space
Why not both?


You need to know which basis the sender use to collapse and measure in the same basis. Then you need to sample a statistical distribution and the desired information will be the average of the distribution. This is very well proven in the Bells inequality experiment and can definitely be used to gain information.
It is clearly not very efficient in the sense a lot of transported bits are wasted to convey less information. But the advantages of instantaneous and secure communication will be worth it in some use cases.
That is, of course, if the engineering issues such as quantum repeaters (a sort of range extender) and high fidelity storage are properly solved. It is a few years ago since I did any quantum information in uni, so I don’t know what the current state of things are.


For using the quantum teleportation algorithm you first have two establish entangled qubit pair, with one photon at the sender and one at the destination. This process does take the distance over speed of light amount of time. The trick is that you would pre-process this, and decide later when to and what information to encode into the qubit, allowing for “instant” information transfer. Naturally, this requires that you have a very good memory device that keeps the fidelity of the entangled qubits.


Not a search engine, but last week I learned of the European Open Websearch project, which builds a new free and open search index. It should already be ready to try out. Hopefully we will see some search engines implementing this soon.


Without knowing much about psychology, I would imagine separating the mindset into a set of orthogonal axis is pretty difficult and certainly the normal range would probably not follow a normal distribution in each axis. As a result the N-dimensional volume would not be a N-sphere but some complex topological shape. Possibly even consisting of multiple disjointed sets. If any of these assumptions are true then the global point average over the entire space may lie outside many of the “normal” ranges.


0.5% of eluveitie at 1443 minutes, I suppose not too impressive considering 907k monthly listeners. But I’m a varied listener
Same problem from Denmark