Let’s now talk about the quantum min-entropy, a quantum entropy that is central to the business of generating random numbers. Consider that we have source of quantum particles P1P2… and that we measure them to produce the binary sequence X = X1X2…. If the particles are prepared with a definite value for some property (say, they are electrons with a well-defined value for their spin in the z-direction) and we measure a complementary property (i.e. the spin of the said electrons in the x-direction), then quantum theory predicts that the outcomes are uniformly distributed and, more importantly, independent of any other measurement.
From a cryptographic point of view, this means that the said outcomes are completely unpredictable for any eavesdropper bounded by the laws of quantum mechanics. In real devices, however, due to noise and imperfections it is impossible to prepare systems with perfectly defined properties and to perfectly measure complementarity properties. The conditional quantum min-entropy Hmin (X|E) of X given the eavesdropper’s system E precisely quantifies the number of completely unpredictable bits that can be extracted from the said outcomes. But, you may ask yourselves, “how do we know the state of the eavesdropper’s quantum memory E? I am sure she wouldn’t just willingly reveal it”. It turns that we don’t need to know or assume anything about her system: from a complete description of our particles and measurements we get a description of the worst-case state for the adversary. That’s just the beauty of quantum mechanics.
Entropy is, undoubtedly, the main concept in quantum information theory. For the interested reader, we recommend the following books on the subject:
- Wilde, M. (2013). Quantum Information Theory. Cambridge: Cambridge University Press.
- Tomamichel, M. (2015). Quantum information processing with finite resources: mathematical foundations (Vol. 5). Springer.
- Renes, J. (2022). Quantum Information Theory: Concepts and Methods. Walter de Gruyter GmbH & Co KG.