A less weak-minded adversary would rationally assume that ONETIMEPAD was the plaintext, and that we had sent our message unencrypted. While in theory it’s possible that an adversary (knowing we are using a one-time pad) could be fooled, this would only be possible if we live in Mos Eisley (“ this is not the plaintext you are looking for“). When we encrypt our plaintext of ONETIMEPAD with this key, we end up with a ciphertext of… ONETIMEPAD. Let’s choose the very first key (in alphabetical order): AAAAAAAAAA. But, if (as Schneier writes) “all keys are equally likely”, the one-time pad must be secure for every key. Schneier chooses the plaintext ONETIMEPAD and encrypts using the key TBFRGFARFM, producing the ciphertext IPKLPSFHGQ. First, using his choice of a high entropy key, and then using a different key that is possible if we were using a truly random key. We’ll use Schneier’s one-time pad example, from Applied Cryptography. In his words: “Perfect systems in which the number of cryptograms, the number of messages, and the number of keys are all equal are characterized by the properties that (1) each M is connected to each E by exactly one line, (2) all keys are equally likely.” All keys are equally likely – even keys that don’t “look random”.Ī simple example will demonstrate why the one-time pad can not be secure when using truly random keys.
![one time pad system one time pad system](https://d3i71xaburhd42.cloudfront.net/c6feb12b2cdd81fcd1334b7d3b3038e37ebeead3/3-Figure1-1.png)
However, when Shannon attempts to prove that the Vernam cipher is perfect, rather than simply secure, he uses a definition for the key that is truly random in the probabilistic sense. Now it is known that running key ciphers can usually be solved uniquely.” He explicitly differentiates the Vernam cipher from earlier ciphers: “A running key cipher is a Vernam type system where, in place of a random sequence of letters, the key is a meaningful text.
![one time pad system one time pad system](https://www.cryptomuseum.com/crypto/img/301278/001/full.jpg)
Shannon understood that a one-time pad would require high entropy keys to be secure. However, these kinds of sequences are not appropriate for block ciphers (e.g. Computer random number generators produce high entropy sequences. is “locally random”), whereas when using an English language passphrase the letter “e” appears far more often than, say, the letter “q”. The distribution of letters (or bits) in a high entropy sequence is uniform even for relatively short sequences (i.e. This concept of randomness exists because these sorts of sequences are very useful for stream ciphers, to prevent cryptanalysis using frequency analysis. But because these kind of sequences are excluded from high-entropy keys, they are not random in the traditional sense. A sequence with a high level of entropy never looks like “ ABABABABABABABAB.“, for example. This means that it passes all the statistical tests of randomness we can find… It is unpredictable… should not be compressible”. Bruce Schneier (in Applied Cryptography) describes a high entropy sequence as follows: “It looks random. 15% of random two letter sequences are English words.īut Vernam wasn’t thinking of random sequences in the probabilistic sense, he was thinking of sequences that have a high level of entropy (in the information theory sense). If one chooses three letters completely at random, approximately 6% will be English words. The set of letter sequences that are valid English is a subset of all possible letter sequences, not a separate set. This implies that the set of letter sequences that are valid English is a completely different set from letter sequences chosen completely at random. Vernam’s original 1917 paper proposing an unbreakable stream cipher introduces the concept with this sentence: “If, now, instead of using English words or sentences, we employ a key composed of letters selected absolutely at random, a cipher system is produced which is absolutely unbreakable.”