Suppose you have two pseudorandom bit generators. They’re both fast, but not suitable for cryptographic use. How might you combine them into one generator that is suitable for cryptography?

Coppersmith et al [1] had a simple but effective approach which they call the **shrinking generator**, also called **irregular decimation**. The idea is to use one bit stream to sample the other. Suppose the two bit streams are *a*_{i} and *b*_{i}. If *a*_{i} = 1, then output *b*_{i}. Otherwise, throw it away. In NumPy or R notation, this is simply `b[a > 0]`

.

## Examples in Python and R

For example, in Python

>>> import numpy as np >>> a = np.array([1,0,1,1,0,0,0,1]) >>> b = np.array([1,0,1,0,0,1,1,0]) >>> b[a > 0] array([1, 1, 0, 0])

Here’s the same example in R.

> a = c(1,0,1,1,0,0,0,1) > b = c(1,0,1,0,0,1,1,0) > b[a>0] [1] 1 1 0 0

## Linear Feedback Shift Registers

Coppersmith and colleagues were concerned specifically with linear feedback shift register (**LFSR**) streams. These are efficient sources of pseudorandom bits because they lend themselves to hardware implementation or low-level software implementation. But the problem is that linear feedback shift registers are *linear*. They have an algebraic structure that enables simple cryptographic attacks. But the procedure above is *nonlinear* and so less vulnerable to attack.

A potential problem is that the shrinking generator outputs bits at an irregular rate, and a timing attack might reveal something about the sampling sequence *a* unless some sort of buffering conceals this.

## Other stream sources

While the shrinking generator was developed in the context of LFSRs, it seems like it could be used to combine any two PRNGs in hopes that the combination is better than the components. At a minimum, it doesn’t seem it should make things worse [2].

There is a problem of efficiency, however, because on average the shrinking generator has to generate 4*n* bits to output *n* bits. For very efficient generators like LFSRs this isn’t a problem, but it could be a problem for other generators.

## Self-shrinking generators

A variation on the shrinking generator is the **self-shrinking generator**. The idea is to use half the bits of a stream to sample the other half. Specifically, look at pairs of bits, and if the first bit is a 1, return the second bit. If the first bit is a 0, return nothing.

## Use in stream ciphers

The eSTREAM competition for cryptographically secure random bit generators included one entry, Decim v2, that uses shrinking generators. The competition had two categories, methods intended for software implementation and methods intended for hardware implementation. Decim was entered in the hardware category. According to the portfolio PDF on the competition site,

Decim contains a unique component in eSTREAM, that of irregular decimation, and is an interesting addition to the field of stream ciphers.

That sounds like it was the only method to use irregular decimation (i.e. shrinking generators).

The first version of Decim had some flaws, but the document says “no adverse cryptanalytic results are known” for the second version. Decim v2 made it to the second round of the eSTREAM competition but was not chosen as a finalist because

… the cipher doesn’t seem to deliver such a satisfying performance profile, so while there might be some very interesting elements to the Decim construction, we feel that the current proposal doesn’t compare too well to the other submissions for the hardware profile.

That seems to imply Decim might be competitive with a different implementation or in some context with different trade-offs.

## Related posts

[1] Coppersmith, D. Krawczyk, H. Mansour, Y. The Shrinking Generator. Advances in Cryptology — CRYPTO ’93. Lecture Notes in Computer Scienc, vol. 773, pp. 22–39. Springer, Berlin.

[2] If *a* and *b* are good sources of random bits, it seems *b* sampled by *a* should be at least as good. In fact, if *a* is poor quality but *b* is good quality, *b* sampled by *a* should still be good. However, the reverse could be a problem. If *b* is biased, say it outputs more 1s than 0s, then if *a* is a quality sample, that sample will be biased in favor of 1s as well.

Using one RNG to sample another, does assume independence of the two streams. It might be a moot point, but in the wide world of all possibilities, one could imagine falsely believing two streams independent, which would result in poor results.

Somehow this reminds me of Knuth’s “Algorithm K (“Super-random” number generator)” excerpted from Volume 2 at http://www.informit.com/articles/article.aspx?p=2221790

How shrinking generator compares with respect to randomness with the Ulam’s algorithm (if I remember the name correctly) of combining two random number generators, where one generator is used to fill a small table, and the second generator is used to sample this table (with replacement using the first generator)?