The Soviet license plate game and Kolmogorov complexity

Soviet license plate

Physicist Lev Landau used to play a mental game with Soviet license plates [1]. The plates had the form of two digits, a dash, two more digits, and some letters.

Rules of the game

His game was to apply high school math operators to the numbers on both side of the dash so that the dash could be replaced by an equal sign. For example, given the license plate 44-74, one solution would be

4! + 4 = 7*4.

Note that we can insert operators, such as !, +, and *, but not more digits.

Is there a solution for every possible license plate? That depends on what class of operators you allow.

You could trivialize the game by applying the fractional part operation {x} to both sides since the fractional part of an integer is zero. You could disallow the fractional part operator on the grounds that it’s not clearly a high school math operation, or just disallow it because it makes the game uninteresting.

Universal solution

It turns out there’s a universal solution, starting with the observation that

√(n + 1) = sec arctan √n.

If one side is greater than the other by one, the formula above gives an immediate solution. For example, a solution for the license plate 89-88 would be

√89 = sec arctan √88.

If the difference is greater, the formula can be applied repeatedly. For example, we could apply the formula twice to obtain

√(n + 2) = sec arctan √(n + 1) = sec arctan sec arctan √n

and so a possible solution for 35-37 is

sec arctan sec arctan √35 = √37.

Kolmogorov complexity

Given that a solution is always possible, we can make the game more interesting by looking for the simplest solution. We have some intuitive idea what this means. With our example of 44-74, the first solution

4! + 4 = 7*4

is simpler than the universal solution

sec arctan sec arctan … √44 = √74

which would require applying secant and arctangent each 30 times.

The Kolmogorov complexity of an object is the length of the shortest computer program to produce the object. We could compute the Kolmogorov complexity of the functions applied to the digits on each side in order to measure how complex a solution is.

To make this precise, we need to specify what our programming language is, and that’s not as easy as it sounds. If we think of mathematics notation as a programming language, do we want to count ! as one character and arctan as 6 characters? That doesn’t seem right. If we wrote “arctan” as “atn” we would use fewer characters without producing a different solution.

Complexity of Python code

To make things more objective, we could look at the length of actual computer programs rather than imagining math notation to be a programming language. Say we choose Python. Then here are a couple functions that compute our two solutions for license plate 44-74.

    from math import sqrt, cos, atan

    def f():
        sec = lambda x: 1/cos(x)
        y = sqrt(44)
        for _ in range(30):
            y = sec(atan(y))
        return y

    def g():
        return sqrt(77)

We could measure the complexity of our functions f and g by counting the number of characters in each one. But there still are difficulties.

What about the import statement? It should count toward the length of f because it uses everything imported but g could have used a shorter statement that only imported sqrt. More fundamentally, are we cheating by even importing a library?

Furthermore, the two functions above don’t produce exactly the same output due to limited precision. We could imagine that our imported functions are infinitely precise, but then we’re not really using Python but rather an idealized version of Python.

And what about the loop? That introduced new digits, 3 and 0, and so violates the rules of Landau’s game. So should we unroll the loop before calculating complexity?

Thought experiment

Kolmogorov complexity is a very useful concept, but it’s more of a thought experiment than something you can practically compute. We can imagine the shortest program to compute something, but we can seldom know that we’ve actually found such a program. All we can know in practice is upper bounds.

In theory you can enumerate all Turing machines of a given length, or all Python programs of a given length, and find the shortest one that does a given task [2], but the list grows exponentially with length.

However, it is possible to compute the length of particular programs if we deal with some of the complications mentioned above. We could make Landau’s game a two-player game by seeing who could come up with the simpler solution in a fixed amount of time.

Back to Landau

If we allow sine and degree in our set of operators, there’s a universal solution due to B. S. Gorobets. For n ≥ 6, n! is a multiple of 360, and so

sin (n!)° = 0.

And if n is less than 6, it’s two-digit representation begins with zero, so we can multiply the digits to get zero.

If we disallow transcendental functions, we block Gorobets’ trick and we have functions whose length we can objectively measure in a programming language.

Related posts

[1] Harun Šiljak. Soviet Street Mathematics: Landau’s License Plate Game. Mathematical Intelligencer, https://doi.org/10.1007/s00283-017-9743-9

[2] In addition to having to test an exponentially growing set of programs, there’s the problem of knowing if even one program will complete.

If you run a program and see it complete, then you know it completes! If you don’t see it complete, it might just need more time, or it might never finish. The halting problem theorem says that there’s no program that can tell whether another arbitrary program will terminate. You still might be able to determine whether a particular program terminates, but you might not.

Photo credit CC BY-SA 3.0 via Wikimedia Commons
Русский: Задний автомобильный номер СССР образца 1936 года

The door was unlocked

From the dedication of C. S. Lewis’ A Preface to Paradise Lost:

Apparently the door of the prison was really unlocked all the time; but it was only you who thought of trying the handle.

In context he was saying that Charles Williams had recovered a clear understanding of Milton that had been obfuscated for over a century.

That line made me think of a quote from Alexander Grothendieck (that I haven’t been able to find this morning) to the effect that progress in mathematics has often waited for someone to be bold enough to ask a simple question or introduce a simple concept.

Update: Thanks to Roland Elliot for providing the quote I was missing. Ronald Brown says that Grothendieck said in a letter to him in 1982:

The introduction of the cipher 0 or the group concept was general nonsense too, and mathematics was more or less stagnating for thousands of years because nobody was around to take such childish steps …

Complex for whom?

From Out of the Tar Pit:

… the type of complexity we are discussing in this paper is that which makes large systems hard to understand. It is this that causes us to expend huge resources in creating and maintaining such systems. This type of complexity has nothing to do with complexity theory — the branch of computer science which studies the resources consumed by a machine executing a program. The two are completely unrelated — it is a straightforward matter to write a small program in a few lines which is incredibly simple (in our sense) and yet is of the highest complexity class (in the complexity theory sense).

More posts on complexity

Simplicity is hard to sell

People say they want simple things, but they don’t. Not always.

Donald Norman argues in Living with Complexity that people really want things that are easy to use, not things that are simple. They want things that are familiar, even if the familiar is complex.

People also want all the features they can handle. If you’re competing with a product that is overwhelmingly complex, then simplicity sells. Otherwise simplicity is perceived as weakness. The more features the better. People may enjoy using simpler products, but they buy complex ones.

It’s hard to remove even vestigial features. If people are in the habit of doing something that is no longer necessary (or never did add much value), you’ll have a lot of work to do to convince them that it’s OK to let go of it.

Related post: Simple versus easy

Simple versus easy

Rich Hickey argued in a recent talk that simplicity is objective but easiness is subjective. Something is simple if it is singular: it does one thing, it is made of one thing, etc. Something is easy if it is close at hand, i.e. familiar.

I think this is a useful distinction, though simplicity is a little harder to pin down than the talk implies. Simplicity is relative and requires context. Rich Hickey’s context is programming languages, and in that context it may be fairly objective to say one construction is simpler than another because it does less.

For example, Hickey says one complication of Lisp is that it uses parentheses for function calls and for grouping. It would be simpler if one symbol did one thing. Mathematica does something like this. Parentheses are for grouping only. Function calls are delimited by square brackets. The square brackets are inconsistent with standard mathematical notation, so they’re not as easy (i.e. familiar), but they are simpler.

Mnemonics often complicate things to make them easier. For example, consider this mnemonic for pi:

How I want a drink, alcoholic of course, after the heavy lectures involving quantum mechanics.

This sentence is easier for most people to remember than 3.14159265358979.  But the sentence is also more complex. A computer can represent the number in 8 bytes but the sentence takes 94 bytes of ASCII, more in Unicode.

Sometimes complex is better than simple, better in some context. It’s easier to memorize coherent sentences than numbers. But imagine if we got so excited by this mnemonic that we decided to represent all numbers by sentences. This would be amusing for a little while but would quickly become painful.

Some things are objectively simple but inhuman. Counting seconds since some event (e.g. Unix time) is much simpler than our system of keeping time with days, weeks, months, and years. But our human experience is profoundly influenced by the rotations and revolutions of our planet. Even weeks, which have no astronomical significance, seem to be aligned with human nature. So we keep our complex calendars while our computers count seconds.

I believe Hickey’s main point is that we need to reevaluate what we believe is simple. Maybe what we think is simple is complex but familiar. Maybe there is something new that is objectively simpler would become even easier once we’re used to it. (In particular, Hickey would like for us to try his programming language.) Once you practice thinking this way, you’ll see that many familiar things could be made simpler.

Related post: A little simplicity goes a long way

How has math changed your view of the world?

Several people have asked me whether studying math changed my view of the world, and if so how.

I see applications of math everywhere. But more fundamentally, studying math has led me to believe that complex problems sometimes have simple solutions.

Simple solutions may be hard or impossible to find. But you’re more likely to find a simple solution if you believe it exists because you’ll keep looking longer.

Related posts

Third-system effect

The third-system effect describes a simple system rising like a phoenix out of the ashes of a system that collapsed under its own complexity.

A notorious ‘second-system effect’ often afflicts the successors of small experimental prototypes. The urge to add everything that was left out the first time around all too frequently leads to huge and overcomplicated design. Less well known, because less common, is the ‘third-system effect’: sometimes, after the second system has collapsed of its own weight, there is a chance to go back to simplicity and get it right.

From The Art of Unix Programming by Eric S. Raymond. Available online here.

Raymond says that Unix was such a third system. What are other examples of the third-system effect?

Related posts

Demand for simplicity?

From Donald Norman’s latest book Living with Complexity:

… the so-called demand for simplicity is a myth whose time has passed, if it ever existed.

Make it simple and people won’t buy. Given a choice, they will take the item that does more. Features win over simplicity, even when people realize that features mean more complexity. You do too, I’ll bet. Haven’t you ever compared two products side by side, feature by feature, and preferred the one that did more? …

Would you pay more money for a washing machine with fewer controls? In the abstract, maybe. At the store, probably not.

Donald Norman’s assessment sounds wrong at first. Don’t we all like things to be simple? Not if by “simple” we mean “fewer features.”

A general theme in Living with Complexity is that complexity is inevitable and often desirable, but it can be managed. We say we want things that are simple, but we really want things that are easy to use. The book gives several examples to illustrate how different those two ideas are.

If something is complex but familiar and well designed, it’s easy to use. If something is simple but unfamiliar or poorly designed, it’s hard to use.

Simplicity posts