Logic vs C
I recently made a mistake because I interpreted the symbol ^ in the wrong context. I saw an expression of the form a ^ b and thought of the ^ as AND, as in logic, but the author’s intent was XOR, as in C.
How else might someone misinterpret the ^ operator? Here are a couple more possibilities. They’re sufficiently different contexts that they’re more like puns than likely mistakes.
The ^ symbol is also used in mathematics for wedge products of differential forms. Surely that’s completely different from bit twiddling, right? Actually it’s similar.
The wedge product of a differential form with itself is 0, i.e.
x ^ x = 0
for any differential form x. But that’s also true if you interpret x as a Boolean variable and ^ as XOR.
Another property of wedge products is that products are alternating:
x ^ y = – (y ^ x).
That doesn’t look right for bits, but it is. If you think of 0 and 1 as elements of a binary field, i.e. GF(2), then – means additive inverse. But everything is its own additive inverse in this context, so
x ^ y = y ^ x = – (y ^ x)
for bits with XOR just as for differential forms with exterior product.
Not everything from wedge products works for bits, however. For wedge products, any repetition in terms makes the product zero, whereas for XOR it’s only even numbers of replication that always equal zero. For example,
x ^ x ^ x = 0
is always true for differential forms, but is not true for bits if x = 1.
The ^ symbol is also used for exponents, such as
x ^ y in LaTeX for xy.
In the context of bits, ^ is the same as ≥. That is, for binary variables x and y, xy is 1 if and only if x ≥ y.
You might object that I’m moving back and forth freely between thinking of 0 and 1 as elements of a field and as truth values, but this post is all about being a little sloppy with context.