Linear logic arithmetic

Linear logic has connectives not used in classical logic. The connectives ⊗and & are conjunctions, ⊕ and ⅋ are disjunctions, and ! and ? are analogous to the modal operators ◻ and ◇ (necessity and possibility).

\begin{table}[] \begin{tabular}{lccll} & \phantom{i}add\phantom{i} & mult & & \\ \cline{2-3} \multicolumn{1}{l|}{conjunction} & \multicolumn{1}{c|}{\&} & \multicolumn{1}{c|}{\otimes} & & \\ \cline{2-3} \multicolumn{1}{r|}{disjunction} & \multicolumn{1}{c|}{\oplus} & \multicolumn{1}{c|}{\parr} & & \\ \cline{2-3} & & & & \end{tabular} \end{table}

Another way to classify the connectives is to say ⊕ and & are called additive, ⊗ and ⅋ are multiplicative, and ! and ? are called exponential.And still another classification says that ⊕, ⊗, and ! have positive polarity, while &, ⅋, and ? have negative polarity.

\begin{table}[] \begin{tabular}{lcccl} & add & mult & exp & \\ \cline{2-4} \multicolumn{1}{r|}{positive} & \multicolumn{1}{c|}{\oplus} & \multicolumn{1}{c|}{\otimes} & \multicolumn{1}{c|}{!} & \\ \cline{2-4} \multicolumn{1}{r|}{negative} & \multicolumn{1}{c|}{\&} & \multicolumn{1}{c|}{\parr} & \multicolumn{1}{c|}{?} & \\ \cline{2-4} \end{tabular} \end{table}

This post will show that these arithmetical names are justified by analogy.

For one thing, multiplication-like connectives distribute over addition-like connectives.
2
\begin{align*} A \otimes (B \oplus C) &\equiv (A \otimes B) \oplus (A \otimes C) \\ A \parr (B \,\&\, C) &\equiv (A \parr B) \,\&\, (A \parr C) \\ (A \oplus B) \otimes C &\equiv (A \otimes C) \oplus (B \otimes C) \\ (A \,\&\, B) \parr C &\equiv (A \parr C) \,\&\, (B \parr C) \end{align*}

Also, the exponential-like operators behave analogously to the equation

e^(a+b) = e^a e^b.

with respect to the addition-like and multiplication-like connectives.

\begin{align*} !(A \& B) &\equiv (!A) \otimes (!B) \\ ?(A \oplus B) &\equiv (?A) \parr (?B) \end{align*}

In both equations, if you apply an exponential-like operator to the result of applying an addition-like operator, you get a multiplication-like operator applied to the exponential-like operator applied to two addition-like arguments separately.

The term “polarity” is justified by the fact that linear negation flips the polarity of connectives.

In the following analogs of De Morgan’s laws, negation turns conjunctions into disjunctions and vice versa, and it reverses the polarity of connectives.

\begin{align*} (A \otimes B)^\bot &\equiv A^\bot \parr B^\bot \\ (A \oplus B)^\bot &\equiv A^\bot \,\&\,\, B^\bot \\ (A \parr B)^\bot &\equiv A^\bot \otimes B^\bot \\ (A \,\&\, B)^\bot &\equiv A^\bot \oplus B^\bot \\ \end{align*}

Negating exponential connectives also flips polarity.

\begin{align*} (!A)^\bot &\equiv ?(A^\bot) \\ (?A)^\bot &\equiv !(A^\bot) \end{align*}

These rules are analogous to the rules for negating quantifiers in classical logic.

\begin{align*} \neg(\forall x, p) &\iff \exists x, \neg p \\ \neg(\exists x, p) &\iff \forall x, \neg p \end{align*}

More logic posts