Kronecker sum

I’m working on a project these days where I’ve used four different kinds of matrix product, which made me wonder if there’s another kind of product out there that I could find some use for.

In the process of looking around for other matrix products, I ran across the Kronecker sum. I’ve seen Kronecker products many times, but I’d never heard of Kronecker sums.

The Kronecker sum is defined in terms of the Kronecker product, so if you’re not familiar with the latter, you can find a definition and examples here. Essentially, you multiply each scalar element of the first matrix by the second matrix as a block matrix.

The Kronecker product of an m × n matrix A and a p × q matrix B is a mp × nq matrix KA B. You could think of K as an m × n matrix whose entries are p × q blocks.

So, what is the Kronecker sum? It is defined for two square matrices, an n × n matrix A and an m × m matrix B. The sizes of the two matrices need not match, but the matrices do need to be square.  The Kronecker sum of A and B is

AB = AIm + InB

where Im and In are identity matrices of size m and n respectively.

Does this make sense dimensionally? The left side of the (ordinary) matrix addition is nm × nm, and so is the right side, so the addition makes sense.

However, the Kronecker sum is not commutative, and usually things called “sums” are commutative. Products are not always commutative, but it goes against convention to call a non-commutative operation a sum. Still, the Kronecker sum is kinda like a sum, so it’s not a bad name.

I don’t have any application in mind (yet) for the Kronecker sum, but presumably it was defined for a good reason, and maybe I’ll run an application, maybe even on the project alluded to at the beginning.

There are several identities involving Kronecker sums, and here’s one I found interesting:

exp( A ) ⊗ exp( B ) = exp( A B ).

If you haven’t seen the exponential of a matrix before, basically you stick your matrix into the power series for the exponential function.

Examples

First, let’s define a couple matrices A and B.

\begin{align*} A &= \left( \begin{array}{cc} 1 & 2 \\ 3 & 4 \\ \end{array} \right) \\ B &= \left( \begin{array}{ccc} 1 & 0 & 1 \\ 1 & 2 & 0 \\ 2 & 0 & 3 \\ \end{array} \right) \end{align*}

We can compute the Kronecker sums

S = AB

and

T = B ⊕ A

with Mathematica to show they are different.

    A = {{1, 2}, {3, 4}}
    B = {{1, 0, 1}, {1, 2, 0}, {2, 0, 3}}
    S = KroneckerProduct[A, IdentityMatrix[3]] + 
        KroneckerProduct[IdentityMatrix[2], B]
    T = KroneckerProduct[B, IdentityMatrix[2]] + 
        KroneckerProduct[IdentityMatrix[3], A]

This shows

\begin{align*} A \oplus B &= \left( \begin{array}{cccccc} 2 & 0 & 1 & 2 & 0 & 0 \\ 1 & 3 & 0 & 0 & 2 & 0 \\ 2 & 0 & 4 & 0 & 0 & 2 \\ 3 & 0 & 0 & 5 & 0 & 1 \\ 0 & 3 & 0 & 1 & 6 & 0 \\ 0 & 0 & 3 & 2 & 0 & 7 \\ \end{array} \right) \\ B \oplus A &= \left( \begin{array}{cccccc} 2 & 2 & 0 & 0 & 1 & 0 \\ 3 & 5 & 0 & 0 & 0 & 1 \\ 1 & 0 & 3 & 2 & 0 & 0 \\ 0 & 1 & 3 & 6 & 0 & 0 \\ 2 & 0 & 0 & 0 & 4 & 2 \\ 0 & 2 & 0 & 0 & 3 & 7 \\ \end{array} \right) \end{align*}

and so the two matrices are not equal.

We can compute the matrix exponentials of A and B with the Mathematica function MatrixExp to see that

\begin{align*} \exp(A) &= \left( \begin{array}{cc} 2.71828 & 7.38906 \\ 20.0855 & 54.5982 \\ \end{array} \right) \\ \exp(B) &= \left( \begin{array}{ccc} 2.71828 & 1. & 2.71828 \\ 2.71828 & 7.38906 & 1. \\ 7.38906 & 1. & 20.0855 \\ \end{array} \right) \end{align*}

(I actually used MatrixExp[N[A]] and similarly for B so Mathematica would compute the exponentials numerically rather than symbolically. The latter takes forever and it’s hard to read the result.)

Now we have

\begin{align*} \exp(A) \otimes \exp(B) &= \left( \begin{array}{cccccc} 512.255 & 0. & 606.948 & 736.673 & 0. & 872.852 \\ 361.881 & 384.002 & 245.067 & 520.421 & 552.233 & 352.431 \\ 1213.9 & 0. & 1726.15 & 1745.7 & 0. & 2482.38 \\ 1105.01 & 0. & 1309.28 & 1617.26 & 0. & 1916.22 \\ 780.631 & 828.349 & 528.646 & 1142.51 & 1212.35 & 773.713 \\ 2618.55 & 0. & 3723.56 & 3832.45 & 0. & 5449.71 \\ \end{array} \right) \\ \exp(A \oplus B) &= \left( \begin{array}{cccccc} 512.255 & 0. & 606.948 & 736.673 & 0. & 872.852 \\ 361.881 & 384.002 & 245.067 & 520.421 & 552.233 & 352.431 \\ 1213.9 & 0. & 1726.15 & 1745.7 & 0. & 2482.38 \\ 1105.01 & 0. & 1309.28 & 1617.26 & 0. & 1916.22 \\ 780.631 & 828.349 & 528.646 & 1142.51 & 1212.35 & 773.713 \\ 2618.55 & 0. & 3723.56 & 3832.45 & 0. & 5449.71 \\ \end{array} \right) \end{align*}

and so the two matrices are equal.

Even though the identity

exp( A ) ⊗ exp( B ) = exp( A B )

may look symmetrical, it’s not. The matrices on the left do not commute in general. And not only are AB and BA different in general, their exponentials are also different. For example

\exp(B\oplus A) = \left( \begin{array}{cccccc} 512.255 & 736.673 & 0. & 0. & 606.948 & 872.852 \\ 1105.01 & 1617.26 & 0. & 0. & 1309.28 & 1916.22 \\ 361.881 & 520.421 & 384.002 & 552.233 & 245.067 & 352.431 \\ 780.631 & 1142.51 & 828.349 & 1212.35 & 528.646 & 773.713 \\ 1213.9 & 1745.7 & 0. & 0. & 1726.15 & 2482.38 \\ 2618.55 & 3832.45 & 0. & 0. & 3723.56 & 5449.71 \\ \end{array} \right)

Related posts

3 thoughts on “Kronecker sum

  1. Kronecker sums can be useful for studying matrix equations (Riccati, Lyapunov, Sylvester) and matrix formulations of difference operators on tensor product grids. As for Kronecker products, there are nice theorems about the eigenvalues and eigenvectors of Kronecker sums. These applications and properties as well as the non-commutativity are discussed in a post by Nick Higham [1].

    [1] https://nhigham.com/2020/08/25/what-is-the-kronecker-product/

  2. This is almost spooky: I hadn’t heard of the Khatri-Rao product before reading your comment, and this morning I found a use for it.

Leave a Reply

Your email address will not be published. Required fields are marked *