Relating Rényi entropy and q-log entropy

I’ve written before about Rényi entropy Hq and most recently q-log entropy Sq, two generalizations of Shannon entropy.

There are simple equations relating Rényi entropy and q-log entropy if we measure both in nats:

\begin{align*} S_q &= \frac{1}{1-q} \left( \exp\left( (1-q) H_q\right) -1 \right) \\ H_q &= \frac{1}{1-q} \log\left( (1-q) S_q +1 \right) \\ \end{align*}

I mentioned in the post on q-log entropy that there were two possible ways it could be defined. The equation above applies to what I called in that post Sq, not S q. In other words, it applies to the version that uses lnq(1/p) and not the version that uses −lnq(p). Recall that these are not equal unless q equals 1. When q does equal 1, then Rényi entropy and q-log entropy are the same as Shannon entropy.

Source: Tom Leinster, Entropy and Diversity: The Axiomatic Approach.