If a coin comes up heads with probability p and tails with probability 1-p, the entropy in the coin flip is
S = –p log2 p – (1-p) log2 (1-p).
It’s common to start with p and compute entropy, but recently I had to go the other way around: given entropy, solve for p. It’s easy to come up with an approximate solution.
Entropy in this case is approximately quadratic
S ≈ 4p(1-p)
p ≈ (1 ± √(1-S))/2.
This is a good approximation if S is near 0 or 1 but mediocre in the middle. You could use solve for p numerically, say with Newton’s method, to get more accuracy if needed.
As Sjoerd Visscher pointed out in the comments, the quadratic approximation for entropy is much better if you raise it to the power 3/4. When I added this new approximation to the graph above, the new approximation agreed with the correct value to within the thickness of the plotting line.
To make the approximation error visible, here’s the log of the absolute value of the error of the two approximations, on a log scale.
The error in the new approximation is about an order of magnitude smaller, sometimes more.
The improved approximation for entropy is
S ≈ (4p(1-p))3/4
and so the new approximation for probability is
p ≈ (1 ± √(1-S4/3))/2.