Upper and lower bounds for the normal distribution function

Let Z be a standard normal random variable. These notes present upper and lower bounds for the complementary cumulative distribution function

 \Phi^c(t) = P(Z > t) = \frac{1}{\sqrt{2\pi}} \int_t^\infty e^{-x^2/2}\, dx.

We prove simple bounds fi rst then state improved bounds without proof.

An upper bound is easy to obtain. Since x/t > 1 for x in (t, ∞), we have

\begin{eqnarray*} \Phi^c(t) &=& \frac{1}{\sqrt{2\pi}} \int_t^\infty e^{-x^2/2}\, dx \\ &<& \frac{1}{\sqrt{2\pi}} \int_t^\infty \frac{x}{t}e^{-x^2/2}\, dx \\ &=& \frac{1}{\sqrt{2\pi}} \frac{1}{t} e^{-t^2/2}. \label{upper} \end{eqnarray*}

We can also show there is a lower bound

\Phi^c(t) > \frac{1}{\sqrt{2\pi}} \frac{t}{t^2 + 1} e^{-t^2/2}.

To prove this lower bound, define

g(t) = \Phi^c(t) - \frac{1}{\sqrt{2\pi}} \frac{t}{t^2 + 1} e^{-t^2/2}.

We will show that g(t) is always positive. Clearly g(0) > 0. From the derivative

g'(t) = -\frac{2}{\sqrt{2\pi}} \frac{e^{-t^2/2}}{(t^2 + 1)^2}

we see that g is strictly decreasing. Since the limit of g(t) as t goes infinity vanishes, g must always be positive.

Combining the inequalities above we have

\frac{t}{t^2 +1} < \sqrt{2\pi} e^{t^2/2} \Phi^c(t) < \frac{1}{t}.

Abramowitz and Stegun give bounds on the error function from which we can derive different bounds on the normal distribution. Formula 7.1.13 from Abramowitz and Stegun reads

\frac{1}{x + \sqrt{x^2 + 2}} < e^{x^2} \int_x^\infty e^{-t^2}\, dt \leq \frac{1}{x + \sqrt{x^2 + 4/\pi}}.

Let t = √2x. Then the inequality above yields

\frac{1}{t + \sqrt{t^2 + 4}} < \sqrt{\frac{\pi}{2}} \exp\left(\frac{t^2}{2} \right) \, \Phi^c(t) < \frac{1}{t + \sqrt{t^2 + \frac{8}{\pi}}}

This post includes Python code based on the inequality above and shows that the upper and lower bounds are quite close to each other as t gets large.