# Upper and lower bounds for the normal distribution function

Let Z be a standard normal random variable. These notes present upper and lower bounds for the complementary cumulative distribution function

We prove simple bounds fi rst then state improved bounds without proof.

An upper bound is easy to obtain. Since x/t > 1 for x in (t, ∞), we have

We can also show there is a lower bound

To prove this lower bound, define

We will show that g(t) is always positive. Clearly g(0) > 0. From the derivative

we see that g is strictly decreasing. Since the limit of g(t) as t goes infinity vanishes, g must always be positive.

Combining the inequalities above we have

Abramowitz and Stegun give bounds on the error function from which we can derive different bounds on the normal distribution. Formula 7.1.13 from Abramowitz and Stegun reads

Let t = √2x. Then the inequality above yields

This post includes Python code based on the inequality above and shows that the upper and lower bounds are quite close to each other as t gets large.