Monotonic interpolation

Accuracy isn’t everything. Sometimes when you’re approximating a function you care more about matching a function’s qualitative behavior than matching its numerical values.

One example is interpolating monotonic data. Say your data show that increasing input always increases output. It’s likely you want a function that interpolates your data to have the same property and that you might sacrifice some accuracy to assure monotonicity. In a visual application, you might get a distracting wiggle in what should be a smoothly bending curve. In software used to control a physical device, it could lead to unexpected behavior, or even disaster, if an increase in input led to a decrease in output, even if it was a small decrease.

Linear interpolation preserves monotonicity: if in your data increasing x leads to increasing y, the same is true of a linear interpolation to the data. But for higher-order interpolation this isn’t the case. A simple example would be the three points (-1, 1), (2, 4), and (3, 9). The unique quadratic polynomial interpolating these three points is x², which is not monotonic in any interval containing 0. We’d like to have a smoother and more accurate [1] way to interpolate data while still preserving monotonicity.

It’s unlikely that a polynomial interpolating monotone data will be monotone. I suspect that given a set of monotonic data, it’s more likely that a polynomial spline interpolant will be monotone than a polynomial interpolant because the former is more flexible. Formulating this suspicion into a conjecture that could be settled theoretically, or even tested empirically, would be a non-trivial but interesting exercise.

A cubic spline interpolating monotone data is not necessarily monotone. However, there is a way to determine derivative conditions so that a Hermite spline fitting the data will be monotone. Fritsch and Carleson published an algorithm for this in 1980.

Related posts

[1] If the function sampled by your data is smooth, then in general a smooth approximation to the function will be more accurate than a piecewise linear approximation.

5 thoughts on “Monotonic interpolation

  1. In scientific applications I’ve run into the need for isotonic regression before. If I recall correctly it was basically a case where I had a lot of noisy data that formed a calibration curve that I wanted to invert for some reason. So I think I basically wanted to ‘machine learn’ the data given the fact that I knew it was monotonic. I think I came to the conclusion that for a mathematician, adding this type of derivative constraint to the smoothing spline was totally viable, but for a chemist it was a bit beyond my comfort zone. I never did find a python package that I found fully satisfying. It is certainly a topic of interest for me because I am sure it will come up again.

    This line of thinking also took me down the road of shape constrained fitting where I wanted to perform machine learning on a set of noisy data where I wanted to fit a ‘peak’ with an ill defined functional form. In concept I viewed this as a more complex form of isotonic regression. But again, never really found a satisfying computational answer, though I was never sure I had a well-enough formed problem either.

    Someone wrote an interesting Matlab package though:
    https://www.mathworks.com/matlabcentral/fileexchange/24443-slm-shape-language-modeling

    BTW, thanks for the blog posts!

  2. There is a literature on more general “shape preserving” interpolation (e.g., preserving monotonicity, convexity, smoothness) that probably started with Schumaker’s 1983 paper using quadratic splines. A more recent paper (which also cites Schumaker’s) is at https://core.ac.uk/download/pdf/82616503.pdf

Comments are closed.