Numerical experiments, Tips, Tricks and Gotchas

## Polynomial Tricks

### Introduction

Polynomials are just about the simplest mathematical functions that exist, requiring only multiplications and additions for their evaluation. Yet they also have the flexibility to represent very general nonlinear relationships [1, 2]. Moreover, the Weierstrass approximation theorem states that every continuous function defined on a closed interval ${[}a,b{]}$ can be uniformly approximated as closely as desired by a polynomial function [3]. Polynomials can be calculated efficiently without actual rising their arguments to powers by using the Horner's method [4]. However, additional tricks can further increase the efficiency of the polynomial technique.

### Choosing appropriate polynomials for the problem

A polynomial function is actually a sum of monomials, special polynomials with only one nonzero coefficient $$f(x)=\sum_{n=0}^{N}a_{n}x^{n}=\sum_{n=0}^{N}a_{n}M_{n}(x)\label{eq:fxm}$$ here $$M_{n}(x)=x^{n}\label{eq:mono}$$ On the other hand, the same polynomial function can be equally expressed in terms of any $N+1$ linear independent polynomials of the power $N$ : $$f(x)=\sum_{n=0}^{N}b_{n}P_{n}(x)\label{eq:fxp}$$ One should try to find some polynomial basis, which simplifies the problem. This trick can simplify both derivation and/or use of a polynomial approximation.

### Example

Let's consider the power sum problem: $$S_{n}(m)=\sum_{k=1}^{n}k^{m}\label{eq:snm1}$$ For the first two $m$ the solutions are: \begin{eqnarray*} S_{n}(0) & = & n\\ S_{n}(1) & = & \frac{n(n+1)}{2} \end{eqnarray*} For an arbitrary $m$ we can estimate $S_{n}(m)\approx\intop_{0}^{n}k^{m}dk=\frac{n^{m+1}}{m+1}$ This gives the leading term. Therefore we conclude that $S_{n}(m)$ must be a polynomial in $n$ of the order $m+1$. In particular, $$S_{n}(2)=b_{0}+b_{1}\, n+b_{2}\, n^{2}+b_{3}\, n^{3}\label{eq:sn2}$$ We can find the coefficients by equating (\ref{eq:sn2}) to the known values of the sum for four values of $n,$ e.g. for $n=1,2,3,4$.

### A small trick

Note that for $m>0$ the summation in (\ref{eq:snm1}) can start with $k=0$: $$S_{n}(m)=\sum_{k=0}^{n}k^{m}\label{eq:snm}$$ Therefore we can utilize that $S_{0}(m)=0$ and use smaller values $n=0,1,2,3$.

### The direct solution

The basis polynomials for (\ref{eq:sn2}) are $$\vec{P}(n)\equiv\{P_{0}(n),P_{1}(n),P_{2}(n),P_{3}(n)\}=\{1,n,n^{2},n^{3}\}$$ for $n=0,1,2,3$ we have \begin{eqnarray*} 0 & = & b_{0}\\ 1 & = & b_{0}+b_{1}+b_{2}+b_{3}\\ 5 & = & b_{0}+2b_{2}+4b_{2}+8b_{3}\\ 14 & = & b_{0}+3b_{3}+9b_{2}+27b_{3} \end{eqnarray*} This is a system of linear equations with a dense matrix ($n\times n$ in general). It can be solved e.g. by $LU$ decomposition.

### Trick 1

Let's choose the following basis polynomials $$\vec{P}(n)=\{1,\, n,\, n(n-1),\, n(n-1)(n-2)\}\label{eq:H}$$ or $$S_{n}(2)=b_{0}+b_{1}\, n+b_{2}\, n(n-1)+b_{3}\, n(n-1)(n-2)\label{eq:sn2-1}$$ Now the matrix is already triangular \begin{eqnarray*} 0 & = & b_{0}\\ 1 & = & b_{0}+b_{1}\\ 5 & = & b_{0}+2b_{1}+2b_{2}\\ 14 & = & b_{0}+3b_{1}+6b_{2}+6b_{3} \end{eqnarray*} and the system can be easy solved \begin{eqnarray*} 0 & = & b_{0}\\ 1 & = & 0+b_{1}\\ 3 & = & 0+0+2b_{2}\\ 2 & = & 0+0+0+6b_{3} \end{eqnarray*} After some algebra we get a well known result: $$S_{n}(2)=\frac{n(n+1)(2n+1)}{6}\label{eq:sn2-2}$$ Note that expressions like (\ref{eq:sn2-1}) can be efficiently calculated using a Horner type form: $$S_{n}(2)=b_{0}+n\left\{ b_{1}+(n-1)\left[b_{2}+b_{3}(n-2)\right]\right\} \label{eq:sn2-3}$$

### Trick 2

Using Lagrange's approach [5] we can get the solution immediately. In this case the basis polynomials are $$\vec{P}(n)=\left\{ \frac{(n-1)(n-2)(n-3)}{(-1)\cdot(-2)\cdot(-3)},\,\frac{n(n-2)(n-3)}{1\cdot(-1)\cdot(-2)},\,\frac{n(n-1)(n-3)}{2\cdot1\cdot(-1)},\,\frac{n(n-1)(n-2)}{3\cdot2\cdot1}\right\} \label{eq:Lp}$$ They satisfy the following relation $$P_{i}(n)=\delta_{n,i}\:,\: n,i=0,1,2,3\label{eq:delta}$$ Therefore $b_{n}=S_{n}(2)\,,\: n=0,1,2,3$ and finally $$S_{n}(2)=0\cdot P_{0}(n)+1\cdot P_{1}(n)+5\cdot P_{2}(n)+14\cdot P_{3}(n)\label{eq:sn2-4}$$ with $P_{i}(n)$ from (\ref{eq:Lp}). The drawback of this "quick'' solution is that the polynomials (\ref{eq:Lp} ) are computationally more complex.

### Sources of inspiration

The basis set (\ref{eq:Lp}) can be called the polynomial delta-function because of Eq. (\ref{eq:delta}). The Heaviside step function [6] is the analogy for (\ref{eq:H}). The Chebyshev polynomials [7, 8, 9] are the polynomial harmonic functions. The analogy with the other elementary and special functions can be very fruitful. In my article [10] I discuss several options for calculating the smoothing polynomials. In this tutorial [11] the use of orthogonal polynomials is discussed.

### References

1. Smyth, G. K. (2004). Polynomial approximation. In: Encyclopedia of Biostatistics Second Edition, Volume 6, P. Armitage and T. Colton (eds.), Wiley, London, pages 4136-4140.
2. Abramowitz, Milton; Stegun, Irene A., eds. (1972), Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, New York: Dover Publications
3. Wikipedia, Stone-Weierstrass theorem.
4. Wikipedia, Horner's method.
5. Wikipedia, Lagrange polynomial.
6. Wikipedia, Heaviside step function.
7. Wikipedia, Chebyshev polynomials.
8. C. Lanczos, Applied Analysis, Prentice Hall, Inc. Englewood Cliffs, N.J. 1956 (There are also later reprints).
9. R. Hamming, Numerical methods for scientists and engineers, Dover, 1987.
10. N. Shokhirev, Least squares polynomial approximation, www.numericalexpert.com, 2013.
11. N. Shokhirev, The power of orthogonality, www.numericalexpert.com, 2012.