In mathematics, a Taylor series is a representation of a function as an infinite sum of terms that are calculated from the values of the function’s derivatives at a single point.
It is common practice to approximate a function by using a finite number of terms of its Taylor series. Taylor’s theorem gives quantitative estimates on the error in this approximation. Any finite number of initial terms of the Taylor series of a function is called a Taylor polynomial. The Taylor series of a function is the limit of that function’s Taylor polynomials, provided that the limit exists. A function may not be equal to its Taylor series, even if its Taylor series converges at every point. A function that is equal to its Taylor series in an open interval (or a disc in the complex plane) is known as an analytic function. Source.
I’ve never heard of this, and I don’t completely understand it, but it’s interesting! Above: A Taylor approximation of the function sin(x). Image source.