Become a math whiz with AI Tutoring, Practice Questions & more.

Join for Free
HotmathMath Homework. Do It Faster, Learn It Better.

Taylor Series

In mathematics, we use Taylor Series to make clever calculations about functions and polynomials by using an infinite number of terms. Although this can be a difficult concept to grasp, the calculations aren't overly complex. So what exactly is a Taylor Series? Why are they so important? When do we use them? Let's find out:

The basic definition of a Taylor Series

A Taylor series is a representation of a function as an infinite sum of terms. calculated from the values of its derivatives at a single point. This is not limited to only exponentials or trigonometric functions but can be applied to a wide range of functions.

The general form of the Taylor series of a function f x about the point a is:

In this series, f a , f ′′ a , f ′′′ a and so on represent the first, second, third, etc., derivatives of the function evaluated at the point a .

Perhaps the most straightforward example of a Taylor Series is the expansion of e x :

As we can see, this expanded sum of infinite terms follows the rule for Taylor Series, as each term has a larger exponent. Remember that the symbol "!" represents a factorial value that needs to be multiplied by descending numbers down to one. For example, 4 ! represents 4 × 3 × 2 × 1 = 24 .

The Taylor series expansions for sin x and cos x centered around x = 0 are:

After seeing a few examples of Taylor Series, we see a few similarities. They all seem to have factorial values, and their exponents always increase. We can also see that the concept is applicable to geometrical concepts like cosine and sine.

The Geometric series expansion is as follows:

Taylor Series is a powerful tool in calculus, providing a way to represent many functions as an infinite series, which can be useful in a variety of applications, including approximations and solving differential equations.

When are Taylor Series useful?

Taylor Series is useful when dealing with continuous functions.

We may recall that when we can graph a function as a single, unbroken curve, it is said to be continuous. This definition includes many functions, but it disqualifies things like hyperbolas because of their asymptotes.

Formally, a function is continuous at a point if the limit of the function as x approaches that point from either side is equal to the value of the function at the point itself.

So what does this all have to do with Taylor Series?

Recall that derivatives help us find the slope of a single point on a function's graph by measuring the change in y over the change in x as a ratio. A differential function is one that has an existing derivative. In other words, a function is differentiable at a point if the slope of the tangent line of the points from the left approaches the same value as the slope of the tangent of the points from the right. The resulting value is the derivative.

Okay, that might sound a little complicated -- but here''s the most important point:

We can find the derivatives of higher order by repeating this process and assigning each derivative to the values f , f ′′ , f ′′′ , etc. that exist at x = a . Each of these derivatives is evaluated at the point a , and these values form the coefficients for the terms in the Taylor series.

Notice anything familiar about these values? Each value has a larger exponent -- which should remind us of a Taylor Series.

In this case, we can create the following Taylor Series for f x as a power series:

In other words, we have expanded this function into an infinite sum of terms that each have larger exponents.

Note that we can write this in sigma notation as well:

But we still haven't really answered the question of when Taylor Series is useful.

In basic terms, we use Taylor Series to get approximate values for our functions. Although the first few terms in the Taylor Series may be a little off, we eventually get some extremely accurate approximations. This is especially true with functions like sine or cosine.

The basic rule when finding our Taylor Series is simple:

For each term, we need to take the n th derivative, divide by n ! , and then multiply by x - a n .

This shows how important derivatives are for our calculations involving Taylor Series.

If a = 0 , then we have a special type of Taylor Series called a "Maclaurin Series." If we have partial sums of our Taylor Series, these are called Taylor polynomials.

Although the first few terms in the Taylor Series may be a little off, we eventually get some extremely accurate approximations near the point "a". This region, where the series converges to the function, is called the interval of convergence.

Working with Taylor Series

Now that we know how Taylor Series work, it's time to use our knowledge to solve a few problems:

Can we find the Taylor polynomials for the function f x = sin x around x = 0 ?

That one was easy! But what about the next values?

The pattern arises from the fact that the derivatives of the sine function are cyclic: the first derivative of sine is cosine, the second derivative is negative sine, the third derivative is negative cosine, and the fourth derivative is back to positive sine. This pattern repeats every four steps, which makes the Taylor series for sine and cosine functions easier to compute as it's a repeating pattern.

A short history of the Taylor Series

People have been trying to make calculations with the sums of infinite series for thousands of years. The ancient Greeks first believed that finding finite results was always impossible, leading to Zeno's paradox. This concept was explored by other Greek thinkers, including Aristotle and Archimedes, with the latter using a "method of exhaustion" to achieve a finite result.

At around the same time, Liu Hui employed a very similar method. In the 14th century, the Indian mathematician Madhava of Sangamagrama provided early examples of series that would later be recognized as specific instances of Taylor series. Notably, he introduced series expressions for trigonometric functions like sine, cosine, and arctangent. These are sometimes referred to as Madhava series.

In the 17 th century, James Gregory independently worked on the concept of infinite series. Isaac Newton made significant strides in this area with his methods of fluxions and his work on binomial series. However, the Taylor series gets its name from Brook Taylor, who in 1715 wrote a general method for constructing what we now recognize as Taylor series for all functions. Colin Maclaurin, a Scottish mathematician, later extensively used a special case of the Taylor series, where the series is centered at zero, which is now known as the Maclaurin series.

Topics related to the Taylor Series

Infinite Geometric Series

Sum of the First n Terms of a Geometric Sequence

Binomial Series

Flashcards covering the Taylor Series

Calculus 2 Flashcards

AP Calculus BC Flashcards

Practice tests covering the Taylor Series

Calculus 2 Diagnostic Tests

AP Calculus BC Diagnostic Tests

Pair your student with a tutor who understands Taylor Series

Taylor Series can be difficult for students to understand, and sometimes they need to hear a number of different explanations before they finally get that "aha moment." Tutors are helpful in this situation because they can try a range of different teaching methods. They can also personalize these methods based on your student's learning style. In addition, your student can ask for clarification at a moment's notice. This isn't always possible in a classroom setting, where students must raise their hands for answers. Speak with our Educational Directors today, and rest assured: Varsity Tutors will match your student with a suitable tutor.

Subjects Near Me
Popular Cities
Popular Subjects
;
Download our free learning tools apps and test prep books
varsity tutors app storevarsity tutors google play storevarsity tutors amazon storevarsity tutors ibooks store