next up previous
Next: About this document ...

A sequence is any function whose domain is $\{k,k+1,k+2,\dots\}$ for some integer k.

Definition of convergence of a sequence: A sequence {an} is said to converge to L if, for every $\epsilon >
0$, there exists an integer k such that $\vert a_n - L\vert < \epsilon$ for every $n \geq k$. In this case, we write $\lim_{n \rightarrow \infty} = L$.

Any sequence which does not converge is said to diverge.

Definition of divergence of a sequence to $\infty$: A sequence {an} is said to diverge to $\infty$ if for every M, there exists an integer k such that an > M for all $n \geq k$.

A series is a formal sum of infinitely many terms. If the terms are $a_0,
a_1, a_2, \dots$, then the series is typically denoted $\sum_{n=0}^{\infty} a_n$.

Given a series $\sum_{n=0}^{\infty} a_n$, the n-th partial sum Sn is defined by $S_n = a_0 + a_1 + \cdots + a_n$.

Definition of convergence of a series: A series is said to converge if the sequence {Sn} converges.

A series $\sum a_n$ is said to converge absolutely if $\sum \vert a_n\vert$ converges.

If a and r are real numbers, the series $\sum_{n=0}^{\infty} ar^n$ is called a geometric series. $\sum_{n=0}^{\infty} ar^n$ converges if and only if -1 < r < 1. If -1 < r < 1, then $\sum_{n=0}^{\infty} ar^n = \frac{a}{1-r}$ (you should be able to prove this).

If p is any real number, $\sum_{n=1}^{\infty} \frac{1}{n^p}$ is called a p-series with exponent p. A p-series converges if and only if p > 1. You should know how to prove this (using the integral test).

nth term test for divergence: If $\lim_{n \rightarrow \infty} a_n$ is not equal to zero, then $\sum a_n$ does not converge.

Comparison test: Let cn be a sequence of nonnegative numbers.

Limit Comparison test: Suppose you have two series $\sum a_n$ and $\sum b_n$, and $\lim_{n \to \infty} \frac{a_n}{b_n} = L$. If $0< L < \infty$, then

Note: If you are using the limit comparison test on a series whose terms are the quotient of two polynomials, then use the limit comparison test with the series whose terms are the highest power of n in the numerator, divided by the highest power of n in the denominator.

Ratio test: Let an be a sequence of nonnegative numbers.

Note that the ratio test does not give any information if $\lim
\frac{a_{n+1}}{a_n} = 1$. Also note that the ratio test does not give any information about what the series converges to, only whether or not it converges.

Alternating series:

A series $\sum a_n$ is said to be an alternating series if its terms alternate in sign.

Important fact about alternating series: If an is an (eventually) nonincreasing sequence of positive numbers, and if $\lim_{n} a_n = 0$, then the alternating series $\sum_{n=0}^{\infty} (-1)^n a_n$ converges.

Even more interesting than the result itself is the following byproduct of the proof: if En denotes the error associated with estimating the sum of the series from the nth partial sum, then En < |an+1|. In other words, the error incurred by using a partial sum is less than the first ommitted term.

Taylor polynomials: Let f be a function which is smooth (i.e. infinitely differentiable) at x=a. Define Pn, the Taylor polynomial degree n for f about x=a by

\begin{displaymath}
P_n(x) = f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \frac{f'''(a)}{3!}(x-a)^3
+ \cdots
+ \frac{f{(n)}(a)}{n!}(x-a)^n.
\end{displaymath}

Note that Pn has the following property: Pn(k)(a) = f(k)(a) for all $k \leq n$.

Taylor series: Let f be a function which is smooth (i.e. infinitely differentiable) at x=a. Define the Taylor series for f about x=a to be the series

\begin{displaymath}
f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \frac{f'''(a)}{3!}(x-a)^3 + \cdots
\end{displaymath}

The interval of convergence of the Taylor series is the set of all values of x for which the Taylor series converges. Note that the interval of convergence always includes a. It turns out that the interval of convergence is an interval centerd at a. Thus it must take one of the following forms:

1.
(a-R,a+R] for some R;
a-R,a+R
for some R;
2.
[a-R,a+R) for some R;
3.
(a-R,a+R) for some R;
4.
$(-\infty,\infty)$ (i.e. all real numbers)

The number R is called the radius of convergence. Further, the convergence is absolute in the interval (a-R,a+R). In other words, if you have a Taylor series (about x=a) with radius of convergence R, the series converges absolutely for all $x \in (a-R,a+R)$, and it diverges for all x outside the interval [a-R,a+R]. At the endpoints, a+R and a-R, it may converge absolutely, it may converge (but not absolutely), or it may diverge.

The ratio test is a useful tool for computing the radius of convergence of a Taylor series.

Note that Taylor series can be added, subtracted, and differentiated termwise within their intervals of convergence (see section 5.1 of your book).

Applications of Taylor series:




next up previous
Next: About this document ...
Math 23 Winter 2000
2000-01-20