4.5 Solutions

Convergence

Theorem. Suppose xnxx_n \to x and ynyy_n \to y. Then xn+ynx+yx_n + y_n \to x + y.

Proof. Let ϵ>0\epsilon>0. 1Nx:n>Na    xnx<ϵ2xnx2Ny:n>Ny    yny<ϵ2yny\begin{alignat*}{2} \text{\textcircled{1}} \quad \exists N_x: n>N_a \implies \lvert x_n - x\rvert &< \tfrac{\epsilon}{2} &\hspace{4em}& x_n \to x \\ \text{\textcircled{2}} \quad \exists N_y: n>N_y \implies \lvert y_n - y\rvert &< \tfrac{\epsilon}{2} && y_n \to y \end{alignat*} Thus, for all n>N=max(Nx,Ny)n>N=\max(N_x,N_y), we have xn+yn(x+y)xnx+ynyTriangle inequality<ϵ1,2\begin{alignat*}{2} \lvert x_n + y_n - (x+y)\rvert &\le \lvert x_n - x\rvert + \lvert y_n - y\rvert &\hspace{6em}& \text{Triangle inequality} \\ &< \epsilon&& \text{\textcircled{1}}, \text{\textcircled{2}} \end{alignat*}  

Theorem. Suppose xnxx_n \to x and ynyy_n \to y. Then xnynxyx_n y_n \to xy.

Proof. First, let’s establish an identity: xnynxy=xnynxny+xnyxy=xn(yny)+y(xnx)=(xnx+x)(yny)+y(xnx)=(xnx)(yny)+x(yny)+y(xnx)\begin{align*} x_n y_n - xy &= x_n y_n - x_n y + x_n y - xy \\ &= x_n (y_n - y) + y(x_n - x) \\ &= (x_n - x + x)(y_n - y) + y(x_n - x) \\ &= (x_n - x)(y_n - y) + x(y_n -y) + y(x_n - x) \end{align*}
Now, let ϵ>0\epsilon>0. 1Nx:n>Nx    xnx<ϵ3+ϵ3yxnx2Ny:n>Ny    yny<ϵ3+ϵ3xyny\begin{alignat*}{2} \text{\textcircled{1}} \quad \exists N_x: n>N_x \implies &\lvert x_n - x\rvert < \frac{\sqrt{\epsilon}}{3} + \frac{\epsilon}{3\lvert y\rvert} &\hspace{4em}& x_n \to x \\ \text{\textcircled{2}} \quad \exists N_y: n>N_y \implies &\lvert y_n - y\rvert < \frac{\sqrt{\epsilon}}{3} + \frac{\epsilon}{3\lvert x\rvert} && y_n \to y \end{alignat*} Thus, for all n>N=max(Nx,Ny)n>N=\max(N_x,N_y), we have xnynxy=(xnx)(yny)+x(yny)+y(xnx)Identity abovexnxyny+xyny+yxnxTriangle inequality<ϵ3+ϵ3+ϵ31,2=ϵ\begin{alignat*}{2} \lvert x_n y_n - xy\rvert &= \lvert(x_n - x)(y_n - y) + x(y_n -y) + y(x_n - x)\rvert &\hspace{4em}& \text{Identity above} \\ &\le \lvert x_n-x\rvert\lvert y_n-y\rvert + \lvert x\rvert\lvert y_n-y\rvert + \lvert y\rvert\lvert x_n-x\rvert && \text{Triangle inequality} \\ &< \tfrac{\epsilon}{3} + \tfrac{\epsilon}{3} + \tfrac{\epsilon}{3} && \text{\textcircled{1}}, \text{\textcircled{2}} \\ &= \epsilon \end{alignat*} In the construction of NN above, note that we are assuming x,y0x,y \ne 0. If either is zero, the second term in the sum can simply be omitted, as the corresponding term below is zero.

Theorem. Suppose xnxx_n \to x, with xn0x_n \ne 0 for all nn and x0x \ne 0. Then 1/xn1/x1/x_n \to 1/x.

Proof.

First, let us note that ab<12b    a>12b\lvert a-b\rvert < \tfrac{1}{2}\lvert b\rvert \implies \lvert a\rvert > \tfrac{1}{2}{b}. This is fairly obvious when you think about it; to prove it, we can break the claim up into cases:

The cases where b<0b<0 follow the same reasoning. Now, let ϵ>0\epsilon>0. 1N1:n>N1    xnx<12x2ϵxnx2N2:n>N2    xnx<12xxnx3so that xn>12x2,see above\begin{alignat*}{3} \text{\textcircled{1}} &\quad& \exists N_1: n>N_1 \implies \lvert x_n - x\rvert &< \tfrac{1}{2}\lvert x\rvert^2\epsilon&\hspace{8em}& x_n \to x \\ \text{\textcircled{2}} && \exists N_2: n>N_2 \implies \lvert x_n-x\rvert &< \tfrac{1}{2}\lvert x\rvert && x_n \to x \\ \text{\textcircled{3}} && \text{so that } \lvert x_n\rvert &> \tfrac{1}{2}\lvert x\rvert && \text{\textcircled{2}}, \text{see above} \end{alignat*} Thus, for all n>N=max(N1,N2)n>N=\max(N_1, N_2), we have 1xn1x=xxnxnx2x2xnx3<ϵ1\begin{alignat*}{2} \lvert\frac{1}{x_n} - \frac{1}{x}\rvert &= \lvert\frac{x-x_n}{x_n x}\rvert \\ &\le \frac{2}{\lvert x\rvert^2}\lvert x_n-x\rvert &\hspace{8em}& \text{\textcircled{3}} \\ &< \epsilon&& \text{\textcircled{1}} \end{alignat*} Note that in this third theorem, the requirement that xn0x_n \ne 0 is unnecessary. As we see from 3\text{\textcircled{3}}, if xnxx_n \to x and x0x \ne 0, then there is an NN such that xn0x_n \ne 0 for all n>Nn>N.

Continuity

The first two theorems are essentially the same as their sequence counterparts, but the differences are worth paying attention to.

Theorem. Let the functions ff and gg be continuous at x0x_0. Then h=f+gh = f + g is continuous at x0x_0.

Proof. Let ϵ>0\epsilon>0. 1δf:xx0<δf    f(x)f(x0)<ϵ2f continuous at x02δg:xx0<δg    g(x)g(x0)<ϵ2g continuous at x0\begin{alignat*}{2} \text{\textcircled{1}} \quad \exists \delta_f: \lvert x-x_0\rvert < \delta_f &\implies \lvert f(x) - f(x_0)\rvert < \tfrac{\epsilon}{2} &\hspace{6em}& f \text{ continuous at } x_0 \\ \text{\textcircled{2}} \quad \exists \delta_g: \lvert x-x_0\rvert < \delta_g &\implies \lvert g(x) - g(x_0)\rvert < \tfrac{\epsilon}{2} && g \text{ continuous at } x_0 \end{alignat*} Thus, for all x:xx0<δ=min(δf,δg)x:\lvert x-x_0\rvert < \delta=\min(\delta_f, \delta_g), we have h(x)h(x0)=f(x)+g(x)f(x0)g(x0)Def hf(x)f(x0)+g(x)g(x0)Triangle inequalityϵ1,2\begin{alignat*}{2} \lvert h(x) -h(x_0)\rvert &= \lvert f(x) + g(x) - f(x_0) - g(x_0)\rvert &\hspace{6em}& \text{Def } h \\ &\le \lvert f(x)-f(x_0)\rvert + \lvert g(x)-g(x_0)\rvert && \text{Triangle inequality} \\ &\le \epsilon&& \text{\textcircled{1}}, \text{\textcircled{2}} \end{alignat*}  

Theorem. Let the functions ff and gg be continuous at x0x_0. Then h=fgh = f \cdot g is continuous at x0x_0.

Proof. Let ϵ>0\epsilon>0. 1δf:xx0<δf    f(x)f(x0)<ϵ3+ϵ3g(x0)f continuous at x02δg:xx0<δg    g(x)g(x0)<ϵ3+ϵ3f(x0)g continuous at x0\begin{alignat*}{2} \text{\textcircled{1}} \quad \exists \delta_f: \lvert x-x_0\rvert < \delta_f &\implies \lvert f(x) - f(x_0)\rvert < \frac{\sqrt{\epsilon}}{3} + \frac{\epsilon}{3\lvert g(x_0)\rvert} &\hspace{3em}& f \text{ continuous at } x_0 \\ \text{\textcircled{2}} \quad \exists \delta_g: \lvert x-x_0\rvert < \delta_g &\implies \lvert g(x) - g(x_0)\rvert < \frac{\sqrt{\epsilon}}{3} + \frac{\epsilon}{3\lvert f(x_0)\rvert} && g \text{ continuous at } x_0 \end{alignat*} Thus, for all x:xx0<δ=min(δf,δg)x:\lvert x-x_0\rvert < \delta=\min(\delta_f, \delta_g), we have h(x)h(x0)=f(x)g(x)f(x0)g(x0)Def h{f(x)f(x0)}{g(x)g(x0)}+f(x0){g(x)g(x0)}+g(x0){f(x)f(x0)}See earlier proof<ϵ3+ϵ3+ϵ31,2=ϵ\begin{alignat*}{2} \lvert h(x) - h(x_0)\rvert &= \lvert f(x)g(x) - f(x_0)g(x_0)\rvert &\hspace{6em}& \text{Def } h \\ &\le \lvert\{f(x) - f(x_0)\}\{g(x)-g(x_0)\}\rvert \\ &\qquad + \lvert f(x_0)\{g(x)-g(x_0)\}\rvert \\ &\qquad + \lvert g(x_0)\{f(x)-f(x_0)\}\rvert && \text{See earlier proof} \\ &< \tfrac{\epsilon}{3} + \tfrac{\epsilon}{3} + \tfrac{\epsilon}{3} && \text{\textcircled{1}}, \text{\textcircled{2}} \\ &= \epsilon \end{alignat*}  

Theorem. Let the function ff be continuous at x0x_0 and the function gg be continuous at f(x0)f(x_0). Then h(x)=g(f(x))h(x) = g(f(x)) is continuous at x0x_0.

Proof. Let ϵ>0\epsilon>0. 1η:yf(x0)<η    g(y)g(f(x0))<ϵg continuous at f(x0)2δ:xx0<δ    f(x)f(x0)<ηf continuous at x0\begin{alignat*}{2} \text{\textcircled{1}} \qquad \exists \eta: \lvert y-f(x_0)\rvert < \eta &\implies \lvert g(y) - g(f(x_0))\rvert < \epsilon&\hspace{4em}& g \text{ continuous at } f(x_0) \\ \text{\textcircled{2}} \qquad \exists \delta: \lvert x-x_0\rvert < \delta &\implies \lvert f(x) - f(x_0)\rvert < \eta && f \text{ continuous at } x_0 \end{alignat*} Thus, for all x:xx0<δx:\lvert x-x_0\rvert < \delta, we have h(x)h(x0)=g(f(x))g(f(x0))Def h<ϵ2    1\begin{alignat*}{2} \lvert h(x) - h(x_0)\rvert &= \lvert g(f(x)) - g(f(x_0))\rvert &\hspace{4em}& \text{Def } h \\ &< \epsilon&& \text{\textcircled{2}} \implies \text{\textcircled{1}} \end{alignat*}

Exercise: Write an R function n(eps) that returns the smallest NN for which n>N    f(xn)f(x0)<ϵn > N \implies \lvert f(x_n)-f(x_0)\rvert < \epsilon for xn=21/nx_n = 2^{1/n} and f(x)=exf(x) = e^x.

Conceptually, this is a three-part process:

  1. Determine what xnx_n is converging to. Here, xn1x_n \to 1.
  2. Determine the largest value of delta that satisfies e1+δe1<ϵe^{1+\delta} - e^1 < \epsilon.
  3. Determine the smallest value of NN such that 21/n1<δ2^{1/n} - 1 < \delta.
n <- function(eps) {
  delta <- log(eps + exp(1)) - 1
  ceiling(1/log2(1+delta))
}

Let’s test this out and make sure it works:

n(0.01)
## [1] 190
exp(2^(1/189)) - exp(1)  ## 189 not good enough
## [1] 0.01000582
exp(2^(1/190)) - exp(1)  ## 190 within 0.01
## [1] 0.009952969