Real analysis notes

12 minute read

Published:

Set

  • \(\epsilon\)-neighborhood: \(B_\epsilon(x):=(x-\epsilon, x+\epsilon), \epsilon>0.\)
  • Neighborhood: \(M\subseteq\mathbb{R}\) is called a neighborhood of \(x\) if \(\exists \epsilon>0\) such that \(M\subseteq B_\epsilon(x)\).

    (-2, 2), [-2,2] are not neighborhood of 2.

  • Open: \(M\subseteq\mathbb{R}\) is called open if for all \(x\in M, M\) is a neighborhood of \(x\).
  • Closed: \(M\subseteq\mathbb{R}\) is called closed if \(M^c:=\mathbb{R}/M\) is open.
  • Compact: closed and bounded (Heine-Borel Theorem).
 OpenClosedCompact
\(\mathbb{R}\)xx 
\(\emptyset\)xxx
{5} xx
[a,b] xx
(a,b)x  
(a,b]   
(a,\(\infty)\)x  
[a,\(\infty)\) x 

Sequence

Definition: a map from \(\mathbb{N}\) to \(\mathbb{R}\), \((a_n)_{n\in\mathbb{N}}\).

Convergence

  • Cauchy Sequence: A sequence \((a_n)_{n\in\mathbb{N}}\) is called a Cauchy sequence if \(\forall \epsilon > 0 , \exists N\in \mathbb{N}, \forall n, m \ge N: |a_n - a_m| < \epsilon\).

    Difference between elements is approaching zero.
    Completeness: Cauchy seq \(\Longleftrightarrow\) Convergent seq (sequence of real numbers)

  • Dedekind Completeness:
    • \(M\subseteq\mathbb{R}\) is bounded from above \(\Longleftrightarrow \sup M\in\mathbb{R}\) exists.
    • \(M\subseteq\mathbb{R}\) is bounded from below \(\Longleftrightarrow \inf M\in\mathbb{R}\) exists.
  • Accumulation value (accumulation point, cluster point, partial point): \(a\in\mathbb{R}\) is called an accumulatio point of \((a_n)_{n\in\mathbb{N}}\) if there exists a subsequence
    \((a_{n_k})_{k\in\mathbb{N}} \text{ with } \lim_{k\rightarrow\infty} a_{n_k}=a.\)

    \(\Longleftrightarrow \forall \epsilon>0,\) the \(\epsilon\)-neighborhood of \(a\) contains infinitely many sequence members.
    An improper accumulation value of \(\infty\): not bounded from above.
    An improper accumulation value of \(-\infty\): not bounded from below.

  • Bolzano-Weierstrass Theorem:
    \((a_n)\text{ is bounded } \Longleftrightarrow \text{ has at least one accumulation value (has a convergent subsequence)}.\)
    Can be proved by bisecting the range.
  • Closed set: For all convergent sequences with \(a_n\in A\) for all \(n, \lim_{n\rightarrow\infty} a_n \in A\).
  • Compact set: For all sequences with \(a_n\in A\), there is a convergent subsequence \(a_{n_k}\) with \(\lim_{k\rightarrow\infty} a_{n_k} \in A\).

Limit

  • Limit inferior: the smallest (improper) accumulation value
    \(a = \liminf_{n\rightarrow\infty} a_n = \lim_{n\rightarrow\infty} \inf \{a_k|k>n\}\)
  • Limit superior: the largest (improper) accumulation value
    \(a = \limsup_{n\rightarrow\infty} a_n = \lim_{n\rightarrow\infty} \sup \{a_k|k>n\}\)
    \[\limsup a_n, \liminf a_n \in \mathbb{R}\cup\{-\infty, \infty\} \text{ or }[-\infty, \infty]\]
  • Properties:
    1. convergent: \(\limsup a_n = \liminf a_n \neq \pm\infty\)
    2. divergent to \(\infty\): \(\limsup a_n = \liminf a_n = \infty\)
    3. divergent to \(-\infty\): \(\limsup a_n = \liminf a_n = -\infty\)
    4. Two sequences (RHS defined)
      \(\limsup (a_n+b_n)\le \limsup a_n + \limsup b_n\)
      \(\limsup (a_n b_n)\le \limsup a_n \cdot \limsup b_n, (a_n, b_n > 0)\)
      \(\liminf (a_n+b_n)\ge \liminf a_n + \liminf b_n\)
      \(\liminf (a_n b_n)\ge \liminf a_n \cdot \liminf b_n, (a_n, b_n > 0)\)

Series

Definition: sequence of partial sums \(S_n\) with \(S_n = \sum_{k=1}^n a_k\)

Examples:

  1. Geometric series: \(a_n = q^n, \lim_{n\rightarrow \infty} S_n = \frac{1}{1-q}\), convergent when \(|q|<1\)
  2. Harmonic series: \(a_n = 1/n, \lim_{n\rightarrow \infty} S_n = \infty\), divergent
  3. Exponential: \(a_n = \frac{x^n}{n!}, \lim_{n\rightarrow \infty} S_n = e^x\), convergent
  4. p-Series: \(a_n = 1/n^p, \lim_{n\rightarrow \infty} S_n = 1 + \frac{1}{2^p} + \frac{1}{3^p} + … \), convergent when \(p\ge 1\)(Riemann Zeta Function)

Cauchy Criterion: A series is convergent iff \(\forall \epsilon, \exists N\in \mathbb{N}, \forall n\ge m\ge N: |\sum_{k=m}^n a_k| < \epsilon \).

\(S_n\) is convergent \(\Longleftrightarrow (S_n)\) is a Cauchy Sequence
\(S_n\) is convergent \(\Longrightarrow \lim a_n = 0\)

Absolute Convergence: \(\sum_{k=1}^n |a_k|\) converges.

Cauchy Product: For two series \(\sum_{k=1}^\infty a_k, \sum_{k=1}^\infty b_k\), the series \(\sum_{k=1}^\infty c_k\) with \(c_k = \sum_{l=0}^k a_l b_{k-l}\) is called the Cauchy product.

Theorem: If \(\sum_{k=1}^\infty a_k\) is absolutely convergent and \(\sum_{k=1}^\infty b_k\) is convergent, the Cauchy product is also absolutely convergent.

Testing for Convergence and Divergence

  • Comparison Test: If \(a_n, b_n\) are always positive
    1. If \(a_n \le b_n\) for all \(n\) and \(\sum b_n\) is convergent, then \(\sum a_n\) is convergent.
    2. If \(a_n \ge b_n\) for all \(n\) and \(\sum b_n\) is divergent, then \(\sum a_n\) is divergent.
  • Limit Comparison Test: If \(a_n, b_n\) are always positive and \(\lim_{n\rightarrow\infty} \frac{a_n}{b_n}=c>0\) (and finite), then \(\sum a_n, \sum b_n\) are either both converge or both diverge.
  • Integral Test: If \(f(n)=a_n\) and is continuous, positive, and decreasing on \([0,\infty)\)
    • \(\int_1^\infty f(x)dx\) converges, then \(\sum a_n\) converges.
    • \(\int_1^\infty f(x)dx\) diverges, then \(\sum a_n\) diverges.
  • Ratio Test: \(c = \lim_{n\rightarrow \infty} |\frac{a_{n+1}}{a_n}|\)
    • \(c < 1\): \(\sum a_n\) is absolutely convergent.
    • \(c > 1\): \(\sum a_n\) is divergent.
  • Root Test: \(c = \lim_{n\rightarrow \infty} |\sqrt[n]{a_n}|\)
    • \(c < 1\): \(\sum a_n\) is absolutely convergent.
    • \(c > 1\): \(\sum a_n\) is divergent.
  • Leibniz’s Test (Alternatig Series Test): If \(\sum a_n\) is monotonically decreasing and converges to 0, then \(\sum (-1)^n a_n\) is convergent.

Functions

  • Definition: function \(f:I\rightarrow \mathbb{R}, I\subseteq\mathbb{R}\)

    domain: \(I\)
    codomain: $f[I] = \{f(x)|x\in I\}$

  • Bounded function: codomain is a bounded set

  • Limits: If there is \(c\in\mathbb{R}\) and all sequences $(x_n)\subseteq I/\{x_0\}$ with $\lim_{n\rightarrow \infty} x_n = x_0$, we have $(f(x_n))$ is also convergent with $\lim_{n\rightarrow\infty} f(x_n) = c$, \(\lim_{x\rightarrow x_0} f(x) = c.\)

    Approach from above (right): $\lim_{x\rightarrow x_0^+} f(x) = c, x_n>x_0$
    Approach from below (left): $\lim_{x\rightarrow x_0^-} f(x) = c, x_n<x_0$

Convergence

Sequence of functions: $(f_1,f_2,f_3,…)$, and $f_n:I\rightarrow\mathbb{R}$ for all $n$

For any fixed $\hat{x}\in I$, we get a sequence $(f_1(\hat{x}),f_2(\hat{x}),f_3(\hat{x}),…)$.

  • Point-wise Convergence: a sequence of functions is convergent to $f(\hat{x})$ for all $\hat{x}\in I$. \(\forall \hat{x}\in I, \forall \epsilon>0, \exists N\in\mathbb{N}, \forall n\ge N: |f_n(\hat{x}) - f(\hat{x})|<\epsilon\)

  • Uniform Convergence: \(\forall \epsilon>0, \exists N\in\mathbb{N}, \forall n\ge N, \forall \hat{x}\in I: |f_n(\hat{x}) - f(\hat{x})|<\epsilon\)

    • supremum norm (function distance): $||f-g||_\infty = \sup |f(x) - g(x)|$
    • Uniform convergence: $\lim_{n\rightarrow \infty}|f_n-f|_\infty = 0$
    • Stronger than piece-wise convergences: uniform –> piece-wise

Continuity

Denifition:

  • Limit Theorem: $f$ is called continous at $x_0\in I$ if $\lim_{x\rightarrow x_0} f(x) = f(x_0)$ or if $x_0$ is isolated in $I$ (because no sequence nearby).
  • $f$ is called continuous on $I$ if $f$ is continuous at $x_0$ for all $x_0\in I$.

    Continuity implies: $\lim_{n\rightarrow\infty} f(x_n) = f(\lim_{n\rightarrow\infty} x_n)$.

  • Examples
    1. Polynomial is continuous
    2. Rational functions: $\frac{p_1(x)}{p_2(x)}$ is continuous
    3. Absolute value: continuous
  • $(\epsilon,\delta)$ Theorem: $\forall \epsilon>0, \exists\delta>0, \forall x\in I: |x-x_0|<\delta \Rightarrow|f(x) - f(x_0)|<\epsilon.$

Properties:

  • Addition, multiplication, division: $f, g$ are defined on $I$ and continuous at $x_0\in I$
    • $f+g$, $f\cdot g$, $f/g (\ne 0)$ are continuos at $x_0\in I$
  • Composition: $f:I\rightarrow \mathbb{R}$, $g:J\rightarrow K$, with $K\subseteq I$
    • $g$ continuous at $x_0\in J$, $f$ continuous at $g(x_0)$
    • $f\circ g: J\rightarrow \mathbb{R}$ continous at $x_0$ ($f(g(x_0))$)
  • Compactness: $f$ continous on a compact domain $I\subseteq\mathbb{R}$, then
    • the codomain $f[I]$ is also compact (bounded and closed)
    • there are $x^+, x^-\in I$ with $f(x^+) = \sup\{f(x) | x\in I\}$ and $f(x^-) = \inf\{f(x) | x\in I\}$.

Uniform Limit Theorem: If a sequence of continuous functions uniformly converges to $f$, then $f$ is also continous.

Intermediate Value Theorem: If $f$ is continous on $[a,b]$, and $k$ between $f(a)$ and $f(b)$, then there is $c\in[a,b]$ such that $f(c) = k$.

Examples

  1. Exponential function: $\exp(x) = \sum_{k=0}^\infty \frac{x^k}{k!}$

    continous, strictly monotonically increasing, bijective $\mathbb{R}\rightarrow (0, \infty)$
    $\exp(x+y) = \exp(x)\exp(y)$

  2. Logarithm function: $\log(x)$

    continous, strictly monotonically increasing, bijective $(0, \infty)\rightarrow \mathbb{R}$
    $\log(xy) = \log(x) + \log(y)$
    inverse of exponential

  3. Polynomials of degree $m$: $f(x) = a_mx^m + a_{m-1}x^{m-1} + … + a_1 x + a_0$

    continous $\mathbb{R}\rightarrow \mathbb{R}$

  4. Power series: $f(x) = \sum_{k=0}^\infty a_k x^k$ with a radius of convergence $r$ ($f:(-r,r)\rightarrow \mathbb{R}$)

    infinite polynomials

Differentiability

  • Linear(affine) Function: $g(x) = a_1x + a_0 = m (x-x_0) + c$
    • $c = g(x_0), m = \frac{g(x) - g(x_0)}{x-x_0}$
  • Linear Approximation:
    • slope at $x_0: f’(x_0):= \lim_{x\rightarrow x_0} \frac{f(x) - f(x_0)}{x - x_0} =: \frac{df}{dx}(x_0)$

Definition

$f:I\rightarrow \mathbb{R}$ is differentiable at $x_0$ if

  • there is a function $\Delta_{f, x_0}: I\rightarrow\mathbb{R}$ with $f(x) = f(x_0) + (x-x_0)\Delta_{f, x_0}(x)$ and $\Delta_{f, x_0}$ is continous at $x_0$.
  • $\lim_{x\rightarrow x_0} \frac{f(x) - f(x_0)}{x-x_0}$ exists (continuous)
  • ‘slope’ is continous
  • $f: I\rightarrow\mathbb{R}$ is differentiable if $f$ is differentiable at all $x\in I$. $f’:I\rightarrow \mathbb{R}$ is called derivative of $f$

Properties

  • differentiability at $x_0 \Rightarrow$ continuity
  • $f,g$ differntiable at $x_0$
    • $f+g$ is differentiable at $x_0$ with $(f+g)’ = f’+g’$
    • Product Rule: $f\cdot g$ is differentiable at $x_0$ with $(f+g)’(x_0) = f’(x_0)g(x_0)+g’(x_0)f(x_0)$
    • Chain Rule: composition $(f\circ g)(x) = f(g(x))$ \(\frac{df(g(x))}{dx}\bigg\rvert_{x=x_0} = \frac{df(y)}{dy}\bigg\rvert_{y=g(x_0)}\cdot \frac{dg(x)}{dx}\bigg\rvert_{x=x_0}\)
  • Convergence Theorem: For a sequence of functions $(f_n)$, if
    • $(f_n)$ is point-wisely convergent to $f$
    • $f_n$ is differentiable for all $n$
    • the derivatives uniformly convergent to $g$: $\lim_{n\rightarrow \infty}||f’n-g||\infty = 0$
    • conclusion: $(f_n)$ uniformly convergent to $f$ and $f$ differentiable with $f’=g$.

Derivatives

Power Series: $f(x) = \sum_{k=0}^\infty a_kx^k$ with a radius of convergence $r>0$.

  • $\sum_{k=0}^\infty a_kx^k$ is uniformly convergent on each interval $[-c,c]\subseteq(-r,r)$
  • $\sum_{k=1}^\infty a_kkx^{k-1}$ is uniformly convergent on each interval $[-c,c]\subseteq(-r,r)$
  • $f’(x) = \sum_{k=1}^\infty a_kkx^{k-1}$
  • Example:
    • $\exp(x) = \sum_{k=0}^\infty \frac{x^k}{k!}$, $\exp’(x) = \exp(x)$
    • $\sin(x), \cos(x)$

Inverse Function Theorem: E.g., $\exp(x), \log(x)$

  • $f:I\rightarrow J$ bijective $\Rightarrow g=f^{-1}:J\rightarrow I$ exists
  • $f$ is differentiable at $x_0\in I$ with $f’(x_0)\ne 0$
  • $g = f^{-1}$ is continuous at $y_0 = f(x_0)$
  • conclusion: $g = f^{-1}$ is differentiable at $y_0$ with \(g'(y_0) = \frac{1}{f'(g(y_0))}\)

Local Extreme: $f$ has a local minimum/maximum at $x_0$ if there is a neighborhood $U\subseteq I $ of $x_0$ with $f(x_0) = \min\{f(x)| x\in U \}$

  • $f$ is differentiable at $x_0\in (a,b)$ and $x_0$ is a local extremum, then $f’(x_0) = 0$.
  • Rolle’s Theorem: $f:[a,b]\rightarrow \mathbb{R}$ differentiable and $f(a) = f(b)$. Then, there is $\hat{x}\in(a,b)$ with $f’(\hat{x}) = 0$.

Mean Value Theorem: $f:[a,b]\rightarrow \mathbb{R}$ differentiable. Then there exists $\hat{x}\in(a,b)$ with $f’(\hat{x}) = \frac{f(b) - f(a)}{b - a}$.

  • Extended Mean Value Theorem: $f,g:[a,b]\rightarrow \mathbb{R}$ differentiable and $g’(x)\ne 0$ for all $x\in (a,b)$. Then there exists $\hat{x}\in(a,b)$ with $\frac{f(b) - f(a)}{g(b) - g(a)} = \frac{f’(\hat{x})}{g’(\hat{x})}$.

L’Hospital’s Rule: $f,g$ defined on an open interval $I$

  • $f,g$ differentiable on $I/\{c\}$
  • $\lim_{x\rightarrow c} f(c) = \lim_{x\rightarrow c} g(c) = 0$ or $\pm \infty$ (indeterminacy)
  • $g’(x)\ne 0$ for all $x\ne c$.
  • $\lim_{x\rightarrow c} \frac{f’(x)}{g’(x)}$ exists
  • conclusion: $\lim_{x\rightarrow c} \frac{f(x)}{g(x)}= \lim_{x\rightarrow c} \frac{f’(x)}{g’(x)}$

Higher-order Derivatives: Let $f^{(0)} = f$ and define $f^{(n)}:=(f^{(n-1)})’ = \frac{d^n f}{dx^n}$ for $n>0$.

  • n-times differentiable if $f^{(n)}$ exists.
  • n-times continuously differentiable if $f^{(n)}$ exists and is continous ($C^n(I)$).
  • $\infty-$times differentiable: $f^{(n)}$ exists for all $n$ ($C^\infty(I)$)
  • $C(I)\subseteq C^1(I)\subseteq C^2(I)\subseteq …\subseteq C^\infty(I)$
  • $f’(c) = 0, f’'(c)>0 \Rightarrow$ c is local minimum
  • $f’(c) = 0, f’'(c)<0 \Rightarrow$ c is local maximum

Taylor’s Expansion

  • Linear approximation: $f(x_0+h) = f(x_0) + f’(x_0)h + r(h)h$
  • Quadratic approximation: $f(x_0+h) = f(x_0) + f’(x_0)h + \frac{1}{2}f’'(x_0)h^2 + r(h)h^2$
  • Taylor’s Theorem: $f$ is n-differentiable. $x_0, x_0+h\in I$
    • $f(x_0+h) = \sum_{k=0}^n \frac{f^{(k)}}{k!} h^k + R_n(h)$
    • there is $\xi\in (x_0-|h|, x_0+|h|)$ such that $R_n(h) = \frac{f^{(n+1)}(\xi)}{(n+1)!} h^{n+1}$ ($O(h^{n+1})$)
  • Example approximation:
    • $\log(1+x) \approx x - x^2/2 + x^3/3$

Integral

Riemann Integral

  • Partition: a partition of $[a,b]$ is a set $\{x_0,x_1,x_2,…,x_n\}$ with $ a = x_0 < x_1<x_2<…<x_n = b$.
  • Step function: $\Phi$ on $[a,b]$ is a step function if it is piecewisely constant.
    • there is a partition of $[a,b]$, $\{x_0,x_1,x_2,…,x_n\}$, and there are numbers $C_1,C_2, …, C_n\in \mathbb{R}$ such that $\Phi(x) = C_j$ for $x\in(x_{j-1}, x_j)$.
  • Riemann Integral for step function $\int_a^b\phi(x)dx$
    • well-defined: (the exact partition doesn’t matter)
    • linear: $\int_a^b c_1\phi(x)+c_2\psi(x)dx = c_1\int_a^b \phi(x)dx + c_2\int_a^b \psi(x)dx$
    • monotonic: $\phi\le \psi\Rightarrow \int_a^b \phi(x)dx \le \int_a^b \psi(x)dx$
  • Riemann integrable: a bounded function $f:[a,b]\rightarrow \mathbb{R}$ with \(\sup\{\int_a^b \phi(x)dx | \phi \le f\} = \inf\{\int_a^b \phi(x)dx | \phi \ge f\}\)
  • Examples
    • Dirichlet function $f:\mathbb{R}\rightarrow \{0,1\}$, not integrable
    • Continous function on compact support: integrable

Fundamentals of Calculus

  • Antiderivative: $f$ be a continuous function, and $F$ a differentiable function on $I$ and $F’=f$.
  • 1st Fundamental Theorem: $F(x):= \int_a^x f(t)dt$ is differentiable and an antiderivative of $f: F’=f$.
  • 2nd Fundamental Theorem: $\int_a^b f(t)dt = F(b) - F(a)$
  • Mean Value Theorem: $f,g:[a,b]\rightarrow \mathbb{R}$ continous, $g\ge 0$.
    • there is $\hat{x}\in [a,b]$ with $\int_a^bf(x)g(x)dx = f(\hat{x})\int_a^b g(x)dx$
    • when $g=1$: $f(\hat{x}) = \frac{1}{b - a}\int_a^b f(x)dx $

Integration Rules

  • Integration by substitution: $f:I\rightarrow \mathbb{R}, \phi: [a,b]\rightarrow I$ \(\int_a^b f(\phi(t))\phi'(t)dt = \int_{\phi(a)}^{\phi(b)}f(x)dx\)
  • Integration by parts: $f,g: I\rightarrow \mathbb{R}$ continously differentiable \(\int_a^b f'(x)g(x)dx = f(x)g(x)\bigg\rvert_{x=a}^{x=b} - \int_a^b f(x)g'(x)dx\)
  • Partial fraction decomposition: $f(x) = \frac{p(x)}{q(x)}$ is a rational function with deg(p) < deg(q) = n.
    • $k\le n$ different zeros: $x_1,x_2,…,x_k$ with multiplicities $\alpha_1,\alpha_2,…,\alpha_k$
    • decompose into sum of $\frac{A_i}{(x-x_i)^\alpha}$

Improper Integrals

  • Comparison Test for Integrals: $f,g: [a,\infty)\rightarrow \mathbb{R}$ with $g(x)\ge 0$
    • If $|f(x)|\le g(x)$ and $\int_a^\infty g(x)dx$ converges $\Rightarrow \int_a^\infty f(x)dx$ converges
    • if $g(x)\le f(x)$ and $\int_a^\infty g(x)dx$ diverges $\Rightarrow \int_a^\infty f(x)dx$ diverges
  • Unbounded functions: $f:[a,b]/\{p\}\rightarrow \mathbb{R}$
    • $\int_a^bf(x)dx = \lim_{\epsilon\rightarrow 0^-}\int_a^{p-\epsilon} f(x)dx+ \lim_{\epsilon\rightarrow 0^+}\int_{p+\epsilon}^b f(x)dx$
  • Cauchy Principal Value:
    • $\int_a^bf(x)dx = \lim_{\epsilon\rightarrow 0^+}\left(\int_a^{p-\epsilon} f(x)dx+\int_{p+\epsilon}^b f(x)dx\right)$