Abstract Linear Algebra notes
Published:
(Abstract) Vector Space
Definition
- A field $\mathbb{F}$ (often $\mathbb{R}, \mathbb{C}$)
- A set $V\ne \emptyset$ together with two operations
- vector addition $+: V \times V\rightarrow V$
- scalar multiplication $\cdot: \mathbb{F}\times V\rightarrow V$
- where the following eight rules are satisfied, is called an $\mathbb{F}-$vector space.
- $(V, +)$ is an abelian group:
- $\mathbf{u}+(\mathbf{v}+\mathbf{w}) = (\mathbf{u}+\mathbf{v})+\mathbf{w}$ (associativity of $+$)
- $\mathbf{v}+\mathbf{0} = \mathbf{v}, \mathbf{0}\in V$ (neutral element)
- $\mathbf{v}+(-\mathbf{v}) = \mathbf{0}, -\mathbf{v} \in V$ (inverse elements)
- $\mathbf{v}+\mathbf{w} = \mathbf{w}+\mathbf{v}$ (commutativity of $+$)
- scalar multiplication is compatible
- $\lambda\cdot (\mu\cdot \mathbf{v}) = (\lambda\cdot \mu)\cdot \mathbf{v}$
- $1\cdot \mathbf{v} =\mathbf{v}, 1\in \mathbb{F}$ (multiplicative unit from the field)
- distributive laws
- $\lambda\cdot (\mathbf{v}+\mathbf{w}) = \lambda\cdot \mathbf{v} + \lambda\cdot \mathbf{w}$
- $(\lambda + \mu)\cdot \mathbf{v} = \lambda \cdot \mathbf{v} + \mu \cdot \mathbf{v}$
- $(V, +)$ is an abelian group:
- Example:
- Function space: a set of functions between two fixed sets.
- $\mathcal{F}(I):=\{f:I\rightarrow \mathbb{R}\}$ defines a real vector space:
- vector addition: $(f+g)(x):= f(x) + g(x)$
- scaling: $(\lambda f)(x):= \lambda\cdot f(x)$
- zero: $f(x) = 0$
- Polynomial space: $\mathcal{P}(\mathbb{R}):=\{p\}$, $p$ polynomials
- $\mathcal{P}(\mathbb{R})\subset \mathcal{F}(\mathbb{R})$, linear subspace
- Function space: a set of functions between two fixed sets.
- $0\cdot \mathbf{v} = \mathbf{0}, (-1)\cdot \mathbf{v} = -\mathbf{v}$
Subspaces
- Linear subspaces: vector space inside another
- $V$ is $\mathbb{F}$-vector space, $U\subseteq V$ is also a $\mathbb{F}$-vector space and a linear subspace of $V$ if
- $\mathbf{0}\in U$
- $\mathbf{u},\mathbf{v}\in U \Rightarrow \mathbf{u}+\mathbf{v}\in U$
- $\mathbf{u}\in U, \lambda\in \mathbb{F}\Rightarrow \lambda\cdot \mathbf{u}\in U$
- Examples: $\mathcal{P}_2(\mathbb{R})$ polynomials with degree $\le 2$, $\mathcal{P}_2(\mathbb{R})\subseteq \mathcal{P}_4(\mathbb{R})\subseteq \mathcal{F}(\mathbb{R})$
- $V$ is $\mathbb{F}$-vector space, $U\subseteq V$ is also a $\mathbb{F}$-vector space and a linear subspace of $V$ if
- Linear Combination $\sum_{j=1}^k\alpha_j v_k$: $v_1,v_2,…,v_k\in V, \alpha_1,\alpha_2,…,\alpha_k\in\mathbb{F}$
- Span: the set of all possible linear combinations with vectors from $M\subseteq V$
- span$(\emptyset):= \{0\}$
- Generating set $M\subseteq V$ of a subspace $U\subseteq V$ if span$(M) = U$.
- Linear Independence: a set $M$ is linear independent if for all $k\in\mathbb{N}$ and $v_j\in M$,
- $\sum_{j=1}^k \alpha_j v_j = 0 \Rightarrow \alpha_j = 0$
- Basis: A set $M\subseteq V$, or an ordered family $M=(v_1,v_2,…,v_k)$, is called a basis of a subspace $U\subseteq V$ if $M$ is generating and linearly independent.
- Dimension: the number of elements in a basis of $U$ (0,1,2,…,$\infty$)
- Example:
- dim($P_n(\mathbb{R})$) = n+1
- dim($\mathcal{F}(\mathbb{R})$) = $\infty$
- dim($\mathbb{C}^{2\times 3}$) = 6, complex matrix
- Example:
- Coordinates: given a basis $B = (b_1,b_2,…,b_n) of $V$, $each $v\in V$ can be uniquely written as $v = \sum_{j=1}^n \alpha_j b_j$ with $\alpha_j\in\mathbb{F}$. $\alpha_j$ are the coordinates of $v$ with respect to $B$.
- Basis Isomorphism: linear and bijective map $\Phi_B: V\rightarrow \mathbb{F}^n$ between abstract vector space $V$ and a concrete vector space $\mathbb{F}^n$. $\Phi_B(b_j) = e_j$ –> canonical unit vector.
- Examples:
- Subspace of the function space $U=\text{span}(\cos,\sin,\exp)$, $2\cos(x)+7\exp(x)\rightarrow (2,0,7)$
- Change of basis:
- $P_2(\mathbb{R})$, $m_0(x) = 1, m_1(x) = x, m_2(x) = x^2$.
- $B = (m_0,m_1,m_2)$: $p=-2m_0+8m_1+3m_2 = \Phi^{-1}_B([-2,8,3])$
- $C = (m_1,m_0,3m_2+8m_1)$: $p=-2m_0+8m_1+3m_2 = \Phi^{-1}_C([0,-2,1])$
- change from $B$ to $C$: $f:\mathbb{F}\rightarrow \mathbb{F} = \Phi_C\circ \Phi_B^{-1}$
- Transformation/transition Matrix: $\Phi^{-1}_B(v_B) = \Phi^{-1}_C(v_C)$ in $V$
- $T_{C\leftarrow B} = \Phi_C\circ \Phi_B^{-1}, v_C=T_{C\leftarrow B} v_B$
- $T_{B\leftarrow C} = \Phi_B\circ \Phi_C^{-1}, v_B=T_{B\leftarrow C} v_C$
- $T_{B\leftarrow C} = T_{B\leftarrow E}T_{E\leftarrow C} = T_{B\leftarrow E}T_{C\leftarrow E}^{-1} $
- $P_2(\mathbb{R})$, $m_0(x) = 1, m_1(x) = x, m_2(x) = x^2$.
Inner Product Space (vector space + inner product)
- Inner Product: $\langle\cdot,\cdot\rangle: V\times V\rightarrow \mathbb{F}$. $\langle u,v\rangle_{standard} = \sum_i \overline{u_i}v_i$
- Positive definite: $\langle x,x\rangle\ge 0 (\in\mathbb{R})$ for all $x\in V$. $\langle x,x\rangle=0\Leftrightarrow x=\mathbf{0}$
- Linear in the second argument:
- $\langle x,y+z\rangle = \langle x,y\rangle +\langle x,z\rangle$
- $\langle x,\lambda\cdot y\rangle = \lambda \langle x,y\rangle$
- conjugate symmetric: $\langle x,y\rangle = \overline{\langle y, x\rangle}$
- Hermitian (self-adjoint) Matrix: $A\in\mathbb{F}^{n\times n}, A^*=A, (a_{ij} = \overline{a_{ji}})$
- $\Leftrightarrow \langle y,Ax\rangle_{standard} = \langle Ay,x\rangle_{standard}$
- diagonal values are real
- all eigenvalues are real
- determinant is real
- sum, inverse of Hermitian is Hermitian
- Positive Definite: A Hermitian $A$ with $\langle x,Ax\rangle > 0$ for all $x\in \mathbb{F}^n/\{0\}$.
- $\langle y,Ax\rangle_{standard}$ defines an inner product in $\mathbb{F}^n$
- Following equivalence:
- All eigenvalues of $A$ are positive
- After Gaussian elimination (with scaling and exchanging rows) only with row operations, all pivots in the row echelon are positive
- The determinants of the leading principal minors of A are positive.
- Gram (Gramian) Matrix: for a set of vectors $(v_i)$, $G_{ij} = \langle v_i,v_j \rangle$, $G = V^\dagger V$
- Hermitian, positive-semidefinite
- Can be used to calculate the projection
- Inner product norm (length)
- An inner product defines a norm $\lVert x\rVert:= \sqrt{\langle x,x\rangle}$
- Cauchy-Schwarz Inequality:
- $|\langle y,x\rangle|\le \lVert x\rVert\lVert y\rVert$
- $|\langle y,x\rangle|= \lVert x\rVert\lVert y\rVert \Leftrightarrow x,y$ linearly dependent.
Orthogonality
- Definition: $x\perp y$ if $\langle x,y\rangle = 0$
- E.g., $p_1(x) = x, p_2(x)=x^2$ in $P([-1,1], \mathbb{F})$ are orthogonal
- Orthogonal complement of $M\subseteq V, M\ne \emptyset: M^\perp := \{x\in V | \langle x,m\rangle = 0, \forall m\in M\}$
- $U\cap U^\perp = \{\mathbf{0}\}$ for subspace $U\subseteq V$
- Orthogonal projection: $U=\text{span}(b_1,…,b_k)\subseteq V$ a k-dimensional subspace. There exists a unique decomposition of $x=p+n, x\in V$ with $p\in U, n\in U^\perp$
- $p$: orthogonal projection of $x$ onto $U$
- $n$: normal component of $x$ with respect to $U$
- Approximation Formula: For $x\in V$ and a k-dimensional subspace $U\subseteq V$, dist$(x,U):= \inf\{x- u | u\in U\} = \lVert x- p\rVert$
- Orthonormal Basis: $B=\{b_1,b_2,…,b_n\}$, k-d $U\subseteq V$
- orthogonal system (OS) if $\langle b_i,b_j\rangle = 0, \forall i\ne j$
- orthonormal system (ONS) if $\langle b_i,b_j\rangle = \delta_{ij}$
- orthogonal basis (OB) if it’s an OS and a basis of $U$
- orthonormal basis (ONB) if it’s an ONS and a basis of $U$
- Gram matrix of ONB: $G(B) = \mathbb{I}^{n\times n}$