From caa723146c6e478767760cb766701fa2aa173e89 Mon Sep 17 00:00:00 2001 From: JJ Date: Mon, 15 Apr 2024 21:20:32 -0700 Subject: meow --- mathematics/linear-algebra.md | 185 ++++++++++++++++++++++++++++++++++++++++-- 1 file changed, 179 insertions(+), 6 deletions(-) (limited to 'mathematics/linear-algebra.md') diff --git a/mathematics/linear-algebra.md b/mathematics/linear-algebra.md index ce24abd..95e0138 100644 --- a/mathematics/linear-algebra.md +++ b/mathematics/linear-algebra.md @@ -241,7 +241,7 @@ Theorem: For $V$ and $W$ of equal (and finite) dimension: $T$ is injective iff i
Proof -We have that $T$ is injective iff $N(T) = \\{0\\}$ and thus iff $nullity(T) = 0$. So $T$ is injective iff $rank(T) = dim(W)$ and $dim(R(T)) = dim(W)$ from the Rank-Nullity Theorem. $dim(R(T)) = dim(W)$ is equivalent to $R(T) = W$, which is the definition of surjectivity. +We have that $T$ is injective iff $N(T) = \\{0\\}$ and thus iff $nullity(T) = 0$. So $T$ is injective iff $rank(T) = dim(R(T)) = dim(W)$, from the Rank-Nullity Theorem. $dim(R(T)) = dim(W)$ is equivalent to $R(T) = W$, which is the definition of surjectivity. So $T$ is injective iff it is surjective. ∎
@@ -282,8 +282,11 @@ Theorem: Let $T : V → W$ and $U : W → Z$ be linear. Then their composition $ Let $x,y ∈ V$ and $c ∈ F$. Then: $$UT(cx + y)$$ + $$= U(T(cx + y)) = U(cT(x) + T(y))$$ + $$= cU(T(x)) + U(T(y)) = c(UT)(x) + UT(y)$$ + Theorem: Let $T, U_1, U_2 ∈ \mathcal{L}(V)$, and $a ∈ F$. Then: @@ -460,7 +463,7 @@ Theorem: For any linear transformation $T : V → W$, the mapping $T^t : W^* → ... -## Homogeneous Linear Differential Equations + ## Systems of Linear Equations @@ -470,10 +473,6 @@ This section is mostly review. ## Determinants -Let $A \begin{bmatrix} a & b \\\\ c & d \end{bmatrix}$. - -We define the **determinant** of any 2 × 2 matrix $A$ to be the scalar $ad - bc$, and denote it $det(A)$ or $|A|$. - ... Let $Ax = b$ be the matrix form of a system of $n$ linear equations in $n$ unknowns (where $x = (x_1, x_2, ..., x_n)^t$). @@ -487,3 +486,177 @@ Cramer's Rule: If $det(A) ≠ 0$, then the system $Ax = b$ has a *unique* soluti ... --> + +--- + +## Determinants, summarized + +Determinants are important for future sections. We state facts here without proof. + +Let $A$ be a matrix containing entries from a field $F$. + +The **determinant** of an $n × n$ (square) matrix $A$ is a scalar in $F$, and denoted $|A|$. The determinant is calculated as follows: +- For a $1 × 1$ matrix $A = [a]$, $|A| = a$ +- For a $2 × 2$ matrix $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$, $|A| = ac - bd$ +- For an $n × n$ matrix $A = \begin{bmatrix} a_{1,1} & a_{1,2} & ... & a_{1,n} \\ a_{2,1} & a_{2,2} & ... & a_{2,n} \\ ⋮ & ⋮ & ⋱ & ⋮ \\ a_{n,1} & a_{n,2} & ... & a_{n,n} \end{bmatrix}$, $|A| = Σ^n_{i=1} (-1)^{i+j} A_{i,j} |A^~_{i,j}|$. + +The last one deserves some additional exposition: todo + +The determinant has a number of nice properties that make it of fair interest. +1. $|B| = -|A|$ if $B$ is a matrix obtained by exchanging any two rows or columns of $A$ +2. $|B| = k|A|$ if $B$ is a matrix obtained by multiplying $A$ by some scalar $k$ +3. $|B| = |A|$ if $B$ is a matrix obtained by adding a multiple of a column or row to a *different* column or row +4. $|A| = 1$ if $A$ is the identity matrix +5. $|A| = 0$ if either the rows or columns of $A$ are not linearly independent +6. $|AB| = |A||B|$ if $A$ and $B$ are both $n × n$ matrices +7. $|A| ≠ 0$ iff $A$ is invertible +8. $|A| = |A^t|$ +9. $|A| = |B|$ if $A$ and $B$ are *similar* + +Thus, we can say that the determinant *characterizes* square matrices (and thus linear operations), somewhat. It is a scalar value with a deep relation to the core identity of the matrix, and changes regularly as the matrix changes. + +## Eigenvalues and Eigenvectors + +- Let $V$ be a finite-dimensional vector space over a field $F$. +- Let $T: V → V$ be a linear operator on $V$. +- Let $β$ be an ordered basis on $V$. +- Let $A$ be in $M_{n×n}(F)$ (a square $n×n$ matrix with entries in $F$). + +$T$ is **diagonalizable** if there exists an ordered basis $β$ for $V$ such that $[T]_β$ is a *diagonal matrix*. +$A$ is **diagonalizable** if $L_A$ is diagonalizable. + +A nonzero vector $v ∈ V$ is an **eigenvector** if $∃λ ∈ F$: $T(v) = λv$. +The corresponding scalar $λ$ is the **eigenvalue** corresponding to the eigenvector $v$. +A nonzero vector $v ∈ F^n$ is an **eigenvector** of $A$ if $v$ is an eigenvector of $L_A$ (that is, $∃λ ∈ F$: $Av = λv$) +The corresponding scalar $λ$ is the **eigenvalue** of $A$ corresponding to the eigenvector $v$. + +The terms *characteristic vector* and *proper vector* are also used in place of *eigenvector*. +The terms *characteristic value* and *proper value* are also used in place of *eigenvalue*. + +Theorem: $T: V → V$ is diagonalizable if and only if $V$ has an ordered basis $β$ consisting of eigenvectors of $T$. +
+Proof +... +
+ +Corollary: If $T$ is diagonalizable, and $β = \{v_1, v_2, ..., v_n\}$ is an ordered basis of eigenvectors of $T$, and $D = [T]_β$, then $D$ is a diagonal matrix with $D_{i,i}$ being the eigenvalue corresponding to $v_n$ for any $i ≤ n$. +
+Proof +... +
+ +To *diagonalize* a matrix (or a linear operator) is to find a basis of eigenvectors and the corresponding eigenvalues. + +Theorem: A scalar $λ$ is an eigenvalue of $A$ if and only if $|A - λI_n| = 0$, that is, the eigenvalues of a matrix are exactly the zeros of its characteristic polynomial. +
+Proof +A scalar $λ$ is an eigenvalue of $A$ iff $∃v ≠ 0 ∈ F^n$: $Av = λv$, that is, $(A - λI_n)(v) = 0$. +... todo +
+ +The **characteristic polynomial** of $A$ is the polynomial $f(t) = |A - tI_n|$. +The **characteristic polynomial** of $T$ is the characteristic polynomial of $[T]_β$, often denoted $f(t) = |T - tI|$. + +Theorem: The characteristic polynomial of $A$ is a polynomial of degree $n$ with leading coefficient $(-1)^n$. +
+Proof +... +
+ +Corollary: $A$ has at most $n$ distinct eigenvalues. +
+Proof +... +
+ +Theorem: A vector $v ∈ V$ is an eigenvector of $T$ corresponding to an eigenvalue $λ$ iff $v ≠ 0 ∈ N(T - λI)$. +
+Proof +... +
+ + + +## Diagonalizability + +- Let $V$ be a finite-dimensional vector space over a field $F$. +- Let $T: V → V$ be a linear operator on $V$. +- Let $A$ be in $M_{n×n}(F)$ (a square $n×n$ matrix with entries in $F$). +- Let $λ$ be an eigenvalue of $T$. +- Let $λ_1, λ_2, ..., λ_k$ be distinct eigenvalues of $T$. + +Theorem: Let $λ_1, λ_2, ..., λ_k$ be distinct eigenvalues of $T$. If $v_1, v_2, ..., v_k$ are eigenvectors of $T$ such that $λ_i$ corresponds to $v_i$ (for all $i ≤ k$), then $\{v_1, v_2, ..., v_k\}$ is linearly independent. In fewer words, eigenvectors with distinct eigenvalues are all linearly independent from one another. +
+Proof +... +
+ + +Corollary: If $T$ has $n$ distinct eigenvalues, where $n = dim(V)$, then $T$ is diagonalizable. +
+Proof +... +
+ +A polynomial $f(t) ∈ P(F)$ **splits over** $F$ if there are scalars $c, a_1, a_2, ..., a_n ∈ F$: $f(t) = c(a_1 - t)(a_2 - t)...(a_n - t)$. + +Theorem: The characteristic polynomial of any diagonalizable linear operator splits. +
+Proof +... +
+ +Let $λ$ be an eigenvalue of a linear operator or matrix with characteristic polynomial $f(t)$. The **(algebraic) multiplicity** of $λ$ is the largest positive integer $k$ for which $(t-λ)^k$ is a factor of $f(t)$. + +Let $λ$ be an eigenvalue of $T$. The set $E_λ = \{x ∈ V: T(x) = λx\} = N(T - λI_V)$ is called the **eigenspace** of $T$ with respect to the eigenvalue $λ$. Similarly, the **eigenspace** of $A$ is the eigenspace of $L_A$. + +Theorem: If $T$ has multiplicity $m$, $1 ≤ dim(E_λ) ≤ m$. +
+Proof +... +
+ +Lemma: Let $S_i$ be a finite linearly independent subset of the eigenspace $E_{λ_i}$ (for all $i ≤ k$). Then $S = S_1 ∪ S_2 ∪ ... ∪ S_k$ is a linearly independent subset of $V$. +
+Proof +... +
+ +Theorem: If the characteristic polynomial of $T$ splits, then $T$ is diagonalizable iff the multiplicity of $λ_i$ is equal to $dim(E_{λ_i})$ (for all $i ≤ k$). +
+Proof +... +
+ +Corollary: If the characteristic polynomial of $T$ splits, $T$ is diagonalizable, $β_i$ is an ordered basis for $E_{λ_i}$ (for all $i ≤ k$), then $β = β_1 ∪ β_2 ∪ ... ∪ β_k$ is an ordered basis for $V$ consisting of eigenvectors of $T$. +
+Proof +... +
+ +## Direct Sums + +- Let $V$ be a finite-dimensional vector space over a field $F$. +- Let $T: V → V$ be a linear operator on $V$. +- Let $W_1, W_2, ..., W_k$ be subspaces of $V$. + +The **sum** of some subspaces $W_i$ (for $1 ≤ i ≤ k$) is the set $\{v_1 + v_2 + ... + v_k : v_i ∈ W_i \}$, denoted $W_1 + W_2 + ... + W_k$ or $Σ^k_{i=1} W_i$. + +The subspaces $W_i$ (for $1 ≤ i ≤ k$) form a **direct sum** of $V$, denoted $W_1 ⊕ W_2 ⊕ ... ⊕ W_k$, if $V = Σ^k_{i=1} W_i$ and $W_j ∩ Σ_{i≠j} W_i = \{0\}$ for all $j ≤ k$. + +Theorem: The following conditions are equivalent: +1. $V = W_1 ⊕ W_2 ⊕ ... ⊕ W_k$. +2. $V = Σ^k_{i=1} W_i$ and ??? todo +3. Every vector $v ∈ V$ can be uniquely written as $v = v_1 + v_2 + ... + v_k$ where $v_i ∈ W_i$. +4. If $γ_i$ is an ordered basis for $W_i$ (for $1 ≤ i ≤ k$), then $γ_1 ∪ γ_2 ∪ ... ∪ γ_k$ is an ordered basis for $V$. +5. There exists an ordered basis $γ_i$ for $W_i$ for every $1 ≤ i ≤ k$ such that $γ_i ∪ γ_2 ∪ ... γ_k$ is an ordered basis for $V$. +
+Proof +... +
+ +Theorem: $T: V → V$ is diagonalizable if and only if $V$ is the direct sum of the eigenspaces of $T$. +
+Proof +... +
-- cgit v1.2.3-70-g09d2