diff options
author | JJ | 2024-04-16 04:34:05 +0000 |
---|---|---|
committer | JJ | 2024-04-16 04:34:05 +0000 |
commit | da6055e7c0044d454bc5cdf7983aa33fe981fc95 (patch) | |
tree | ca7be2ba1be77bf21bc476e7b8793adef71e602d | |
parent | caa723146c6e478767760cb766701fa2aa173e89 (diff) |
meow
-rw-r--r-- | mathematics/linear-algebra.md | 46 |
1 files changed, 25 insertions, 21 deletions
diff --git a/mathematics/linear-algebra.md b/mathematics/linear-algebra.md index 95e0138..f86b1e4 100644 --- a/mathematics/linear-algebra.md +++ b/mathematics/linear-algebra.md @@ -495,23 +495,23 @@ Determinants are important for future sections. We state facts here without proo Let $A$ be a matrix containing entries from a field $F$. -The **determinant** of an $n × n$ (square) matrix $A$ is a scalar in $F$, and denoted $|A|$. The determinant is calculated as follows: -- For a $1 × 1$ matrix $A = [a]$, $|A| = a$ -- For a $2 × 2$ matrix $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$, $|A| = ac - bd$ -- For an $n × n$ matrix $A = \begin{bmatrix} a_{1,1} & a_{1,2} & ... & a_{1,n} \\ a_{2,1} & a_{2,2} & ... & a_{2,n} \\ ⋮ & ⋮ & ⋱ & ⋮ \\ a_{n,1} & a_{n,2} & ... & a_{n,n} \end{bmatrix}$, $|A| = Σ^n_{i=1} (-1)^{i+j} A_{i,j} |A^~_{i,j}|$. +The **determinant** of an $n×n$ (square) matrix $A$ is a scalar in $F$, and denoted $ |A| $. The determinant is calculated as follows: +- For a $1 × 1$ matrix $A = [a]$, $ |A| = a$ +- For a $2 × 2$ matrix $A = \begin{bmatrix} a & b \\\\ c & d \end{bmatrix}$, $|A| = ac - bd$ +- For an $n × n$ matrix $A = \begin{bmatrix} a_{1,1} & a_{1,2} & ... & a_{1,n} \\\\ a_{2,1} & a_{2,2} & ... & a_{2,n} \\\\ ⋮ & ⋮ & ⋱ & ⋮ \\\\ a_{n,1} & a_{n,2} & ... & a_{n,n} \end{bmatrix}$, $|A| = Σ^n_{i=1} (-1)^{i+j} A_{i,j} |A^~_{i,j}|$. The last one deserves some additional exposition: todo The determinant has a number of nice properties that make it of fair interest. -1. $|B| = -|A|$ if $B$ is a matrix obtained by exchanging any two rows or columns of $A$ -2. $|B| = k|A|$ if $B$ is a matrix obtained by multiplying $A$ by some scalar $k$ -3. $|B| = |A|$ if $B$ is a matrix obtained by adding a multiple of a column or row to a *different* column or row -4. $|A| = 1$ if $A$ is the identity matrix -5. $|A| = 0$ if either the rows or columns of $A$ are not linearly independent -6. $|AB| = |A||B|$ if $A$ and $B$ are both $n × n$ matrices -7. $|A| ≠ 0$ iff $A$ is invertible -8. $|A| = |A^t|$ -9. $|A| = |B|$ if $A$ and $B$ are *similar* +1. $|B| = -|A| $ if $B$ is a matrix obtained by exchanging any two rows or columns of $A$ +2. $|B| = k|A| $ if $B$ is a matrix obtained by multiplying $A$ by some scalar $k$ +3. $|B| = |A| $ if $B$ is a matrix obtained by adding a multiple of a column or row to a *different* column or row +4. $|A| = 1 $ if $A$ is the identity matrix +5. $|A| = 0 $ if either the rows or columns of $A$ are not linearly independent +6. $|AB| = |A||B| $ if $A$ and $B$ are both $n × n$ matrices +7. $|A| ≠ 0 $ iff $A$ is invertible +8. $|A| = |A^t| $ +9. $|A| = |B| $ if $A$ and $B$ are *similar* Thus, we can say that the determinant *characterizes* square matrices (and thus linear operations), somewhat. It is a scalar value with a deep relation to the core identity of the matrix, and changes regularly as the matrix changes. @@ -539,7 +539,7 @@ Theorem: $T: V → V$ is diagonalizable if and only if $V$ has an ordered basis ... </details> -Corollary: If $T$ is diagonalizable, and $β = \{v_1, v_2, ..., v_n\}$ is an ordered basis of eigenvectors of $T$, and $D = [T]_β$, then $D$ is a diagonal matrix with $D_{i,i}$ being the eigenvalue corresponding to $v_n$ for any $i ≤ n$. +Corollary: If $T$ is diagonalizable, and $β = \\{v_1, v_2, ..., v_n\\}$ is an ordered basis of eigenvectors of $T$, and $D = [T]_β$, then $D$ is a diagonal matrix with $D_{i,i}$ being the eigenvalue corresponding to $v_n$ for any $i ≤ n$. <details markdown="block"> <summary>Proof</summary> ... @@ -554,8 +554,8 @@ A scalar $λ$ is an eigenvalue of $A$ iff $∃v ≠ 0 ∈ F^n$: $Av = λv$, that ... todo </details> -The **characteristic polynomial** of $A$ is the polynomial $f(t) = |A - tI_n|$. -The **characteristic polynomial** of $T$ is the characteristic polynomial of $[T]_β$, often denoted $f(t) = |T - tI|$. +The **characteristic polynomial** of $A$ is the polynomial $f(t) = |A - tI_n| $. +The **characteristic polynomial** of $T$ is the characteristic polynomial of $[T]_β$, often denoted $f(t) = |T - tI| $. Theorem: The characteristic polynomial of $A$ is a polynomial of degree $n$ with leading coefficient $(-1)^n$. <details markdown="block"> @@ -569,7 +569,7 @@ Corollary: $A$ has at most $n$ distinct eigenvalues. ... </details> -Theorem: A vector $v ∈ V$ is an eigenvector of $T$ corresponding to an eigenvalue $λ$ iff $v ≠ 0 ∈ N(T - λI)$. +Theorem: A vector $v ∈ V$ is an eigenvector of $T$ corresponding to an eigenvalue $λ$ iff $v ∈ N(T - λI)$ and $v ≠ 0$. <details markdown="block"> <summary>Proof</summary> ... @@ -585,7 +585,7 @@ Theorem: A vector $v ∈ V$ is an eigenvector of $T$ corresponding to an eigenva - Let $λ$ be an eigenvalue of $T$. - Let $λ_1, λ_2, ..., λ_k$ be distinct eigenvalues of $T$. -Theorem: Let $λ_1, λ_2, ..., λ_k$ be distinct eigenvalues of $T$. If $v_1, v_2, ..., v_k$ are eigenvectors of $T$ such that $λ_i$ corresponds to $v_i$ (for all $i ≤ k$), then $\{v_1, v_2, ..., v_k\}$ is linearly independent. In fewer words, eigenvectors with distinct eigenvalues are all linearly independent from one another. +Theorem: Let $λ_1, λ_2, ..., λ_k$ be distinct eigenvalues of $T$. If $v_1, v_2, ..., v_k$ are eigenvectors of $T$ such that $λ_i$ corresponds to $v_i$ (for all $i ≤ k$), then $\\{v_1, v_2, ..., v_k\\}$ is linearly independent. In fewer words, eigenvectors with distinct eigenvalues are all linearly independent from one another. <details markdown="block"> <summary>Proof</summary> ... @@ -608,7 +608,7 @@ Theorem: The characteristic polynomial of any diagonalizable linear operator spl Let $λ$ be an eigenvalue of a linear operator or matrix with characteristic polynomial $f(t)$. The **(algebraic) multiplicity** of $λ$ is the largest positive integer $k$ for which $(t-λ)^k$ is a factor of $f(t)$. -Let $λ$ be an eigenvalue of $T$. The set $E_λ = \{x ∈ V: T(x) = λx\} = N(T - λI_V)$ is called the **eigenspace** of $T$ with respect to the eigenvalue $λ$. Similarly, the **eigenspace** of $A$ is the eigenspace of $L_A$. +Let $λ$ be an eigenvalue of $T$. The set $E_λ = \\{x ∈ V: T(x) = λx\\} = N(T - λI_V)$ is called the **eigenspace** of $T$ with respect to the eigenvalue $λ$. Similarly, the **eigenspace** of $A$ is the eigenspace of $L_A$. Theorem: If $T$ has multiplicity $m$, $1 ≤ dim(E_λ) ≤ m$. <details markdown="block"> @@ -634,15 +634,19 @@ Corollary: If the characteristic polynomial of $T$ splits, $T$ is diagonalizable ... </details> +$T$ is diagonalizable iff both of the following conditions hold: +1. The characteristic polynomial of $T$ splits. +2. The multiplicity of $λ$ equals $n - rank(T-λI)$, where $n = dim(V)$ and for each eigenvalue $λ$ of $T$. + ## Direct Sums - Let $V$ be a finite-dimensional vector space over a field $F$. - Let $T: V → V$ be a linear operator on $V$. - Let $W_1, W_2, ..., W_k$ be subspaces of $V$. -The **sum** of some subspaces $W_i$ (for $1 ≤ i ≤ k$) is the set $\{v_1 + v_2 + ... + v_k : v_i ∈ W_i \}$, denoted $W_1 + W_2 + ... + W_k$ or $Σ^k_{i=1} W_i$. +The **sum** of some subspaces $W_i$ (for $1 ≤ i ≤ k$) is the set $\\{v_1 + v_2 + ... + v_k : v_i ∈ W_i \\}$, denoted $W_1 + W_2 + ... + W_k$ or $Σ^k_{i=1} W_i$. -The subspaces $W_i$ (for $1 ≤ i ≤ k$) form a **direct sum** of $V$, denoted $W_1 ⊕ W_2 ⊕ ... ⊕ W_k$, if $V = Σ^k_{i=1} W_i$ and $W_j ∩ Σ_{i≠j} W_i = \{0\}$ for all $j ≤ k$. +The subspaces $W_i$ (for $1 ≤ i ≤ k$) form a **direct sum** of $V$, denoted $W_1 ⊕ W_2 ⊕ ... ⊕ W_k$, if $V = Σ^k_{i=1} W_i$ and $W_j ∩ Σ_{i≠j} W_i = \\{0\\}$ for all $j ≤ k$. Theorem: The following conditions are equivalent: 1. $V = W_1 ⊕ W_2 ⊕ ... ⊕ W_k$. |