summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorJJ2024-04-16 04:40:35 +0000
committerJJ2024-04-16 04:40:35 +0000
commit4f5b2b4b272020353811fc23fc7e94e6dc038e29 (patch)
tree9b5d0a5e75d97fb0d1f5575d7b455e8f817dadd4
parentda6055e7c0044d454bc5cdf7983aa33fe981fc95 (diff)
meow
-rw-r--r--mathematics/linear-algebra.md48
1 files changed, 24 insertions, 24 deletions
diff --git a/mathematics/linear-algebra.md b/mathematics/linear-algebra.md
index f86b1e4..3d58b39 100644
--- a/mathematics/linear-algebra.md
+++ b/mathematics/linear-algebra.md
@@ -51,14 +51,14 @@ Our definition of our vector space leads us to some facts:
<details markdown="block">
<summary>Examples</summary>
-Let $S = \\{(a_1, a_2) | a_1, a_2 ∈ ℝ\\}$.
+Let $S = \\{(a_1, a_2) : a_1, a_2 ∈ ℝ\\}$.
For $(a_1, a_2), (b_1, b_2) ∈ S$ and $c ∈ ℝ$, we define:
- $(a_1, a_2) + (b_1, b_2) = (a_1 + b_1, a_2 - b_2)$
- $c(a_1, a_2) = (ca_1, ca_2)$.
This fails commutativity! It is thus not a vector space.
-Let $S = \\{(a_1, a_2) | a_1, a_2 ∈ ℝ\\}$. We define:
+Let $S = \\{(a_1, a_2) : a_1, a_2 ∈ ℝ\\}$. We define:
- $(a_1, a_2) + (b_1, b_2) = (a_1 + b_1)$
- $c(a_1, a_2) = (ca_1, 0)$
@@ -116,10 +116,10 @@ As long as $a^2 + b^2 ≠ 0$, the inverse of $z = a + bi$ is given by $z^{-1} =
Let $z = a + bi$, where $a, b ∈ ℝ$. The **absolute value** of $z$ is defined as the real number $\sqrt{a^2 + b^2}$.
This absolute value shares many properties. For $z, w ∈ ℂ$:
-- $z \bar{z} = a^2 + b^2 = |z|^2$, where $|z| = \sqrt{a^2 + b^2}$
-- $\frac{z}{w} = \frac{|z|}{|w|}$ where $w ≠ 0$
-- $|z + w| ≤ |z| + |w|$
-- $|z| - |w| ≤ |z + w|$
+- $z \bar{z} = a^2 + b^2 = \|z\|^2$, where $\|z\| = \sqrt{a^2 + b^2}$
+- $\frac{z}{w} = \frac{\|z\|}{\|w\|}$ where $w ≠ 0$
+- $\|z + w\| ≤ \|z\| + \|w\|$
+- $\|z\| - \|w\| ≤ \|z + w\|$
**The Fundamental Theorem of Algebra**:
Consider the polynomial $p(z) = a_n z^n + a_{n-1}z^{n-1} + ... + a_1 z + a_0$ where all $a$ are complex numbers.
@@ -457,7 +457,7 @@ Theorem: Let $β = \\{ x_1, x_2, ..., x_n \\}$. Let $f_i(1 ≤ i ≤ n)$ be the
...
-Theorem: For any linear transformation $T : V → W$, the mapping $T^t : W^* → V^*$ defined by $T^t(g) = gT$ for all $g ∈ W^*$ is a linear transformation, and $[T^t]_{γ^*}^{β^*} = ([T]_β^γ)^t$
+Theorem: For any linear transformation $T : V → W$, the mapping $T^t : W^\* → V^\*$ defined by $T^t(g) = gT$ for all $g ∈ W^\*$ is a linear transformation, and $[T^t]_{γ^\*}^{β^\*} = ([T]_β^γ)^t$
<details markdown="block">
<summary>Proof</summary>
...
@@ -495,23 +495,23 @@ Determinants are important for future sections. We state facts here without proo
Let $A$ be a matrix containing entries from a field $F$.
-The **determinant** of an $n×n$ (square) matrix $A$ is a scalar in $F$, and denoted $ |A| $. The determinant is calculated as follows:
-- For a $1 × 1$ matrix $A = [a]$, $ |A| = a$
-- For a $2 × 2$ matrix $A = \begin{bmatrix} a & b \\\\ c & d \end{bmatrix}$, $|A| = ac - bd$
-- For an $n × n$ matrix $A = \begin{bmatrix} a_{1,1} & a_{1,2} & ... & a_{1,n} \\\\ a_{2,1} & a_{2,2} & ... & a_{2,n} \\\\ ⋮ & ⋮ & ⋱ & ⋮ \\\\ a_{n,1} & a_{n,2} & ... & a_{n,n} \end{bmatrix}$, $|A| = Σ^n_{i=1} (-1)^{i+j} A_{i,j} |A^~_{i,j}|$.
+The **determinant** of an $n×n$ (square) matrix $A$ is a scalar in $F$, and denoted $\|A\|$. The determinant is calculated as follows:
+- For a $1 × 1$ matrix $A = [a]$, $\|A\| = a$
+- For a $2 × 2$ matrix $A = \begin{bmatrix} a & b \\\\ c & d \end{bmatrix}$, $\|A\| = ac - bd$
+- For an $n × n$ matrix $A = \begin{bmatrix} a_{1,1} & a_{1,2} & ... & a_{1,n} \\\\ a_{2,1} & a_{2,2} & ... & a_{2,n} \\\\ ⋮ & ⋮ & ⋱ & ⋮ \\\\ a_{n,1} & a_{n,2} & ... & a_{n,n} \end{bmatrix}$, $\|A\| = Σ^n_{i=1} (-1)^{i+j} A_{i,j} \|A^\~_{i,j}\|$.
The last one deserves some additional exposition: todo
The determinant has a number of nice properties that make it of fair interest.
-1. $|B| = -|A| $ if $B$ is a matrix obtained by exchanging any two rows or columns of $A$
-2. $|B| = k|A| $ if $B$ is a matrix obtained by multiplying $A$ by some scalar $k$
-3. $|B| = |A| $ if $B$ is a matrix obtained by adding a multiple of a column or row to a *different* column or row
-4. $|A| = 1 $ if $A$ is the identity matrix
-5. $|A| = 0 $ if either the rows or columns of $A$ are not linearly independent
-6. $|AB| = |A||B| $ if $A$ and $B$ are both $n × n$ matrices
-7. $|A| ≠ 0 $ iff $A$ is invertible
-8. $|A| = |A^t| $
-9. $|A| = |B| $ if $A$ and $B$ are *similar*
+1. $\|B\| = -\|A\|$ if $B$ is a matrix obtained by exchanging any two rows or columns of $A$
+2. $\|B\| = k\|A\|$ if $B$ is a matrix obtained by multiplying $A$ by some scalar $k$
+3. $\|B\| = \|A\|$ if $B$ is a matrix obtained by adding a multiple of a column or row to a *different* column or row
+4. $\|A\| = 1 $ if $A$ is the identity matrix
+5. $\|A\| = 0 $ if either the rows or columns of $A$ are not linearly independent
+6. $\|AB\| = \|A\|\|B\|$ if $A$ and $B$ are both $n×n$ matrices
+7. $\|A\| ≠ 0 $ iff $A$ is invertible
+8. $\|A\| = \|A^t\|$
+9. $\|A\| = \|B\|$ if $A$ and $B$ are *similar*
Thus, we can say that the determinant *characterizes* square matrices (and thus linear operations), somewhat. It is a scalar value with a deep relation to the core identity of the matrix, and changes regularly as the matrix changes.
@@ -539,7 +539,7 @@ Theorem: $T: V → V$ is diagonalizable if and only if $V$ has an ordered basis
...
</details>
-Corollary: If $T$ is diagonalizable, and $β = \\{v_1, v_2, ..., v_n\\}$ is an ordered basis of eigenvectors of $T$, and $D = [T]_β$, then $D$ is a diagonal matrix with $D_{i,i}$ being the eigenvalue corresponding to $v_n$ for any $i ≤ n$.
+Corollary: If $T$ is diagonalizable, and $β = \\{v_1, v_2, ..., v_n\\}$ is an ordered basis of eigenvectors of $T$, and $D = \[T\]_β$, then $D$ is a diagonal matrix with $D_{i,i}$ being the eigenvalue corresponding to $v_n$ for any $i ≤ n$.
<details markdown="block">
<summary>Proof</summary>
...
@@ -547,15 +547,15 @@ Corollary: If $T$ is diagonalizable, and $β = \\{v_1, v_2, ..., v_n\\}$ is an o
To *diagonalize* a matrix (or a linear operator) is to find a basis of eigenvectors and the corresponding eigenvalues.
-Theorem: A scalar $λ$ is an eigenvalue of $A$ if and only if $|A - λI_n| = 0$, that is, the eigenvalues of a matrix are exactly the zeros of its characteristic polynomial.
+Theorem: A scalar $λ$ is an eigenvalue of $A$ if and only if $\|A - λI_n\| = 0$, that is, the eigenvalues of a matrix are exactly the zeros of its characteristic polynomial.
<details markdown="block">
<summary>Proof</summary>
A scalar $λ$ is an eigenvalue of $A$ iff $∃v ≠ 0 ∈ F^n$: $Av = λv$, that is, $(A - λI_n)(v) = 0$.
... todo
</details>
-The **characteristic polynomial** of $A$ is the polynomial $f(t) = |A - tI_n| $.
-The **characteristic polynomial** of $T$ is the characteristic polynomial of $[T]_β$, often denoted $f(t) = |T - tI| $.
+The **characteristic polynomial** of $A$ is the polynomial $f(t) = \|A - tI_n\|$.
+The **characteristic polynomial** of $T$ is the characteristic polynomial of $[T]_β$, often denoted $f(t) = \|T - tI\|$.
Theorem: The characteristic polynomial of $A$ is a polynomial of degree $n$ with leading coefficient $(-1)^n$.
<details markdown="block">