summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--mathematics/linear-algebra.md118
1 files changed, 105 insertions, 13 deletions
diff --git a/mathematics/linear-algebra.md b/mathematics/linear-algebra.md
index 2645354..ce24abd 100644
--- a/mathematics/linear-algebra.md
+++ b/mathematics/linear-algebra.md
@@ -190,16 +190,20 @@ Theorem: The kernel $N(T)$ and image $R(T)$ are subspaces of $V$ and $W$, respec
<summary>Proof</summary>
We shall denote the zero vector of $V$ and $W$ as $0_v$ and $0_w$, respectively.
-Let $x,y ∈ N(T)$ and $c ∈ F$. As $T(0_v) = 0_w$, $0_v ∈ N(T)$. Then $T(cx + y) = cT(x) + T(y) = 0_w + 0_w = 0_w$, as $x$ and $y$ are in the null space. Hence any linear combination of $x$ and $y$ in the null space is in the null space. So as $N(T) ⊆ V$ by definition, $N(T)$ is a subspace of $V$.
+Let $x,y ∈ N(T)$ and $c ∈ F$. As $T(0_v) = 0_w$, $0_v ∈ N(T)$. Then $T(cx + y) = cT(x) + T(y) = 0_w + 0_w = 0_w$, as $x$ and $y$ are in the null space. Hence any linear combination of $x$ and $y$ in the null space is in the null space. So as $N(T) ⊆ V$ by definition, $N(T)$ is a subspace of $V$. ∎
+
+Let $x,y ∈ R(T)$ and $c ∈ F$. As $T(0_v) = 0_w$, $0_w ∈ R(T)$. Then there exist $v,w ∈ V$ such that $T(v) = x$ and $T(w) = y$. So $T(v + cw) = T(v) + cT(w) = x + cy$. Hence any linear combination of $x$ and $y$ is in the image. So as $R(T) ⊆ W$ by definition, $R(T)$ is a subspace of $W$. ∎
-Let $x,y ∈ R(T)$ and $c ∈ F$. As $T(0_v) = 0_w$, $0_w ∈ R(T)$.
-...
</details>
Theorem: If $β = \\{ v_1, v_2, ... v_n \\}$ is a basis for $V$, then $R(T) = span(\\{ T(v_1), T(v_2), ..., T(v_n) \\})$.
<details markdown="block">
<summary>Proof</summary>
-...
+
+Clearly, $T(v_i) ∈ R(T)$ for all $i$. As $R(T)$ is a subspace of $W$ by the previous theorem, $R(T)$ *contains* $span(\\{ v_1, v_2, ... v_n \\}) = span(T(β))$.
+
+Now consider an arbitrary $w ∈ R(T)$. By definition, there exists a $v ∈ V$ such that $w = T(v)$. As $β$ is a basis for $V$, we can consider $v$ as a linear combination of some basis vectors in $β$. As $T$ is linear, it thus must be the case that $T(v) ∈ span(T(β))$. So $R(T) = span(T(β))$. ∎
+
</details>
For a finite-dimensional $N(T)$ and $R(T)$: the **nullity** and **rank** of $T$ are the dimensions of $N(T)$ and $R(T)$, respectively.
@@ -207,27 +211,47 @@ For a finite-dimensional $N(T)$ and $R(T)$: the **nullity** and **rank** of $T$
**Rank-Nullity Theorem**: If $V$ is *finite-dimensional*, then $dim(V) = nullity(T) + rank(T)$.
<details markdown="block">
<summary>Proof</summary>
+
+Let $dim(V) = n$, let $nullity(T) = k$, and let $β_N$ be a basis for $N(T)$. As $N(T) ⊆ V$, we may extend $β_N$ to a basis $β$ for $V$.
+
+We assert that $S = \\{T(v_{k+1}), T(v_{k+2}), ..., T(v_n) \\}$. We must prove this.
+
...
+
+So $S$ is linearly independent. As it contains $n-k$ linearly independent vectors in $R(T)$, it is a basis for $R(T)$.
+
</details>
Recall that a *function* definitionally maps *each* element of its domain to *exactly* one element of its codomain.
-A function is **injective** (or **one-to-one**) iff each element of its domain maps to a *distinct* element of its codomain.
-A function is **surjective** (or **onto**) iff each element of the codomain is mapped to by *at least* one element in the domain.
-A function is **bijective** iff it is surjective and injective. Necessarily, a bijective function is invertible, which will be formally stated & proven later.
+- A function is **injective** (or **one-to-one**) iff each element of its domain maps to a *distinct* element of its codomain, that is, $f(x) = f(y) → x = y$.
+- A function is **surjective** (or **onto**) iff each element of the codomain is mapped to by *at least* one element in the domain, that is, $R(T) = W$.
+- A function is **bijective** iff it is surjective and injective. Necessarily, a bijective function is invertible, which will be formally stated & proven later.
Theorem: $T$ is injective iff $N(T) = \\{0\\}$.
<details markdown="block">
<summary>Proof</summary>
-...
+
+Suppose that $T$ is injective. Consider some $x ∈ N(T)$. Then $T(x) = 0 = T(0)$. As $T$ is injective, $x = 0$, and so $N(T) = \\{0\\}$. ∎
+
+Now suppose that $N(T) = \\{0\\}$. Consider some $T(x) = T(y)$. Then $T(x) - T(y) = T(x-y) = 0$. So $x-y ∈ N(T) = \\{0\\}$, and so $x-y = 0$ and $x = y$. So $T$ is injective. ∎
+
</details>
Theorem: For $V$ and $W$ of equal (and finite) dimension: $T$ is injective iff it is surjective.
<details markdown="block">
<summary>Proof</summary>
+
+We have that $T$ is injective iff $N(T) = \\{0\\}$ and thus iff $nullity(T) = 0$. So $T$ is injective iff $rank(T) = dim(W)$ and $dim(R(T)) = dim(W)$ from the Rank-Nullity Theorem. $dim(R(T)) = dim(W)$ is equivalent to $R(T) = W$, which is the definition of surjectivity.
+
+</details>
+
+Theorem: Suppose that $V$ has a finite basis $\\{ v_1, v_2, ..., v_n \\}$. For any vectors $w_1, w_2, ... w_n$ in $W$, there exists *exactly* one linear transformation such that $T(v_i) = w_i$ for $i = 1, 2, ..., n$.
+<details markdown="block">
+<summary>Proof</summary>
...
</details>
-Theorem: Suppose that $V$ is finite-dimensional with a basis $\\{ v_1, v_2, ..., v_n \\}$. For any vectors $w_1, w_2, ... w_n$ in $W$, there exists *exactly* one linear transformation such that $T(v_i) = w_i$ for $i = 1, 2, ..., n$.
+Theorem: Suppose that $V$ has a finite basis $\\{ v_1, v_2, ..., v_n \\}$. If $U, T : V → W$ are linear and $U(v_i) = T(v_i)$ for all $i$, then $U = T$.
<details markdown="block">
<summary>Proof</summary>
...
@@ -295,7 +319,13 @@ Theorem: Consider a linear function $T: V → W$.
Theorem: If $T$ is linear and invertible, $T^{-1}$ is linear and invertible.
<details markdown="block">
<summary>Proof</summary>
-...
+
+Let $y_1, y_2 ∈ W$ and $c ∈ F$. As $T$ is bijective, there exist unique vectors $x_1$ and $x_2$ such that $T(x_1) = y_1$ and $T(x_2) = y_2$. So $x_1 = T^{-1}(y_1)$ and $x_2 = T^{-1}(y_2)$. So:
+
+$$T^{-1}(y_1 + cy_2) = T^{-1}[T(x_1) + cT(x_2)] = T^{-1}[T(x_1 + cx_2)] = x_1 + cx_2 = T^{-1}(y_1) + cT^{-1}(y_2)$$
+
+Thus $T^{-1}$ is linear. We may see it is invertible by definition. ∎
+
</details>
$V$ is **isomorphic** to $W$ if there exists an *invertible* linear $T : V → W$ (an **isomorphism**).
@@ -303,14 +333,20 @@ $V$ is **isomorphic** to $W$ if there exists an *invertible* linear $T : V → W
Lemma: For finite-dimensional $V$ and $W$: If $T: V → W$ is invertible, then $dim(V) = dim(W)$.
<details markdown="block">
<summary>Proof</summary>
-...
+
+As $T$ is invertible, it is bijective. So $nullity(T) = 0$, $rank(T) = dim(R(T)) = dim(W)$. So by the Rank-Nullity Theorem $dim(V) = dim(W)$. ∎
+
</details>
Theorem: $V$ is isomorphic to $W$ iff $dim(V) = dim(W)$. Subsequently:
- $V$ is isomorphic to $F^n$ iff $dim(V) = n$.
<details markdown="block">
<summary>Proof</summary>
-...
+
+Suppose that $V$ is isomorphic to $W$ and that $T : V → W$ is an isomorphism. As $T$ is an isomorphism, it is invertible, and so by our earlier lemma $dim(V) = dim(W)$.
+
+Now suppose that $dim(V) = dim(W)$. Let $β = \\{v_1, v_2, ..., v_n\\}$ and $γ = \\{w_1, w_2, ..., w_n\\}$ be bases for $V$ and $W$, respectively. Then as $V$ and $W$ are of equal dimension there must exist $T : V → W$ such that $T$ is linear and $T(v_i) = w_i$ for all $i$. Thus $R(T) = span(T(β)) = span(γ) = W$, and $T$ is surjective. As $V$ and $W$ are of equal dimension, $T$ is also injective. So $T$ is bijective, and so $T$ is an isomorphism. ∎
+
</details>
## Linear Transformations as Matrices
@@ -370,17 +406,71 @@ Theorem: For any $V$ with ordered basis $β$, $ϕ_β$ is an isomorphism.
## The Change of Coordinate Matrix
+- Let $V$ be a finite-dimensional vector space (over a field $F$).
+- Let $B$ and $β'$ be two ordered bases for $V$.
+- Let $T$ be a linear operator on $V$.
+
+Theorem: Let $Q = [I_V]_{β'}^β$. Then $Q$ is invertible, and for any $v ∈ V$, $[v]_β = Q[v]_{β'}$.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
+We call such a matrix $Q = [I_V]_{β'}^β$ a **change of coordinate matrix** and say that $Q$ **changes β'-coordinates into β-coordinates**.
+
+Let $Q = [I_V]_{β'}^β$.
+
+Theorem: $Q^{-1}$ changes β-coordinates into β'-coordinates.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
+Theorem: $[T]_{β'} = Q^{-1} [T]_β Q$
+<details markdown="block">
+<summary>Proof</summary>s
+...
+</details>
+
## Dual Spaces
+The **dual space** of a vector space $V$ is the vector space $\mathcal{L}(V,F)$ and is denoted by $V^*$.
+
+- Let $V, W$ be finite-dimensional vector spaces (over a field $F$).
+- Let $T, U : V → W$ be linear transformations from $V$ to $W$.
+- Let $β$ and $γ$ be ordered bases of $V$ and $W$, respectively.
+
+Theorem: For a finite-dimensional $V$: $dim(V^*) = dim(\mathcal{L}(V,F)) = dim(V) ∙ dim(F) = dim(V)$.
+
+Corollary: $V$ and $V^*$ are isomorphic.
+<details markdown="block">
+<summary>Proof</summary>
+$dim(V) = dim(V^*)$ and so they are isomorphic.
+</details>
+
+The **dual basis** of a basis $β$ (of a vector space $V$) is the ordered basis $β^* = \\{ f_1, f_2, ..., f_n \\}$ of $V^*$ that satisfies the $i$th coordinate function $f_i(x_j) = δ_{ij} (1 ≤ i, j ≤ n)$. Recall that $δ_{ij} = 1$ if $i=j$, and $0$ otherwise.
+
+Theorem: Let $β = \\{ x_1, x_2, ..., x_n \\}$. Let $f_i(1 ≤ i ≤ n)$ be the $i$th coordinate function wrt. $β$, and let $β^* = \\{ f_1, f_2, ..., f_n \\}$. Then $
+
+...
+
+Theorem: For any linear transformation $T : V → W$, the mapping $T^t : W^* → V^*$ defined by $T^t(g) = gT$ for all $g ∈ W^*$ is a linear transformation, and $[T^t]_{γ^*}^{β^*} = ([T]_β^γ)^t$
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
## Homogeneous Linear Differential Equations
## Systems of Linear Equations
This section is mostly review.
+<!--
+
## Determinants
-Let $A \begin{bmatrix} a & b \\ c & d \end{bmatrix}$.
+Let $A \begin{bmatrix} a & b \\\\ c & d \end{bmatrix}$.
We define the **determinant** of any 2 × 2 matrix $A$ to be the scalar $ad - bc$, and denote it $det(A)$ or $|A|$.
@@ -395,3 +485,5 @@ Cramer's Rule: If $det(A) ≠ 0$, then the system $Ax = b$ has a *unique* soluti
</details>
...
+
+-->