summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorJJ2024-03-15 01:00:35 +0000
committerJJ2024-03-15 01:00:35 +0000
commit5640e35821ec63f7c1af402101f527ce27992ee7 (patch)
tree7db96197df9f462e10fe4a50710a18b0e22dc67a
parent6baad4c5d183c19068f4205f5f95d3ac4fa159cb (diff)
meow
-rw-r--r--mathematics/linear-algebra.md132
1 files changed, 79 insertions, 53 deletions
diff --git a/mathematics/linear-algebra.md b/mathematics/linear-algebra.md
index 4e1162b..7b9cc27 100644
--- a/mathematics/linear-algebra.md
+++ b/mathematics/linear-algebra.md
@@ -90,8 +90,8 @@ Example: Let $S = \\{(a_1, a_2) | a_1, a_2 ∈ ℝ\\}$. We define:
- fails zero!
A subset $W$ of a vector space $V$ over a field 𝔽 is called a **subspace** of $V$ if $W$ is a *vector space* over 𝔽 with the operations of addition and scalar multiplication from $V$.
--
--
+- .
+- .
A subset of $V$ is a **subspace** of V iff:
- the subset is non-empty
@@ -155,11 +155,12 @@ Subsequently:
Let $T: V → W$ be a linear transformation.
-The **kernel** (or null space) $N(T)$ of $T$ is the set of all vectors in $V$ such that $T(x) = 0$: $N(T) = \\{ x ∈ V : T(x) = 0 \\}$.
-The **image** (or range) $R(T)$ of $T$ is the subset of $W$ consisting of all images (under $T$) of elements of $V$: $R(T) = \\{ T(x) : x ∈ V \\}$
+The **kernel** (or null space) $N(T)$ of $T$ is the set of all vectors in $V$ such that $T(x) = 0$: that is, $N(T) = \\{ x ∈ V : T(x) = 0 \\}$.
+
+The **image** (or range) $R(T)$ of $T$ is the subset of $W$ consisting of all images (under $T$) of elements of $V$: that is, $R(T) = \\{ T(x) : x ∈ V \\}$.
Theorem: The kernel $N(T)$ and image $R(T)$ are subspaces of $V$ and $W$, respectively.
-<details>
+<details markdown="block">
<summary>Proof</summary>
We shall denote the zero vector of $V$ and $W$ as $0_v$ and $0_w$, respectively.
@@ -170,15 +171,15 @@ Let $x,y ∈ R(T)$ and $c ∈ F$. As $T(0_v) = 0_w$, $0_w ∈ R(T)$.
</details>
Theorem: If $β = \\{ v_1, v_2, ... v_n \\}$ is a basis for $V$, then $R(T) = span(\\{ T(v_1), T(v_2), ..., T(v_n) \\})$.
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
</details>
-If $N(T)$ and $R(T)$ are finite-dimensional, then the **nullity** and **rank** of T are the dimensions of $N(T)$ and $R(T)$, respectively.
+For a finite-dimensional $N(T)$ and $R(T)$: the **nullity** and **rank** of $T$ are the dimensions of $N(T)$ and $R(T)$, respectively.
-Rank-Nullity Theorem: If $V$ is *finite-dimensional*, then $dim(V) = nullity(T) + rank(T)$.
-<details>
+**Rank-Nullity Theorem**: If $V$ is *finite-dimensional*, then $dim(V) = nullity(T) + rank(T)$.
+<details markdown="block">
<summary>Proof</summary>
...
</details>
@@ -189,19 +190,19 @@ A function is **surjective** (or **onto**) iff each element of the codomain is m
A function is **bijective** iff it is surjective and injective. Necessarily, a bijective function is invertible, which will be formally stated & proven later.
Theorem: $T$ is injective iff $N(T) = \\{0\\}$.
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
</details>
Theorem: For $V$ and $W$ of equal (and finite) dimension: $T$ is injective iff it is surjective.
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
</details>
Theorem: Suppose that $V$ is finite-dimensional with a basis $\\{ v_1, v_2, ..., v_n \\}$. For any vectors $w_1, w_2, ... w_n$ in $W$, there exists *exactly* one linear transformation such that $T(v_i) = w_i$ for $i = 1, 2, ..., n$.
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
</details>
@@ -210,8 +211,14 @@ Theorem: Suppose that $V$ is finite-dimensional with a basis $\\{ v_1, v_2, ...,
Let $V$, $W$, and $Z$ be vector spaces.
+Theorem: Let $T, U : V → W$ be linear. Then their sum and scalar products are linear.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
Theorem: The set of all linear transformations (via our definitions of addition and scalar multiplication above) $V → W$ forms a vector space over $F$. We denote this as $\mathcal{L}(V, W)$. If $V = W$, we write $\mathcal{L}(V)$.
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
</details>
@@ -219,7 +226,7 @@ Theorem: The set of all linear transformations (via our definitions of addition
Let $T, U : V → W$ be arbitrary functions. We define **addition** $T + U : V → W$ as $∀x ∈ V : (T + U)(x) = T(x) + U(x)$, and **scalar multiplication** $aT : V → W$ as $∀x ∈ V : (aT)(x) = aT(x)$ for all $a ∈ F$.
Theorem: Let $T : V → W$ and $U : W → Z$ be linear. Then their composition $UT : V → Z$ is linear.
-<details>
+<details markdown="block">
<summary>Proof</summary>
Let $x,y ∈ V$ and $c ∈ F$. Then:
@@ -233,15 +240,54 @@ Theorem: Let $T, U_1, U_2 ∈ \mathcal{L}(V)$. Then:
- $T(U_1 U_2) = (TU_1) U_2$
- $TI = IT = T$
- $∀a ∈ F : a(U_1 U_2) = (aU_1) U_2 = U_1 (aU_2)$
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
<!-- A more general result holds for linear transformations with domains unequal to their codomains, exercise 7 -->
</details>
+## Invertibility and Isomorphism
+
+- Let $V$ and $W$ be vector spaces.
+- Let $T: V → W$ be a linear transformation.
+- Let $I_V: V → V$ and $I_W: W → W$ denote the identity transformations within $V$ and $W$, respectively.
+
+A function $U: W → V$ is an **inverse** of $T$ if $TU = I_W$ and $UT = I_V$.<br>
+If $T$ has an inverse, then $T$ is **invertible**.
+
+Theorem: Consider a linear function $T: V → W$.
+- If $T$ is invertible, it has a *unique* inverse $T^{-1}$.
+- If $T$ is invertible, $T^{-1}$ is invertible with the inverse $T$.
+- A function is invertible if and only iff it is bijective.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
+Theorem: If $T$ is linear and invertible, $T^{-1}$ is linear and invertible.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
+$V$ is **isomorphic** to $W$ if there exists an *invertible* linear $T : V → W$ (an **isomorphism**).
+
+Lemma: For finite-dimensional $V$ and $W$: If $T: V → W$ is invertible, then $dim(V) = dim(W)$.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
+Theorem: $V$ is isomorphic to $W$ iff $dim(V) = dim(W)$.
+- Subsequently, $V$ is isomorphic to $F^n$ iff $dim(V) = n$.
+<details markdown="block">
+<summary>Proof</summary>
+...
+</details>
+
## Linear Transformations as Matrices
-- Let $V, W$ be finite-dimensional vector spaces.
+- Let $V, W$ be finite-dimensional vector spaces (over a field $F$).
- Let $T, U : V → W$ be linear transformations from $V$ to $W$.
- Let $β$ and $γ$ be ordered bases of $V$ and $W$, respectively.
- Let $a ∈ F$ be a scalar.
@@ -255,61 +301,41 @@ Let $a_1, a_2, ... a_n$ be the unique scalars such that $x = Σ_{i=1}^n a_i u_i$
The $m × n$ matrix $A$ defined by $A_{ij} = a_{ij}$ is called the **matrix representation of $T$ in the ordered bases $β$ and $γ$**, and denoted as $A = [T]_β^γ$. If $V = W$ and $β = γ$, we write $A = [T]_β$.
Theorem: $[T + U]_β^γ = [T]_β^γ + [U]_β^γ$ and $[aT]_β^γ = a[T]_β^γ$.
-<details>
+<details markdown="block">
<summary>Proof</summary>
...
</details>
---
-## Invertibility and Isomorphism
-
-Let $V$ and $W$ be vector spaces.
-Let $T: U → V$ be a linear transformation.
-Let $I_V: V → V$ and $I_W: W → W$ denote the identity transformations within $V$ and $W$, respectively.
-
-A function $U: W → V$ is an **inverse** of $T$ if $TU = I_W$ and $UT = I_V$. If $T$ has an inverse, then $T$ is **invertible**.
+Let $A$ be a $n × n$ matrix. Then $A$ is **invertible** iff there exists an $n × n$ matrix $B$ such that $AB = BA = I$.
-Theorem: Consider a linear function $T: V → W$.
-- If $T$ is invertible, it has a *unique* inverse $T^{-1}$.
-- If $T$ is invertible, $T^{-1}$ is invertible with the inverse $T$.
-- A function is invertible if and only iff it is bijective.
-<details>
+Theorem: If $A$ is invertible, the matrix $B$ is unique, and denoted $A^{-1}$.
+<details markdown="block">
<summary>Proof</summary>
-...
+Suppose there existed another inverse matrix $C$. Then $C = CI = C(AB) = (CA)B = IB = B$.
</details>
-Theorem: If $T$ is linear and invertible, $T^{-1}$ is linear and invertible.
-<details>
+Theorem: For finite-dimensional $V$ and $W$: $T$ is invertible iff $[T]_β^γ$ is invertible, and $[T^{-1}]_γ^β = ([T]_β^γ)^{-1}$. Subsequently:
+- Let $U : V → V$ be linear. $U$ is invertible iff $[U]_β$ is invertible, and $[U^{-1}]_β = ([T]_β)^{-1}$.
+- Let $A$ be an $n × n$ matrix. $A$ is invertible iff $[L_A]$ is invertible, and $(L_A)^{-1} = L_{A-1}$.
+<details markdown="block">
<summary>Proof</summary>
...
</details>
-Let $A$ be a $n × n$ matrix. Then $A$ is **invertible** iff there exists an $n × n$ matrix $B$ such that $AB = BA = I$.
-
-Theorem: If $A$ is invertible, the matrix $B$ is unique, and denoted $A^{-1}$.
-<details>
+Theorem: Let $V$ and $W$ be of finite dimensions $n$ and $m$ with ordered finite bases $β$ and $γ$, respectively. The function $Φ : \mathcal{L}(V,W) → M_{m×n}(F)$ where $Φ(T) = [T]_β^γ$ for all $T ∈ \mathcal{L}(V,W)$ is an isomorphism.
+<details markdown="block">
<summary>Proof</summary>
-Suppose there existed another inverse matrix $C$. Then $C = CI = C(AB) = (CA)B = IB = B$.
+...
</details>
+That is, the set (really *vector space*) of all linear transformations between two vector spaces $V$ and $W$ itself is isomorphic to the vector space of all $m × n$ matrices.
+- Subsequently, for $V$ and $W$ of finite dimensions $n$ and $m$, $\mathcal{L}(V,W)$ is of finite dimension $mn$.
-$V$ is **isomorphic** to $W$ if there exists an *invertible* linear transformation $T : V → W$ (an **isomorphism**).
+Let $β$ be an ordered basis for an $n$-dimensional $V$. The **standard representation** of $V$ with respect to $β$ is the function $ϕ_β : V → F^n$ where $ϕ_β(x) = [x]_β$ for all $x ∈ V$.
-Lemma: For finite-dimensional $V$ and $W$: If $T: V → W$ is invertible, then $dim(V) = dim(W)$.
-<details>
+Theorem: FOr any $V$ with ordered basis $β$, $ϕ_β$ is an isomorphism.
+<details markdown="block">
<summary>Proof</summary>
...
</details>
-
-Theorem: For finite-dimensional $V$ and $W$: $T$ is invertible iff $[T]_β^γ$ is invertible, and $[T^{-1}]_γ^β = ([T]_β^γ)^{-1}$.
-
-...
-
-
-Let $V$ and $W$ be vector spaces. We say that $V$ is **isomorphic** to $W$ if there exists an *invertible* linear transformation $T: V → W$.
-Such a transformation is called an **isomorphism** from $V$ onto $W$.
-- "is isomorphic to" is an equivalence relation
-
-## Matrices
-
-An $n × n$ matrix $A$ is **invertible** if there exists an $n × n$ matrix $B$ such that $AB = BA = I_n$.