From 5640e35821ec63f7c1af402101f527ce27992ee7 Mon Sep 17 00:00:00 2001 From: JJ Date: Thu, 14 Mar 2024 18:00:35 -0700 Subject: meow --- mathematics/linear-algebra.md | 132 +++++++++++++++++++++++++----------------- 1 file changed, 79 insertions(+), 53 deletions(-) diff --git a/mathematics/linear-algebra.md b/mathematics/linear-algebra.md index 4e1162b..7b9cc27 100644 --- a/mathematics/linear-algebra.md +++ b/mathematics/linear-algebra.md @@ -90,8 +90,8 @@ Example: Let $S = \\{(a_1, a_2) | a_1, a_2 ∈ ℝ\\}$. We define: - fails zero! A subset $W$ of a vector space $V$ over a field 𝔽 is called a **subspace** of $V$ if $W$ is a *vector space* over 𝔽 with the operations of addition and scalar multiplication from $V$. -- -- +- . +- . A subset of $V$ is a **subspace** of V iff: - the subset is non-empty @@ -155,11 +155,12 @@ Subsequently: Let $T: V → W$ be a linear transformation. -The **kernel** (or null space) $N(T)$ of $T$ is the set of all vectors in $V$ such that $T(x) = 0$: $N(T) = \\{ x ∈ V : T(x) = 0 \\}$. -The **image** (or range) $R(T)$ of $T$ is the subset of $W$ consisting of all images (under $T$) of elements of $V$: $R(T) = \\{ T(x) : x ∈ V \\}$ +The **kernel** (or null space) $N(T)$ of $T$ is the set of all vectors in $V$ such that $T(x) = 0$: that is, $N(T) = \\{ x ∈ V : T(x) = 0 \\}$. + +The **image** (or range) $R(T)$ of $T$ is the subset of $W$ consisting of all images (under $T$) of elements of $V$: that is, $R(T) = \\{ T(x) : x ∈ V \\}$. Theorem: The kernel $N(T)$ and image $R(T)$ are subspaces of $V$ and $W$, respectively. -
+
Proof We shall denote the zero vector of $V$ and $W$ as $0_v$ and $0_w$, respectively. @@ -170,15 +171,15 @@ Let $x,y ∈ R(T)$ and $c ∈ F$. As $T(0_v) = 0_w$, $0_w ∈ R(T)$.
Theorem: If $β = \\{ v_1, v_2, ... v_n \\}$ is a basis for $V$, then $R(T) = span(\\{ T(v_1), T(v_2), ..., T(v_n) \\})$. -
+
Proof ...
-If $N(T)$ and $R(T)$ are finite-dimensional, then the **nullity** and **rank** of T are the dimensions of $N(T)$ and $R(T)$, respectively. +For a finite-dimensional $N(T)$ and $R(T)$: the **nullity** and **rank** of $T$ are the dimensions of $N(T)$ and $R(T)$, respectively. -Rank-Nullity Theorem: If $V$ is *finite-dimensional*, then $dim(V) = nullity(T) + rank(T)$. -
+**Rank-Nullity Theorem**: If $V$ is *finite-dimensional*, then $dim(V) = nullity(T) + rank(T)$. +
Proof ...
@@ -189,19 +190,19 @@ A function is **surjective** (or **onto**) iff each element of the codomain is m A function is **bijective** iff it is surjective and injective. Necessarily, a bijective function is invertible, which will be formally stated & proven later. Theorem: $T$ is injective iff $N(T) = \\{0\\}$. -
+
Proof ...
Theorem: For $V$ and $W$ of equal (and finite) dimension: $T$ is injective iff it is surjective. -
+
Proof ...
Theorem: Suppose that $V$ is finite-dimensional with a basis $\\{ v_1, v_2, ..., v_n \\}$. For any vectors $w_1, w_2, ... w_n$ in $W$, there exists *exactly* one linear transformation such that $T(v_i) = w_i$ for $i = 1, 2, ..., n$. -
+
Proof ...
@@ -210,8 +211,14 @@ Theorem: Suppose that $V$ is finite-dimensional with a basis $\\{ v_1, v_2, ..., Let $V$, $W$, and $Z$ be vector spaces. +Theorem: Let $T, U : V → W$ be linear. Then their sum and scalar products are linear. +
+Proof +... +
+ Theorem: The set of all linear transformations (via our definitions of addition and scalar multiplication above) $V → W$ forms a vector space over $F$. We denote this as $\mathcal{L}(V, W)$. If $V = W$, we write $\mathcal{L}(V)$. -
+
Proof ...
@@ -219,7 +226,7 @@ Theorem: The set of all linear transformations (via our definitions of addition Let $T, U : V → W$ be arbitrary functions. We define **addition** $T + U : V → W$ as $∀x ∈ V : (T + U)(x) = T(x) + U(x)$, and **scalar multiplication** $aT : V → W$ as $∀x ∈ V : (aT)(x) = aT(x)$ for all $a ∈ F$. Theorem: Let $T : V → W$ and $U : W → Z$ be linear. Then their composition $UT : V → Z$ is linear. -
+
Proof Let $x,y ∈ V$ and $c ∈ F$. Then: @@ -233,15 +240,54 @@ Theorem: Let $T, U_1, U_2 ∈ \mathcal{L}(V)$. Then: - $T(U_1 U_2) = (TU_1) U_2$ - $TI = IT = T$ - $∀a ∈ F : a(U_1 U_2) = (aU_1) U_2 = U_1 (aU_2)$ -
+
Proof ...
+## Invertibility and Isomorphism + +- Let $V$ and $W$ be vector spaces. +- Let $T: V → W$ be a linear transformation. +- Let $I_V: V → V$ and $I_W: W → W$ denote the identity transformations within $V$ and $W$, respectively. + +A function $U: W → V$ is an **inverse** of $T$ if $TU = I_W$ and $UT = I_V$.
+If $T$ has an inverse, then $T$ is **invertible**. + +Theorem: Consider a linear function $T: V → W$. +- If $T$ is invertible, it has a *unique* inverse $T^{-1}$. +- If $T$ is invertible, $T^{-1}$ is invertible with the inverse $T$. +- A function is invertible if and only iff it is bijective. +
+Proof +... +
+ +Theorem: If $T$ is linear and invertible, $T^{-1}$ is linear and invertible. +
+Proof +... +
+ +$V$ is **isomorphic** to $W$ if there exists an *invertible* linear $T : V → W$ (an **isomorphism**). + +Lemma: For finite-dimensional $V$ and $W$: If $T: V → W$ is invertible, then $dim(V) = dim(W)$. +
+Proof +... +
+ +Theorem: $V$ is isomorphic to $W$ iff $dim(V) = dim(W)$. +- Subsequently, $V$ is isomorphic to $F^n$ iff $dim(V) = n$. +
+Proof +... +
+ ## Linear Transformations as Matrices -- Let $V, W$ be finite-dimensional vector spaces. +- Let $V, W$ be finite-dimensional vector spaces (over a field $F$). - Let $T, U : V → W$ be linear transformations from $V$ to $W$. - Let $β$ and $γ$ be ordered bases of $V$ and $W$, respectively. - Let $a ∈ F$ be a scalar. @@ -255,61 +301,41 @@ Let $a_1, a_2, ... a_n$ be the unique scalars such that $x = Σ_{i=1}^n a_i u_i$ The $m × n$ matrix $A$ defined by $A_{ij} = a_{ij}$ is called the **matrix representation of $T$ in the ordered bases $β$ and $γ$**, and denoted as $A = [T]_β^γ$. If $V = W$ and $β = γ$, we write $A = [T]_β$. Theorem: $[T + U]_β^γ = [T]_β^γ + [U]_β^γ$ and $[aT]_β^γ = a[T]_β^γ$. -
+
Proof ...
--- -## Invertibility and Isomorphism - -Let $V$ and $W$ be vector spaces. -Let $T: U → V$ be a linear transformation. -Let $I_V: V → V$ and $I_W: W → W$ denote the identity transformations within $V$ and $W$, respectively. - -A function $U: W → V$ is an **inverse** of $T$ if $TU = I_W$ and $UT = I_V$. If $T$ has an inverse, then $T$ is **invertible**. +Let $A$ be a $n × n$ matrix. Then $A$ is **invertible** iff there exists an $n × n$ matrix $B$ such that $AB = BA = I$. -Theorem: Consider a linear function $T: V → W$. -- If $T$ is invertible, it has a *unique* inverse $T^{-1}$. -- If $T$ is invertible, $T^{-1}$ is invertible with the inverse $T$. -- A function is invertible if and only iff it is bijective. -
+Theorem: If $A$ is invertible, the matrix $B$ is unique, and denoted $A^{-1}$. +
Proof -... +Suppose there existed another inverse matrix $C$. Then $C = CI = C(AB) = (CA)B = IB = B$.
-Theorem: If $T$ is linear and invertible, $T^{-1}$ is linear and invertible. -
+Theorem: For finite-dimensional $V$ and $W$: $T$ is invertible iff $[T]_β^γ$ is invertible, and $[T^{-1}]_γ^β = ([T]_β^γ)^{-1}$. Subsequently: +- Let $U : V → V$ be linear. $U$ is invertible iff $[U]_β$ is invertible, and $[U^{-1}]_β = ([T]_β)^{-1}$. +- Let $A$ be an $n × n$ matrix. $A$ is invertible iff $[L_A]$ is invertible, and $(L_A)^{-1} = L_{A-1}$. +
Proof ...
-Let $A$ be a $n × n$ matrix. Then $A$ is **invertible** iff there exists an $n × n$ matrix $B$ such that $AB = BA = I$. - -Theorem: If $A$ is invertible, the matrix $B$ is unique, and denoted $A^{-1}$. -
+Theorem: Let $V$ and $W$ be of finite dimensions $n$ and $m$ with ordered finite bases $β$ and $γ$, respectively. The function $Φ : \mathcal{L}(V,W) → M_{m×n}(F)$ where $Φ(T) = [T]_β^γ$ for all $T ∈ \mathcal{L}(V,W)$ is an isomorphism. +
Proof -Suppose there existed another inverse matrix $C$. Then $C = CI = C(AB) = (CA)B = IB = B$. +...
+That is, the set (really *vector space*) of all linear transformations between two vector spaces $V$ and $W$ itself is isomorphic to the vector space of all $m × n$ matrices. +- Subsequently, for $V$ and $W$ of finite dimensions $n$ and $m$, $\mathcal{L}(V,W)$ is of finite dimension $mn$. -$V$ is **isomorphic** to $W$ if there exists an *invertible* linear transformation $T : V → W$ (an **isomorphism**). +Let $β$ be an ordered basis for an $n$-dimensional $V$. The **standard representation** of $V$ with respect to $β$ is the function $ϕ_β : V → F^n$ where $ϕ_β(x) = [x]_β$ for all $x ∈ V$. -Lemma: For finite-dimensional $V$ and $W$: If $T: V → W$ is invertible, then $dim(V) = dim(W)$. -
+Theorem: FOr any $V$ with ordered basis $β$, $ϕ_β$ is an isomorphism. +
Proof ...
- -Theorem: For finite-dimensional $V$ and $W$: $T$ is invertible iff $[T]_β^γ$ is invertible, and $[T^{-1}]_γ^β = ([T]_β^γ)^{-1}$. - -... - - -Let $V$ and $W$ be vector spaces. We say that $V$ is **isomorphic** to $W$ if there exists an *invertible* linear transformation $T: V → W$. -Such a transformation is called an **isomorphism** from $V$ onto $W$. -- "is isomorphic to" is an equivalence relation - -## Matrices - -An $n × n$ matrix $A$ is **invertible** if there exists an $n × n$ matrix $B$ such that $AB = BA = I_n$. -- cgit v1.2.3-70-g09d2