Matrix Operations: Be proficient in matrix operations like matrix multiplication, addition, and scalar multiplication. Study the properties of matrices, including transposition, inverse, and rank.
Vector Spaces: Familiarize yourself with the properties and axioms of vector spaces. Learn about the subspace concept, linear independence, and spanning sets.
A vector space (also known as a linear space) is a collection of objects called vectors, which can be added together and multiplied ("scaled") by numbers, called scalars in this context. Scalars are often taken to be real numbers, but one can also consider vector spaces with scalar multiplication by complex numbers, rational numbers, or generally any field.
The operations of vector addition and scalar multiplication must satisfy certain requirements, or axioms. These are, for any vectors u, v, and w in vector space V, and any scalars a and b:
Associativity of addition: u + (v + w) = (u + v) + w
Commutativity of addition: u + v = v + u
Identity element of addition: There exists an element 0 in V, such that v + 0 = v for all v in V.
Inverse elements of addition: For every v in V, there exists an element -v in V such that v + (-v) = 0
Compatibility of scalar multiplication with field multiplication: a * (bv) = (ab)v
Identity element of scalar multiplication: 1v = v for all v in V.
Distributivity of scalar multiplication with respect to vector addition: a * (u + v) = au + av
Distributivity of scalar multiplication with respect to scalar addition: (a + b)v = av + bv
For example, the collection of all two-dimensional vectors is a set forming a vector space, as is the collection of all three-dimensional vectors, the collection of all real numbers, and the collection of all polynomials, etc.
Inner Product Spaces: Understand inner product spaces, inner products, and their properties. Study norms, orthogonality, and orthogonal bases.
An inner product space is a vector space that has an additional structure called an inner product. This structure allows you to measure angles and lengths, making it a key tool in the field of geometry.
Formally, an inner product on a real vector space V is a function that associates each pair of vectors u, v in V with a real number, denoted as ⟨u, v⟩, and satisfies the following properties for all u, v, w in V and all scalars c:
Conjugate Symmetry: ⟨v, u⟩ = ⟨u, v⟩
Linearity in the first argument: ⟨u+v, w⟩ = ⟨u, w⟩ + ⟨v, w⟩ and ⟨cu, v⟩ = c ⟨u, v⟩
Positive-definiteness: ⟨v, v⟩ ≥ 0 and ⟨v, v⟩ = 0 if and only if v = 0
The inner product of two vectors is a number that provides information about the angle between the vectors and the lengths of the vectors. The most common inner product is the dot product, which on real vectors is defined as ⟨u, v⟩ = u₁v₁ + u₂v₂ + ... + uₙvₙ, where u = (u₁, u₂, ..., uₙ) and v = (v₁, v₂, ..., vₙ).
The norm of a vector is a measure of its length (or magnitude) in the vector space. In an inner product space, the norm of a vector v is defined as the square root of the inner product of the vector with itself, ||v|| = sqrt(⟨v, v⟩).
Two vectors are orthogonal (or perpendicular) if their inner product is zero. In terms of geometry, orthogonal vectors are at a right angle to each other.
An orthogonal basis for an inner product space V is a basis such that every pair of different basis vectors is orthogonal. If in addition all basis vectors have norm 1, the basis is called orthonormal.
Orthogonal and orthonormal bases have very nice properties. For example, to find the coordinates of a vector v with respect to an orthonormal basis, you just need to compute the inner product of v with each basis vector.
The Gram-Schmidt Process is a method for converting any basis of a finite-dimensional inner product space into an orthogonal (and, usually, orthonormal) basis. This is very useful for simplifying computations in linear algebra.
These concepts are fundamental in many fields of mathematics and its applications, including machine learning, data science, physics, engineering, computer graphics, and many others.
Hilbert Spaces: Get acquainted with Hilbert spaces, which are complete inner product spaces. Learn about the convergence of sequences and series in Hilbert spaces.
Operator Norms: Study operator norms and their properties, such as submultiplicativity. Understand bounded linear operators and their relationship to continuity.
Adjoint of a Linear Transformation: Learn about the adjoint (also known as the transpose or Hermitian conjugate) of a linear transformation and its properties.
Fundamental Subspaces: Explore the four fundamental subspaces of a linear transformation: range, nullspace, range of the adjoint, and nullspace of the adjoint.
Pseudoinverses: Understand the concept of pseudoinverses and their applications in solving least-squares problems and finding approximate solutions to systems of linear equations.
Rank-Nullity Theorem: Study the Rank-Nullity Theorem and its implications in understanding the dimensions of the fundamental subspaces of a matrix.
Singular Value Decomposition (SVD): Learn about SVD, a powerful technique to decompose a matrix into orthogonal and diagonal components, used in various applications, including data compression and dimensionality reduction.
Least Squares Approximation: Explore the concept of least squares approximation and its applications in solving over-determined systems of linear equations.