Vector Spaces: A Comprehensive College-Level Study Guide

Vector Spaces:

A vector space is a collection of vectors (i.e. elements) that can be added and multiplied by scalars in a well-defined manner. The scalars are typically taken from a field, such as the real numbers or complex numbers, and the operations of vector addition and scalar multiplication must satisfy a set of axioms known as the axioms of a vector space.

Properties of Vector Spaces:

  1. Closure under vector addition: The sum of two vectors in a vector space is also a vector in the same space.

  2. Commutativity of vector addition: The order in which two vectors are added does not affect the result.

  3. Associativity of vector addition: The grouping of vectors in a sum does not affect the result.

  4. Existence of an additive identity: There exists a zero vector in the vector space such that adding it to any vector leaves the vector unchanged.

  5. Existence of additive inverses: For each vector in the vector space, there exists an additive inverse such that the sum of the vector and its inverse is the zero vector.

  6. Compatibility of scalar multiplication with field operations: Scalar multiplication must be compatible with the operations of the underlying field, such as addition and multiplication.

  7. Distributivity of scalar multiplication over vector addition: Scalar multiplication distributes over vector addition, meaning that (c + d)v = cv + dv for any scalars c and d and any vector v.

Subspaces:

A subspace of a vector space is a subset of the space that is itself a vector space under the same operations of vector addition and scalar multiplication. To determine if a subset of a vector space is a subspace, it must be non-empty, closed under vector addition, and closed under scalar multiplication.

Basis and Dimension:

A basis for a vector space is a set of vectors that can be used to represent any vector in the space as a linear combination of the basis vectors. The dimension of a vector space is the number of basis vectors in a basis for the space.

Linear Independence and Span:

A set of vectors is linearly independent if no vector in the set can be represented as a linear combination of the others. The span of a set of vectors is the set of all linear combinations of the vectors. A basis for a vector space must be linearly independent and its span must equal the entire vector space.

Linear Transformations:

A linear transformation is a function that maps one vector space to another while preserving the operations of vector addition and scalar multiplication. A linear transformation is linear if it satisfies the following properties:

  1. Additivity: T(u + v) = T(u) + T(v) for all vectors u and v.

  2. Homogeneity: T(cu) = cT(u) for all scalars c and vectors u.

Eigenvectors and Eigenvalues:

An eigenvector of a linear transformation is a non-zero vector that is mapped to a scalar multiple of itself by the transformation. The scalar is called the eigenvalue associated with the eigenvector. The eigenvalues and eigenvectors of a linear transformation play a key role in understanding the properties and behavior of the transformation.

related notes