Lemma For a set of vectors ` u_1, u_2,\dots , u_n` in vector space `V`, if `v \in sp \{ u_1, u_2,\dots u_n \} `, then
`` sp \{ u_1, u_2,\dots u_n, v \} = sp \{ u_1, u_2,\dots u_n \} ``
This leads to the concept of a basis, a spanning set containing the fewest possible elements.
Definition of a Basis
Let `u_1, u_2,\dots u_n` be vectors in a vector space `V`. We say that `\{ u_1, u_2,\dots u_n \}` is a basis for `V` iff `\{ u_1, u_2,\dots , u_n \}` is a spanning set for `V` and is linearly independent.
Definition of Linear Independence
Let `V` be a vector space over a field `F`. Let `u_1, u_2,\dots u_n` be vectors in `V`, and let `\alpha _1, \alpha _2,\dots \alpha _n` be scalars in `F`. We say that `\{ u_1, u_2,\dots , u_n \}` is linearly independent iff ``\alpha _1 u_1+\alpha _2u_2+\cdots + \alpha _nu_n = 0`` only when ``\alpha _1 = \alpha _2 = \dots = \alpha _n =0.``
- inversely, a set is linearly dependent iff there exists a nontrivial solution to `\alpha _1 = \alpha _2 = \dots = \alpha _n =0` in which not all of the scalars `\alpha _i` are zero.
Intuitively, linearly independent vectors represent different directions in a vector space, such as the x-axis and the y-axis that only have one common element, 0.
Theory A set of vectors `\{ u_1, u_2,\dots u_n \}`, where `n \geq 2,` is linearly dependent iff at least one of the vectors can be written as a linear combination of the remaining `n-1` vectors.
Lemma If a set of vectors `\{ u_1, u_2,\dots u_n \}` is linearly dependent, then there exists an integer `k`, with `2 \leq k \leq n`, such that `u_k` is a linear combination of ` u_1, u_2,\dots u_{k-1}`.
Theory A set of vectors `\{ u_1, u_2,\dots u_n \}` is linearly independent iff each vector in `sp \{ u_1, u_2,\dots u_n \}` can be written uniquely as a linear combination of `u_1, u_2,\dots u_n`.
Theory A set of vectors `\{ u_1, u_2,\dots u_n \}` is a basis for `V` iff each vector in `v \in V` can be written uniquely as a linear combination of `u_1, u_2,\dots u_n`.
Dimensions of a Vector Space
Definition of Finite-Dimensionality
Let `V` be a vector space over a field `F`. We say that `V` is finite-dimensional if
- `V` has a basis or
- `V` is a trivial vector space `\{ 0 \}`.
If `V` is not finite-dimensional, then it is called infinite-dimensional.
Corollary Every nontrivial subspace of a finite-dimensional vector space has a basis
Theorem Let `V` be a nontrivial vector space over a field `F`, and suppose `\{ u_1, u_2,\dots , u_m \}` spans `V`. Then a subset of the spanning set is a basis for `V`.
Theorem Let `V` be a finite-dimensional vector space over a field `F`, and suppose `\{ u_1, u_2,\dots , u_k \}` is linearly independent. If the linearly independent set of vectors does not span `V`, then there exist vectors ` u_{k+1}, u_{k+2},\dots , u_n ` such that `` \{ u_1, u_2,\dots , u_k, u_{k+1}, \dots , u_n \} `` is a basis for `V`.
Theorem Let `V` be a finite-dimensional vector space over a field `F`, and let `\{ u_1, u_2,\dots u_m \}` be a basis for `V`. If `v_1, v_2,\dots v_n` are any `n` vectors in `V`, with `n > m`, then the set ` \{ v_1, v_2,\dots v_n \}` is linearly dependent.
Corollary Let `\{ u_1, u_2,\dots u_m \}` and `\{ v_1, v_2,\dots v_m \}` be two bases for `V`. Then `n = m`.
Definition of Dimensions of a Vector Space
Let `V` be a finite-dimensional vector space.
- If `V` is a trivial vector space, then we say that the dimension of `V` is zero.
- Else, the dimension of `V` is the number of vectors in a basis for `V`.
Example The dimensions of `P_n` is `n+1` since ` \{ 1, x, x^2, \dots x^n \} ` is a basis.
Theorem Let V be an n-dimensional vector space over a field `F`, and let ` u_1, u_2,\dots u_n` be vectors in V.
- If `\{ u_1, u_2,\dots u_n \}` spans `V`, then it is linearly independent and hence a basis for `V`.
- If `\{ u_1, u_2,\dots u_n \}` is linearly independent, then it spans `V` and hence a basis for `V`.
basis: a minimal spanning set for a vector space