In linear algebra, the column space (also called the range or image) of a matrix A is the span (set of all possible linear combinations) of its column vectors.
The dimension of the column space is called the rank of the matrix and is at most min(m, n).
[2] This article considers matrices of real numbers.
The set of all possible linear combinations of v1, ..., vn is called the column space of A.
That is, the column space of A is the span of the vectors v1, ..., vn.
In this case, the column space is precisely the set of vectors (x, y, z) ∈ R3 satisfying the equation z = 2x (using Cartesian coordinates, this set is a plane through the origin in three-dimensional space).
Fortunately, elementary row operations do not affect the dependence relations between the column vectors.
This makes it possible to use row reduction to find a basis for the column space.
For example, consider the matrix The columns of this matrix span the column space, but they may not be linearly independent, in which case some subset of them will form a basis.
To find this basis, we reduce A to reduced row echelon form: At this point, it is clear that the first, second, and fourth columns are linearly independent, while the third column is a linear combination of the first two.
This makes it possible to determine which columns are linearly independent by reducing only to echelon form.
To find the basis in a practical setting (e.g., for large matrices), the singular-value decomposition is typically used.
The dimension of the column space is called the rank of the matrix.
The rank is equal to the number of pivots in the reduced row echelon form, and is the maximum number of linearly independent columns that can be chosen from the matrix.
The nullity of a matrix is the dimension of the null space, and is equal to the number of columns in the reduced row echelon form that do not have pivots.
[7] The rank and nullity of a matrix A with n columns are related by the equation: This is known as the rank–nullity theorem.
The left null space of A is the set of all vectors x such that xTA = 0T.
Similarly the column space (sometimes disambiguated as right column space) can be defined for matrices over a ring K as for any c1, ..., cn, with replacement of the vector m-space with "right free module", which changes the order of scalar multiplication of the vector vk to the scalar ck such that it is written in an unusual order vector–scalar.
The set of all possible linear combinations of r1, ..., rm is called the row space of A.
That is, the row space of A is the span of the vectors r1, ..., rm.
In this case, the row space is precisely the set of vectors (x, y, z) ∈ K3 satisfying the equation z = 2x (using Cartesian coordinates, this set is a plane through the origin in three-dimensional space).
For a matrix that represents a homogeneous system of linear equations, the row space consists of all linear equations that follow from those in the system.
[9] This algorithm can be used in general to find a basis for the span of a set of vectors.
It is sometimes convenient to find a basis for the row space from among the rows of the original matrix instead (for example, this result is useful in giving an elementary proof that the determinantal rank of a matrix is equal to its rank).
Using the example matrix A above, find AT and reduce it to row echelon form: The pivots indicate that the first two columns of AT form a basis of the column space of AT.
The dimension of the row space is called the rank of the matrix.
[9] The rank of a matrix is also equal to the dimension of the column space.
The dimension of the null space is called the nullity of the matrix, and is related to the rank by the following equation: where n is the number of columns of the matrix A.
The null space of matrix A is the set of all vectors x for which Ax = 0.
The kernel of a linear transformation is analogous to the null space of a matrix.