Linear independence

[1] A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors.

The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space.

from a vector space V is said to be linearly dependent, if there exist scalars

In other words, a sequence of vectors is linearly independent if the only representation of

as a linear combination of its vectors is the trivial representation in which all the scalars

[2] Even more concisely, a sequence of vectors is linearly independent if and only if

can be represented as a linear combination of its vectors in a unique way.

This allows defining linear independence for a finite set of vectors: A finite set of vectors is linearly independent if the sequence obtained by ordering them is linearly independent.

An infinite set of vectors is linearly independent if every nonempty finite subset is linearly independent.

Conversely, an infinite set of vectors is linearly dependent if it contains a finite subset that is linearly dependent, or equivalently, if some vector in the set is a linear combination of other vectors in the set.

For example, the vector space of all polynomials in x over the reals has the (infinite) subset {1, x, x2, ...} as a basis.

This is sufficient information to describe the location, because the geographic coordinate system may be considered as a 2-dimensional vector space (ignoring altitude and the curvature of the Earth's surface).

The person might add, "The place is 5 miles northeast of here."

Also note that if altitude is not ignored, it becomes necessary to add a third vector to the linearly independent set.

In general, n linearly independent vectors are required to describe all locations in n-dimensional space.

then the condition for linear dependence seeks a set of non-zero scalars, such that or Row reduce this matrix equation by subtracting the first row from the second to obtain, Continue the row reduction by (i) dividing the second row by 5, and then (ii) multiplying by 3 and adding to the first row, that is Rearranging this equation allows us to obtain which shows that non-zero ai exist such that

and check, or The same row reduction presented above yields, This shows that

are linearly dependent, form the matrix equation, Row reduce this equation to obtain, Rearrange to solve for v3 and obtain, This equation is easily solved to define non-zero ai, where

are linearly independent if and only if the determinant of the matrix formed by taking the vectors as its columns is non-zero.

In this case, the matrix formed by the vectors is We may write a linear combination of the columns as We are interested in whether AΛ = 0 for some nonzero vector Λ.

vectors are linearly dependent by testing whether for all possible lists of

This fact is valuable for theory; in practical calculations more efficient methods are available.

be the vector space of all differentiable functions of a real variable

are two real numbers such that Take the first derivative of the above equation: for all values of

In order to do this, we subtract the first equation from the second, giving

Linear dependencies among v1, ..., vn form a vector space.

If the vectors are expressed by their coordinates, then the linear dependencies are the solutions of a homogeneous system of linear equations, with the coordinates of the vectors as coefficients.

A basis of the vector space of linear dependencies can therefore be computed by Gaussian elimination.

A set of vectors is said to be affinely dependent if at least one of the vectors in the set can be defined as an affine combination of the others.

Any affine combination is a linear combination; therefore every affinely dependent set is linearly dependent.

Linearly independent vectors in
Linearly dependent vectors in a plane in