# Basis of a vector space

A basis of a vector space is a set of vectors of the space that (i) are linearly independent and (ii) span the vector space.

We have learnt that a given set of vectors spans or generates a certain vector space when every vector in the space can be expressed as a linear combination of the given set. Obviously there can be more than one set that can span a certain vector space; moreover the number of vectors in each generating set can vary.

Example |

The set {(1, 0), (0, 1)} spans the vector space of all two dimensional real coordinate vectors; also, the set {(1, 2), (2. 1), (3, 3)} spans the same vector space. Every vector in the space can be expressed as a linear combination of each of the given sets, and therefore each set is a spanning set even though the sets are different in both content and number of elements.

There is, however, a distinction between two kinds of spanning sets; that is, for given vector space, some spanning sets are linearly independent and some are linearly dependent. Those spanning sets that are linearly independent are very important in the study of linear algebra; such a set is called a basis of the vector space.

Example 01 |

The set {(1, 0, 0), (0, 1, 0), (0, 0, 1)} is a basis for the vector space of all coordinate vectors with three real components over the field of real numbers because the set is linearly independent and the set spans the space.

Example 02 |

The set {(0, 1), (1, 0), (1, 1)} is not a basis of the vector space of two dimensional real coordinate vectors over the field of real numbers, because the set is not linearly independent. The set does, however, span the space.

Theorem |

**If {β _{1}, β_{2}, …, β_{n}} is a basis of a vector space V over a field F. Then each vector in V can be expressed uniquely as a linear combination of β_{1}, β_{2}, …, β_{n}.**

**Proof:**

Since { β_{1}, β_{2}, …, β_{n}} spans the vector space V and vector β ∈ V can be expressed as a linear combination of β_{1}, β_{2}, …, β_{n}.

Let us suppose that

β = a_{1}β_{1} + a_{2}β_{2} + … + a_{n}β_{n} and β = b_{1}β_{1} + b_{2}β_{2} + … + b_{n}β_{n}

where a_{i}’s and b_{i}’s ∈ F, are two such expressions of β. Theorem will be proved if we can show that a_{i} = b_{i} , i = 1, 2, …, n

We have β = a_{1}β_{1} + a_{2}β_{2} + … + a_{n}β_{n} = b_{1}β_{1} + b_{2}β_{2} + … + b_{n}β_{n}

⇒ (a_{1} – b_{1}) β_{1} + (a_{2} – b_{2}) β_{2} + … + (a_{n} – b_{n}) β_{n} = θ

Since β_{1}, β_{2}, …, β_{n} are linearly independent, it follows that a_{1} – b_{1} = 0, a_{2} – b_{2} = 0, …, a_{n} – b_{n} = 0

Therefore a_{i} = b_{i} , i = 1, 2, …, n.

Hence the theorem is proved.

Theorem (Replacement Theorem) |

**If {α _{1}, α_{2}, …, α_{ n}} be a basis of a vector space V over a field F and a non-zero vector β of V is expressed β = c_{1} α_{1} + c_{2} α_{2} + … + c_{n} α_{n} , c_{i} ∈ F, then if c_{j} ≠ 0, {α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } is a new basis of V. [that is β can replace α_{j} in the basis]**

**Proof:**

β = c_{1} α_{1} + c_{2} α_{2} + … + c_{j-1} α_{j-1} + c_{j} α_{j} + c_{j+1} α_{j+1} + … + c_{n} α_{n}

⇒ c_{j} α_{j }= β – c_{1} α_{1} – c_{2} α_{2} – … – c_{j-1} α_{j-1} – c_{j+1} α_{j+1} – … – c_{n} α_{n} ……….(1)

Since c_{j} ≠ 0, c_{j}^{-1} exists in F.

Therefore multiplying both sides of (1) by c_{j}^{-1} we get

α_{j }= c_{j}^{-1} β – c_{j}^{-1} c_{1} α_{1} – c_{j}^{-1} c_{2} α_{2} – … – c_{j}^{-1} c_{j-1} α_{j-1} – c_{j}^{-1} c_{j+1} α_{j+1} – … – c_{j}^{-1} c_{n} α_{n}

⇒ α_{j }= p_{1} α_{1} + p_{2} α_{2} + … + p_{j-1} α_{j-1} + c_{j}^{-1} β + p_{j+1} α_{j+1} + … + p_{n} α_{n}

Where p_{i }= – c_{j}^{-1} c_{i}, if i ≠ j and p_{i }= – c_{j}^{-1}, if i=j

We first prove that {α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } is linearly independent.

Let us suppose that

d_{1} α_{1} + d_{2} α_{2} + … + d_{j-1} α_{j-1} + d_{j} β + d_{j+1} α_{j+1} + … + d_{n} α_{n} = θ

⇒ d_{1} α_{1} + d_{2} α_{2} + … + d_{j-1} α_{j-1} + d_{j} (c_{1} α_{1} + c_{2} α_{2} + … + c_{j-1} α_{j-1} + c_{j} α_{j} + c_{j+1} α_{j+1} + … + c_{n} α_{n}) + d_{j+1} α_{j+1} + … + d_{n} α_{n} = θ

⇒ (d_{1} + d_{j} c_{1}) α_{1} + (d_{2} + d_{j} c_{2}) α_{2} + … +(d_{j-1} + d_{j} c_{j-1}) α_{j-1} + d_{j}c_{j}α_{j} + (d_{j+1} + d_{j} c_{j+1}) α_{j+1} + … +(d_{n} + d_{j} c_{n}) α_{n} =θ

Since {α_{1}, α_{2}, …, α_{ n}} is linearly independent, we have

d_{1} + d_{j} c_{1} = 0, d_{2} + d_{j} c_{2} = 0,…, d_{j-1} + d_{j} c_{j-1} = 0, d_{j}c_{j} = 0, d_{j+1} + d_{j} c_{j+1} = 0, …, d_{n} + d_{j} c_{n} = 0

Since d_{j}c_{j} = 0, but c_{j} ≠ 0 we have d_{j} = 0, hence it follows that d_{1} = d_{2} = … = d_{n} = 0

This proves that, {α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } is linearly independent set of vectors.

We now prove that L{α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } = V

Let δ be any arbitrary vector in V. Since {α_{1}, α_{2}, …, α_{ n}} is a basis of V, there exists k_{1}, k_{2}, …, k_{ n} ∈ F such that

δ = k_{1} α_{1} + k_{2} α_{2} + … + k_{n} α_{n}

⇒ δ = k_{1} α_{1} + k_{2} α_{2} + … + k_{j-1} α_{j-1} + k_{j} (p_{1} α_{1} + c_{2} α_{2} + … + p_{j-1} α_{j-1} + p_{j} β+ p_{j+1} α_{j+1} + … + p_{n} α_{n}) + p_{j+1} α_{j+1} + … + p_{n} α_{n}

⇒ δ = s_{1} α_{1} + s_{2} α_{2} + … + s_{j-1} α_{j-1} + s_{j} β + s_{j+1} α_{j+1} + … + s_{n} α_{n}

Where s_{i} = k_{i} + k_{j}p_{i}, i≠j and s_{i} = k_{j}p_{i}, i=j

Therefore δ ∈ L{α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} }

i.e., V ⊆ L{α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } ……….(2)

But L{α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } being the smallest subspace containing the set {α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} }.

Therefore, L{α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } ⊆ V ………(3)

Combining (2) and (3)

V = L{α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} }

Hence {α_{1}, α_{ 2}, …, α_{ j-1}, β, α_{j+1}, …, α_{ n} } is a basis of V.

Example 03 |

**Prove that the set S = {(1, 0, 1), (0, 1, 1), (1, 1, 0)} is a basis of R ^{3}.**

**Solution:**

Let α_{1 }= (1, 0, 1), α_{2 }= (0, 1, 1), α_{3} = (1, 1, 0)

We have to show that α_{1}, α_{2}, α_{3} are linearly independent and that L(S) = R^{3}.

To show that α_{1}, α_{2}, α_{3} are linearly independent, we suppose that there exists c_{1}, c_{2}, c_{3} ∈ F such that

c_{1} α_{1} + c_{2} α_{2} + c_{3} α_{3} = θ

⇒ c_{1} (1, 0, 1) + c_{2} (0, 1, 1) + c_{3} (1, 1, 0) = (0, 0, 0)

⇒ c_{1} + c_{3} = 0, c_{2} + c_{3} = 0, c_{1} + c_{2} = 0

⇒ c_{1} = c_{2} = c_{3} = 0

Hence α_{1}, α_{2}, α_{3} are linearly independent.

To show that L(S) = R^{3}, we suppose that ξ = (a, b, c) be any vector in R^{3}. ξ will belong to L(S) if we can find r_{1}, r_{2}, r_{3} ∈ F such that

ξ = r_{1} α_{1} + r_{2} α_{2} + r_{3} α_{3} = θ.

This requires, (a, b, c) = r_{1} (1, 0, 1) + r_{2} (0, 1, 1) + r_{3} (1, 1, 0)

⇒ r_{1} + r_{3} = 0, r_{2} + r_{3} = 0, r_{1} + r_{2} = 0 …..(1)

The system of equations (1) will have unique solution if the coefficient determinant is non zero.

Once we can find r_{1}, r_{2}, r_{3} from (1) such that ξ = r_{1} α_{1} + r_{2} α_{2} + r_{3} α_{3}

Hence ξ ∈ L(S)

Since L(S) is the smallest subspace of R^{3} containing S. We have L(S) = R^{3} containing S, we have L(S) = R^{3}

Hence the given set forms a basis of R^{3}.

### <<Previous Next>>