In this post, we introduce the fundamental concept of the basis for vector spaces. A basis for a real vector space is a linearly independent subset of the vector space which also spans it. More precisely, by definition, a subset \(B\) of a real vector space \(V\) is said to be a basis if each vector in \(V\) is a linear combination of the vectors in \(B\) (i.e., \(B\) spans \(V\)) and \(B\) is linearly independent.

**Exercise**. If \(B\) is a basis for \(V\), then each vector of \(V\) has a unique representation as a linear combination of the vectors in \(B.\)

**Solution**. Since \(B\) spans \(V\), each vector has at least one representation as a linear combination of the vectors in \(B.\) Now, let \(u\) be a vector having the following representations: $$ u = r_1 b_1 + \cdots + r_n b_n$$ and $$ u = s_1 b_1 + \cdots + s_n b_n,$$ where \(b_i\)s are elements of the basis \(B\), and \(r_i\)s and \(s_i\)s are real numbers. It is then easy to see that $$(r_1 – s_1) b_1 + \cdots $$ $$ + (r_n – s_n) b_n = \vec{0}.$$ Since the elements of \(B\) are linearly independent, we have \(r_i – s_i = 0\), for each \(i\). This implies that \(r_i = s_i\), for each \(i\), showing that the representation is unique.

One of the most fundamental results in linear algebra states that in a vector space, a finite basis is the one with the maximum number of linearly independent vectors. In other words, if $$B = \{v_1, \dots, v_n\}$$ is a basis for a vector space \(V\), then any subset of \(V\) with more than \(n\) elements is linearly dependent. Based on this, we have the following:

### Invariance of the size of basis for vector spaces

**Result**. If a real vector space \(V\) has a basis with \(n\) elements, then each basis of the real vector space \(V\) has \(n\) elements.

Proof. We bring the proof of above result. Let \(B\) be a basis for \(V\) with \(n\) elements and \(B’\) be a basis for \(V\) with \(m\) elements. Since \(B’\) is linearly independent, it cannot have more than \(n\) elements. Therefore, \(m \leq n.\) On the other hand, since \(B\) is linearly independent and has \(n\) elements, again, it cannot have more than \(m\) elements. So, \(n \leq m.\) Hence, \(m = n,\) as required.

By definition, if \(V\) is a real vector space with a basis with \(n\) elements, then it is said that \(V\) has dimension \(n\) or \(V\) is an \(n\)-dimensional vector space. In such a case, it is written that $$\dim(V) = n.$$

#### Examples of finite dimensional real vector spaces

- The set of all complex numbers \(\mathbb C\) is an real vector space and its dimension is 2.
- The dimension of the real vector space \(\mathbb{R}^n\) is \(n\) for each positive integer \(n.\)

**Exercise**. Let \(V\) be an inner product real vector space of dimension \(n\). Prove that if \(O\) is an orthogonal subset of \(V\), then \(O\) has at most \(n\) elements.

**Exercise**. Let \(P_n\) be the set of all one-variable real polynomials of degree at most \(n\). Prove that $$\dim(P_n) = n+1.$$

#### Gram-Schmidt process converts a finite basis into an orthogonal basis

By definition, an orthogonal (orthonormal) basis is a basis \(B\) such that \(B\) is also an orthogonal (orthonormal) subset. The main theme of the Gram-Schmidt process is an algorithm that converts a finite basis into an orthogonal basis. More precisely:

**Gram-Schmidt process**. Let \(\{u_1, \dots, u_p\}\) be a basis for a nonzero subspace \(W\) of an inner product real vector space \(V\). Set $$ v_1 = u_1,$$ $$v_2 = u_2 – \left(\frac{u_2 \cdot v_1}{v_1 \cdot v_1} v_1\right),$$ $$v_3 = u_3 – \left(\frac{u_3 \cdot v_1}{v_1 \cdot v_1} v_1 + \frac{u_3 \cdot v_{2}}{v_{2} \cdot v_{2}} v_{2}\right),$$ and finally, $$v_p = u_p – $$ $$\left(\frac{u_p \cdot v_1}{v_1 \cdot v_1} v_1 + \dots + \frac{u_p \cdot v_{p-1}}{v_{p-1} \cdot v_{p-1}} v_{p-1}\right).$$ Then, \(\{v_1, \dots, v_p\}\) is an orthogonal basis for \(W\). Furthermore, if we divide each \(v_i\) by its norm \(\Vert v_i\Vert\), we obtain an orthonormal basis for \(W\).

For the proof, recall that if \(y\) and \(u \neq 0\) are vectors, then the vector orthogonal projection of \(y\) onto \(u\) is: $$\hat{y} = \left(\frac{y \cdot u}{u \cdot u}\right) u$$ and \(z = y – \hat{y}\) is orthogonal to \(u\). Now, the mathematical induction on the number of the elements in the basis of \(W\) will do the job.

**Exercise**. Find an orthogonal basis for a subspace of \(\mathbb{R}^5\) spanned by the following vectors: $$u_1 = (2,-3,4,-1,6)$$ and $$ u_2 = (4,5,3,-7,1).$$

**Solution**. Note that \(u_1 \cdot u_2 = 18\). So, \(u_1\) and \(u_2\) are not perpendicular to each other. Set $$ v_1 = u_1,$$ $$v_2 = u_2 – \left(\frac{u_2 \cdot v_1}{v_1 \cdot v_1} v_1\right) = $$ $$ u_2 – \frac{18}{66} v_1 = $$ $$\left(\frac{38}{11}, \frac{64}{11}, \frac{21}{11}, \frac{-74}{11}, \frac{-7}{11}\right).$$ The reader may check that \(v_1\) and \(v_2\) are orthogonal to each other and span the same subspace that the vectors \(u_1\) and \(u_2\) span.