First let us define orthogonal subsets. Let \(O\) be a subset of an inner product real vector space \(V\). By definition, \(O\) is an orthogonal subset of \(V\) if any pair of distinct elements of \(O\) are perpendicular to each other; in other words, if \(O\) does not contain the zero vector and \(u\) and \(v\) are distinct elements of \(O\), i.e., \(u \neq v\), then their inner product is zero, i.o., \(u \cdot v = 0.\)

An equivalent definition of orthogonal subsets is the following:

Let \(I\) be an index set and $$ O = \{o_i : i \in I\}$$ a subset of an inner product real vector space \(V\). It is clear that \(O\) is orthogonal if and only if \(o_i \cdot o_j = 0\), for \(i \neq j\), and \(o_i \cdot o_i \neq 0\), for all indices \(i\) and \(j\) in \(I\).

**Remark**. A subset \(O = \{o_i : i\in I\}\) of an inner product vector space is, by definition, orthonormal if \(o_i \cdot o_j = \delta_{ij}\), where \(\delta\) is the Kronecker’s delta. It is clear that if \(O\) is an orthogonal subset and we divide each element of \(O\) by its norm, then the new subset will be orthonormal. The set of tangent, normal, and binormal unit vectors discussed in multi-variable calculus (differential geometry) is an example of a orthonormal subset of \(\mathbb{R}^3\).

#### Orthogonal subsets are linearly independent

We prove the following result on orthogonal subsets of inner product real vector spaces:

**Result**. Let \(O\) be an orthogonal subset of an inner product real vector space \(V\). Then, \(O\) is linearly independent.

For the proof, assume that $$\{o_1, \dots, o_p\} \subseteq O$$ and there are real numbers \(r_i\)s such that $$r_1 o_1 + \dots + r_p o_p = 0.$$ If we product the element \(o_1\) to the both sides of the latter equation, we obtain that \(r_1 (o_1 \cdot o_1) = 0\) since \(o_i \cdot o_1 = 0\), whenever \(i \neq 1\). On the other hand, since \(o_1\) is nonzero, the real number \(o_1 \cdot o_1\) is nonzero, and so, \(r_1 = 0\). Similarly, we can prove that \(r_i\) is zero for the other indices \(i\). This means that any finite subset of \(O\) is linearly independent, i.e., \(O\) is linearly independent.

#### Examples of infinite orthogonal subsets

Let \(F\) be the set of all trigonometric functions of the form \(\sin nt\) and \(\cos mt\), where \(m\) and \(n\) are positive integers. It is clear that \(F\) is a subset of all one-variable continuous real functions defined on the interval \([-\pi,\pi]\), i.e., $$F \subseteq C[-\pi,\pi].$$ The inner product on \(C[-\pi,\pi]\) is defined as follows: $$f \cdot g = \int_{-\pi}^{\pi} f(t)g(t)dt.$$ (For more, see the advanced example explained in the post on vector orthogonal projection.)

From trigonometry, we know that $$\cos mt \cos nt = (1/2) (\cos(n-m)t + \cos(n+m)t),$$ $$\sin mt \sin nt = (1/2) (\cos(n-m)t – \cos (n+m)t),$$ and $$\cos mt \sin nt = (1/2) (\sin (n+m)t + \sin(n-m)t).$$ Using these identities and the integration techniques of trigonometric functions in calculus, we obtain that $$\int_{-\pi}^{\pi} \cos mt \cos nt = \pi \delta_{mn},$$ $$\int_{-\pi}^{\pi} \sin mt \sin nt = \pi \delta_{mn},$$ and finally, $$\int_{-\pi}^{\pi} \cos mt \sin nt = 0,$$ where \(\delta\) is the Kronecker’s delta. This shows that \(F\) is orthogonal.

**Exercise**. Convert \(F\) introduced above into an orthonormal subset of \(C[-\pi,\pi]\).

**Example**. Let \(i\), \(j\), and \(k\) be the standard unit basis of \(\mathbb{R}^3\). Show that the following subset is linearly independent but not orthogonal: $$\{i, i+j, i+j+k\}.$$

**Important remark**. Orthonormal subsets are used in the definition of the concept of orthogonal (orthonormal) matrices.