The orthogonal matrices have applications in many fields of science and engineering, including data science and machine learning. For example, orthogonal matrices are key concepts in numerical linear algebra including QR decompositions. In this post, we define orthogonal matrices and investigate their basic properties. Orthogonal matrices are those real matrices whose columns form an orthonormal subset.

#### Definition of orthogonal matrices and some examples

By definition, an \(m \times n\) real matrix is orthogonal if the set of its columns form an orthonormal subset of the real vector space \(\mathbb{R}^m\). It is for this reason that some references on statistics refer to orthogonal matrices as “orthonormal matrices”.

**Example**. The following matrix is an example of a \(5 \times 2\) orthogonal matrix: $$\begin{pmatrix} 2/\sqrt{66} & 4/\sqrt{103} \\ -3/\sqrt{66} & 5/\sqrt{103} \\ 4/\sqrt{66} & 3/\sqrt{103} \\ -1/\sqrt{66} & 7/\sqrt{103} \\ 6/\sqrt{66} & -2/\sqrt{103} \end{pmatrix} $$ because the first and the second columns are perpendicular to each other in the five-dimensional real vector space \(\mathbb{R}^5\) and their norm is 1.

**Example**. The following square matrix is orthogonal: $$ \begin{pmatrix} \cos \theta & 0 & -\sin \theta \\ 0 & 1 & 0 \\ \sin \theta & 0 & \cos \theta\end{pmatrix} $$ because its columns form an orthogonal subset of \(\mathbb{R}^3\) and their norm is 1.

**Exercise**. Show that the following real matrices are orthogonal matrices: $$ A = \frac{1}{\sqrt{2}} \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix} $$ $$ B = \frac{1}{3} \begin{pmatrix} 2 & -2 & 1 \\ 1 & 2 & 2 \\ 2 & 1 & -2 \end{pmatrix} $$

#### Equivalent definition of orthogonal matrices

The proof of the following result is based on the definition of orthonormal subsets and left to the reader:

**Result**. An \(m \times n\) real matrix \(A\) is an orthogonal matrix if and only if \(A^T A = I\).

By the above result, a real square matrix \(A\) is orthogonal if and only if \(A^T A = I\). This implies that $$A^T (A^T)^T = (A^T A)^T = I^T = I.$$ Recall that if the multiplication of an square matrix \(A\) by \(B\) is the identity matrix, the multiplication of \(B\) by \(A\) is also the identity matrix. From this point, we obtain that \((A^T)^T A^T = I\), i.e., \(A^T\) is orthogonal, and so, it has orthonormal columns. However, the columns of the transpose of a matrix are the rows of the original matrix. So, we have already proved the following result:

**Result**. An \(n \times n\) real (square) matrix \(A\) is orthogonal if and only if the row vectors of \(A\) form an orthogonal subset of the \(n\)-dimensional vector space \(\mathbb{R}^n\).

**Important notice**. The important property of an orthogonal matrix is this property that the inverse of an orthogonal matrix \(A\) is its transpose. In other words, a square matrix \(A\) is orthogonal if and only if $$A^{-1} = A^T.$$

**Exercise**. A diagonal real matrix \(D = (d_{ij})\) is orthogonal if and only if \(d_{ii}\) is either 1 or -1, for each \(i\).

#### Basic properties of orthogonal matrices

The basic properties of orthogonal matrices include the following facts:

**Orthogonality of the identity matrix**. The identity matrix is orthogonal.**Invertibility**: Each orthogonal matrix is invertible and the inverse of an orthogonal matrix is orthogonal.**Closure of the multiplication of orthogonal matrices**. The multiplication of orthogonal matrices are orthogonal.

**Exercise**. Prove the above facts about the orthogonal square matrices.

Technical comment. The set of all orthogonal matrices of dimension \(n\), denoted by \(O(n)\), equipped with the multiplication of matrices is a group (a subgroup of the group of invertible matrices).

#### Orthogonal matrices preserve lengths and orthogonality

Some of the most interesting properties of orthogonal matrices are as follows:

**Result**. Let \(A\) be an \(m \times n\) orthogonal matrix and \(u\) and \(v\) be \(n\)-dimensional real vectors. Then, the following statements hold:

- \(\Vert (Au) \Vert = \Vert u \Vert \) (preserving the length).
- \((Au) \cdot (Av) = u \cdot v\) (The matrix \(A\) is the matrix of an orthogonal linear map).
- \((Au) \cdot (Av) = 0\) if and only if \(u \cdot v = 0\) (preserving orthogonality).

The matrices of orthogonal linear maps over finite-dimensional inner product real vector spaces are orthogonal matrices.