![Hands-On Mathematics for Deep Learning](https://wfqqreader-1252317822.image.myqcloud.com/cover/81/36698081/b_36698081.jpg)
Orthogonal matrices
The concept of orthogonality arises frequently in linear algebra. It's really just a fancy word for perpendicularity, except it goes beyond two dimensions or a pair of vectors.
But to get an understanding, let's start with two column vectors . If they are orthogonal, then the following holds:
.
Orthogonal matrices are a special kind of matrix where the columns are pairwise orthonormal. What this means is that we have a matrix with the following property:
![](https://epubservercos.yuewen.com/FF11E0/19470372701459106/epubprivate/OEBPS/Images/Chapter_1712.jpg?sign=1738968613-gmlKEcCyhrDgZqyALF9PMLv2sLyrbuUO-0-2c3f74d7bfe1f10f9b87f1875e29be31)
Then, we can deduce that (that is, the transpose of Q is also the inverse of Q).
As with other types of matrices, orthogonal matrices have some special properties.
Firstly, they preserve inner products, so that the following applies:
.
This brings us to the second property, which states that 2-norms are preserved for orthogonal matrices, which we see as follows:
![](https://epubservercos.yuewen.com/FF11E0/19470372701459106/epubprivate/OEBPS/Images/Chapter_1196.jpg?sign=1738968613-ewLQjLKMmLuiusWy9XZwZ7F6zYtRoovO-0-3c626af674da300eb0b417e2df4ae5dc)
When multiplying by orthogonal matrices, you can think of it as a transformation that preserves length, but the vector may be rotated about the origin by some degree.
The most well-known orthogonal matrix that is also orthonormal is a special matrix we have dealt with a few times already. It is the identity matrix I, and since it represents a unit of length in the direction of axes, we generally refer to it as the standard basis.