Orthogonal matrices

A square matrix A is orthogonal whenever its inverse is the same as its transpose: A–1 = AT.

At first look, orthogonal matrices don't seem to have any connection with the idea of orthogonality. However, there is actually a very strong connection.

Suppose A is an n x n matrix.

a) If A is orthogonal, then AT is also orthogonal.

b) A is orthogonal if and only if its columns are orthonormal with respect to the dot product on Rn.

c) A is orthogonal if and only if its rows are orthonormal with respect to the dot product on Rn.

Proof. a) If A is orthogonal, then (AT)–1 = (A–1)T = (AT)T , so AT is orthogonal.

b) Suppose A has columns a1, a2, ..., an; then AT has rows a1T, a2T, ..., anT and the entry in row i and column j of ATA is aiTaj. Then A is orthogonal if and only if ATA = A–1A = I, which is true if and only if aiTaj = 0 when i ≠ j and aiTai = 1 for all i's. In other words, A is orthogonal if and only if its columns are orthonormal vectors.

c) A is orthogonal if and only if AT is orthogonal, which is true if and only if AT has orthonormal columns, or if and only if A has orthonormal rows.

 

Now suppose we have a particular square matrix: the matrix of a linear operator.

Suppose T:RnRn is a linear operator with matrix A. If A is orthogonal, then T preserves dot products: for any vectors u and v in Rn, T(u)•T(v) = uv.

Proof. For any vector x in Rn, T(x) = Ax. Then

T(u)•T(v)

= AuAv
= (Au)T(Av)
= uTATAv
= uTIv
= uTv
= uv.

Since norms and angles between vectors in Rn are defined in terms of dot products, this means that a linear operators with orthogonal matrices (called isometries of Rn) don't distort the lengths of vectors or the angles between them. It's not difficult to show that the only isometries of R2 are the rotations and reflections.