|
At first look, orthogonal matrices don't seem to have any connection with the idea of orthogonality. However, there is actually a very strong connection.
Suppose A is an n x n matrix. |
|
Proof. a) If A is orthogonal, then (AT)–1 = (A–1)T = (AT)T , so AT is orthogonal. |
b) Suppose A has columns a1, a2, ..., an; then AT has rows a1T, a2T, ..., anT and the entry in row i and column j of ATA is aiTaj. Then A is orthogonal if and only if ATA = A–1A = I, which is true if and only if aiTaj = 0 when i ≠ j and aiTai = 1 for all i's. In other words, A is orthogonal if and only if its columns are orthonormal vectors. |
c) A is orthogonal if and only if AT is orthogonal, which is true if and only if AT has orthonormal columns, or if and only if A has orthonormal rows. |
Now suppose we have a particular square matrix: the matrix of a linear operator.
|
||
Proof. For any vector x in Rn, T(x) = Ax. Then
|
Since norms and angles between vectors in Rn are defined in terms of dot products, this means that a linear operators with orthogonal matrices (called isometries of Rn) don't distort the lengths of vectors or the angles between them. It's not difficult to show that the only isometries of R2 are the rotations and reflections.