A unitary matrix is a square matrix whose inverse equals its conjugate transpose. In notation, a matrix U is unitary if UHU = I, where UH is the conjugate transpose (flip the rows and columns, then take the complex conjugate of every entry) and I is the identity matrix. This single property gives unitary matrices a powerful geometric meaning: they preserve lengths and angles when they transform vectors.
The Defining Property
For most square matrices, finding the inverse requires significant computation. Unitary matrices sidestep this entirely. If you have a unitary matrix U, you already know its inverse: just take the conjugate transpose. That means you transpose the matrix (swap rows and columns) and replace each entry with its complex conjugate. The result, multiplied by the original, gives you the identity matrix.
When all the entries of the matrix are real numbers (no imaginary parts), the conjugate transpose is just the regular transpose. In that case, the unitary matrix is called an orthogonal matrix. Orthogonal matrices are the real-number special case of unitary matrices, and they satisfy UTU = I.
What Unitary Matrices Do Geometrically
The most intuitive way to think about a unitary matrix is that it preserves the length of any vector it acts on. If you multiply a vector by a unitary matrix, the output vector has exactly the same length as the input. This makes unitary transformations a type of isometry, a transformation that doesn’t stretch, shrink, or distort.
Length preservation is actually just the beginning. A unitary matrix also preserves the inner product (a generalization of the dot product) between any two vectors. That means angles between vectors stay the same, and vectors that were perpendicular before the transformation remain perpendicular afterward. Geometrically, unitary transformations include rotations and reflections, but never scaling or shearing.
Eigenvalues Sit on the Unit Circle
Every unitary matrix can be diagonalized, meaning it can be broken down into its eigenvalues and eigenvectors in a clean way. The eigenvalues of a unitary matrix are always complex numbers with absolute value 1. You can write each eigenvalue as eiα for some angle α, which places it exactly on the unit circle in the complex plane.
This connects directly to the length-preserving property. Since each eigenvalue has magnitude 1, multiplying an eigenvector by it changes the vector’s phase (its direction in the complex plane) without changing its magnitude. The eigenvectors corresponding to different eigenvalues are always orthogonal to each other, and together they form a complete orthogonal basis for the space. This is the spectral theorem for unitary matrices.
Because the determinant of a matrix equals the product of its eigenvalues, the determinant of a unitary matrix also has absolute value 1. A special unitary matrix is one whose determinant is exactly 1 (not just magnitude 1).
Common Examples
The simplest unitary matrix is the identity matrix itself. Beyond that, rotation matrices in two or three dimensions are unitary (and orthogonal, since their entries are real). A 2×2 rotation matrix that turns vectors by angle θ has eigenvalues eiθ and e−iθ, both on the unit circle.
The Fourier matrix is one of the most important unitary matrices in applied mathematics. For an N×N Fourier matrix, the entry in row j, column k is a normalized complex exponential: e−2πijk/N divided by √N. The normalization factor √N is what makes the matrix unitary rather than just having orthogonal rows. This matrix is the foundation of the Discrete Fourier Transform (DFT), and because it’s unitary, Parseval’s theorem holds: the total energy of a signal is the same whether you measure it in the time domain or the frequency domain.
Hadamard matrices are another well-known family. A real Hadamard matrix has entries of only +1 and −1, and when you divide it by the square root of its size, it becomes unitary. These show up frequently in signal processing, error-correcting codes, and quantum computing.
Role in Matrix Decompositions
Unitary matrices are structural building blocks in several of the most widely used matrix decompositions. In the singular value decomposition (SVD), any matrix A can be written as A = UΣVT, where U and V are orthogonal (or unitary, in the complex case) matrices and Σ is a diagonal matrix of singular values. The columns of U are eigenvectors of AAT, and the columns of V are eigenvectors of ATA. The orthogonal matrices handle the rotation, while Σ handles the scaling.
In QR decomposition, a matrix A is factored into Q times R, where Q is a unitary matrix and R is upper triangular. This decomposition is central to numerical algorithms for solving systems of equations and computing eigenvalues. The reason unitary matrices are preferred in these decompositions is precisely their length-preserving property: they don’t amplify numerical errors the way other matrices can.
Why Quantum Mechanics Requires Unitarity
Unitary matrices play a fundamental role in quantum mechanics, and this is where many people encounter them for the first time outside a math course. In quantum theory, the state of a system is described by a vector in a complex vector space (a Hilbert space), and the length of that vector represents total probability, which must always equal 1.
The second postulate of quantum mechanics states that the evolution of a closed quantum system is described by a unitary transformation. If a system is in state |ψ⟩ at one time, it will be in state U|ψ⟩ at a later time, where U is a unitary operator. Because unitary matrices preserve vector lengths, the total probability stays at 1. No probability is created or destroyed, which is exactly the physical requirement.
In quantum computing, every logic gate that operates on qubits is represented by a unitary matrix. A single-qubit gate is a 2×2 unitary matrix; a two-qubit gate is 4×4, and so on. The constraint that gates must be unitary is not a design choice but a consequence of quantum physics. It also means that every quantum computation is reversible: since every unitary matrix has an inverse (its conjugate transpose), you can always run the computation backward.
Quick Reference: Key Properties
- Inverse: The inverse of a unitary matrix is its conjugate transpose, making inversion computationally trivial.
- Eigenvalues: All eigenvalues lie on the unit circle in the complex plane (absolute value 1).
- Determinant: The absolute value of the determinant is always 1.
- Length preservation: Multiplying any vector by a unitary matrix leaves the vector’s length unchanged.
- Inner product preservation: Angles and orthogonality between vectors are maintained.
- Diagonalizability: Every unitary matrix can be diagonalized by another unitary matrix, with eigenvectors forming an orthogonal basis.
- Closure: The product of two unitary matrices is also unitary, and the inverse of a unitary matrix is unitary.

