This article lists some important classes of matrices used in mathematics, science and engineering. A matrix (plural matrices, or less commonly matrixes) is a rectangular array of numbers called entries. Matrices have a long history of both study and application, leading to diverse ways of classifying matrices. A first group is matrices satisfying concrete conditions of the entries, including constant matrices. Important examples include the identity matrix given by

Thumb
Several important classes of matrices are subsets of each other.

and the zero matrix of dimension . For example:

.

Further ways of classifying matrices are according to their eigenvalues, or by imposing conditions on the product of the matrix with other matrices. Finally, many domains, both in mathematics and other sciences including physics and chemistry, have particular matrices that are applied chiefly in these areas.

Constant matrices

The list below comprises matrices whose elements are constant for any given dimension (size) of matrix. The matrix entries will be denoted aij. The table below uses the Kronecker delta δij for two integers i and j which is 1 if i = j and 0 else.

More information Name, Explanation ...
NameExplanationSymbolic description of the entriesNotes
Commutation matrixThe matrix of the linear map that maps a matrix to its transposeSee Vectorization
Duplication matrixThe matrix of the linear map mapping the vector of the distinct entries of a symmetric matrix to the vector of all entries of the matrixSee Vectorization
Elimination matrixThe matrix of the linear map mapping the vector of the entries of a matrix to the vector of a part of the entries (for example the vector of the entries that are not below the main diagonal)See vectorization
Exchange matrixThe binary matrix with ones on the anti-diagonal, and zeroes everywhere else.aij = δn+1−i,jA permutation matrix.
Hilbert matrixaij = (i + j  1)−1.A Hankel matrix.
Identity matrixA square diagonal matrix, with all entries on the main diagonal equal to 1, and the rest 0.aij = δij
Lehmer matrixaij = min(i, j) ÷ max(i, j).A positive symmetric matrix.
Matrix of onesA matrix with all entries equal to one.aij = 1.
Pascal matrixA matrix containing the entries of Pascal's triangle.
Pauli matricesA set of three 2 × 2 complex Hermitian and unitary matrices. When combined with the I2 identity matrix, they form an orthogonal basis for the 2 × 2 complex Hermitian matrices.
Redheffer matrixEncodes a Dirichlet convolution. Matrix entries are given by the divisor function; entires of the inverse are given by the Möbius function.aij are 1 if i divides j or if j = 1; otherwise, aij = 0.A (0, 1)-matrix.
Shift matrixA matrix with ones on the superdiagonal or subdiagonal and zeroes elsewhere.aij = δi+1,j or aij = δi−1,jMultiplication by it shifts matrix elements by one position.
Zero matrixA matrix with all entries equal to zero.aij = 0.
Close

Specific patterns for entries

The following lists matrices whose entries are subject to certain conditions. Many of them apply to square matrices only, that is matrices with the same number of columns and rows. The main diagonal of a square matrix is the diagonal joining the upper left corner and the lower right one or equivalently the entries ai,i. The other diagonal is called anti-diagonal (or counter-diagonal).

More information with real coefficients, such that ...
NameExplanationNotes, references
(0,1)-matrixA matrix with all elements either 0 or 1.Synonym for binary matrix or logical matrix.
Alternant matrixA matrix in which successive columns have a particular function applied to their entries.
Alternating sign matrixA square matrix with entries 0, 1 and 1 such that the sum of each row and column is 1 and the nonzero entries in each row and column alternate in sign.
Anti-diagonal matrixA square matrix with all entries off the anti-diagonal equal to zero.
Anti-Hermitian matrixSynonym for skew-Hermitian matrix.
Anti-symmetric matrixSynonym for skew-symmetric matrix.
Arrowhead matrixA square matrix containing zeros in all entries except for the first row, first column, and main diagonal.
Band matrixA square matrix whose non-zero entries are confined to a diagonal band.
Bidiagonal matrixA matrix with elements only on the main diagonal and either the superdiagonal or subdiagonal.Sometimes defined differently, see article.
Binary matrixA matrix whose entries are all either 0 or 1.Synonym for (0,1)-matrix or logical matrix.[1]
Bisymmetric matrixA square matrix that is symmetric with respect to its main diagonal and its main cross-diagonal.
Block-diagonal matrixA block matrix with entries only on the diagonal.
Block matrixA matrix partitioned in sub-matrices called blocks.
Block tridiagonal matrixA block matrix which is essentially a tridiagonal matrix but with submatrices in place of scalar elements.
Boolean matrixA matrix whose entries are taken from a Boolean algebra.
Cauchy matrixA matrix whose elements are of the form 1/(xi + yj) for (xi), (yj) injective sequences (i.e., taking every value only once).
Centrosymmetric matrixA matrix symmetric about its center; i.e., aij = ani+1,nj+1.
Circulant matrixA matrix where each row is a circular shift of its predecessor.
Conference matrixA square matrix with zero diagonal and +1 and −1 off the diagonal, such that CTC is a multiple of the identity matrix.
Complex Hadamard matrixA matrix with all rows and columns mutually orthogonal, whose entries are unimodular.
Compound matrix A matrix whose entries are generated by the determinants of all minors of a matrix.
Copositive matrixA square matrix A with real coefficients, such that is nonnegative for every nonnegative vector x
Diagonally dominant matrixA matrix whose entries satisfy .
Diagonal matrixA square matrix with all entries outside the main diagonal equal to zero.
Discrete Fourier-transform matrixMultiplying by a vector gives the DFT of the vector as result.
Elementary matrixA square matrix derived by applying an elementary row operation to the identity matrix.
Equivalent matrixA matrix that can be derived from another matrix through a sequence of elementary row or column operations.
Frobenius matrixA square matrix in the form of an identity matrix but with arbitrary entries in one column below the main diagonal.
GCD matrixThe matrix having the greatest common divisor as its entry, where .
Generalized permutation matrixA square matrix with precisely one nonzero element in each row and column.
Hadamard matrixA square matrix with entries +1, −1 whose rows are mutually orthogonal.
Hankel matrixA matrix with constant skew-diagonals; also an upside down Toeplitz matrix.A square Hankel matrix is symmetric.
Hermitian matrixA square matrix which is equal to its conjugate transpose, A = A*.
Hessenberg matrixAn "almost" triangular matrix, for example, an upper Hessenberg matrix has zero entries below the first subdiagonal.
Hollow matrixA square matrix whose main diagonal comprises only zero elements.
Integer matrixA matrix whose entries are all integers.
Logical matrixA matrix with all entries either 0 or 1.Synonym for (0,1)-matrix, binary matrix or Boolean matrix. Can be used to represent a k-adic relation.
Markov matrixA matrix of non-negative real numbers, such that the entries in each row sum to 1.
Metzler matrixA matrix whose off-diagonal entries are non-negative.
Monomial matrixA square matrix with exactly one non-zero entry in each row and column.Synonym for generalized permutation matrix.
Moore matrixA row consists of a, aq, aq², etc., and each row uses a different variable.
Nonnegative matrixA matrix with all nonnegative entries.
Null-symmetric matrix A square matrix whose null space (or kernel) is equal to its transpose, N(A) = N(AT) or ker(A) = ker(AT). Synonym for kernel-symmetric matrices. Examples include (but not limited to) symmetric, skew-symmetric, and normal matrices.
Null-Hermitian matrix A square matrix whose null space (or kernel) is equal to its conjugate transpose, N(A)=N(A*) or ker(A)=ker(A*). Synonym for kernel-Hermitian matrices. Examples include (but not limited) to Hermitian, skew-Hermitian matrices, and normal matrices.
Partitioned matrixA matrix partitioned into sub-matrices, or equivalently, a matrix whose entries are themselves matrices rather than scalars.Synonym for block matrix.
Parisi matrixA block-hierarchical matrix. It consist of growing blocks placed along the diagonal, each block is itself a Parisi matrix of a smaller size.In theory of spin-glasses is also known as a replica matrix.
Pentadiagonal matrixA matrix with the only nonzero entries on the main diagonal and the two diagonals just above and below the main one.
Permutation matrixA matrix representation of a permutation, a square matrix with exactly one 1 in each row and column, and all other elements 0.
Persymmetric matrixA matrix that is symmetric about its northeast–southwest diagonal, i.e., aij = anj+1,ni+1.
Polynomial matrixA matrix whose entries are polynomials.
Positive matrixA matrix with all positive entries.
Quaternionic matrixA matrix whose entries are quaternions.
Random matrixA matrix whose entries are random variables
Sign matrixA matrix whose entries are either +1, 0, or −1.
Signature matrixA diagonal matrix where the diagonal elements are either +1 or −1.
Single-entry matrixA matrix where a single element is one and the rest of the elements are zero.
Skew-Hermitian matrixA square matrix which is equal to the negative of its conjugate transpose, A* = −A.
Skew-symmetric matrixA matrix which is equal to the negative of its transpose, AT = −A.
Skyline matrixA rearrangement of the entries of a banded matrix which requires less space.
Sparse matrixA matrix with relatively few non-zero elements.Sparse matrix algorithms can tackle huge sparse matrices that are utterly impractical for dense matrix algorithms.
Symmetric matrixA square matrix which is equal to its transpose, A = AT (ai,j = aj,i).
Toeplitz matrixA matrix with constant diagonals.
Totally positive matrixA matrix with determinants of all its square submatrices positive.
Triangular matrixA matrix with all entries above the main diagonal equal to zero (lower triangular) or with all entries below the main diagonal equal to zero (upper triangular).
Tridiagonal matrixA matrix with the only nonzero entries on the main diagonal and the diagonals just above and below the main one.
X–Y–Z matrix A generalization to three dimensions of the concept of two-dimensional array
Vandermonde matrixA row consists of 1, a, a2, a3, etc., and each row uses a different variable.
Walsh matrixA square matrix, with dimensions a power of 2, the entries of which are +1 or −1, and the property that the dot product of any two distinct rows (or columns) is zero.
Z-matrixA matrix with all off-diagonal entries less than zero.
Close

Matrices satisfying some equations

A number of matrix-related notions is about properties of products or inverses of the given matrix. The matrix product of a m-by-n matrix A and a n-by-k matrix B is the m-by-k matrix C given by

[2]

This matrix product is denoted AB. Unlike the product of numbers, matrix products are not commutative, that is to say AB need not be equal to BA.[2] A number of notions are concerned with the failure of this commutativity. An inverse of square matrix A is a matrix B (necessarily of the same dimension as A) such that AB = I. Equivalently, BA = I. An inverse need not exist. If it exists, B is uniquely determined, and is also called the inverse of A, denoted A1.

More information Name, Explanation ...
NameExplanationNotes
Circular matrix or Coninvolutory matrix A matrix whose inverse is equal to its entrywise complex conjugate: A1 = A. Compare with unitary matrices.
Congruent matrixTwo matrices A and B are congruent if there exists an invertible matrix P such that PT A P = B.Compare with similar matrices.
EP matrix or Range-Hermitian matrix A square matrix that commutes with its Moore–Penrose inverse: AA+ = A+A.
Idempotent matrix or
Projection Matrix
A matrix that has the property A² = AA = A.The name projection matrix inspires from the observation of projection of a point multiple
times onto a subspace(plane or a line) giving the same result as one projection.
Invertible matrixA square matrix having a multiplicative inverse, that is, a matrix B such that AB = BA = I.Invertible matrices form the general linear group.
Involutory matrixA square matrix which is its own inverse, i.e., AA = I.Signature matrices, Householder matrices (Also known as 'reflection matrices'
to reflect a point about a plane or line) have this property.
Isometric matrixA matrix that preserves distances, i.e., a matrix that satisfies A*A = I where A* denotes the conjugate transpose of A.
Nilpotent matrixA square matrix satisfying Aq = 0 for some positive integer q.Equivalently, the only eigenvalue of A is 0.
Normal matrixA square matrix that commutes with its conjugate transpose: AA = AAThey are the matrices to which the spectral theorem applies.
Orthogonal matrixA matrix whose inverse is equal to its transpose, A1 = AT.They form the orthogonal group.
Orthonormal matrixA matrix whose columns are orthonormal vectors.
Partially Isometric matrixA matrix that is an isometry on the orthogonal complement of its kernel. Equivalently, a matrix that satisfies AA*A = A.Equivalently, a matrix with singular values that are either 0 or 1.
Singular matrixA square matrix that is not invertible.
Unimodular matrixAn invertible matrix with entries in the integers (integer matrix)Necessarily the determinant is +1 or 1.
Unipotent matrixA square matrix with all eigenvalues equal to 1.Equivalently, A I is nilpotent. See also unipotent group.
Unitary matrix A square matrix whose inverse is equal to its conjugate transpose, A1 = A*.
Totally unimodular matrixA matrix for which every non-singular square submatrix is unimodular. This has some implications in the linear programming relaxation of an integer program.
Weighing matrixA square matrix the entries of which are in {0, 1, 1}, such that AAT = wI for some positive integer w.
Close

Matrices with conditions on eigenvalues or eigenvectors

More information Name, Explanation ...
NameExplanationNotes
Convergent matrixA square matrix whose successive powers approach the zero matrix.Its eigenvalues have magnitude less than one.
Defective matrixA square matrix that does not have a complete basis of eigenvectors, and is thus not diagonalizable.
Derogatory matrix A square matrix whose minimal polynomial is of order less than n. Equivalently, at least one of its eigenvalues has at least two Jordan blocks.[3]
Diagonalizable matrixA square matrix similar to a diagonal matrix.It has an eigenbasis, that is, a complete set of linearly independent eigenvectors.
Hurwitz matrixA matrix whose eigenvalues have strictly negative real part. A stable system of differential equations may be represented by a Hurwitz matrix.
M-matrixA Z-matrix with eigenvalues whose real parts are nonnegative.
Positive-definite matrixA Hermitian matrix with every eigenvalue positive.
Stability matrixSynonym for Hurwitz matrix.
Stieltjes matrixA real symmetric positive definite matrix with nonpositive off-diagonal entries.Special case of an M-matrix.
Close

Matrices generated by specific data

More information A matrix of the form ...
NameDefinitionComments
Adjugate matrixTranspose of the cofactor matrixThe inverse of a matrix is its adjugate matrix divided by its determinant
Augmented matrixMatrix whose rows are concatenations of the rows of two smaller matricesUsed for performing the same row operations on two matrices
Bézout matrixSquare matrix whose determinant is the resultant of two polynomialsSee also Sylvester matrix
Carleman matrixInfinite matrix of the Taylor coefficients of an analytic function and its integer powersThe composition of two functions can be expressed as the product of their Carleman matrices
Cartan matrixA matrix associated with either a finite-dimensional associative algebra, or a semisimple Lie algebra
Cofactor matrixFormed by the cofactors of a square matrix, that is, the signed minors, of the matrixTranspose of the Adjugate matrix
Companion matrixA matrix having the coefficients of a polynomial as last column, and having the polynomial as its characteristic polynomial
Coxeter matrixA matrix which describes the relations between the involutions that generate a Coxeter group
Distance matrixThe square matrix formed by the pairwise distances of a set of pointsEuclidean distance matrix is a special case
Euclidean distance matrixA matrix that describes the pairwise distances between points in Euclidean spaceSee also distance matrix
Fundamental matrixThe matrix formed from the fundamental solutions of a system of linear differential equations
Generator matrixIn Coding theory, a matrix whose rows span a linear code
Gramian matrixThe symmetric matrix of the pairwise inner products of a set of vectors in an inner product space
Hessian matrixThe square matrix of second partial derivatives of a function of several variables
Householder matrixThe matrix of a reflection with respect to a hyperplane passing through the origin
Jacobian matrixThe matrix of the partial derivatives of a function of several variables
Moment matrix Used in statistics and Sum-of-squares optimization
Payoff matrixA matrix in game theory and economics, that represents the payoffs in a normal form game where players move simultaneously
Pick matrixA matrix that occurs in the study of analytical interpolation problems
Rotation matrixA matrix representing a rotation
Seifert matrixA matrix in knot theory, primarily for the algebraic analysis of topological properties of knots and links.Alexander polynomial
Shear matrixThe matrix of a shear transformation
Similarity matrixA matrix of scores which express the similarity between two data pointsSequence alignment
Sylvester matrixA square matrix whose entries come from the coefficients of two polynomialsThe Sylvester matrix is nonsingular if and only if the two polynomials are coprime to each other
Symplectic matrixThe real matrix of a symplectic transformation
Transformation matrixThe matrix of a linear transformation or a geometric transformation
Wedderburn matrixA matrix of the form , used for rank-reduction & biconjugate decompositionsAnalysis of matrix decompositions
Close

Matrices used in statistics

The following matrices find their main application in statistics and probability theory.

  • Bernoulli matrix — a square matrix with entries +1, 1, with equal probability of each.
  • Centering matrix — a matrix which, when multiplied with a vector, has the same effect as subtracting the mean of the components of the vector from every component.
  • Correlation matrix — a symmetric n×n matrix, formed by the pairwise correlation coefficients of several random variables.
  • Covariance matrix — a symmetric n×n matrix, formed by the pairwise covariances of several random variables. Sometimes called a dispersion matrix.
  • Dispersion matrix — another name for a covariance matrix.
  • Doubly stochastic matrix — a non-negative matrix such that each row and each column sums to 1 (thus the matrix is both left stochastic and right stochastic)
  • Fisher information matrix — a matrix representing the variance of the partial derivative, with respect to a parameter, of the log of the likelihood function of a random variable.
  • Hat matrix — a square matrix used in statistics to relate fitted values to observed values.
  • Orthostochastic matrix — doubly stochastic matrix whose entries are the squares of the absolute values of the entries of some orthogonal matrix
  • Precision matrix — a symmetric n×n matrix, formed by inverting the covariance matrix. Also called the information matrix.
  • Stochastic matrix — a non-negative matrix describing a stochastic process. The sum of entries of any row is one.
  • Transition matrix — a matrix representing the probabilities of conditions changing from one state to another in a Markov chain
  • Unistochastic matrix — a doubly stochastic matrix whose entries are the squares of the absolute values of the entries of some unitary matrix

Matrices used in graph theory

The following matrices find their main application in graph and network theory.

  • Adjacency matrix — a square matrix representing a graph, with aij non-zero if vertex i and vertex j are adjacent.
  • Biadjacency matrix — a special class of adjacency matrix that describes adjacency in bipartite graphs.
  • Degree matrix — a diagonal matrix defining the degree of each vertex in a graph.
  • Edmonds matrix — a square matrix of a bipartite graph.
  • Incidence matrix — a matrix representing a relationship between two classes of objects (usually vertices and edges in the context of graph theory).
  • Laplacian matrix — a matrix equal to the degree matrix minus the adjacency matrix for a graph, used to find the number of spanning trees in the graph.
  • Seidel adjacency matrix — a matrix similar to the usual adjacency matrix but with 1 for adjacency; +1 for nonadjacency; 0 on the diagonal.
  • Skew-adjacency matrix — an adjacency matrix in which each non-zero aij is 1 or 1, accordingly as the direction i → j matches or opposes that of an initially specified orientation.
  • Tutte matrix — a generalization of the Edmonds matrix for a balanced bipartite graph.

Matrices used in science and engineering

Specific matrices

See also

Notes

References

Wikiwand in your browser!

Seamless Wikipedia browsing. On steroids.

Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.

Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.