Matrix addition: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
→‎Direct sum: Remark on boldface is not systematically true (if true at all)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{More footnotes|date=November 2009}}
My name: Joyce Dunning<br>My age: 39 years old<br>Country: Australia<br>City: Nowa Nowa <br>Post code: 3887<br>Street: 35 Crofts Road<br><br>Review my website; [http://Binaryblitz.com/modules.php?name=Your_Account&op=userinfo&username=HStanfill How To Get Free Fifa 15 Coins]
 
In [[linear algebra]], a '''symmetric matrix''' is a [[square matrix]] that is equal to its [[transpose]]. Formally, matrix ''A'' is symmetric if
 
:<math>A = A^{\top}.</math>
 
Because equal matrices have equal dimensions, only square matrices can be symmetric.
 
The entries of a symmetric matrix are symmetric with respect to the [[main diagonal]]. So if the entries are written as ''A'' = (''a''<sub>''ij''</sub>), then ''a''<sub>''ij''</sub> = a<sub>''ji''</sub>, for all indices ''i'' and ''j''.
 
The following 3×3 matrix is symmetric:
 
:<math>\begin{bmatrix}
1 & 7 & 3\\
7 & 4 & -5\\
3 & -5 & 6\end{bmatrix}.</math>
 
Every square [[diagonal matrix]] is symmetric, since all off-diagonal entries are zero. Similarly, each diagonal element of a [[skew-symmetric matrix]] must be zero, since each is its own negative.
 
In linear algebra, a [[real number|real]] symmetric matrix represents a [[self-adjoint operator]] over a [[real number|real]] [[inner product space]]. The corresponding object for a [[complex number|complex]] inner product space is a [[Hermitian matrix]] with complex-valued entries, which is equal to its [[conjugate transpose]]. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries.  Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.
 
== Properties ==
 
The finite-dimensional [[spectral theorem]] says that any symmetric matrix whose entries are [[real number|real]] can be [[diagonal matrix|diagonalized]] by an [[orthogonal matrix]]. More explicitly: For every symmetric real matrix ''A'' there exists a real orthogonal matrix ''Q'' such that ''D'' = ''Q''<sup>T</sup>''AQ'' is a [[diagonal matrix]]. Every symmetric matrix is thus, [[up to]] choice of an [[orthonormal basis]], a diagonal matrix.
 
Another way to phrase the spectral theorem is that a real ''n''×''n'' matrix ''A'' is symmetric if and only if there is an orthonormal basis of <math>\mathbb{R}^n</math> consisting of eigenvectors for ''A''.
 
Every real symmetric matrix is [[Hermitian matrix|Hermitian]], and therefore all its [[eigenvalues]] are real. (In fact, the eigenvalues are the entries in the diagonal matrix ''D'' (above), and therefore ''D'' is uniquely determined by ''A'' up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.
 
A complex symmetric matrix can be diagonalized using a unitary matrix: thus if ''A'' is a complex symmetric matrix, there is a unitary matrix ''U'' such that
''UAU''<sup>t</sup> is a diagonal matrix. This result is referred to as the '''Autonne–Takagi factorization'''. It was originally proved by Leon Autonne (1915) and [[Teiji Takagi]] (1925) and rediscovered with different proofs by several other mathematicians.<ref>{{harvnb|Horn|Johnson|2013|page=278}}</ref><ref>See:
*{{citation|first=L.|last= Autonne|title= Sur les matrices hypohermitiennes et sur les matrices unitaires|journal= Ann. Univ. Lyon|volume= 38|year=1915|pages= 1–77}}
*{{citation|first=T.|last= Takagi|title= On an algebraic problem related to an analytic theorem of Carathéodory and Fejér and on an allied theorem of Landau|journal= Japan. J. Math.|volume= 1 |year=1925|pages= 83–93}}
*{{citation|title=Symplectic Geometry|first=Carl Ludwig|last= Siegel|journal= American Journal of Mathematics|volume= 65|year=1943|pages=1-86|url= http://www.jstor.org/stable/2371774}}, Lemma 1, page 12
*{{citation|first=L.-K.|last= Hua|title= On the theory of automorphic functions of a matrix variable I–geometric basis|journal= Amer. J. Math.|volume= 66 |year=1944|pages= 470–488}}
*{{citation|first=I.|last= Schur|title= Ein Satz über quadratische formen mit komplexen koeffizienten|journal=Amer. J. Math.|volume=67|year=1945|pages=472–480}}
*{{citation|first1=R.|last1= Benedetti|first2=P.|last2= Cragnolini|title=On simultaneous diagonalization of one Hermitian and one symmetric form|journal= Linear Algebra Appl. |volume=57 |year=1984| pages=215–226}}
</ref> In fact the matrix ''B'' = ''A''*''A'' is Hermitian and non-negative, so there is a unitary matrix ''V'' such that ''V''*''BV'' is diagonal with non-negative real entries. Thus ''C'' = ''V''<sup>''t''</sup>''AV'' is complex symmetric with ''C''*''C'' real. Writing ''C'' = ''X'' + ''iY'' with ''X'' and ''Y'' real symmetric matrices,  ''C''*''C'' = ''X''<sup>2</sup> − ''Y''<sup>2</sup> + i
(''XY'' − ''YX''). Thus ''XY'' = ''YX''. Since ''X'' and ''Y'' commute, there is a real orthogonal matrix ''W'' such that ''WXW''<sup>''t''</sup> and ''WYW''<sup>''t''</sup> are diagonal. Setting ''U'' = ''WV''<sup>''t''</sup>, the matrix ''UAU''<sup>''t''</sup> is diagonal. Post-multiplying ''U'' by a diagonal matrix the diagonal entries can be taken to be non-negative. Since their squares are the eigenvalues of ''A''*''A'', they coincide with the [[singular value]]s of ''A''.
 
The sum and difference of two symmetric matrices is again symmetric, but this is not always true for the [[matrix multiplication|product]]: given symmetric matrices ''A'' and ''B'', then ''AB'' is symmetric if and only if ''A'' and ''B'' [[commutativity|commute]], i.e., if ''AB'' = ''BA''. So for integer ''n'', ''A<sup>n</sup>'' is symmetric if ''A'' is symmetric. If ''A'' and ''B'' are ''n''×''n'' real symmetric matrices that commute, then there exists a basis of <math>\mathbb{R}^n</math> such that every element of the basis is an [[eigenvector]] for both ''A'' and ''B''.
 
If ''A''<sup>&minus;1</sup> exists, it is symmetric if and only if ''A'' is symmetric.
 
Let Mat<sub>''n''</sub> denote the space of {{nowrap|1=''n'' &times; ''n''}} matrices. A symmetric ''n''&nbsp;&times;&nbsp;''n'' matrix is determined by ''n''(''n''&nbsp;+&nbsp;1)/2 scalars (the number of entries on or above the [[main diagonal]]). Similarly, a [[skew-symmetric matrix]] is determined by ''n''(''n''&nbsp;&minus;&nbsp;1)/2 scalars (the number of entries above the main diagonal). If Sym<sub>''n''</sub> denotes the space of {{nowrap|1=''n'' &times; ''n''}} symmetric matrices and Skew<sub>''n''</sub> the space of {{nowrap|1=''n'' &times; ''n''}} skew-symmetric matrices then since {{nowrap|1=Mat<sub>''n''</sub> = Sym<sub>''n''</sub> + Skew<sub>''n''</sub>}} and {{nowrap|1=Sym<sub>''n''</sub> &cap; Skew<sub>''n''</sub> = {0}}}, i.e.
:<math> \mbox{Mat}_n = \mbox{Sym}_n \oplus \mbox{Skew}_n , </math>
where ⊕ denotes the [[Direct sum of modules|direct sum]]. Let {{nowrap|1=X &isin; Mat<sub>''n''</sub>}} then
:<math> X = \frac{1}{2}(X + X^{\top}) + \frac{1}{2}(X - X^{\top}) . </math>
Notice that {{nowrap|1=½(''X'' + ''X''<sup>T</sup>) &isin; Sym<sub>''n''</sub>}} and {{nowrap|1=½(''X'' &minus; ''X''<sup>T</sup>) &isin; Skew<sub>''n''</sub>.}} This is true for every [[square matrix]] ''X'' with entries from any [[field (mathematics)|field]] whose [[characteristic (algebra)|characteristic]] is different from 2.
 
Any matrix [[Matrix congruence|congruent]] to a symmetric matrix is again symmetric: if ''X'' is a symmetric matrix then so is ''AXA''<sup>T</sup> for any matrix ''A''.
 
<!--If A is a skew-symmetric matrix, then ''iA'' (where ''i'' is an [[imaginary unit]]) is symmetric.-->
Denote with <math>\langle \cdot,\cdot \rangle</math> the standard [[inner product]] on '''R'''<sup>''n''</sup>. The real ''n''-by-''n'' matrix ''A'' is symmetric if and only if
 
:<math>\langle Ax,y \rangle = \langle x, Ay\rangle \quad \forall x,y\in\Bbb{R}^n.</math>
 
Since this definition is independent of the choice of [[basis (linear algebra)|basis]], symmetry is a property that depends only on the [[linear operator]] A and a choice of [[inner product]]. This characterization of symmetry is useful, for example, in [[differential geometry]], for each [[tangent space]] to a [[manifold]] may be endowed with an inner product, giving rise to what is called a [[Riemannian manifold]]. Another area where this formulation is used is in [[Hilbert space]]s.
 
A symmetric matrix is a [[normal matrix]].
 
== Decomposition ==
Using the [[Jordan normal form]], one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.<ref>{{cite journal | first=A. J.|last= Bosch | title=The factorization of a square matrix into two symmetric matrices | journal=[[American Mathematical Monthly]] | year=1986 | volume=93 | pages=462–464 | doi=10.2307/2323471 | issue=6 | jstor=2323471}}</ref>
 
Every real [[non-singular matrix]] can be uniquely factored as the product of an [[orthogonal matrix]] and a symmetric [[positive definite matrix]], which is called a [[polar decomposition]]. Singular matrices can also be factored, but not uniquely.
 
[[Cholesky decomposition]] states that every real positive-definite symmetric matrix ''A'' is a product of a lower-triangular matrix ''L'' and its transpose, <math>A=L L^T</math>.
If the matrix is symmetric indefinite, it may be still decomposed as <math> P A P^T = L T L^T</math> where <math>P</math> is
a permutation matrix (arising from the need to [[Pivot element|pivot]]), <math>L</math> a lower unit triangular matrix, <math>T</math> a symmetric tridiagonal matrix, and <math>D</math>
a direct sum of symmetric 1×1 and 2×2 blocks.<ref>{{cite book | author=G.H. Golub, C.F. van Loan. | title=Matrix Computations | publisher=The Johns Hopkins University Press, Baltimore, London | year=1996}}</ref>
 
Every complex symmetric matrix ''A'' can be diagonalized, moreover the [[Eigendecomposition of a matrix#Symmetric_matrices|eigen decomposition]] takes a simpler form:
:<math>A = Q \Lambda Q^{\top}  </math>
where ''Q'' is an [[unitary matrix]]. If A is real ''Q'' is an [[orthogonal matrix]], (the columns of which are [[eigenvectors]] of ''A''), and ''Λ'' is real and diagonal (having the [[eigenvalues]] of ''A'' on the diagonal). To see orthogonality, suppose <math>x</math> and <math>y</math> are eigenvectors corresponding to distinct eigenvalues <math>\lambda_1</math>, <math>\lambda_2</math>. Then
:<math>\lambda_1 \langle x,y \rangle = \langle Ax, y \rangle = \langle x, Ay \rangle = \lambda_2 \langle x, y \rangle</math>
so that if <math>\langle x, y \rangle \neq 0</math> then <math>\lambda_1 = \lambda_2</math>, a contradiction; hence <math>\langle x, y \rangle = 0</math>.
 
== Hessian ==
 
Symmetric real ''n''-by-''n'' matrices appear as the [[Hessian matrix|Hessian]] of twice continuously differentiable functions of ''n'' real variables.
 
Every [[quadratic form]] ''q'' on '''R'''<sup>''n''</sup> can be uniquely written in the form ''q''('''x''') = '''x'''<sup>T</sup>''A'''''x''' with a symmetric ''n''-by-''n'' matrix ''A''. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of '''R'''<sup>''n''</sup>, "looks like"
:<math>q(x_1,\ldots,x_n)=\sum_{i=1}^n \lambda_i x_i^2</math>
with real numbers λ<sub>''i''</sub>. This considerably simplifies the study of quadratic forms, as well as the study of the level sets {'''x''' : ''q''('''x''') = 1} which are generalizations of [[conic section]]s.
 
This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of [[Taylor's theorem]].
 
== Symmetrizable matrix ==
An ''n''-by-''n'' matrix ''A'' is said to be '''symmetrizable''' if there exist an invertible [[diagonal matrix]] ''D'' and symmetric matrix ''S'' such that {{nowrap|1=''A'' = ''DS''.}}
The transpose of a symmetrizable matrix is symmetrizable, for {{nowrap|1=(''DS'')<sup>T</sup> = ''D<sup>&minus;T</sup>(''DSD'')<sup>T</sup>.}} A matrix {{nowrap|1=''A'' = (''a<sub>ij''</sub>)}} is symmetrizable if and only if the following conditions are met:
# <math>a_{ij} = 0</math> implies <math>a_{ji}=0</math> for all <math>1 \le i \le j \le n.</math>
# <math>a_{i_1i_2} a_{i_2i_3}\dots a_{i_ki_1} = a_{i_2i_1} a_{i_3i_2}\dots a_{i_1i_k}</math> for any finite sequence <math>(i_1, i_2, \dots, i_k).</math>
 
== See also ==
Other types of [[symmetry]] or pattern in square matrices have special names; see for example:
 
* [[Antimetric matrix]]
* [[Centrosymmetric matrix]]
* [[Circulant matrix]]
* [[Covariance matrix]]
* [[Coxeter matrix]]
* [[Hankel matrix]]
* [[Hilbert matrix]]
* [[Persymmetric matrix]]
* [[Skew-symmetric matrix]]
* [[Toeplitz matrix]]
 
See also [[symmetry in mathematics]].
 
== Notes ==
{{Reflist}}
==References==
*{{citation|last=Horn|first= Roger A.|last2= Johnson|first2= Charles R.|title= Matrix analysis|edition=2nd| publisher=Cambridge University Press|year= 2013|id= ISBN 978-0-521-54823-6}}
 
== External links ==
* {{springer|title=Symmetric matrix|id=p/s091680}}
* [http://farside.ph.utexas.edu/teaching/336k/Newton/node66.html A brief introduction and proof of eigenvalue properties of the real symmetric matrix]
 
[[Category:Matrices]]

Latest revision as of 00:46, 1 December 2014

My name: Joyce Dunning
My age: 39 years old
Country: Australia
City: Nowa Nowa
Post code: 3887
Street: 35 Crofts Road

Review my website; How To Get Free Fifa 15 Coins