|
|
| Line 1: |
Line 1: |
| In mathematics, '''Sylvester’s criterion''' is a [[necessary and sufficient condition|necessary and sufficient]] criterion to determine whether a [[Hermitian matrix]] is [[positive-definite matrix|positive-definite]]. It is named after [[James Joseph Sylvester]].
| | Yoshiko is her title but she doesn't like when people use her complete title. I am a cashier and I'll be promoted quickly. I presently live in Alabama. The factor she adores most is flower arranging and she is trying to make it a profession.<br><br>Also visit my web site :: [http://freebusinesscollege.com//read_blog/198199/great-ideas-about-auto-repair-that-anyone-can-use cars warranty] |
| | |
| Sylvester's criterion states that a Hermitian matrix ''M'' is positive-definite if and only if all the following matrices have a positive [[determinant]]:
| |
| * the upper left 1-by-1 corner of <math>M</math>,
| |
| * the upper left 2-by-2 corner of <math>M</math>,
| |
| * the upper left 3-by-3 corner of <math>M</math>,
| |
| * ...
| |
| * <math>M</math> itself.
| |
| In other words, all of the leading [[principal minor]]s must be positive.
| |
| | |
| == Proof ==
| |
| The proof is only for nonsingular [[Hermitian matrix]] with coefficients in <math>\mathbb{R}</math>, therefore only for [[nonsingular]] real-symmetric matrices
| |
| | |
| '''Positive Definite or Semidefinite Matrix:''' A symmetric matrix <math>A</math> whose eigenvalues are positive (''λ>0'') is called [[positive-definite matrix|positive definite]], and when the eigenvalues are just nonnegative (''λ≥0''), <math>A</math> is said to be [[positive-semidefinite matrix|positive semidefinite]].
| |
| | |
| '''Theorem I:''' A real-symmetric matrix <math>A</math> has nonnegative eigenvalues if and only if <math>A</math> can be factored as <math>A = B^TB</math>, and all eigenvalues are positive if and only if <math>B</math> is nonsingular.<ref name="ref1"/>
| |
| | |
| {| cellspacing="0" cellpadding="1"
| |
| |-
| |
| |valign="top"| '''Proof:''' ||
| |
| '''Forward implication: '''
| |
| If ''A ∈ R<sup>nxn</sup>'' is symmetric, then, by the [[Spectral theorem]], there is an orthogonal matrix ''P'' such that ''A = PDP<sup>T</sup>'' , where ''D = diag (λ<sub>1</sub>, λ<sub>2</sub>, . . . , λ<sub>n</sub>)'' is real diagonal matrix with entries - eigenvalues of ''A'' and ''P'' is such that its columns are the eigenvectors of ''A''. If ''λ<sub>i</sub> ≥ 0'' for each ''i'', then ''D<sup>1/2</sup>'' exists, so ''A = PDP<sup>T</sup> = PD<sup>1/2</sup>D<sup>1/2</sup>P<sup>T</sup> = B<sup>T</sup>B'' for ''B = D<sup>1/2</sup>P<sup>T</sup>'', and ''λ<sub>i</sub> > 0'' for each ''i'' if and only if ''B'' is nonsingular.
| |
| | |
| '''Reverse implication:'''
| |
| Conversely, if ''A'' can be factored as ''A = B<sup>T</sup>B'', then all eigenvalues of ''A'' are nonnegative because for any eigenpair ''(λ, x)'':
| |
| | |
| ''λ=''<math>{\frac{x^TAx}{x^Tx}}={\frac{x^TB^TBx}{x^Tx}}={\frac{||Bx||^2}{||x||^2}}</math>''≥0''.
| |
| |}
| |
| '''Theorem II (The Cholesky decomposition):''' The symmetric matrix ''A'' possesses positive pivots if and only if ''A'' can be uniquely factored as ''A = R<sup>T</sup>R'', where ''R'' is an upper-triangular matrix with positive diagonal entries. This is known as the [[Cholesky decomposition]] of ''A'', and ''R'' is called the Cholesky factor of ''A''.<ref name="ref2"/>
| |
| {| cellspacing="0" cellpadding="1"
| |
| |-
| |
| |valign="top"| '''Proof:''' ||
| |
| '''Forward implication:''' If ''A'' possesses positive pivots (therefore ''A'' possesses an ''LU'' factorization: ''A=L.U' ''), then, it has an ''LDU'' factorization ''A = LDU=LDL<sup>T</sup>'' in which ''D = diag (u<sub>11</sub>, u<sub>22</sub>, . . . , u<sub>nn</sub>)'' is the diagonal matrix containing the pivots ''u<sub>ii</sub> > 0''.
| |
| | |
| : <math>\mathbf{A} =LU'= \begin{bmatrix}
| |
| 1 & 0 & . & 0\\
| |
| l_{12} & 1 & . & 0 \\
| |
| . & . & . & . \\
| |
| l_{1n} & l_{2n} & . & 1 \end{bmatrix}</math> x <math>\begin{bmatrix}
| |
| u_{11} & u_{12} & . & u_{1n}\\
| |
| 0 & u_{22} & . & u_{2n} \\
| |
| . & . & . & . \\
| |
| 0 & 0 & . & u_{nn} \end{bmatrix} =LDU= \begin{bmatrix}
| |
| 1 & 0 & . & 0\\
| |
| l_{12} & 1 & . & 0 \\
| |
| . & . & . & . \\
| |
| l_{1n} & l_{2n} & . & 1 \end{bmatrix}</math> x <math>\begin{bmatrix}
| |
| u_{11} & 0 & . & 0\\
| |
| 0 & u_{22} & . & 0 \\
| |
| . & . & . & . \\
| |
| 0 & 0 & . & u_{nn} \end{bmatrix}</math> x <math>\begin{bmatrix}
| |
| 1 & u_{12}/u_{11} & . & u_{1n}/u_{11}\\
| |
| 0 & 1 & . & u_{2n}/u_{22} \\
| |
| . & . & . & . \\
| |
| 0 & 0 & . & 1 \end{bmatrix}</math>
| |
| By a uniqueness property of the ''LDU'' decomposition, the symmetry of ''A'' yields: ''U=L<sup>T</sup>'', consequently ''A=LDU=LDL<sup>T</sup>''. Setting ''R = D<sup>1/2</sup>L<sup>T</sup>'' where ''D<sup>1/2</sup> = diag(<math>\scriptstyle\sqrt{u_{11}},\scriptstyle\sqrt{u_{22}},...,\scriptstyle\sqrt{u_{11}}</math>)'' yields the desired factorization, because ''A = LD<sup>1/2</sup>D<sup>1/2</sup>L<sup>T</sup> = R<sup>T</sup>R'', and ''R'' is upper triangular with positive diagonal entries.
| |
| | |
| '''Reverse implication:''' Conversely, if ''A = RR<sup>T</sup>'' , where ''R'' is lower triangular with a positive diagonal, then factoring the diagonal entries out of ''R'' is as follows:
| |
| : <math>\mathbf{R} =LD= \begin{bmatrix} | |
| 1 & 0 & . & 0\\
| |
| r_{12}/r_{11} & 1 & . & 0 \\
| |
| . & . & . & . \\
| |
| r_{1n}/r_{11} & r_{2n}/r_{22} & . & 1 \end{bmatrix}</math> x <math>\begin{bmatrix}
| |
| r_{11} & 0 & . & 0\\
| |
| 0 & r_{22} & . & 0 \\
| |
| . & . & . & . \\
| |
| 0 & 0 & . & r_{nn} \end{bmatrix}.</math>
| |
| ''R = LD'', where ''L'' is lower triangular with a unit diagonal and ''D'' is the diagonal matrix whose diagonal entries are the ''r<sub>ii</sub>'' ’s. Consequently, ''A = LD<sup>2</sup>L<sup>T</sup>'' is the ''LDU'' factorization for ''A'', and thus the pivots must be positive because they are the diagonal entries in ''D<sup>2</sup>''.
| |
| |}
| |
| '''Theorem III:''' Let ''A<sub>k</sub>'' be the ''k × k'' leading principal submatrix of ''A<sub>n×n</sub>''. If ''A'' has an ''LU'' factorization ''A = LU'', then ''det(A<sub>k</sub>) = u<sub>11</sub>u<sub>22</sub> · · · u<sub>kk</sub>'', and the ''k''-th pivot is ''u<sub>kk</sub> =det(A<sub>1</sub>) = a<sub>11</sub>'' for ''k = 1'', ''u<sub>kk</sub>=det(A<sub>k</sub>)/det(A<sub>k−1</sub>)'' for ''k = 2, 3, . . . , n''.<ref name="ref3"/>
| |
| | |
| Combining '''Theorem II''' with '''Theorem III''' yields:
| |
| | |
| '''Statement I:''' If the symmetric matrix ''A'' can be factored as ''A=R<sup>T</sup>R'' where R is an upper-triangular matrix with positive diagonal entries, then all the pivots of ''A'' are positive (by '''Theorem II'''), therefore all the leading principal minors of ''A'' are positive (by '''Theorem III''').
| |
| | |
| '''Statement II:''' If the nonsingular symmetric matrix ''A'' can be factored as <math>A=B^TB</math>, then the [[QR decomposition]] (closely related to [[Gram-Schmidt process]]) of ''B'' (''B=QR'') yields: <math>A=B^TB=R^TQ^TQR=R^TR</math>, where ''Q'' is [[orthogonal matrix]] and ''R'' is upper [[triangular matrix]].
| |
| | |
| Namely '''Statement II''' requires the non-singularity of the symmetric matrix ''A''.
| |
| | |
| Combining '''Theorem I''' with '''Statement I''' and '''Statement II''' yields:
| |
| | |
| '''Statement III:''' If the real-symmetric matrix ''A'' is positive definite then ''A'' possess factorization of the form ''A=B<sup>T</sup>B'', where ''B'' is nonsingular ('''Theorem I'''), the expression ''A=B<sup>T</sup>B'' implies that ''A'' possess factorization of the form ''A=R<sup>T</sup>R'' where ''R'' is an upper-triangular matrix with positive diagonal entries ('''Statement II'''), therefore all the leading principal minors of ''A'' are positive ('''Statement I''').
| |
| | |
| In other words, '''Statement III''' states:
| |
| | |
| '''Sylvester's Criterion:''' The real-symmetric matrix ''A'' is positive definite if and only if all the leading principal minors of ''A'' are positive.
| |
| | |
| The sufficiency and necessity conditions automatically hold because they were proven for each of the above theorems.
| |
| | |
| ==Notes==
| |
| {{reflist|refs=
| |
| <ref name="ref1">Carl D. Meyer, Matrix Analysis and Applied Linear Algebra. See chapter '''7.6 Positive Definite Matrices''', page 558</ref>
| |
| <ref name="ref2">Carl D. Meyer, Matrix Analysis and Applied Linear Algebra. See chapter '''3.10 The LU Factorization''', '''Example 3.10.7''', page 154</ref>
| |
| <ref name="ref3">Carl D. Meyer, Matrix Analysis and Applied Linear Algebra. See chapter '''6.1 Determinants''', '''Exercise 6.1.16''', page 474</ref>
| |
| }}
| |
| | |
| == References ==
| |
| {{Reflist}}
| |
| * {{Citation | last1=Gilbert | first1=George T. | title=Positive definite matrices and Sylvester's criterion | jstor=2324036 | year=1991 | journal=[[American Mathematical Monthly|The American Mathematical Monthly]] | issn=0002-9890 | volume=98 | issue=1 | pages=44–46 | doi=10.2307/2324036 | publisher=Mathematical Association of America}}.
| |
| * {{Citation | last1=Horn | first1=Roger A. | last2=Johnson | first2=Charles R. | title=Matrix Analysis | publisher=[[Cambridge University Press]] | isbn=978-0-521-38632-6 | year=1985}}. See Theorem 7.2.5.
| |
| * {{Citation | last1=Carl D. Meyer | title=Matrix Analysis and Applied Linear Algebra | publisher=[[Society for Industrial and Applied Mathematics|SIAM]] | isbn=0-89871-454-0}}.
| |
| | |
| [[Category:Articles containing proofs]]
| |
| [[Category:Matrix theory]]
| |
| | |
| [[fr:Matrice définie positive#Critère de Sylvester]]
| |
Yoshiko is her title but she doesn't like when people use her complete title. I am a cashier and I'll be promoted quickly. I presently live in Alabama. The factor she adores most is flower arranging and she is trying to make it a profession.
Also visit my web site :: cars warranty