Sum Of Eigenvalues Is Trace
In linear algebra and functional assay, the min-max theorem, or variational theorem, or Courant–Fischer–Weyl min-max principle, is a result that gives a variational characterization of eigenvalues of compact Hermitian operators on Hilbert spaces. It can be viewed as the starting point of many results of like nature.
This article offset discusses the finite-dimensional instance and its applications before because compact operators on infinite-dimensional Hilbert spaces. Nosotros will see that for compact operators, the proof of the master theorem uses substantially the same idea from the finite-dimensional argument.
In the case that the operator is not-Hermitian, the theorem provides an equivalent characterization of the associated singular values. The min-max theorem can exist extended to self-adjoint operators that are bounded below.
Matrices [edit]
Let A be a northward × due north Hermitian matrix. Every bit with many other variational results on eigenvalues, one considers the Rayleigh–Ritz quotient RA : C n \ {0} → R divers by
where (⋅, ⋅) denotes the Euclidean inner production on C n . Clearly, the Rayleigh quotient of an eigenvector is its associated eigenvalue. Equivalently, the Rayleigh–Ritz quotient tin be replaced past
For Hermitian matrices A, the range of the continuous part RA (x), or f(ten), is a compact subset [a, b] of the real line. The maximum b and the minimum a are the largest and smallest eigenvalue of A, respectively. The min-max theorem is a refinement of this fact.
Min-max theorem [edit]
Let A be an n × n Hermitian matrix with eigenvalues λ 1 ≤ ... ≤ λthousand ≤ ... ≤ λn and so
and
in particular,
and these bounds are attained when ten is an eigenvector of the appropriate eigenvalues.
As well the simpler formulation for the maximal eigenvalue λ northward is given by:
Similarly, the minimal eigenvalue λ 1 is given by:
Proof
Since the matrix A is Hermitian information technology is diagonalizable and we can cull an orthonormal basis of eigenvectors {u 1, ..., un } that is, ui is an eigenvector for the eigenvalue λi and such that (ui , ui ) = 1 and (ui , uj ) = 0 for all i ≠ j.
If U is a subspace of dimension k then its intersection with the subspace span{uk , ..., un } isn't zero, for if it were, then the dimension of the span of the 2 subspaces would be , which is impossible. Hence there exists a vector 5 ≠ 0 in this intersection that we can write equally
and whose Rayleigh quotient is
(as all for i=k,..,n) and hence
Since this is true for all U, we can conclude that
This is one inequality. To plant the other inequality, chose the specific k-dimensional space Five = span{u 1, ..., uk } , for which
because is the largest eigenvalue in V. Therefore, also
To become the other formula, consider the Hermitian matrix , whose eigenvalues in increasing order are . Applying the result just proved,
The result follows on replacing with .
Counterexample in the not-Hermitian instance [edit]
Let N be the nilpotent matrix
Ascertain the Rayleigh caliber exactly as to a higher place in the Hermitian example. Then it is piece of cake to see that the only eigenvalue of N is nothing, while the maximum value of the Rayleigh ratio is 1 / 2 . That is, the maximum value of the Rayleigh quotient is larger than the maximum eigenvalue.
Applications [edit]
Min-max principle for atypical values [edit]
The singular values {σk } of a square matrix Thou are the square roots of the eigenvalues of K*M (equivalently MM*). An firsthand issue[ commendation needed ] of the first equality in the min-max theorem is:
Similarly,
Here denotes the k th entry in the increasing sequence of σ's, then that .
Cauchy interlacing theorem [edit]
Let A exist a symmetric n × due north matrix. The m × thousand matrix B, where m ≤ north, is called a compression of A if there exists an orthogonal projection P onto a subspace of dimension m such that PAP* = B. The Cauchy interlacing theorem states:
- Theorem. If the eigenvalues of A are α one ≤ ... ≤ αnorth , and those of B are β ane ≤ ... ≤ βj ≤ ... ≤ βm , so for all j ≤ m ,
This can be proven using the min-max principle. Allow βi have respective eigenvector bi and Sj be the j dimensional subspace Due southj = span{b i, ..., bj }, then
According to start role of min-max, αj ≤ βj . On the other hand, if we ascertain S m−j+1 = span{bj , ..., bone thousand }, then
where the last inequality is given past the second function of min-max.
When due north − 1000 = 1, we have αj ≤ βj ≤ α j+1 , hence the name interlacing theorem.
Compact operators [edit]
Let A exist a meaty, Hermitian operator on a Hilbert space H. Recall that the spectrum of such an operator (the set of eigenvalues) is a set of real numbers whose simply possible cluster point is zero. It is thus convenient to listing the positive eigenvalues of A as
where entries are repeated with multiplicity, every bit in the matrix case. (To emphasize that the sequence is decreasing, we may write .) When H is infinite-dimensional, the above sequence of eigenvalues is necessarily infinite. Nosotros at present apply the same reasoning equally in the matrix case. Letting Sk ⊂ H exist a chiliad dimensional subspace, we can obtain the following theorem.
- Theorem (Min-Max). Allow A be a compact, cocky-adjoint operator on a Hilbert space H, whose positive eigenvalues are listed in decreasing guild ... ≤ λ1000 ≤ ... ≤ λ ane . Then:
A similar pair of equalities hold for negative eigenvalues.
Proof
Let Due south' be the closure of the linear span . The subspace S' has codimension yard − ane. By the same dimension count argument as in the matrix example, S' ∩ Sk is non empty. And so there exists x ∈ S' ∩ Due southchiliad with . Since information technology is an chemical element of S' , such an x necessarily satisfy
Therefore, for all Southk
But A is compact, therefore the function f(x) = (Ax, 10) is weakly continuous. Furthermore, any divisional ready in H is weakly meaty. This lets u.s. supersede the infimum past minimum:
So
Because equality is accomplished when ,
This is the offset part of min-max theorem for compact self-adjoint operators.
Analogously, consider now a (yard − one)-dimensional subspace S k−1, whose the orthogonal complement is denoted by S thou−1 ⊥. If S' = span{u 1...uyard },
So
This implies
where the compactness of A was applied. Index the above by the collection of k-ane-dimensional subspaces gives
Selection S k−ane = span{u 1, ..., u k−1} and we deduce
Self-adjoint operators [edit]
The min-max theorem as well applies to (possibly unbounded) self-adjoint operators.[1] [ii] Recall the essential spectrum is the spectrum without isolated eigenvalues of finite multiplicity. Sometimes we have some eigenvalues beneath the essential spectrum, and we would like to approximate the eigenvalues and eigenfunctions.
- Theorem (Min-Max). Permit A be self-adjoint, and let be the eigenvalues of A below the essential spectrum. Then
.
If we only take North eigenvalues and hence run out of eigenvalues, then we allow (the bottom of the essential spectrum) for north>N, and the above statement holds afterwards replacing min-max with inf-sup.
- Theorem (Max-Min). Allow A be self-adjoint, and allow be the eigenvalues of A below the essential spectrum. So
.
If we but accept N eigenvalues and hence run out of eigenvalues, then we let (the lesser of the essential spectrum) for n > N, and the in a higher place statement holds after replacing max-min with sup-inf.
The proofs[1] [2] use the following results about self-adjoint operators:
- Theorem. Let A be cocky-adjoint. Then for if and but if .[1] : 77
- Theorem. If A is self-adjoint, then
and
.[1] : 77
Run into also [edit]
- Courant minimax principle
- Max–min inequality
References [edit]
- ^ a b c d Chiliad. Teschl, Mathematical Methods in Quantum Mechanics (GSM 99) https://www.mat.univie.ac.at/~gerald/ftp/volume-schroe/schroe.pdf
- ^ a b Lieb; Loss (2001). Analysis. GSM. Vol. fourteen (2d ed.). Providence: American Mathematical Society. ISBN0-8218-2783-9.
- Thousand. Reed and B. Simon, Methods of Modern Mathematical Physics 4: Analysis of Operators, Academic Printing, 1978.
Sum Of Eigenvalues Is Trace,
Source: https://en.wikipedia.org/wiki/Min-max_theorem
Posted by: bellwased1993.blogspot.com

0 Response to "Sum Of Eigenvalues Is Trace"
Post a Comment