Eigenvalue Problem - Theory And Applications

24 downloads 0 Views 298KB Size Report
Apr 2, 2015 - Linear Transformation. Mathematically, if v is an n ..... its Application, 4th Edition. Seymour Lipschutz, Linear Algebra, Schaum's outline Series.
Eigenvalue Problem Theory And Applications

Dr. Mohammad Munir Department of Mathematics Government Postgraduate College, Abbottabad

April 2, 2015

1/2

Outline 1

Introduction to Eigenvalue Problem

2

Properties of Eigenvalues General Theorems Relation With Similar Matrices

3

Diagonalization of Matrices Some Diagonalizable Matrices How to diagonalize a matrix Applications of Diagonalization

4

Application to Vector Differential Equation

5

Fields of Applications of Eigenvalues

6

Eigenfunctions

7

Singular Values and SVD

8

References

9

Appendix 2/2

Introduction to Eigenvalue Problem

The prefix eigen comes from the German language meaning ”own-”, ”unique to” or ”peculiar to”. An Eigenvector or characteristic vector of a square matrix is a vector that points in a direction which is invariant under the associated Linear Transformation. Mathematically, if v is an n × 1 vector, then it is an eigenvector of an n × n matrix A if Av = λv (1) λ is a scalar known as the eigenvalue associated with the eigenvector v. The problem(1) is called Eigenvalue Problem,

2/2

Introduction to Eigenvalue Problem

   v1 a11 a12 ... a1n v2  a21 a22 ... a2n      Let A= . .. .. ..  and v = .. . Equation(1) gives  ..  . . . . an1 an2 ... ann vn 



    a11 a12 ... a1n v1 v1 a21 a22 ... a2n  v2  v2        .. .. .. ..   .. =λ  ..  .  . .    . . . . vn vn an1 an2 ... ann which is equivalent to the homogeneous system      a11 − λ a12 ... a1n v1 0      a21 a22 − λ ... a2n  v2  0    .. .. .. ..   ..  =  ..  .  .     . . . . . an1 an2 ... ann − λ vn 0

(2)

or 2/2

Introduction to Eigenvalue Problem

(A − λI )v = 0,

(3)

A − λI is called characteristic matrix. We are interested in nontrivial solutions of (3). !!! By Cramer’s rule, System (3) has nontrivial solutions iff det (A − λI ) = 0.

(4)

The left-hand side of equation (4) is known as the characteristic polynomial and is given by φ(λ) = det (A − λI ).

(5)

The polynomial (5) has degree n equal to the order of the square matrix A. Therefore there are n roots of the equation (4) and hence n Eigenvalues of A.

2/2

Introduction to Eigenvalue Problem

When the degree of the ploynomial (5) ≤ 4, Eigenvalues can be found by analytical formula or by factorization or by combination of both. In addition, numerical methods can also be applied. When the degree of the ploynomial (5) ≥ 4, then by the Abel-Ruffuni theorem (19) there is no explicit formula for finding the eigenvalues. In this case, the numerical methods is indispensible.   2 1 Example: Consider the transformation matrix A = . 1 2 The characteristic polynomial= φ(λ) = |A − λI | = 3 − 4λ + λ2 . Eigenvalues are λ = 1 and λ = 3.     1 1 Eigenvecotors (Solution of (3)) are v = ,w= . −1 1 Demonstration of Eigenvalues

2/2

Properties of Eigenvalues

General Theorems

We give the important properties of Eigenvalus and Eigenvectors in terms of theorems. Theorem If λ1 , λ2 , ..., λn , are distinct eigenvalues of A, corresponding to vectors v1 , v2 , ..., vn , then these vectors are linearly independent. We have the following theorem about the derivatives of the characteristics polynomial Theorem The kth derivative of the φ(λ) = |λI − A|, with respect to λ is k! times the sum of the principal minors of the order n − k of the characteristics matrix when k < n, is n! when k = n, and is 0 when k > n.

2/2

Properties of Eigenvalues

General Theorems

The multiplicity (20) of the roots of the characteristics polynomials also tell about the dimension of the eigenspace generated by the eigenvalue. Theorem If λi is an eigenvalue of n square matrix A of multiplicity r ,then the rank of |λi I − A| is not less than n − r and the dimension of the associated eigenspace is not greater than r . Consider the transformation matrix A, given by,   2 2 1 A = 1 3 1 . 1 2 2 The characteristics polynomial is φ(λ) = |λI − A| = (λ − 5)(λ − 1)2 = 0. The eigenspace generated by simple root λ = 5 is of dimension 1, and the eigenspace generated by λ = 1 is of dimension 2. Theorem 0

The eigenvalues of the matrix A and A are same. 2/2

Properties of Eigenvalues

General Theorems

Theorem If λ1 , λ2 , ..., λn , are distinct eigenvalues of A and k is any scalar , then kλ1 , kλ2 , ..., kλn , are eigenvectors of A. Theorem If λ1 , λ2 , ..., λn , are distinct eigenvalues of A and k is any scalar , then λ1 − k, λ2 − k, ..., λn − k, are eigenvectors of A. Theorem If α is eigenvalue of a non-singular matrix A then |A|/α is eigenvalue of adj (A). Theorem The chracteristics roots of A and A0 are the same.

2/2

Properties of Eigenvalues

Relation With Similar Matrices

Similar matrices also have a close association with the Eigenvalue problem: Definition Two square matrices A and B are to said to be similar if there exists a non-singular matrix P such that B = PAP −1 . Theorem If an n square matrix A has n linearly independent eigenvectors, it to similar to a diagonal matrix. Theorem The diagonal matrix D = (λ1 , λ2 , ... ,λn )hasλ1 , λ2 , ..., λn eigenvalues corresponding to eigenvectors v1 , v2 , ..., vn . Theorem Similar matrices have the same eigenvalues. 2/2

Properties of Eigenvalues

Relation With Similar Matrices

Theorem Eigenvalues of the triangular(upper or lower) matrices are the diagonal elements. Theorem Eigenvalues of the non-symmetric matrices are complex; those of symmetric are reals.

2/2

Diagonalization of Matrices

Most of the applications of the Eigenvalue Problem involve transforming the matrices to the diagonal forms. Definition A square matrix A is called diagonalizable if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P such that P −1 AP is a diagonal matrix. The matrix P is called the model matrix of A. Definition A square matrix which is not diagonalizable is called a defective. We have the following sufficient (but not necessary) condition for diagonalizable matrices. Theorem An n × n matrix A is diagonalizable over the field F if it has n distinct eigenvalues in F 2/2

1

2

Diagonalization of Matrices 0

Some Diagonalizable Matrices

Real symmetric matrices, A = A, are diagonalizable by orthogonal 0 matrices, Q = Q −1 ; i.e., given a real symmetric matrix A, Q T AQ is diagonal for some orthogonal matrix Q. Generally, matrices are diagonalizable by unitary matrices, A0 = A−1 , if and only if they are normal A0 A−1 = A0 A−1 . Normal matrices include diagonal, real symmetric, skew-symmetric, orthogonal, covariance, Hermitian, skew-Hermitian matrices).

2/2

Diagonalization of Matrices



How to diagonalize a matrix



1 2 0 Consider a matrix A=0 3 0. 2 −4 2 This matrix has eigenvalues λ1 = 3, λ2 = 2, λ3 = 1. Since A is a 3 × 3 matrix with 3 different eigenvalues;   therefore,  it is diagonalizable.   −1 0 −1      The Eigenvectors of A are v1 = −1 , v2 = 0 , v3 = 0 . 2 1 2   −1 0 −1 Now, let P= −1 0 0 . 2 1 2 After calculating P −1 , we have       0 −1 0 1 2 0 −1 0 −1 3 0 0 0 1 0 3 0  −1 0 0  = 0 2 0. P −1 AP =  2 −1 1 0 2 −4 2 2 1 2 0 0 1 Note that the eigenvalues λk appear in the diagonal matrix.

2/2

Diagonalization of Matrices

Applications of Diagonalization

Diagonalization can be used to compute the powers of a matrix A efficiently, provided the matrix is diagonalizable. Suppose we have found that P −1 AP = D is a diagonal matrix. Then, as the matrix product is associative, Ak = (PDP −1 )k = (PDP −1 ) · (PDP −1 ) · · · (PDP −1 )

= PD (P −1 P )D (P −1 P ) · · · (P −1 P )DP −1 = PD k P −1

2/2

Application to Vector Differential Equation

For the matrix ordinary differential equation x0 (t ) = Ax(t ) In the case A has n distinct eigenvalues, the solution has the following form: x(t ) = c1 e λ1 t u1 + c2 e λ2 t u2 + · · · + cn e λn t un where λ1 , λ2 , ..., λn are the eigenvalues of A; u1 , u2 , ..., un are the respective eigenvectors of A and c1 , c2 , ...., cn are constants. The coupled differential equation dx = 3x − 4y , dt dy = 4x − 7y dt x (0) = y (0) = 1. 2/2

Application to Vector Differential Equation

We can write it d dt

     x 3 −4 x = y 4 −7 y

. Its characteristics equation is   3−λ −4 det =0 4 −7 − λ or λ2 + 4λ − 5 = 0 Its roots λ1 = 1 and λ2 =  −5 are theeigenvalues of A and corresponding 2 1 eigenvectors are v1 = . v2 = . 1 2 The general solution   x = Ae λ1 t v1 + Be λ2 t v2 , y 2/2

Application to Vector Differential Equation

becomes       x t 2 −5t 1 = Ae + Be . y 1 2 Simplifying as      x 2 1 Ae t , = Be −5t y 1 2 which gives x = 2Ae t + Be −5t y = Ae t + 2Be −5t . Applying initial conditions 1 = 2A + B 1 = A + 2B. 2/2

Application to Vector Differential Equation

Solving A = 1/3, B = 1/3. Therefore our final solution is 2 t 1 −5t e + e 3 3 1 2 y = e t + e −5t . 3 3 This method can be applied to higher dimensional system and higher order differential equations x=

2/2

Fields of Applications of Eigenvalues 1

Schr¨odinger Wave Equation The Schr¨odinger wave equation in quantum mechanics is an eigenvalue equation: HψE = E ψE where H, the Hamiltonian, is a second-order differential operator and ψE , the wavefunction, is one of its eigenfunctions corresponding to the eigenvalue E , interpreted as its energy.

2

Sturm-Liviouville Problem Find the function f (x ) satisfying the differential equation: 00

f (x ) = −λf (x )

3

Molecular orbitals In Quantum Mechanics,molecular orbitals can be defined by the eigenvectors of the Fock operator. The corresponding eigenavalues are interpreted as ionization potentials. 2/2

Fields of Applications of Eigenvalues 4

Principal components analysis In Principal Components Analysis, principal components are the eigenvalues of the covariance matirx; h i cov(X, Y ) = E (X − E[X])(Y − E[Y ])T .

5

Vibration analysis In the vibration analysis of mechanical structures, the eigenvalues are the natural frequencies of the vibration, and the eigenvectors are the shapes of these vibrational modes.

6

Eigenfaces In image processing, the Eigenvectors of the Covariance Matrix associated with a large set of normalized pictures of faces are called Eigenfaces; Eigenfaces provide a means of applying data compression to faces for identification purposes.

7

Biology The basic reproduction number R0 is the largest eigenvalue of the 2/2

Fields of Applications of Eigenvalues

next generation matrix( how many people in the population will become infected after time tG has passed). The signs of Eigenvalues have also their importance. 8

Optimization Theory In optimization the signs of the eigenvalues of the Hessian matrix are used to classify the extreme points into maxima, minima and saddle points.

9

Stability Analysis The signs of the eigenvalues of the Jacobian of a dynamical system are used to identifiy whether the system is stable or instable(Steady state of the system).

2/2

Eigenfunctions

Eigenfunctions are the analogues of the eigenvalues problems on infinite dimensional spaces. Definition If L is a linear operator on a function space, then f is an eigenfunction for L and λ is the associated eigenvalue whenever Lf = λf . For example, fk (x ) = e kx is an eigenfunction for the differential operator, d2 2 L = dx 2 for any value of k, with corresponding eigenvalue λ = k . We can take fk (x ) to besin(kx ) and cos(kx ), with the same eigenvalue λ = k 2. A boundary condition is applied to this system (e.g., f (0) = 0 or f (0) = 3),

2/2

Singular Values and SVD

The Singular Value Decomposition of an m × n real or complex matrix M is a factorization of the form M = UΣV ∗ ,

where U is an m × m real or complex unitary matrix, Σ is an m × n rectangular diagonal matrix with non-negative real numbers on the diagonal, V ∗ (the conjugate transpose of V , or simply the transpose of V if V is real) is an n × n real or complex unitary matrix. The diagonal entries of Σ are known as the singular values of M. m columns of U and the n columns of V are called the left-singular vectors and right-singular vectors of M, respectively. The singular value decomposition and the eigendecomposition are closely related.

2/2

References

Gilbert Strang, Linear Algebra and its Application, 4th Edition. Seymour Lipschutz, Linear Algebra, Schaum’s outline Series. Ayres Jr, Matirces, Schaum’s outline Series. Online Help, Google search.

2/2

Appendix

Theorem The AbelRuffini theorem (also known as Abel’s impossibility theorem) states that there is no general algebraic solution in explicit form for polynomial of degree five or higher with arbitrary coefficients. Definition let F be a field and p (x ) be a polynomial in one variable and coefficients in F . An element a ∈ F is called a root of multiplicity k of p (x ) if there is a polynomial s (x ) such that s (a) 6= 0 and p (x ) = (x − a)k s (x ). If k = 1, then a is called a simple root. If k ≥ 2, then a is called a multiple root.

2/2