Obtain eigs from matrix and partially known eigenvector

The issue is that I have an input square matrix P, The values in some diagonal elements of P matrix are very big imaginary numbers which corresponds to the zero positions in eigenvectors. Thus, I remove the values in the diagonal of P matrix since no other elements are related to the pre-defined valule (0). Then I try to solve the eigenvalues and eigenvectors of P. However, the eigenvectors of original P and the deleted P are different (V2~=V). Why it happens? I got confused and please give me some suggestions.
Thank you for your help, Jeniffer.
load inputmatrixP.mat
N=length(P);
ind=10:51;
P2=P;
P2(ind,:)=[];
P2(:,ind)=[];
[V,D]=eigs(P,20);
[Vtmp,D2]=eigs(P2,20);
jj=[1:9,52:N];
V2=zeros(N,20);
V2(jj,:)=Vtmp;

 Accepted Answer

Hi Jennifer,
You are right that matrix P here is a block-diagonal matrix with three blocks:
[A 0 0;
0 B 0
0 0 C]
And for such a matrix, the eigenvalues of the whole matrix are the union of the eigenvalues of matrices A, B and C. The eigenvectors of the whole matrix can also be computed from the eigenvectors of the submatrices, like you do in the code above.
Two issues to keep in mind:
1) The eigs call returns the 20 eigenvalues with largest absolute value. It seems to me that this is likely to be the ones in matrix B, since they have a very large imaginary part. That would mean that D and D2 are not the same, and you have to call eigs with an option that will return the eigenvalues in A and C instead.
2) Another thing to keep in mind is that the eigenvectors aren't uniquely defined, each eigenvector can be multiplied with any complex number of absolute value 1. The easiest way to check a matrix of eigenvectors is to compute norm(A*V - V*D), to see if they satisfy their definition to sufficient accuracy.

3 Comments

Hi Christine,
You got what I meant and thank you for your detailed clarification. But I still have one question.
I apologized for my carelessness. I did try the below code that indicates that both eigenvalues are same since they comes from A and C and errors are low enough for both eigenvectors. However, the format of eigenvectors of P is still quite different from eigenvectors of P2 (Vtmp).
P=[A 0 0;
0 0
0 0 C], and VA and VC are eigenvectors of A and C matrix.I believe that the eigenvector of P should be in the form [VA 0;
0 VC]. However, the eigenvector of P have values [x x x
1e-20 1e-20 1e-20
x x x ] in the positions of VA and VC. Although we know that the eigenvectors can be recombined through linear combination, the format of P eigenvectors shouldn't have non-zero elements in both position of VA and VC. How to explain it?
Thanks again,
Jennifer
[V,D]=eigs(P,10,-2);
[Vtmp,D2]=eigs(P2,10,-2);
error1=norm(P*V-V*D);
error2=norm(P2*Vtmp-Vtmp*D2);
Hi Jennifer,
The reason is likely that the matrices A and C both have the same eigenvalues. Two eigenvectors with different eigenvalues
can't be recombined to form an eigenvector for either eigenvalue. But if above, any linear combination of v1 and v2 is just as valid of result for EIGS to return.
Here's an example:
n = 4;
Adifferent = diag(repelem([-2 10 -2.2], [n 4*n n]));
Arepeat = diag(repelem([-2 10 -2], [n 4*n n]));
[Ud, Dd] = eigs(Adifferent, 2*n, -3);
[Ur, Dr] = eigs(Arepeat, 2*n, -3);
diag(Dd)'
ans = 1×8
-2.2000 -2.2000 -2.2000 -2.2000 -2.0000 -2.0000 -2.0000 -2.0000
diag(Dr)'
ans = 1×8
-2.0000 -2.0000 -2.0000 -2.0000 -2.0000 -2.0000 -2.0000 -2.0000
tiledlayout(1, 2)
nexttile
spy(abs(Ud) > 1e-8)
nexttile
spy(abs(Ur) > 1e-8)
So getting separate blocks of eigenvectors for a matrix with blocks that have the same eigenvalues is a valid solution, but not the unique solution - and eigs will just converge to a solution, it doesn't take the block structure into account.
One thing that might contribute to the confusion here is that the eig function for dense matrices will notice a block diagonal structure and is likely to return eigenvectors in blocks for that case (although there is no guarantee of this). The eigs function is more generic, taking just a function handle that applies the matrix to a vector, which is the reason it acts differently.
A last source of confusion is that when eigs is called asking for a relatively large proportion of the eigenvalues of the matrix, or on a very small matrix, it falls back to calling eig, since that will be more efficient in those cases. You can call eigs with option Display=true to check on this:
eigs(speye(1000), 500, 'largestabs', Display=true);
=== Simple eigenvalue problem A*x = lambda*x === Computing 500 eigenvalues of type 'largestabs'. Parameters passed to Krylov-Schur method: Maximum number of iterations: 300 Tolerance: 1e-14 Subspace Dimension: 1000 Compute EIGS by calling EIG, because subspace dimension is equal to problem size.
Hi Christine,
Awesome! Perfect explanations to clear my confusion. Thanks a lot.
Regards,
Jiali

Sign in to comment.

More Answers (1)

Hey @Jiali
The issue you're experiencing is likely due to the removal of diagonal elements from the matrix P. When you remove the values in the diagonal, you are essentially modifying the matrix P by deleting certain rows and columns. This modification can affect the eigenvectors and eigenvalues of the matrix.
Eigenvectors are determined by the relationships between the elements of the matrix. When you remove diagonal elements, you are altering these relationships and, consequently, the eigenvectors can change. Even though the deleted positions correspond to zero values, other elements in the matrix may still have an influence on the eigenvectors.
To address this issue, you can consider a different approach. Instead of removing the diagonal elements, you can set them to a small non-zero value, such as a small imaginary number or a small real number, rather than completely removing them. By doing so, you can preserve the structure of the matrix while minimizing the impact on the eigenvectors.

Categories

Find more on Linear Algebra in Help Center and File Exchange

Products

Release

R2015a

Asked:

on 5 Jul 2023

Commented:

on 7 Jul 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!