I’ve put Jacobi-Davidson to the test on the simple Poisson matrix from the previous post: it has on the diagonals and is . Its eigenvalues are all real and located in the interval .
Using the exact solver
When solving the correction equation with the “exact” solver, we should be able to see quadratic convergence. I picked and as minimum and maximum dimension of the search space. The target is , and we’re after 10 eigenvalues closest to it. The API for this is currently just
Q, R, ritz_hist, conv_hist, residuals = harmonic_ritz_test(
A,
exact_solver(),
pairs = 10,
min_dimension = 10,
max_dimension = 15,
max_iter = 200,
ɛ = 1e-8,
τ = 3.0 + 0im
)
A sanity check is to see whether the partial Schur decomposition is correct:
> norm(A * Q - Q * R)
7.323237382364171e-9
> norm(Q' * Q - eye(10))
3.844239740211255e-9
That seems reasonable for our prescribed tolerance. Next we look at the convergence history.
Green: harmonic Ritz values. Yellow: converged Ritz values. Black: actual eigenvalues. Horizontal blue line: target.
A couple things to note from the convergence plot: after 15 iterations the search space becomes too big, and clearly only the best approximate Ritz values are retained at restart. Another thing is that as soon as a first Ritz value converges, the others converge very quickly as well. This is the main reason why we want a search space in the first place.
Using a couple steps of GMRES
Using the exact solver for the correction equation is basically cheating; Jacobi-Davidson should be seen as the method that targets interior eigenvalues without solving systems exactly at each iteration.
What we can do instead is use just 7 steps of GMRES without preconditioning, and see what happens then.
Q, R, ritz_hist, conv_hist, residuals = harmonic_ritz_test(
A,
gmres_solver(iterations = 7),
pairs = 10,
min_dimension = 10,
max_dimension = 15,
max_iter = 500,
ɛ = 1e-8,
τ = 3.0 + 0im
)
Again we do the sanity check:
> norm(Q' * Q - eye(10))
5.995057809828708e-8
> norm(A * Q - Q * R)
1.3111792852021023e-8
And we inspect the convergence plot
Green: harmonic Ritz values. Yellow: converged Ritz values. Black: actual eigenvalues. Horizontal blue line: target.
Now we need a whole lot of iterations more to let a Ritz value converge, but the convergence seems quite monotonically and is not hindered by restarts, which is great. Also, it’s clear that once the first Ritz value has converged, the others follow quickly as well.
The convergence can probably be improved if we would add a preconditioner, but this is something I have yet to implement.
Loss of orthogonality
It seems that during the iterations a simple modified Gram-Schmidt is not always enough to guarantee orthogonality of matrices and . A solution is to do Gram-Schmidt twice, or use iterative refinement. This seems rather costly, but maybe it’s necessary.