Sleijpen and van der Vorst introduced the Jacobi--Davidson (JD) method [1] to find eigenvalues in the interior of the spectrum of a real or complex large and sparse matrix. For generalized eigenvalue problems, Sleijpen et al. suggest the simplified inexact JD (SIJD) method [2]. The simplification is in removing the subspace acceleration, and the inexactness is due to the fact that “the correction equation is solved approximately.”
In this paper, Zhao offers a convergence analysis of SIJD for nonlinear (polynomial) eigenvalue problems with simple eigenpairs. The convergence of the method is based on the norm of the residual vector. The author shows that the difference between two consecutive approximations of the eigenvalue is also bounded by the same norm. Several examples are solved to demonstrate the rate of convergence and the impact of the solution of the correction equation on the convergence of SIJD. The proof of rate of convergence is important so users know beforehand what conditions a problem should satisfy to guarantee convergence.