理化学研究所 計算科学研究センター

メニュー
メニュー

第150回 第2部

第150回 第2部
日時: 2018年10月19日(金)、14:30 - 15:00
場所: R-CCS 6階講堂

・講演題目:The Evaluation of the Chebyshev smoother and the MSDO-CG of enlarged Krylov subspace method
・講演者:依田凌 (工学院大学)
※発表・スライド共に英語

講演要旨:

The first topic is the Chebyshev smoother utilized in the multigrid method. The Chebyshev smoother is derived from the Chebyshev polynomials. This smoother is the nonstationary method, unlike Jacobi and Gauss-Seidel method. It has as high parallelism as Jacobi method and has high convergence like Gauss-Seidel method. We made use of the Jacobi, Gauss-Seidel, or Chebyshev iteration as the smoother of the multigrid method, and compared the number of iterations of the multigrid preconditioned conjugate gradient method and the improvement of the eigenvalue distribution. The second topic is the enlarged Krylov subspace method. The enlarged Krylov subspace method divides the solution vector and the search direction set into spatially distinct sets. It enlarges DOF of the search direction space compared to the Krylov subspace method. On the other hand, Multiple Search Direction Conjugate Gradient method (MSD-CG), which is one of the conventional versions of the enlarged Krylov subspace CG method, does not satisfy A-orthogonality in the search direction vectors. The Multiple Search Direction Conjugate Gradient method with Orthogonalization (MSDO-CG) is adapted to A-orthogonalization processes to guarantee faster convergence. We implemented MSDO-CG, and evaluated the convergence behavior.