Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default preconditioner type for finite elasticity #127

Open
beauof opened this issue Mar 12, 2018 · 6 comments
Open

Default preconditioner type for finite elasticity #127

beauof opened this issue Mar 12, 2018 · 6 comments

Comments

@beauof
Copy link
Contributor

beauof commented Mar 12, 2018

The default iterative preconditioner type for finite elasticity (incompressible Mooney-Rivlin law) seems to be: CMFE_SOLVER_ITERATIVE_JACOBI_PRECONDITIONER

This, however, causes the PETSC default linesearch solver to diverge with the following error message:

Nonlinear solve monitor:
Iteration number = 0
Function Norm = 19.727618690913857
Newton solver information:
Number of function evaluations = 1
Number of Jacobian evaluations = 0
>>ERROR: 1: Nonlinear line search solver did not converge. PETSc diverged linear solve.
>> SOLVER_NEWTON_LINESEARCH_SOLVE
>> SOLVER_NEWTON_SOLVE
>> SOLVER_NONLINEAR_SOLVE
>> SOLVER_SOLVE
>> Problem_SolverEquationsStaticNonlinearSolve
>> PROBLEM_SOLVER_EQUATIONS_SOLVE
>> PROBLEM_SOLVER_SOLVE
>> PROBLEM_CONTROL_LOOP_SOLVE
>> PROBLEM_SOLVE
>> cmfe_Problem_SolveObj

In fact, in solver_routines.f90, subroutine Solver_LinearIterativePreconditionerTypeSet, we have the case:

CASE(SOLVER_ITERATIVE_JACOBI_PRECONDITIONER)
CALL FlagError("Iterative Jacobi preconditioning is not implemented for a PETSc library.",ERR,ERROR,*999)

But this case is never reached even if the user specifies CMFE_SOLVER_ITERATIVE_JACOBI_PRECONDITIONER as the preconditioner type because of an if statement directly above:

IF(ITERATIVE_PRECONDITIONER_TYPE/=SOLVER%LINEAR_SOLVER%ITERATIVE_SOLVER%ITERATIVE_PRECONDITIONER_TYPE) THEN

The divergence issues can be resolved for this problem class/type with, e.g., CMFE_SOLVER_ITERATIVE_BLOCK_JACOBI_PRECONDITIONER or CMFE_SOLVER_ITERATIVE_INCOMPLETE_LU_PRECONDITIONER.

I would suggest:

  1. Change the default preconditioner type for this problem class/type
  2. Remove the if statement as it does not prevent developers from setting incompatible default values.

Any comments @chrispbradley @PrasadBabarendaGamage ?

@chrispbradley
Copy link
Member

The
IF(ITERATIVE_PRECONDITIONER_TYPE/=SOLVER%LINEAR_SOLVER%ITERATIVE_SOLVER%ITERATIVE_PRECONDITIONER_TYPE) THEN
line means that the preconditioner will not be changed if you specify the same preconditioner as the current preconditioner? It should not prevent you changing the preconditioner?

I've no idea why the FlagError iterative Jacobi is not implemented for PETSc as it sets the PETSc Jacobi preconditioner if you have this preconditioner type (which is the default).

Having the preconditioner default to Jacobi seems sensible to me as this is the simplest and most applicable? The preconditioner is set by the solver not the problem and so the question is what is the most sensible precondioner for a GMRES solver.

The FlagError should be remove and replaced with the line

                  SOLVER%LINEAR_SOLVER%ITERATIVE_SOLVER%ITERATIVE_PRECONDITIONER_TYPE=SOLVER_ITERATIVE_JACOBI_PRECONDITIONER

line.

@beauof
Copy link
Contributor Author

beauof commented Mar 12, 2018

Yes, exactly. The line

IF(ITERATIVE_PRECONDITIONER_TYPE/=SOLVER%LINEAR_SOLVER%ITERATIVE_SOLVER%ITERATIVE_PRECONDITIONER_TYPE) THEN

only prevents you from setting the same value again. But of course, you could in theory set a non-default value and then reach the case (thus, the FlagError) by calling the respective routine a second time (then setting the value back to its default).

GMRES is the default solver, I have double-checked this. So it seems that the JACOBI preconditioner causes the issue with petsc 3.6.1 and we can avoid the issue by setting, e.g., BLOCK_JACOBI or INCOMPLETE_LU as preconditioner.

@chrispbradley
Copy link
Member

Hi Andreas,
yes, that is why I said replace the flag error line with
SOLVER%LINEAR_SOLVER%ITERATIVE_SOLVER%ITERATIVE_PRECONDITIONER_TYPE=SOLVER_ITERATIVE_JACOBI_PRECONDITIONER

rather than get rid of the IF statement. This will fix the problems.

Do you think this is a PETSc 3.6.1 problem? We are well overdue to upgrade PETSc and the other dependencies.

@beauof
Copy link
Contributor Author

beauof commented Mar 12, 2018

Will do so!

Re PETSC 3.6.1 - I just added the version number for detail. I don't necessarily think that it's a petsc issue, although it's possible that the problem disappears once petsc is upgraded.
Nonetheless, I think it would be better to track down the issue with the current set of dependencies to avoid other complications.

I'll have a look and let you know how it goes!

@beauof
Copy link
Contributor Author

beauof commented Mar 15, 2018

Setting the preconditioner to BLOCK_JACOBI is only a temporary fix.

When one increases the number of elements beyond ~128 elements (e.g. generated mesh with 8x4x4 elements) for a uniaxial test (incompr. Neo-Hooke, quad-lin Taylor-Hood) with a stretch of 1.2, the solver converges for a few iterations, then just heads off in the wrong direction. For example, see example-0201-u here.

There must be something else going on that we are not aware of.

This clearly indicates, that we also need to test larger number of elements in the functional test framework.

@chrispbradley
Copy link
Member

Jacobi preconditioners are the most basic and might not affect the matrix spectrum that much. Also, incompressible mechanics problems form saddle point problems that are notorious for this. Maybe try another preconditioner? Or a direct method as the problem size is not that big?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants