-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default preconditioner type for finite elasticity #127
Comments
The I've no idea why the FlagError iterative Jacobi is not implemented for PETSc as it sets the PETSc Jacobi preconditioner if you have this preconditioner type (which is the default). Having the preconditioner default to Jacobi seems sensible to me as this is the simplest and most applicable? The preconditioner is set by the solver not the problem and so the question is what is the most sensible precondioner for a GMRES solver. The FlagError should be remove and replaced with the line
line. |
Yes, exactly. The line
only prevents you from setting the same value again. But of course, you could in theory set a non-default value and then reach the case (thus, the FlagError) by calling the respective routine a second time (then setting the value back to its default). GMRES is the default solver, I have double-checked this. So it seems that the JACOBI preconditioner causes the issue with petsc 3.6.1 and we can avoid the issue by setting, e.g., BLOCK_JACOBI or INCOMPLETE_LU as preconditioner. |
Hi Andreas, rather than get rid of the IF statement. This will fix the problems. Do you think this is a PETSc 3.6.1 problem? We are well overdue to upgrade PETSc and the other dependencies. |
Will do so! Re PETSC 3.6.1 - I just added the version number for detail. I don't necessarily think that it's a petsc issue, although it's possible that the problem disappears once petsc is upgraded. I'll have a look and let you know how it goes! |
Setting the preconditioner to BLOCK_JACOBI is only a temporary fix. When one increases the number of elements beyond ~128 elements (e.g. generated mesh with 8x4x4 elements) for a uniaxial test (incompr. Neo-Hooke, quad-lin Taylor-Hood) with a stretch of 1.2, the solver converges for a few iterations, then just heads off in the wrong direction. For example, see example-0201-u here. There must be something else going on that we are not aware of. This clearly indicates, that we also need to test larger number of elements in the functional test framework. |
Jacobi preconditioners are the most basic and might not affect the matrix spectrum that much. Also, incompressible mechanics problems form saddle point problems that are notorious for this. Maybe try another preconditioner? Or a direct method as the problem size is not that big? |
The default iterative preconditioner type for finite elasticity (incompressible Mooney-Rivlin law) seems to be: CMFE_SOLVER_ITERATIVE_JACOBI_PRECONDITIONER
This, however, causes the PETSC default linesearch solver to diverge with the following error message:
Nonlinear solve monitor:
Iteration number = 0
Function Norm = 19.727618690913857
Newton solver information:
Number of function evaluations = 1
Number of Jacobian evaluations = 0
>>ERROR: 1: Nonlinear line search solver did not converge. PETSc diverged linear solve.
>> SOLVER_NEWTON_LINESEARCH_SOLVE
>> SOLVER_NEWTON_SOLVE
>> SOLVER_NONLINEAR_SOLVE
>> SOLVER_SOLVE
>> Problem_SolverEquationsStaticNonlinearSolve
>> PROBLEM_SOLVER_EQUATIONS_SOLVE
>> PROBLEM_SOLVER_SOLVE
>> PROBLEM_CONTROL_LOOP_SOLVE
>> PROBLEM_SOLVE
>> cmfe_Problem_SolveObj
In fact, in solver_routines.f90, subroutine Solver_LinearIterativePreconditionerTypeSet, we have the case:
CASE(SOLVER_ITERATIVE_JACOBI_PRECONDITIONER)
CALL FlagError("Iterative Jacobi preconditioning is not implemented for a PETSc library.",ERR,ERROR,*999)
But this case is never reached even if the user specifies CMFE_SOLVER_ITERATIVE_JACOBI_PRECONDITIONER as the preconditioner type because of an if statement directly above:
IF(ITERATIVE_PRECONDITIONER_TYPE/=SOLVER%LINEAR_SOLVER%ITERATIVE_SOLVER%ITERATIVE_PRECONDITIONER_TYPE) THEN
The divergence issues can be resolved for this problem class/type with, e.g., CMFE_SOLVER_ITERATIVE_BLOCK_JACOBI_PRECONDITIONER or CMFE_SOLVER_ITERATIVE_INCOMPLETE_LU_PRECONDITIONER.
I would suggest:
Any comments @chrispbradley @PrasadBabarendaGamage ?
The text was updated successfully, but these errors were encountered: