parallel. Many thanks for all of your help.
Post by Barry SmithLooks like the bug of using gmres on the coarse mesh is still there
in the latest patch release.
If you switch to PETSc master
http://www.mcs.anl.gov/petsc/developers/index.html it will not use gmres
Barry
Post by Gil ForsythHi Mark,
I've lost it, too. I was bisecting to find the change that started
returning the indefinite PC error to our code that has previously worked --
but this was using KSPSetNullSpace.
Post by Gil ForsythIncreasing the number of steps was in an effort to see if it impacted
the error or not, partially based on this thread from PETSC-users (
http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html)
with one of the previous authors of our code.
Post by Gil ForsythUpdating to PETSc master briefly eliminated the error in the poisson
solver, although this was only the case in serial, it still failed with an
indefinite PC error in parallel.
Post by Gil ForsythI'll confess that I'm not sure what to bisect between, as we don't
have a "good" version after the switch from KSPSetNullSpace ->
MatSetNullSpace. That's what prompted the initial bisection search in and
around the 3.5.4 commit range. I'm going to take another crack at that
today in a more automated fashion, since I expect I inserted some human
error somewhere along the way.
Post by Gil ForsythCompiled against petsc v3.6.2, it shows again that the coarse grid
solver is using GMRES even when using -mg_coarse_ksp_type preonly. Logs
are attached.
Post by Gil Forsyth$PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info
Post by Gil ForsythI've lost this thread a bit, but you seem to be bisecting to find
where a problem started and you are noticing the gmres coarse grid solver.
We fixed a bug where PETSc was resetting the coarse grid solver to GMRES
when it should not. So older versions have this, but the current version,
and I think this has been in place for all of v3.6, but it might have
missed v3.6.1, have the fix of not resetting the coarse grid solver type.
GAMG sets the coarse grid solver type to preonly, but you can override it.
Let me know if I'm missing something here.
Post by Gil ForsythI also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This
is the number of smoothing steps of the prolongation operator and you
should not use more than 1. In fact for CFD, you should use no smoothing
(0), probably.
Post by Gil ForsythMark
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using
GMRES in the multigrid solve but doesn't complain until the 2nd timestep,
so my bisection criteria is off, since I was giving commits a PASS if they
made it through the one timestep without complaining about the indefinite
PC. I think I'm still close to the problem commit, but it's probably a
little bit before 25a145a. Apologies for the goose chase.
Post by Gil ForsythThanks,
Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
Post by Gil Forsyth#+END_SRC
and then against 25a145a with the same inputs. I notice that the
poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is
using preonly.
Post by Gil ForsythLogs from both runs are attached.
Can you please send a good log also, with the ksp_view.
Mark
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
Post by Gil Forsyth$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Post by Gil ForsythThanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Post by Gil ForsythMany thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show
that the behavior seems to have changed between versions. The only
difference in our code between 3.5.4 and 3.6.1 is the change from
KSPSetNullSpace to MatSetNullSpace.
Post by Gil ForsythMark made some GAMG changes which were later reversed because they had
unintended consequences like this.
Post by Gil ForsythI think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Post by Gil ForsythThanks,
Matt
Update your PETSc
Post by Gil ForsythHi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil ForsythPost by Gil Forsyth$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil ForsythPost by Gil Forsythand shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil ForsythPost by Gil Forsyth$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil ForsythPost by Gil Forsythstill shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil ForsythPost by Gil ForsythGMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil ForsythPost by Gil ForsythKSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil ForsythPost by Gil ForsythGMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil ForsythPost by Gil ForsythI'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil ForsythPost by Gil ForsythUnder PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil ForsythPost by Gil ForsythMatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil ForsythPost by Gil Forsythierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil ForsythPost by Gil ForsythFor questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil ForsythPost by Gil Forsythknow what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil ForsythPost by Gil ForsythMatt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil ForsythPost by Gil ForsythI've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil ForsythPost by Gil ForsythAnd then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil ForsythPost by Gil Forsythierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil ForsythPost by Gil Forsythierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil ForsythPost by Gil ForsythWe run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil ForsythPost by Gil ForsythUnder PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil ForsythPost by Gil ForsythThis is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil ForsythPost by Gil ForsythChange nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil ForsythPost by Gil ForsythThanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil ForsythPost by Gil Forsyth-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth-- Norbert Wiener
<kspview_arch-3264318.log>