Discussion:
[petsc-users] Moving from KSPSetNullSpace to MatSetNullSpace
Gil Forsyth
2015-09-29 15:28:41 UTC
Permalink
Hi all,

I've been having some trouble with what should be a relatively simple
update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1

I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100

Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the
null space. This is for a 2D Poisson system with no immersed boundary and
so the null space is the constant vector.

MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp);
CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);


And then setup the KSP with

ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);


The matrix QTBNQ does not change, only the rhs of the system is updated.

We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1`
and everything seems to work as expected.

Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to

ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);

and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC

This is weird, especially because if we change nsmooths to 2, it runs for
264 timesteps and the returns the same error. But we have explicitly set
KSPSetReusePreconditioner so it should be using the same PC, right?

Change nsmooths to 3 and it again diverges after 1 timestep.

Change nsmooths to 4 and it runs to completion.

It seems like either gamg's behavior has changed, or that KSPSetNullSpace
was doing something implicitly that we now need to do explicitly in
addition to MatSetNullSpace?

Thanks,
Gil Forsyth
Matthew Knepley
2015-09-29 15:42:29 UTC
Permalink
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple
update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the
null space. This is for a 2D Poisson system with no immersed boundary and
so the null space is the constant vector.
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp);
CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be
created yet.

For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason

Matt
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1`
and everything seems to work as expected.
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
This is weird, especially because if we change nsmooths to 2, it runs for
264 timesteps and the returns the same error. But we have explicitly set
KSPSetReusePreconditioner so it should be using the same PC, right?
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that KSPSetNullSpace
was doing something implicitly that we now need to do explicitly in
addition to MatSetNullSpace?
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gil Forsyth
2015-09-29 15:53:54 UTC
Permalink
Post by Matthew Knepley
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple
update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the
null space. This is for a 2D Poisson system with no immersed boundary and
so the null space is the constant vector.
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be
created yet.
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Matt
Yes -- sorry, those are both in inline files and are called in the reverse
order that I wrote them out.

I've attached the output of

$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview.log
Post by Matthew Knepley
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths
1` and everything seems to work as expected.
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
This is weird, especially because if we change nsmooths to 2, it runs for
264 timesteps and the returns the same error. But we have explicitly set
KSPSetReusePreconditioner so it should be using the same PC, right?
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that KSPSetNullSpace
was doing something implicitly that we now need to do explicitly in
addition to MatSetNullSpace?
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Barry Smith
2015-09-29 16:37:42 UTC
Permalink
This can't work. You can't use a GMRES inside a CG. Try changing to -poisson_mg_coarse_ksp_type preonly

KSP Object:(poisson_) 1 MPI processes
type: cg

KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
I'm getting indefinite PC errors for a simple lid-driven cavity test problem, 32x32 at Re 100
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the null space. This is for a 2D Poisson system with no immersed boundary and so the null space is the constant vector.
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp); CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be created yet.
For questions about solvers, we HAVE to see the complete output of -ksp_view so we
know what we are dealing with. Its also nice to have -ksp_monitor_true_residual -ksp_converged_reason
Matt
Yes -- sorry, those are both in inline files and are called in the reverse order that I wrote them out.
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview.log
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8 KSP_DIVERGED_INDEFINITE_PC
This is weird, especially because if we change nsmooths to 2, it runs for 264 timesteps and the returns the same error. But we have explicitly set KSPSetReusePreconditioner so it should be using the same PC, right?
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that KSPSetNullSpace was doing something implicitly that we now need to do explicitly in addition to MatSetNullSpace?
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
<kspview.log>
Gil Forsyth
2015-09-29 17:00:59 UTC
Permalink
Hi Barry,

We aren't explicitly setting GMRES anywhere in the code and I'm not sure
why it's being used. Running our 3.5.4 code using KSPSetNullSpace works
with:

$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4

and shows that the coarse grid solver is of type:preonly

running the newer version that uses MatSetNullSpace in its stead and adding
in -poisson_mg_coarse_ksp_type preonly

$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1

still shows

KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test


both logs are attached.
Post by Barry Smith
This can't work. You can't use a GMRES inside a CG. Try changing to
-poisson_mg_coarse_ksp_type preonly
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple
update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the
null space. This is for a 2D Poisson system with no immersed boundary and
so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be
created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths
1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it runs
for 264 timesteps and the returns the same error. But we have explicitly
set KSPSetReusePreconditioner so it should be using the same PC, right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
Barry Smith
2015-09-29 17:04:02 UTC
Permalink
Update your PETSc
Post by Gil Forsyth
Hi Barry,
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.5.4
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and adding in -poisson_mg_coarse_ksp_type preonly
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_mg_coarse_ksp_type preonly -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.6.1
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing to -poisson_mg_coarse_ksp_type preonly
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
I'm getting indefinite PC errors for a simple lid-driven cavity test problem, 32x32 at Re 100
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the null space. This is for a 2D Poisson system with no immersed boundary and so the null space is the constant vector.
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp); CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be created yet.
For questions about solvers, we HAVE to see the complete output of -ksp_view so we
know what we are dealing with. Its also nice to have -ksp_monitor_true_residual -ksp_converged_reason
Matt
Yes -- sorry, those are both in inline files and are called in the reverse order that I wrote them out.
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview.log
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8 KSP_DIVERGED_INDEFINITE_PC
This is weird, especially because if we change nsmooths to 2, it runs for 264 timesteps and the returns the same error. But we have explicitly set KSPSetReusePreconditioner so it should be using the same PC, right?
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that KSPSetNullSpace was doing something implicitly that we now need to do explicitly in addition to MatSetNullSpace?
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
Gil Forsyth
2015-09-29 17:08:26 UTC
Permalink
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that the
behavior seems to have changed between versions. The only difference in
our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not sure
why it's being used. Running our 3.5.4 code using KSPSetNullSpace works
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing to
-poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple
update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set
the null space. This is for a 2D Poisson system with no immersed boundary
and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not
be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it runs
for 264 timesteps and the returns the same error. But we have explicitly
set KSPSetReusePreconditioner so it should be using the same PC, right?
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
Matthew Knepley
2015-09-29 17:10:49 UTC
Permalink
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect using
the master branch from PETSc development"

Thanks,

Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple
update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set
the null space. This is for a 2D Poisson system with no immersed boundary
and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not
be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it runs
for 264 timesteps and the returns the same error. But we have explicitly
set KSPSetReusePreconditioner so it should be using the same PC, right?
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gil Forsyth
2015-09-29 17:12:44 UTC
Permalink
Ah, got it. I'll checkout the master branch and see if the behavior
persists.

Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect using
the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set
the null space. This is for a 2D Poisson system with no immersed boundary
and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gil Forsyth
2015-09-30 20:11:14 UTC
Permalink
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.

$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2 $PETIBM_DIR/petibm-git/bin/petibm2d
-directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason


Thanks again,
Gil Forsyth
Post by Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Barry Smith
2015-09-30 20:24:34 UTC
Permalink
Did the exact same thing run in parallel without the indefinite problem?

Run that failure with -info and send all the output

You could use bisection to find out exactly what change in the library breaks your example.

Barry
Using PETSc master branch solved the problem in serial, but I'm still seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This runs to completion when I don't use GAMG. Log is attached for the following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2 $PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior persists.
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that the behavior seems to have changed between versions. The only difference in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect using the master branch from PETSc development"
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.5.4
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and adding in -poisson_mg_coarse_ksp_type preonly
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_mg_coarse_ksp_type preonly -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.6.1
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing to -poisson_mg_coarse_ksp_type preonly
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
I'm getting indefinite PC errors for a simple lid-driven cavity test problem, 32x32 at Re 100
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the null space. This is for a 2D Poisson system with no immersed boundary and so the null space is the constant vector.
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp); CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be created yet.
For questions about solvers, we HAVE to see the complete output of -ksp_view so we
know what we are dealing with. Its also nice to have -ksp_monitor_true_residual -ksp_converged_reason
Matt
Yes -- sorry, those are both in inline files and are called in the reverse order that I wrote them out.
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview.log
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8 KSP_DIVERGED_INDEFINITE_PC
This is weird, especially because if we change nsmooths to 2, it runs for 264 timesteps and the returns the same error. But we have explicitly set KSPSetReusePreconditioner so it should be using the same PC, right?
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that KSPSetNullSpace was doing something implicitly that we now need to do explicitly in addition to MatSetNullSpace?
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
<mpi_n2_indefinite_pc.log>
Gil Forsyth
2015-09-30 20:36:03 UTC
Permalink
The exact same thing ran in serial without the indefinite problem, but it
does crop up in all parallel runs.

I've attached the failure run log and I'll start bisecting against 3.5.4 to
try to track down the change.

Thanks!
Gil Forsyth
Post by Barry Smith
Did the exact same thing run in parallel without the indefinite problem?
Run that failure with -info and send all the output
You could use bisection to find out exactly what change in the library
breaks your example.
Barry
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
Post by Gil Forsyth
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Post by Gil Forsyth
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Post by Gil Forsyth
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Post by Gil Forsyth
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
Post by Gil Forsyth
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Post by Gil Forsyth
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set
the null space. This is for a 2D Poisson system with no immersed boundary
and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<mpi_n2_indefinite_pc.log>
Gil Forsyth
2015-10-01 19:10:59 UTC
Permalink
I think I found it. I thought initially that the problem was in the shift
from KSPSetNullSpace to MatSetNullSpace, but in retrospect that doesn't
make much sense as they ostensibly are offering the exact same
functionality. I discovered that our code returns the same
KSP_DIVERGED_INDEFINITE_PC error in commit e8f7834, which is the last
commit before KSPSetNullSpace was removed.

I've attached the bisection log between e8f7834 and 9fbf19a (v3.5.4) and
the problem seems to have been introduced in 25a145a ("fixed gamg coarse
grid to be general").

Thanks,
Gil Forsyth
Post by Gil Forsyth
The exact same thing ran in serial without the indefinite problem, but it
does crop up in all parallel runs.
I've attached the failure run log and I'll start bisecting against 3.5.4
to try to track down the change.
Thanks!
Gil Forsyth
Post by Barry Smith
Did the exact same thing run in parallel without the indefinite problem?
Run that failure with -info and send all the output
You could use bisection to find out exactly what change in the library
breaks your example.
Barry
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
Post by Gil Forsyth
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Post by Gil Forsyth
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Post by Gil Forsyth
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Post by Gil Forsyth
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
Post by Gil Forsyth
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Post by Gil Forsyth
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<mpi_n2_indefinite_pc.log>
Mark Adams
2015-10-01 19:52:30 UTC
Permalink
Can you please send a good log also, with the ksp_view.
Mark
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2 $PETIBM_DIR/petibm-git/bin/petibm2d
-directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Post by Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gil Forsyth
2015-10-01 20:21:24 UTC
Permalink
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
#+END_SRC

and then against 25a145a with the same inputs. I notice that the poisson
multigrid solve in 25a145a is using GMRES again while 3.5.4 is using
preonly.

Logs from both runs are attached.
Post by Mark Adams
Can you please send a good log also, with the ksp_view.
Mark
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Post by Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show
that the behavior seems to have changed between versions. The only
difference in our code between 3.5.4 and 3.6.1 is the change from
KSPSetNullSpace to MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead
and adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0,
NULL, &nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output
of -ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in
the reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gil Forsyth
2015-10-01 20:27:23 UTC
Permalink
Hi Mark,

I just noticed that in the previous commit 7743f89, it's also using GMRES
in the multigrid solve but doesn't complain until the 2nd timestep, so my
bisection criteria is off, since I was giving commits a PASS if they made
it through the one timestep without complaining about the indefinite PC. I
think I'm still close to the problem commit, but it's probably a little bit
before 25a145a. Apologies for the goose chase.

Thanks,
Gil Forsyth
Post by Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
#+END_SRC
and then against 25a145a with the same inputs. I notice that the poisson
multigrid solve in 25a145a is using GMRES again while 3.5.4 is using
preonly.
Logs from both runs are attached.
Post by Mark Adams
Can you please send a good log also, with the ksp_view.
Mark
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Post by Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show
that the behavior seems to have changed between versions. The only
difference in our code between 3.5.4 and 3.6.1 is the change from
KSPSetNullSpace to MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm
not sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead
and adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0,
NULL, &nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2
would not be created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output
of -ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in
the reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Mark Adams
2015-10-04 12:57:22 UTC
Permalink
I've lost this thread a bit, but you seem to be bisecting to find where a
problem started and you are noticing the gmres coarse grid solver. We
fixed a bug where PETSc was resetting the coarse grid solver to GMRES when
it should not. So older versions have this, but the current version, and I
think this has been in place for all of v3.6, but it might have missed
v3.6.1, have the fix of not resetting the coarse grid solver type. GAMG
sets the coarse grid solver type to preonly, but you can override it. Let
me know if I'm missing something here.

I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This is the
number of smoothing steps of the prolongation operator and you should not
use more than 1. In fact for CFD, you should use no smoothing (0),
probably.

Mark
Post by Gil Forsyth
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using GMRES
in the multigrid solve but doesn't complain until the 2nd timestep, so my
bisection criteria is off, since I was giving commits a PASS if they made
it through the one timestep without complaining about the indefinite PC. I
think I'm still close to the problem commit, but it's probably a little bit
before 25a145a. Apologies for the goose chase.
Thanks,
Gil Forsyth
Post by Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
#+END_SRC
and then against 25a145a with the same inputs. I notice that the poisson
multigrid solve in 25a145a is using GMRES again while 3.5.4 is using
preonly.
Logs from both runs are attached.
Post by Mark Adams
Can you please send a good log also, with the ksp_view.
Mark
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Post by Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show
that the behavior seems to have changed between versions. The only
difference in our code between 3.5.4 and 3.6.1 is the change from
KSPSetNullSpace to MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they
had unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm
not sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead
and adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000
Post by Gil Forsyth
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following
to set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0,
NULL, &nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2
would not be created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output
of -ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in
the reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory .
-poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths
1 -poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2,
it runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin
their experiments is infinitely more interesting than any results to which
their experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gil Forsyth
2015-10-05 14:06:03 UTC
Permalink
Hi Mark,

I've lost it, too. I was bisecting to find the change that started
returning the indefinite PC error to our code that has previously worked --
but this was using KSPSetNullSpace.
Increasing the number of steps was in an effort to see if it impacted the
error or not, partially based on this thread from PETSC-users (
http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html)
with one of the previous authors of our code.

Updating to PETSc master briefly eliminated the error in the poisson
solver, although this was only the case in serial, it still failed with an
indefinite PC error in parallel.

I'll confess that I'm not sure what to bisect between, as we don't have a
"good" version after the switch from KSPSetNullSpace -> MatSetNullSpace.
That's what prompted the initial bisection search in and around the 3.5.4
commit range. I'm going to take another crack at that today in a more
automated fashion, since I expect I inserted some human error somewhere
along the way.

Compiled against petsc v3.6.2, it shows again that the coarse grid solver
is using GMRES even when using -mg_coarse_ksp_type preonly. Logs are
attached.

$PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info
Post by Mark Adams
I've lost this thread a bit, but you seem to be bisecting to find where a
problem started and you are noticing the gmres coarse grid solver. We
fixed a bug where PETSc was resetting the coarse grid solver to GMRES when
it should not. So older versions have this, but the current version, and I
think this has been in place for all of v3.6, but it might have missed
v3.6.1, have the fix of not resetting the coarse grid solver type. GAMG
sets the coarse grid solver type to preonly, but you can override it. Let
me know if I'm missing something here.
I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This is
the number of smoothing steps of the prolongation operator and you should
not use more than 1. In fact for CFD, you should use no smoothing (0),
probably.
Mark
Post by Gil Forsyth
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using GMRES
in the multigrid solve but doesn't complain until the 2nd timestep, so my
bisection criteria is off, since I was giving commits a PASS if they made
it through the one timestep without complaining about the indefinite PC. I
think I'm still close to the problem commit, but it's probably a little bit
before 25a145a. Apologies for the goose chase.
Thanks,
Gil Forsyth
Post by Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
#+END_SRC
and then against 25a145a with the same inputs. I notice that the
poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is
using preonly.
Logs from both runs are attached.
Post by Mark Adams
Can you please send a good log also, with the ksp_view.
Mark
Post by Gil Forsyth
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Post by Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Many thanks,
Gil
Post by Matthew Knepley
Post by Gil Forsyth
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show
that the behavior seems to have changed between versions. The only
difference in our code between 3.5.4 and 3.6.1 is the change from
KSPSetNullSpace to MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they
had unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Thanks,
Matt
Post by Gil Forsyth
Post by Barry Smith
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm
not sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead
and adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000
Post by Gil Forsyth
left preconditioning
using NONE norm type for convergence test
both logs are attached.
On Tue, Sep 29, 2015 at 12:37 PM, Barry Smith <
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
On Tue, Sep 29, 2015 at 10:28 AM, Gil Forsyth <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven
cavity test problem, 32x32 at Re 100
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following
to set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0,
NULL, &nsp); CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2
would not be created yet.
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete
output of -ksp_view so we
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in
the reverse order that I wrote them out.
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory .
-poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths
1 -poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system
is updated.
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2,
it runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin
their experiments is infinitely more interesting than any results to which
their experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Barry Smith
2015-10-05 18:51:48 UTC
Permalink
Looks like the bug of using gmres on the coarse mesh is still there in the latest patch release.

If you switch to PETSc master http://www.mcs.anl.gov/petsc/developers/index.html it will not use gmres

Barry
Post by Gil Forsyth
Hi Mark,
I've lost it, too. I was bisecting to find the change that started returning the indefinite PC error to our code that has previously worked -- but this was using KSPSetNullSpace.
Increasing the number of steps was in an effort to see if it impacted the error or not, partially based on this thread from PETSC-users (http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html) with one of the previous authors of our code.
Updating to PETSc master briefly eliminated the error in the poisson solver, although this was only the case in serial, it still failed with an indefinite PC error in parallel.
I'll confess that I'm not sure what to bisect between, as we don't have a "good" version after the switch from KSPSetNullSpace -> MatSetNullSpace. That's what prompted the initial bisection search in and around the 3.5.4 commit range. I'm going to take another crack at that today in a more automated fashion, since I expect I inserted some human error somewhere along the way.
Compiled against petsc v3.6.2, it shows again that the coarse grid solver is using GMRES even when using -mg_coarse_ksp_type preonly. Logs are attached.
$PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100 -poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info
I've lost this thread a bit, but you seem to be bisecting to find where a problem started and you are noticing the gmres coarse grid solver. We fixed a bug where PETSc was resetting the coarse grid solver to GMRES when it should not. So older versions have this, but the current version, and I think this has been in place for all of v3.6, but it might have missed v3.6.1, have the fix of not resetting the coarse grid solver type. GAMG sets the coarse grid solver type to preonly, but you can override it. Let me know if I'm missing something here.
I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This is the number of smoothing steps of the prolongation operator and you should not use more than 1. In fact for CFD, you should use no smoothing (0), probably.
Mark
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using GMRES in the multigrid solve but doesn't complain until the 2nd timestep, so my bisection criteria is off, since I was giving commits a PASS if they made it through the one timestep without complaining about the indefinite PC. I think I'm still close to the problem commit, but it's probably a little bit before 25a145a. Apologies for the goose chase.
Thanks,
Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100 -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info > kspview_3.5.4.log
#+END_SRC
and then against 25a145a with the same inputs. I notice that the poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is using preonly.
Logs from both runs are attached.
Can you please send a good log also, with the ksp_view.
Mark
Using PETSc master branch solved the problem in serial, but I'm still seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This runs to completion when I don't use GAMG. Log is attached for the following run.
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2 $PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior persists.
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that the behavior seems to have changed between versions. The only difference in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to MatSetNullSpace.
Mark made some GAMG changes which were later reversed because they had unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect using the master branch from PETSc development"
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.5.4
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and adding in -poisson_mg_coarse_ksp_type preonly
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_mg_coarse_ksp_type preonly -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.6.1
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing to -poisson_mg_coarse_ksp_type preonly
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
I'm getting indefinite PC errors for a simple lid-driven cavity test problem, 32x32 at Re 100
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the null space. This is for a 2D Poisson system with no immersed boundary and so the null space is the constant vector.
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp); CHKERRQ(ierr);
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would not be created yet.
For questions about solvers, we HAVE to see the complete output of -ksp_view so we
know what we are dealing with. Its also nice to have -ksp_monitor_true_residual -ksp_converged_reason
Matt
Yes -- sorry, those are both in inline files and are called in the reverse order that I wrote them out.
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview.log
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is updated.
We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8 KSP_DIVERGED_INDEFINITE_PC
This is weird, especially because if we change nsmooths to 2, it runs for 264 timesteps and the returns the same error. But we have explicitly set KSPSetReusePreconditioner so it should be using the same PC, right?
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that KSPSetNullSpace was doing something implicitly that we now need to do explicitly in addition to MatSetNullSpace?
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
<kspview_arch-3264318.log>
Gil Forsyth
2015-10-05 19:53:02 UTC
Permalink
Hi Barry and Mark,

Everything is now working as expected on PETSc master, both in serial and
parallel. Many thanks for all of your help.

Gil Forsyth
Post by Barry Smith
Looks like the bug of using gmres on the coarse mesh is still there in
the latest patch release.
If you switch to PETSc master
http://www.mcs.anl.gov/petsc/developers/index.html it will not use gmres
Barry
Post by Gil Forsyth
Hi Mark,
I've lost it, too. I was bisecting to find the change that started
returning the indefinite PC error to our code that has previously worked --
but this was using KSPSetNullSpace.
Post by Gil Forsyth
Increasing the number of steps was in an effort to see if it impacted
the error or not, partially based on this thread from PETSC-users (
http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html)
with one of the previous authors of our code.
Post by Gil Forsyth
Updating to PETSc master briefly eliminated the error in the poisson
solver, although this was only the case in serial, it still failed with an
indefinite PC error in parallel.
Post by Gil Forsyth
I'll confess that I'm not sure what to bisect between, as we don't have
a "good" version after the switch from KSPSetNullSpace -> MatSetNullSpace.
That's what prompted the initial bisection search in and around the 3.5.4
commit range. I'm going to take another crack at that today in a more
automated fashion, since I expect I inserted some human error somewhere
along the way.
Post by Gil Forsyth
Compiled against petsc v3.6.2, it shows again that the coarse grid
solver is using GMRES even when using -mg_coarse_ksp_type preonly. Logs
are attached.
Post by Gil Forsyth
$PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info
Post by Gil Forsyth
I've lost this thread a bit, but you seem to be bisecting to find where
a problem started and you are noticing the gmres coarse grid solver. We
fixed a bug where PETSc was resetting the coarse grid solver to GMRES when
it should not. So older versions have this, but the current version, and I
think this has been in place for all of v3.6, but it might have missed
v3.6.1, have the fix of not resetting the coarse grid solver type. GAMG
sets the coarse grid solver type to preonly, but you can override it. Let
me know if I'm missing something here.
Post by Gil Forsyth
I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This is
the number of smoothing steps of the prolongation operator and you should
not use more than 1. In fact for CFD, you should use no smoothing (0),
probably.
Post by Gil Forsyth
Mark
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using
GMRES in the multigrid solve but doesn't complain until the 2nd timestep,
so my bisection criteria is off, since I was giving commits a PASS if they
made it through the one timestep without complaining about the indefinite
PC. I think I'm still close to the problem commit, but it's probably a
little bit before 25a145a. Apologies for the goose chase.
Post by Gil Forsyth
Thanks,
Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
Post by Gil Forsyth
#+END_SRC
and then against 25a145a with the same inputs. I notice that the
poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is
using preonly.
Post by Gil Forsyth
Logs from both runs are attached.
Can you please send a good log also, with the ksp_view.
Mark
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
Post by Gil Forsyth
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Post by Gil Forsyth
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Post by Gil Forsyth
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Post by Gil Forsyth
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
Post by Gil Forsyth
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Post by Gil Forsyth
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
Post by Gil Forsyth
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity test
problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set
the null space. This is for a 2D Poisson system with no immersed boundary
and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview_arch-3264318.log>
Mark Adams
2015-10-05 20:16:41 UTC
Permalink
Good, sorry for the confusion, I thought this stuff was squared away in
v3.6 and certainly in v3.6.2. (its a little disconcerting that it failed.
it would be interesting to see if master fails if you set gmres for the
coarse grid solver, ie, was gmres really the problem?)
Mark
Post by Gil Forsyth
Hi Barry and Mark,
Everything is now working as expected on PETSc master, both in serial and
parallel. Many thanks for all of your help.
Gil Forsyth
Post by Barry Smith
Looks like the bug of using gmres on the coarse mesh is still there in
the latest patch release.
If you switch to PETSc master
http://www.mcs.anl.gov/petsc/developers/index.html it will not use gmres
Barry
Post by Gil Forsyth
Hi Mark,
I've lost it, too. I was bisecting to find the change that started
returning the indefinite PC error to our code that has previously worked --
but this was using KSPSetNullSpace.
Post by Gil Forsyth
Increasing the number of steps was in an effort to see if it impacted
the error or not, partially based on this thread from PETSC-users (
http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html)
with one of the previous authors of our code.
Post by Gil Forsyth
Updating to PETSc master briefly eliminated the error in the poisson
solver, although this was only the case in serial, it still failed with an
indefinite PC error in parallel.
Post by Gil Forsyth
I'll confess that I'm not sure what to bisect between, as we don't have
a "good" version after the switch from KSPSetNullSpace -> MatSetNullSpace.
That's what prompted the initial bisection search in and around the 3.5.4
commit range. I'm going to take another crack at that today in a more
automated fashion, since I expect I inserted some human error somewhere
along the way.
Post by Gil Forsyth
Compiled against petsc v3.6.2, it shows again that the coarse grid
solver is using GMRES even when using -mg_coarse_ksp_type preonly. Logs
are attached.
Post by Gil Forsyth
$PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info
Post by Gil Forsyth
I've lost this thread a bit, but you seem to be bisecting to find where
a problem started and you are noticing the gmres coarse grid solver. We
fixed a bug where PETSc was resetting the coarse grid solver to GMRES when
it should not. So older versions have this, but the current version, and I
think this has been in place for all of v3.6, but it might have missed
v3.6.1, have the fix of not resetting the coarse grid solver type. GAMG
sets the coarse grid solver type to preonly, but you can override it. Let
me know if I'm missing something here.
Post by Gil Forsyth
I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This is
the number of smoothing steps of the prolongation operator and you should
not use more than 1. In fact for CFD, you should use no smoothing (0),
probably.
Post by Gil Forsyth
Mark
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using
GMRES in the multigrid solve but doesn't complain until the 2nd timestep,
so my bisection criteria is off, since I was giving commits a PASS if they
made it through the one timestep without complaining about the indefinite
PC. I think I'm still close to the problem commit, but it's probably a
little bit before 25a145a. Apologies for the goose chase.
Post by Gil Forsyth
Thanks,
Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
Post by Gil Forsyth
#+END_SRC
and then against 25a145a with the same inputs. I notice that the
poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is
using preonly.
Post by Gil Forsyth
Logs from both runs are attached.
Can you please send a good log also, with the ksp_view.
Mark
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
Post by Gil Forsyth
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Post by Gil Forsyth
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Post by Gil Forsyth
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
the behavior seems to have changed between versions. The only difference
in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
MatSetNullSpace.
Post by Gil Forsyth
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
Post by Gil Forsyth
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Post by Gil Forsyth
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.5.4
Post by Gil Forsyth
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try changing
to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview_arch-3264318.log>
Gil Forsyth
2015-10-05 21:50:50 UTC
Permalink
Hi Mark,

I've attached the log of

petibm-master/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view
-poisson_mg_coarse_ksp_type gmres > gmrescoarse.log

and it does fail straight away when setting gmres as the coarse grid
solver. If I give it nsmooths=1, the smoothing saves it for about 15
timesteps but then it still errors out with the indefinite PC error. If
there's anything else you'd like me to try out, just let me know.

Thanks,
Gil Forsyth
Post by Mark Adams
Good, sorry for the confusion, I thought this stuff was squared away in
v3.6 and certainly in v3.6.2. (its a little disconcerting that it failed.
it would be interesting to see if master fails if you set gmres for the
coarse grid solver, ie, was gmres really the problem?)
Mark
Post by Gil Forsyth
Hi Barry and Mark,
Everything is now working as expected on PETSc master, both in serial and
parallel. Many thanks for all of your help.
Gil Forsyth
Post by Barry Smith
Looks like the bug of using gmres on the coarse mesh is still there
in the latest patch release.
If you switch to PETSc master
http://www.mcs.anl.gov/petsc/developers/index.html it will not use gmres
Barry
Post by Gil Forsyth
Hi Mark,
I've lost it, too. I was bisecting to find the change that started
returning the indefinite PC error to our code that has previously worked --
but this was using KSPSetNullSpace.
Post by Gil Forsyth
Increasing the number of steps was in an effort to see if it impacted
the error or not, partially based on this thread from PETSC-users (
http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html)
with one of the previous authors of our code.
Post by Gil Forsyth
Updating to PETSc master briefly eliminated the error in the poisson
solver, although this was only the case in serial, it still failed with an
indefinite PC error in parallel.
Post by Gil Forsyth
I'll confess that I'm not sure what to bisect between, as we don't
have a "good" version after the switch from KSPSetNullSpace ->
MatSetNullSpace. That's what prompted the initial bisection search in and
around the 3.5.4 commit range. I'm going to take another crack at that
today in a more automated fashion, since I expect I inserted some human
error somewhere along the way.
Post by Gil Forsyth
Compiled against petsc v3.6.2, it shows again that the coarse grid
solver is using GMRES even when using -mg_coarse_ksp_type preonly. Logs
are attached.
Post by Gil Forsyth
$PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info
Post by Gil Forsyth
I've lost this thread a bit, but you seem to be bisecting to find
where a problem started and you are noticing the gmres coarse grid solver.
We fixed a bug where PETSc was resetting the coarse grid solver to GMRES
when it should not. So older versions have this, but the current version,
and I think this has been in place for all of v3.6, but it might have
missed v3.6.1, have the fix of not resetting the coarse grid solver type.
GAMG sets the coarse grid solver type to preonly, but you can override it.
Let me know if I'm missing something here.
Post by Gil Forsyth
I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This
is the number of smoothing steps of the prolongation operator and you
should not use more than 1. In fact for CFD, you should use no smoothing
(0), probably.
Post by Gil Forsyth
Mark
Hi Mark,
I just noticed that in the previous commit 7743f89, it's also using
GMRES in the multigrid solve but doesn't complain until the 2nd timestep,
so my bisection criteria is off, since I was giving commits a PASS if they
made it through the one timestep without complaining about the indefinite
PC. I think I'm still close to the problem commit, but it's probably a
little bit before 25a145a. Apologies for the goose chase.
Post by Gil Forsyth
Thanks,
Gil Forsyth
I ran for one timestep against 3.5.4 with
#+BEGIN_SRC
petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100
-poisson_pc_type gamg -poisson_pc_gamg_type agg
-poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info >
kspview_3.5.4.log
Post by Gil Forsyth
#+END_SRC
and then against 25a145a with the same inputs. I notice that the
poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is
using preonly.
Post by Gil Forsyth
Logs from both runs are attached.
Can you please send a good log also, with the ksp_view.
Mark
Using PETSc master branch solved the problem in serial, but I'm still
seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This
runs to completion when I don't use GAMG. Log is attached for the
following run.
Post by Gil Forsyth
$PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2
$PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg
-poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason
Post by Gil Forsyth
Thanks again,
Gil Forsyth
Ah, got it. I'll checkout the master branch and see if the behavior
persists.
Post by Gil Forsyth
Many thanks,
Gil
PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show
that the behavior seems to have changed between versions. The only
difference in our code between 3.5.4 and 3.6.1 is the change from
KSPSetNullSpace to MatSetNullSpace.
Post by Gil Forsyth
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
Post by Gil Forsyth
I think what Barry means is, "you should get the behavior you expect
using the master branch from PETSc development"
Post by Gil Forsyth
Thanks,
Matt
Update your PETSc
Post by Gil Forsyth
Hi Barry,
We aren't explicitly setting GMRES anywhere in the code and I'm not
sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview3.5.4
Post by Gil Forsyth
Post by Gil Forsyth
and shows that the coarse grid solver is of type:preonly
running the newer version that uses MatSetNullSpace in its stead and
adding in -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
-poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
-poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
kspview3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
still shows
KSP Object:(poisson_) 1 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object:(poisson_) 1 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
GAMG specific options
Threshold for dropping small values from graph 0
AGG specific options
Symmetric graph false
Coarse grid solver -- level -------------------------------
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
both logs are attached.
This can't work. You can't use a GMRES inside a CG. Try
changing to -poisson_mg_coarse_ksp_type preonly
Post by Gil Forsyth
Post by Gil Forsyth
KSP Object:(poisson_) 1 MPI processes
type: cg
KSP Object: (poisson_mg_coarse_) 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
Post by Gil Forsyth
Post by Gil Forsyth
GMRES: happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <
Hi all,
I've been having some trouble with what should be a relatively
simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
Post by Gil Forsyth
Post by Gil Forsyth
I'm getting indefinite PC errors for a simple lid-driven cavity
test problem, 32x32 at Re 100
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.5.4 using KSPSetNullSpace we used the following to
set the null space. This is for a 2D Poisson system with no immersed
boundary and so the null space is the constant vector.
Post by Gil Forsyth
Post by Gil Forsyth
MatNullSpace nsp;
ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
&nsp); CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
Clearly this has to happen in the reverse order, since ksp2 would
not be created yet.
Post by Gil Forsyth
Post by Gil Forsyth
For questions about solvers, we HAVE to see the complete output of
-ksp_view so we
Post by Gil Forsyth
Post by Gil Forsyth
know what we are dealing with. Its also nice to have
-ksp_monitor_true_residual -ksp_converged_reason
Post by Gil Forsyth
Post by Gil Forsyth
Matt
Yes -- sorry, those are both in inline files and are called in the
reverse order that I wrote them out.
Post by Gil Forsyth
Post by Gil Forsyth
I've attached the output of
$PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type
gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1
-poisson_ksp_view -poisson_ksp_monitor_true_residual
-poisson_ksp_converged_reason > kspview.log
Post by Gil Forsyth
Post by Gil Forsyth
And then setup the KSP with
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE);
CHKERRQ(ierr);
Post by Gil Forsyth
Post by Gil Forsyth
ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
The matrix QTBNQ does not change, only the rhs of the system is
updated.
Post by Gil Forsyth
Post by Gil Forsyth
We run this with `-pc_type gamg -pc_gamg_type agg
-pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
Post by Gil Forsyth
Post by Gil Forsyth
Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
and the same code diverges after 1 timestep and returns a -8
KSP_DIVERGED_INDEFINITE_PC
Post by Gil Forsyth
Post by Gil Forsyth
This is weird, especially because if we change nsmooths to 2, it
runs for 264 timesteps and the returns the same error. But we have
explicitly set KSPSetReusePreconditioner so it should be using the same PC,
right?
Post by Gil Forsyth
Post by Gil Forsyth
Change nsmooths to 3 and it again diverges after 1 timestep.
Change nsmooths to 4 and it runs to completion.
It seems like either gamg's behavior has changed, or that
KSPSetNullSpace was doing something implicitly that we now need to do
explicitly in addition to MatSetNullSpace?
Post by Gil Forsyth
Post by Gil Forsyth
Thanks,
Gil Forsyth
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
Post by Gil Forsyth
-- Norbert Wiener
<kspview.log>
<kspview3.5.4><kspview3.6.1>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
Post by Gil Forsyth
-- Norbert Wiener
<kspview_arch-3264318.log>
Loading...