Discussion:
[petsc-users] ILU preconditioner hangs with some zero elements on the diagonal
Gary Rebt
2015-10-27 14:06:16 UTC
Permalink
<html><head></head><body><div style="font-family: Verdana;font-size: 12.0px;"><div>Dear petsc-users,</div>

<div>&nbsp;</div>

<div>While using the FEniCS package to Solve a simple Stokes&#39; flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at&nbsp;MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.</div>

<div>&nbsp;</div>

<div>PETSc version is 3.6 and the matrix for the minimal test case is as follows :</div>

<div>http://pastebin.com/t3fvdkaS</div>

<div>&nbsp;</div>

<div>It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).</div>

<div>&nbsp;</div>

<div>Playing with the available options here&nbsp;http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?</div>

<div>&nbsp;</div>

<div>As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.</div>

<div>&nbsp;</div>

<div>Best regards,</div>

<div>&nbsp;</div>

<div>Gary</div></div></body></html>
Gary Rebt
2015-10-27 14:09:49 UTC
Permalink
Dear petsc-users,
 
While using the FEniCS package to Solve a simple Stokes' flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.
 
PETSc version is 3.6 and the matrix for the minimal test case is as follows :
http://pastebin.com/t3fvdkaS
 
It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).
 
Playing with the available options here http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html] did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?
 
As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.
 
Best regards,
 
Gary

PS : My mistake for the previous e-mail in HTML, you can delete it.
Matthew Knepley
2015-10-27 14:10:04 UTC
Permalink
On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch> wrote:

> Dear petsc-users,
>
> While using the FEniCS package to Solve a simple Stokes' flow problem, I
> have run into problems with PETSc preconditioners. In particular, I would
> like to use ILU (no parallel version) along with GMRES to solve my linear
> system but the solver just hangs indefinitely
> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
> is at 100% but even for a tiny system (59x59 for minimal test case), the
> solver does not seem to manage to push through it after 30 mins.
>
> PETSc version is 3.6 and the matrix for the minimal test case is as
> follows :
> http://pastebin.com/t3fvdkaS
>

Hanging is a bug. We will check it out.


> It contains zero diagonal entries, has a condition number of around 1e3
> but is definitely non-singular. Direct solvers manage to solve the system
> as well as GMRES without preconditioner (although after many iterations for
> a 59x59 system..).
>

This will never work. Direct solvers work because they pivot away the
zeros, but ILU is defined by having no pivoting.

Thanks,

Matt


> Playing with the available options here
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html
> did not seem to solve the issue (even after activating diagonal_fill and/or
> nonzeros_along_diagonal) although sometimes error 71 is returned which
> stands for zero pivot detected. Are there yet other options that I have not
> considered? The default ILU factorization in MATLAB returns satisfactory
> problems without errors so surely it must be possible with PETSc?
>
> As for the choice of ILU, I agree it might be suboptimal in this setting
> but I do need it for benchmarking purposes.
>
> Best regards,
>
> Gary
>



--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Matthew Knepley
2015-10-27 14:13:10 UTC
Permalink
On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com> wrote:

> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch> wrote:
>
>> Dear petsc-users,
>>
>> While using the FEniCS package to Solve a simple Stokes' flow problem, I
>> have run into problems with PETSc preconditioners. In particular, I would
>> like to use ILU (no parallel version) along with GMRES to solve my linear
>> system but the solver just hangs indefinitely
>> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
>> is at 100% but even for a tiny system (59x59 for minimal test case), the
>> solver does not seem to manage to push through it after 30 mins.
>>
>> PETSc version is 3.6 and the matrix for the minimal test case is as
>> follows :
>> http://pastebin.com/t3fvdkaS
>>
>
> Hanging is a bug. We will check it out.
>

I do not have any way to read in this ASCII. Can you output a binary version

-mat_view binary:mat.bin

Thanks,

Matt


> It contains zero diagonal entries, has a condition number of around 1e3
>> but is definitely non-singular. Direct solvers manage to solve the system
>> as well as GMRES without preconditioner (although after many iterations for
>> a 59x59 system..).
>>
>
> This will never work. Direct solvers work because they pivot away the
> zeros, but ILU is defined by having no pivoting.
>
> Thanks,
>
> Matt
>
>
>> Playing with the available options here
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html
>> did not seem to solve the issue (even after activating diagonal_fill and/or
>> nonzeros_along_diagonal) although sometimes error 71 is returned which
>> stands for zero pivot detected. Are there yet other options that I have not
>> considered? The default ILU factorization in MATLAB returns satisfactory
>> problems without errors so surely it must be possible with PETSc?
>>
>> As for the choice of ILU, I agree it might be suboptimal in this setting
>> but I do need it for benchmarking purposes.
>>
>> Best regards,
>>
>> Gary
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Gary Rebt
2015-10-27 14:46:17 UTC
Permalink
Thanks. Here's the binary version.

Best,

Gary
 

On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com> wrote:

On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[***@gmx.ch]> wrote:

Dear petsc-users,
 
While using the FEniCS package to Solve a simple Stokes' flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.
 
PETSc version is 3.6 and the matrix for the minimal test case is as follows :
http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
 
Hanging is a bug. We will check it out.
 
I do not have any way to read in this ASCII. Can you output a binary version
 
  -mat_view binary:mat.bin
 
  Thanks,
 
     Matt
 

It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).
 
This will never work. Direct solvers work because they pivot away the zeros, but ILU is defined by having no pivoting.
 
  Thanks,
 
     Matt
 

Playing with the available options here http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html] did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?
 
As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.
 
Best regards,
 
Gary 
 --
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener 
 --
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
Hong
2015-10-27 16:13:39 UTC
Permalink
Gary :
I tested your mat.bin using
petsc/src/ksp/ksp/examples/tutorials/ex10.c
./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
...
Mat Object: 1 MPI processes
type: seqaij
rows=588, cols=588
total: nonzeros=11274, allocated nonzeros=11274
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 291 nodes, limit used is 5
Number of iterations = 0
Residual norm 24.2487

It does not converge, neither hangs.
As you said, matrix is non-singular, LU gives a solution
./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm
2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm
5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
Number of iterations = 1
Residual norm < 1.e-12

Is this the same matrix as you mentioned?

Hong


>
>
> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com>
> wrote:
>
> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[
> ***@gmx.ch]> wrote:
>
> Dear petsc-users,
>
> While using the FEniCS package to Solve a simple Stokes' flow problem, I
> have run into problems with PETSc preconditioners. In particular, I would
> like to use ILU (no parallel version) along with GMRES to solve my linear
> system but the solver just hangs indefinitely
> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
> is at 100% but even for a tiny system (59x59 for minimal test case), the
> solver does not seem to manage to push through it after 30 mins.
>
> PETSc version is 3.6 and the matrix for the minimal test case is as
> follows :
> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>
> Hanging is a bug. We will check it out.
>
> I do not have any way to read in this ASCII. Can you output a binary
> version
>
> -mat_view binary:mat.bin
>
> Thanks,
>
> Matt
>
>
> It contains zero diagonal entries, has a condition number of around 1e3
> but is definitely non-singular. Direct solvers manage to solve the system
> as well as GMRES without preconditioner (although after many iterations for
> a 59x59 system..).
>
> This will never work. Direct solvers work because they pivot away the
> zeros, but ILU is defined by having no pivoting.
>
> Thanks,
>
> Matt
>
>
> Playing with the available options here
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]
> did not seem to solve the issue (even after activating diagonal_fill and/or
> nonzeros_along_diagonal) although sometimes error 71 is returned which
> stands for zero pivot detected. Are there yet other options that I have not
> considered? The default ILU factorization in MATLAB returns satisfactory
> problems without errors so surely it must be possible with PETSc?
>
> As for the choice of ILU, I agree it might be suboptimal in this setting
> but I do need it for benchmarking purposes.
>
> Best regards,
>
> Gary
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
Matthew Knepley
2015-10-27 16:44:28 UTC
Permalink
On Tue, Oct 27, 2015 at 11:13 AM, Hong <***@mcs.anl.gov> wrote:

> Gary :
> I tested your mat.bin using
> petsc/src/ksp/ksp/examples/tutorials/ex10.c
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
> ...
> Mat Object: 1 MPI processes
> type: seqaij
> rows=588, cols=588
> total: nonzeros=11274, allocated nonzeros=11274
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 291 nodes, limit used is 5
> Number of iterations = 0
> Residual norm 24.2487
>
> It does not converge, neither hangs.
> As you said, matrix is non-singular, LU gives a solution
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm
> 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm
> 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
> Number of iterations = 1
> Residual norm < 1.e-12
>
> Is this the same matrix as you mentioned?
>

Hong, could you run ILU on it as well?

Thanks,

Matt


> Hong
>
>
>>
>>
>> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com>
>> wrote:
>>
>> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[
>> ***@gmx.ch]> wrote:
>>
>> Dear petsc-users,
>>
>> While using the FEniCS package to Solve a simple Stokes' flow problem, I
>> have run into problems with PETSc preconditioners. In particular, I would
>> like to use ILU (no parallel version) along with GMRES to solve my linear
>> system but the solver just hangs indefinitely
>> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
>> is at 100% but even for a tiny system (59x59 for minimal test case), the
>> solver does not seem to manage to push through it after 30 mins.
>>
>> PETSc version is 3.6 and the matrix for the minimal test case is as
>> follows :
>> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>>
>> Hanging is a bug. We will check it out.
>>
>> I do not have any way to read in this ASCII. Can you output a binary
>> version
>>
>> -mat_view binary:mat.bin
>>
>> Thanks,
>>
>> Matt
>>
>>
>> It contains zero diagonal entries, has a condition number of around 1e3
>> but is definitely non-singular. Direct solvers manage to solve the system
>> as well as GMRES without preconditioner (although after many iterations for
>> a 59x59 system..).
>>
>> This will never work. Direct solvers work because they pivot away the
>> zeros, but ILU is defined by having no pivoting.
>>
>> Thanks,
>>
>> Matt
>>
>>
>> Playing with the available options here
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]
>> did not seem to solve the issue (even after activating diagonal_fill and/or
>> nonzeros_along_diagonal) although sometimes error 71 is returned which
>> stands for zero pivot detected. Are there yet other options that I have not
>> considered? The default ILU factorization in MATLAB returns satisfactory
>> problems without errors so surely it must be possible with PETSc?
>>
>> As for the choice of ILU, I agree it might be suboptimal in this setting
>> but I do need it for benchmarking purposes.
>>
>> Best regards,
>>
>> Gary
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Hong
2015-10-27 17:36:33 UTC
Permalink
Matt:

> On Tue, Oct 27, 2015 at 11:13 AM, Hong <***@mcs.anl.gov> wrote:
>
>> Gary :
>> I tested your mat.bin using
>> petsc/src/ksp/ksp/examples/tutorials/ex10.c
>> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
>> ...
>> Mat Object: 1 MPI processes
>> type: seqaij
>> rows=588, cols=588
>> total: nonzeros=11274, allocated nonzeros=11274
>> total number of mallocs used during MatSetValues calls =0
>> using I-node routines: found 291 nodes, limit used is 5
>> Number of iterations = 0
>> Residual norm 24.2487
>>
>
>> It does not converge, neither hangs.
>>
>
This is the default GMRES/ILU.
Hong


> As you said, matrix is non-singular, LU gives a solution
>> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
>> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm
>> 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
>> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm
>> 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
>> Number of iterations = 1
>> Residual norm < 1.e-12
>>
>> Is this the same matrix as you mentioned?
>>
>
> Hong, could you run ILU on it as well?
>
> Thanks,
>
> Matt
>
>
>> Hong
>>
>>
>>>
>>>
>>> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com>
>>> wrote:
>>>
>>> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[
>>> ***@gmx.ch]> wrote:
>>>
>>> Dear petsc-users,
>>>
>>> While using the FEniCS package to Solve a simple Stokes' flow problem, I
>>> have run into problems with PETSc preconditioners. In particular, I would
>>> like to use ILU (no parallel version) along with GMRES to solve my linear
>>> system but the solver just hangs indefinitely
>>> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
>>> is at 100% but even for a tiny system (59x59 for minimal test case), the
>>> solver does not seem to manage to push through it after 30 mins.
>>>
>>> PETSc version is 3.6 and the matrix for the minimal test case is as
>>> follows :
>>> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>>>
>>> Hanging is a bug. We will check it out.
>>>
>>> I do not have any way to read in this ASCII. Can you output a binary
>>> version
>>>
>>> -mat_view binary:mat.bin
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>> It contains zero diagonal entries, has a condition number of around 1e3
>>> but is definitely non-singular. Direct solvers manage to solve the system
>>> as well as GMRES without preconditioner (although after many iterations for
>>> a 59x59 system..).
>>>
>>> This will never work. Direct solvers work because they pivot away the
>>> zeros, but ILU is defined by having no pivoting.
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>> Playing with the available options here
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]
>>> did not seem to solve the issue (even after activating diagonal_fill and/or
>>> nonzeros_along_diagonal) although sometimes error 71 is returned which
>>> stands for zero pivot detected. Are there yet other options that I have not
>>> considered? The default ILU factorization in MATLAB returns satisfactory
>>> problems without errors so surely it must be possible with PETSc?
>>>
>>> As for the choice of ILU, I agree it might be suboptimal in this setting
>>> but I do need it for benchmarking purposes.
>>>
>>> Best regards,
>>>
>>> Gary
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
Matthew Knepley
2015-10-27 17:38:04 UTC
Permalink
On Tue, Oct 27, 2015 at 12:36 PM, Hong <***@mcs.anl.gov> wrote:

> Matt:
>
>> On Tue, Oct 27, 2015 at 11:13 AM, Hong <***@mcs.anl.gov> wrote:
>>
>>> Gary :
>>> I tested your mat.bin using
>>> petsc/src/ksp/ksp/examples/tutorials/ex10.c
>>> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
>>> ...
>>> Mat Object: 1 MPI processes
>>> type: seqaij
>>> rows=588, cols=588
>>> total: nonzeros=11274, allocated nonzeros=11274
>>> total number of mallocs used during MatSetValues calls =0
>>> using I-node routines: found 291 nodes, limit used is 5
>>> Number of iterations = 0
>>> Residual norm 24.2487
>>>
>>
>>> It does not converge, neither hangs.
>>>
>>
> This is the default GMRES/ILU.
>

Thanks,

Matt


> Hong
>
>
>> As you said, matrix is non-singular, LU gives a solution
>>> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
>>> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm
>>> 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
>>> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm
>>> 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
>>> Number of iterations = 1
>>> Residual norm < 1.e-12
>>>
>>> Is this the same matrix as you mentioned?
>>>
>>
>> Hong, could you run ILU on it as well?
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Hong
>>>
>>>
>>>>
>>>>
>>>> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com>
>>>> wrote:
>>>>
>>>> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[
>>>> ***@gmx.ch]> wrote:
>>>>
>>>> Dear petsc-users,
>>>>
>>>> While using the FEniCS package to Solve a simple Stokes' flow problem,
>>>> I have run into problems with PETSc preconditioners. In particular, I would
>>>> like to use ILU (no parallel version) along with GMRES to solve my linear
>>>> system but the solver just hangs indefinitely
>>>> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
>>>> is at 100% but even for a tiny system (59x59 for minimal test case), the
>>>> solver does not seem to manage to push through it after 30 mins.
>>>>
>>>> PETSc version is 3.6 and the matrix for the minimal test case is as
>>>> follows :
>>>> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>>>>
>>>> Hanging is a bug. We will check it out.
>>>>
>>>> I do not have any way to read in this ASCII. Can you output a binary
>>>> version
>>>>
>>>> -mat_view binary:mat.bin
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>> It contains zero diagonal entries, has a condition number of around 1e3
>>>> but is definitely non-singular. Direct solvers manage to solve the system
>>>> as well as GMRES without preconditioner (although after many iterations for
>>>> a 59x59 system..).
>>>>
>>>> This will never work. Direct solvers work because they pivot away the
>>>> zeros, but ILU is defined by having no pivoting.
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>> Playing with the available options here
>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]
>>>> did not seem to solve the issue (even after activating diagonal_fill and/or
>>>> nonzeros_along_diagonal) although sometimes error 71 is returned which
>>>> stands for zero pivot detected. Are there yet other options that I have not
>>>> considered? The default ILU factorization in MATLAB returns satisfactory
>>>> problems without errors so surely it must be possible with PETSc?
>>>>
>>>> As for the choice of ILU, I agree it might be suboptimal in this
>>>> setting but I do need it for benchmarking purposes.
>>>>
>>>> Best regards,
>>>>
>>>> Gary
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Hong
2015-10-27 17:40:28 UTC
Permalink
Here is the reason why it does not converge:
./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
-ksp_converged_reason
Linear solve did not converge due to DIVERGED_NANORINF iterations 0

KSP Object: 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
type: ilu
ILU: out-of-place factorization
...

Hong

On Tue, Oct 27, 2015 at 12:36 PM, Hong <***@mcs.anl.gov> wrote:

> Matt:
>
>> On Tue, Oct 27, 2015 at 11:13 AM, Hong <***@mcs.anl.gov> wrote:
>>
>>> Gary :
>>> I tested your mat.bin using
>>> petsc/src/ksp/ksp/examples/tutorials/ex10.c
>>> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
>>> ...
>>> Mat Object: 1 MPI processes
>>> type: seqaij
>>> rows=588, cols=588
>>> total: nonzeros=11274, allocated nonzeros=11274
>>> total number of mallocs used during MatSetValues calls =0
>>> using I-node routines: found 291 nodes, limit used is 5
>>> Number of iterations = 0
>>> Residual norm 24.2487
>>>
>>
>>> It does not converge, neither hangs.
>>>
>>
> This is the default GMRES/ILU.
> Hong
>
>
>> As you said, matrix is non-singular, LU gives a solution
>>> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
>>> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm
>>> 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
>>> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm
>>> 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
>>> Number of iterations = 1
>>> Residual norm < 1.e-12
>>>
>>> Is this the same matrix as you mentioned?
>>>
>>
>> Hong, could you run ILU on it as well?
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Hong
>>>
>>>
>>>>
>>>>
>>>> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com>
>>>> wrote:
>>>>
>>>> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[
>>>> ***@gmx.ch]> wrote:
>>>>
>>>> Dear petsc-users,
>>>>
>>>> While using the FEniCS package to Solve a simple Stokes' flow problem,
>>>> I have run into problems with PETSc preconditioners. In particular, I would
>>>> like to use ILU (no parallel version) along with GMRES to solve my linear
>>>> system but the solver just hangs indefinitely
>>>> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage
>>>> is at 100% but even for a tiny system (59x59 for minimal test case), the
>>>> solver does not seem to manage to push through it after 30 mins.
>>>>
>>>> PETSc version is 3.6 and the matrix for the minimal test case is as
>>>> follows :
>>>> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>>>>
>>>> Hanging is a bug. We will check it out.
>>>>
>>>> I do not have any way to read in this ASCII. Can you output a binary
>>>> version
>>>>
>>>> -mat_view binary:mat.bin
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>> It contains zero diagonal entries, has a condition number of around 1e3
>>>> but is definitely non-singular. Direct solvers manage to solve the system
>>>> as well as GMRES without preconditioner (although after many iterations for
>>>> a 59x59 system..).
>>>>
>>>> This will never work. Direct solvers work because they pivot away the
>>>> zeros, but ILU is defined by having no pivoting.
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>> Playing with the available options here
>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]
>>>> did not seem to solve the issue (even after activating diagonal_fill and/or
>>>> nonzeros_along_diagonal) although sometimes error 71 is returned which
>>>> stands for zero pivot detected. Are there yet other options that I have not
>>>> considered? The default ILU factorization in MATLAB returns satisfactory
>>>> problems without errors so surely it must be possible with PETSc?
>>>>
>>>> As for the choice of ILU, I agree it might be suboptimal in this
>>>> setting but I do need it for benchmarking purposes.
>>>>
>>>> Best regards,
>>>>
>>>> Gary
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
Barry Smith
2015-10-27 18:50:26 UTC
Permalink
> On Oct 27, 2015, at 12:40 PM, Hong <***@mcs.anl.gov> wrote:
>
> Here is the reason why it does not converge:
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view -ksp_converged_reason
> Linear solve did not converge due to DIVERGED_NANORINF iterations 0

This means it found a zero pivot either in the factorization or in the first attempt to do a triangular solve. You can try

-pc_factor_nonzeros_along_diagonal

and or

-pc_factor_shift_type nonzero

to generate a usable LU factorization but these are ad hoc fixes.



Barry

>
> KSP Object: 1 MPI processes
> type: gmres
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> GMRES: happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> ...
>
> Hong
>
> On Tue, Oct 27, 2015 at 12:36 PM, Hong <***@mcs.anl.gov> wrote:
> Matt:
> On Tue, Oct 27, 2015 at 11:13 AM, Hong <***@mcs.anl.gov> wrote:
> Gary :
> I tested your mat.bin using
> petsc/src/ksp/ksp/examples/tutorials/ex10.c
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
> ...
> Mat Object: 1 MPI processes
> type: seqaij
> rows=588, cols=588
> total: nonzeros=11274, allocated nonzeros=11274
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 291 nodes, limit used is 5
> Number of iterations = 0
> Residual norm 24.2487
>
> It does not converge, neither hangs.
>
> This is the default GMRES/ILU.
> Hong
>
> As you said, matrix is non-singular, LU gives a solution
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
> Number of iterations = 1
> Residual norm < 1.e-12
>
> Is this the same matrix as you mentioned?
>
> Hong, could you run ILU on it as well?
>
> Thanks,
>
> Matt
>
> Hong
>
>
>
>
> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com> wrote:
>
> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[***@gmx.ch]> wrote:
>
> Dear petsc-users,
>
> While using the FEniCS package to Solve a simple Stokes' flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.
>
> PETSc version is 3.6 and the matrix for the minimal test case is as follows :
> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>
> Hanging is a bug. We will check it out.
>
> I do not have any way to read in this ASCII. Can you output a binary version
>
> -mat_view binary:mat.bin
>
> Thanks,
>
> Matt
>
>
> It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).
>
> This will never work. Direct solvers work because they pivot away the zeros, but ILU is defined by having no pivoting.
>
> Thanks,
>
> Matt
>
>
> Playing with the available options here http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html] did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?
>
> As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.
>
> Best regards,
>
> Gary
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
>
Hong
2015-10-27 19:08:19 UTC
Permalink
I did a quick test:
1) ilu with any shift type does not converge
2) gmres/4 bjacobi/ilu + shift_type nonzero does not converge
3) mpiexec -n 4 ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor -sub_pc_type lu
-sub_pc_factor_shift_type nonzero
...
0 KSP Residual norm 3.284826093129e+03
1 KSP Residual norm 2.802972716423e+03
2 KSP Residual norm 2.039112137210e+03
...
24 KSP Residual norm 2.666350543810e-02
Number of iterations = 24
Residual norm 0.0179698

Hong

On Tue, Oct 27, 2015 at 1:50 PM, Barry Smith <***@mcs.anl.gov> wrote:

>
> > On Oct 27, 2015, at 12:40 PM, Hong <***@mcs.anl.gov> wrote:
> >
> > Here is the reason why it does not converge:
> > ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
> -ksp_converged_reason
> > Linear solve did not converge due to DIVERGED_NANORINF iterations 0
>
> This means it found a zero pivot either in the factorization or in the
> first attempt to do a triangular solve. You can try
>
> -pc_factor_nonzeros_along_diagonal
>
> and or
>
> -pc_factor_shift_type nonzero
>
> to generate a usable LU factorization but these are ad hoc fixes.
>
>
>
> Barry
>
> >
> > KSP Object: 1 MPI processes
> > type: gmres
> > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> > GMRES: happy breakdown tolerance 1e-30
> > maximum iterations=10000, initial guess is zero
> > tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> > left preconditioning
> > using PRECONDITIONED norm type for convergence test
> > PC Object: 1 MPI processes
> > type: ilu
> > ILU: out-of-place factorization
> > ...
> >
> > Hong
> >
> > On Tue, Oct 27, 2015 at 12:36 PM, Hong <***@mcs.anl.gov> wrote:
> > Matt:
> > On Tue, Oct 27, 2015 at 11:13 AM, Hong <***@mcs.anl.gov> wrote:
> > Gary :
> > I tested your mat.bin using
> > petsc/src/ksp/ksp/examples/tutorials/ex10.c
> > ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
> > ...
> > Mat Object: 1 MPI processes
> > type: seqaij
> > rows=588, cols=588
> > total: nonzeros=11274, allocated nonzeros=11274
> > total number of mallocs used during MatSetValues calls =0
> > using I-node routines: found 291 nodes, limit used is 5
> > Number of iterations = 0
> > Residual norm 24.2487
> >
> > It does not converge, neither hangs.
> >
> > This is the default GMRES/ILU.
> > Hong
> >
> > As you said, matrix is non-singular, LU gives a solution
> > ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
> > 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm
> 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
> > 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm
> 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
> > Number of iterations = 1
> > Residual norm < 1.e-12
> >
> > Is this the same matrix as you mentioned?
> >
> > Hong, could you run ILU on it as well?
> >
> > Thanks,
> >
> > Matt
> >
> > Hong
> >
> >
> >
> >
> > On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <***@gmail.com>
> wrote:
> >
> > On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <***@gmx.ch[
> ***@gmx.ch]> wrote:
> >
> > Dear petsc-users,
> >
> > While using the FEniCS package to Solve a simple Stokes' flow problem, I
> have run into problems with PETSc preconditioners. In particular, I would
> like to use ILU (no parallel version) along with GMRES to solve my linear
> system but the solver just hangs indefinitely at
> MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is
> at 100% but even for a tiny system (59x59 for minimal test case), the
> solver does not seem to manage to push through it after 30 mins.
> >
> > PETSc version is 3.6 and the matrix for the minimal test case is as
> follows :
> > http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
> >
> > Hanging is a bug. We will check it out.
> >
> > I do not have any way to read in this ASCII. Can you output a binary
> version
> >
> > -mat_view binary:mat.bin
> >
> > Thanks,
> >
> > Matt
> >
> >
> > It contains zero diagonal entries, has a condition number of around 1e3
> but is definitely non-singular. Direct solvers manage to solve the system
> as well as GMRES without preconditioner (although after many iterations for
> a 59x59 system..).
> >
> > This will never work. Direct solvers work because they pivot away the
> zeros, but ILU is defined by having no pivoting.
> >
> > Thanks,
> >
> > Matt
> >
> >
> > Playing with the available options here
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]
> did not seem to solve the issue (even after activating diagonal_fill and/or
> nonzeros_along_diagonal) although sometimes error 71 is returned which
> stands for zero pivot detected. Are there yet other options that I have not
> considered? The default ILU factorization in MATLAB returns satisfactory
> problems without errors so surely it must be possible with PETSc?
> >
> > As for the choice of ILU, I agree it might be suboptimal in this setting
> but I do need it for benchmarking purposes.
> >
> > Best regards,
> >
> > Gary
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> >
>
>
Loading...