Discussion:
[petsc-users] DM question.
Dave May
2015-10-21 20:07:22 UTC
Permalink
Hey Mike,



On 21 October 2015 at 18:01, Afanasiev Michael <
Hey Dave,
So I’ve got a couple of days where there’s nothing pressing to work on

was thinking of ripping out the parallel routines (ugly) in my wave
propagation code and replacing them with Petsc DM routines. I can read in
my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a
nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.
But here I’m stuck, and am having a whale of a time with the
documentation. All I *think* I need is a way to modify the exodus-created
DM, and add to it the degrees of freedom that are introduced by my
quadrature rule. This would be really neat. I can just treat each
sub-domain as its own mesh, with its own global numbering. Then whenever
necessary I can scatter stuff the the *real* global degrees of freedom
with something like VecLocToGlob. Most of the things I like about the code
could stay the same (element-wise, matrix-free nature), just these parallel
broadcasts would be infinitely nicer.
First off - I don't use DMPLEX.
But I just can’t figure out how to set this up. The main problem really
boils down to: what’s the best way to add my quadrature points to an
already-created DM, which was constructed with an exodus file? I guess I
could do this after the file is read, but before the partitioning. In this
case though, what’s stopping the partitioner from cutting an element in
half? It seems like it would be a lot cleaner to do this post-partitioning.
Presumably what is read from exodus is just the vertices of the hexes, and
what you want to do is define the function space (given by your GLL
locations) on top of element geometry read in. Is that what you are asking
about?
Any hints here?
Actually I have no experience with this object.
I would just send an email to
petsc-***@mcs.anl.gov
asking for help.

The developer of DMPLEX (Matt Knepley) will definitely answer within in 1
day.

Cheers,
Dave
Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
Matthew Knepley
2015-10-22 00:32:35 UTC
Permalink
Post by Dave May
Hey Mike,
On 21 October 2015 at 18:01, Afanasiev Michael <
Hey Dave,
So I’ve got a couple of days where there’s nothing pressing to work on

was thinking of ripping out the parallel routines (ugly) in my wave
propagation code and replacing them with Petsc DM routines. I can read in
my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a
nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.
But here I’m stuck, and am having a whale of a time with the
documentation. All I *think* I need is a way to modify the
exodus-created DM, and add to it the degrees of freedom that are introduced
by my quadrature rule. This would be really neat. I can just treat each
sub-domain as its own mesh, with its own global numbering. Then whenever
necessary I can scatter stuff the the *real* global degrees of freedom
with something like VecLocToGlob. Most of the things I like about the code
could stay the same (element-wise, matrix-free nature), just these parallel
broadcasts would be infinitely nicer.
First off - I don't use DMPLEX.
Dave is refreshingly candid about his shortcomings ;)
But I just can’t figure out how to set this up. The main problem really
Post by Dave May
boils down to: what’s the best way to add my quadrature points to an
already-created DM, which was constructed with an exodus file? I guess I
could do this after the file is read, but before the partitioning. In this
case though, what’s stopping the partitioner from cutting an element in
half? It seems like it would be a lot cleaner to do this post-partitioning.
Presumably what is read from exodus is just the vertices of the hexes, and
what you want to do is define the function space (given by your GLL
locations) on top of element geometry read in. Is that what you are asking
about?
So Dave is right. We read in topology and geometry from ExodusII. Then you
define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to
global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with
PetscSection. If you give me an idea
what you want I can help you write the code easily I think.

Thanks,

Matt
Post by Dave May
Any hints here?
Actually I have no experience with this object.
I would just send an email to
asking for help.
The developer of DMPLEX (Matt Knepley) will definitely answer within in 1
day.
Cheers,
Dave
Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Afanasiev Michael
2015-10-23 08:01:03 UTC
Permalink
Hi Matthew,

So I’m discretizing via a tensor product of Lagrange polynomials co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order might range from 4 to 10 or something like that. The current problem is solved on 2D hexes.

I had found DMPlexCreateSection, and followed plex/ex1to get things set up. You can see my attempt below. Basically I’ve got 4 fields (displacement, velocity, acceleration, and strain) over each element. Here I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange polynomials (line 11). This seemed to somewhat achieve what I wanted — I created a global vector and wrote it to a vtk file with VecView, and the numbering seemed to make sense. You can also see my attempt at defining a boundary condition (it looked like DMPlexCreateFromExodus labeled side sets as “Face Sets”, seems to have worked).

Does this seem to be headed in the right direction?

Cheers,
Mike.

DM
mesh::createSection(const DM &dm)
{

01 // displacement, velocity, acceleration, strain
02 IS bcPointIs[1];
03 PetscInt numBc = 1;
04 PetscInt bcField[1];
05 PetscInt numFields = 4;
06 PetscInt dim; DMGetDimension(dm, &dim);
07 PetscInt numComp[numFields];
08 for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09 PetscInt numDof[numFields*(dim+1)];
10 for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11 for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12 bcField[0] = 0;
13 PetscSection section;
14 DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15 DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16 NULL, bcPointIs, NULL, &section);
17 ISDestroy(&bcPointIs[0]);
18 PetscSectionSetFieldName(section, 0, "u");
19 PetscSectionSetFieldName(section, 1, "v");
20 PetscSectionSetFieldName(section, 2, "a");
21 PetscSectionSetFieldName(section, 3, "e");
22 DMSetDefaultSection(dm, section);
23 return dm;
}
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 22, 2015, at 2:32 AM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Wed, Oct 21, 2015 at 3:07 PM, Dave May <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Mike,



On 21 October 2015 at 18:01, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Dave,

So I’ve got a couple of days where there’s nothing pressing to work on
 was thinking of ripping out the parallel routines (ugly) in my wave propagation code and replacing them with Petsc DM routines. I can read in my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.

But here I’m stuck, and am having a whale of a time with the documentation. All I think I need is a way to modify the exodus-created DM, and add to it the degrees of freedom that are introduced by my quadrature rule. This would be really neat. I can just treat each sub-domain as its own mesh, with its own global numbering. Then whenever necessary I can scatter stuff the the real global degrees of freedom with something like VecLocToGlob. Most of the things I like about the code could stay the same (element-wise, matrix-free nature), just these parallel broadcasts would be infinitely nicer.


First off - I don't use DMPLEX.

Dave is refreshingly candid about his shortcomings ;)


But I just can’t figure out how to set this up. The main problem really boils down to: what’s the best way to add my quadrature points to an already-created DM, which was constructed with an exodus file? I guess I could do this after the file is read, but before the partitioning. In this case though, what’s stopping the partitioner from cutting an element in half? It seems like it would be a lot cleaner to do this post-partitioning.


Presumably what is read from exodus is just the vertices of the hexes, and what you want to do is define the function space (given by your GLL locations) on top of element geometry read in. Is that what you are asking about?

So Dave is right. We read in topology and geometry from ExodusII. Then you define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with PetscSection. If you give me an idea
what you want I can help you write the code easily I think.

Thanks,

Matt

Any hints here?

Actually I have no experience with this object.
I would just send an email to
petsc-***@mcs.anl.gov<mailto:petsc-***@mcs.anl.gov>
asking for help.

The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.

Cheers,
Dave


Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
Matthew Knepley
2015-10-23 11:14:18 UTC
Permalink
On Fri, Oct 23, 2015 at 3:01 AM, Afanasiev Michael <
Post by Afanasiev Michael
Hi Matthew,
So I’m discretizing via a tensor product of Lagrange polynomials
co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order
might range from 4 to 10 or something like that. The current problem is
solved on 2D hexes.
I had found DMPlexCreateSection, and followed plex/ex1to get things set
up. You can see my attempt below. Basically I’ve got 4 fields
(displacement, velocity, acceleration, and strain) over each element. Here
I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange
polynomials (line 11). This seemed to * somewhat* achieve what I wanted —
I created a global vector and wrote it to a vtk file with VecView, and the
numbering seemed to make sense. You can also see my attempt at defining a
boundary condition (it looked like DMPlexCreateFromExodus labeled side sets
as “Face Sets”, seems to have worked).
Does this seem to be headed in the right direction?
Yes, however I have some questions. Starting out, I think the GLL points
include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1
dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below
you have numDof[] for
a 1D mesh with DG element.

The "Face Sets" is the right label to use for boundary conditions. This
will eliminate those variables
from the global system, but they will be present in the local spaces.

With elements like these, it is common (I think) to eliminate the cell
unknown explicitly, since the
system is dense and not connected to other cells. For this, you would
retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so
I do not know if there
are any pitfalls.

You can see an example of a similar implementation specifically for the
kind of spectral elements
you are considering here: https://github.com/jedbrown/dohp. It would
probably be useful to understand
what is done there as you implement.

Thanks,

Matt
Post by Afanasiev Michael
Cheers,
Mike.
DM
mesh::createSection(const DM &dm)
{
01 // displacement, velocity, acceleration, strain
02 IS bcPointIs[1];
03 PetscInt numBc = 1;
04 PetscInt bcField[1];
05 PetscInt numFields = 4;
06 PetscInt dim; DMGetDimension(dm, &dim);
07 PetscInt numComp[numFields];
08 for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09 PetscInt numDof[numFields*(dim+1)];
10 for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11 for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12 bcField[0] = 0;
13 PetscSection section;
14 DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15 DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16 NULL, bcPointIs, NULL, &section);
17 ISDestroy(&bcPointIs[0]);
18 PetscSectionSetFieldName(section, 0, "u");
19 PetscSectionSetFieldName(section, 1, "v");
20 PetscSectionSetFieldName(section, 2, "a");
21 PetscSectionSetFieldName(section, 3, "e");
22 DMSetDefaultSection(dm, section);
23 return dm;
}
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
Post by Dave May
Hey Mike,
On 21 October 2015 at 18:01, Afanasiev Michael <
Hey Dave,
So I’ve got a couple of days where there’s nothing pressing to work on

was thinking of ripping out the parallel routines (ugly) in my wave
propagation code and replacing them with Petsc DM routines. I can read in
my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a
nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.
But here I’m stuck, and am having a whale of a time with the
documentation. All I *think* I need is a way to modify the
exodus-created DM, and add to it the degrees of freedom that are introduced
by my quadrature rule. This would be really neat. I can just treat each
sub-domain as its own mesh, with its own global numbering. Then whenever
necessary I can scatter stuff the the *real* global degrees of freedom
with something like VecLocToGlob. Most of the things I like about the code
could stay the same (element-wise, matrix-free nature), just these parallel
broadcasts would be infinitely nicer.
First off - I don't use DMPLEX.
Dave is refreshingly candid about his shortcomings ;)
But I just can’t figure out how to set this up. The main problem really
Post by Dave May
boils down to: what’s the best way to add my quadrature points to an
already-created DM, which was constructed with an exodus file? I guess I
could do this after the file is read, but before the partitioning. In this
case though, what’s stopping the partitioner from cutting an element in
half? It seems like it would be a lot cleaner to do this post-partitioning.
Presumably what is read from exodus is just the vertices of the hexes,
and what you want to do is define the function space (given by your GLL
locations) on top of element geometry read in. Is that what you are asking
about?
So Dave is right. We read in topology and geometry from ExodusII. Then you
define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to
global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with
PetscSection. If you give me an idea
what you want I can help you write the code easily I think.
Thanks,
Matt
Post by Dave May
Any hints here?
Actually I have no experience with this object.
I would just send an email to
asking for help.
The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.
Cheers,
Dave
Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Afanasiev Michael
2015-10-26 11:45:01 UTC
Permalink
Hi Matthew,

No. There are (k-1)^2 unknowns in the interior of the cell, so that we have

4 + 4 * (k-1) + (k-1)^2 = (k+1)^2

Right, makes sense. With some insight from Dave, I’m getting what I expect for a simple 4-cell mesh distributed amongst 4 processors (length of 91 for the global vector, length of 25 for the local vectors, for 4th order polynomial and the GLL basis). Now I imagine what’s left is to figure out how these vectors are globally numbered.

I usually do this by having a consistent way of numbering the dofs on the reference element (i.e. given a (u, v)-space loop through v, then u), and then getting the local->global map via finding coincident points in a kd-tree. My question is: in which order are the local vectors now defined? If I knew this, I could create my local->global map as before. From the way the section was created, I guess it might be something like loop vertices, edges, and then interior?

Thanks for all your help,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 23, 2015, at 5:11 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Fri, Oct 23, 2015 at 9:03 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

Yes, however I have some questions. Starting out, I think the GLL points include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1 dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below you have numDof[] for
a 1D mesh with DG element.

For the GLL basis we have (k+1) points in each dimension, including the endpoints. So for example a 2D element with 4-order polynomials will have 25 locally numbered points per element. I should also mention that the velocity, displacement, and acceleration fields are vectors, with d=dim components at each integration point, whereas strain has (2^dim)-1 components. In what you’ve written above, does this then become (sum over the 4 fields):

(sum(compPerField*dofPerField)) -> vertex
(sum((compPerField*dofPerField)*(k+1)) -> edge
(sum((compPerField*dofPerField)*(k+1)^2) -> quad

No. There are (k-1)^2 unknowns in the interior of the cell, so that we have

4 + 4 * (k-1) + (k-1)^2 = (k+1)^2

With elements like these, it is common (I think) to eliminate the cell unknown explicitly, since the
system is dense and not connected to other cells. For this, you would retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so I do not know if there
are any pitfalls.

I believe I don’t worry about this. Everything is solved in a matrix-free sense, on each element. The relevant physical quantities are calculated on each element, and then summed into the global degrees of freedom. The summed global dof are then brought back to the element level, where updates to the acceleration, velocity, and displacement are calculated via a Newmark time step. So no global system of equations is ever formed.

You accumulate into the global dofs, so you would need more storage if you do not do static condensation. It just good to know this,
but you do not have to do it.

This being said, all the routines to a) integrate the element matrices, b) calculate the gll point locations, c) construct the local->global dof mapping, and d) do the element-wise matrix free time stepping are already written. My problem is really that I do my mesh decomposition by hand (poorly), and I’d like to transfer this over to Petsc. Of course if I do this, I might as well use DM's LocalToGlobal vector routines to sum my field vectors across mesh partitions.

Yes.

Perhaps a better question to ask would be in the form of a workflow:

1) Read in exodus mesh and partition (done)
2) Set up local and global degrees of freedom on each mesh partition (done)

Here you just have to setup PetscSections that mirror your local and global layout. Then the LocalToGlobal and its inverse will work.

Matt

3) Integrate element matrices (done)

for i 1, nTimeSteps
3) Step fields forward on each partition (done)
4) Sum to global degrees of freedom on each partition (done)
5) Sum to global degrees of freedom across partitions (????)
6) Retrieve summed global degrees of freedom across partitions (????)
7) Continue

So really what I hope Petsc will help with is just steps 5 and 6. I guess this involves, given a partitioned DMPlex object, which has been partitioned according to the geometry and topology defined in an exodus file, adding the degrees of freedom to each partition-local vector (via a DMPlexSection?). Then, ensuring that the dof added along the partition boundaries sum together properly when a LocalToGlobal vector operation is defined.

Does this make sense? If so, is this something that DMPlex is designed for?
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 23, 2015, at 1:14 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Fri, Oct 23, 2015 at 3:01 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

So I’m discretizing via a tensor product of Lagrange polynomials co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order might range from 4 to 10 or something like that. The current problem is solved on 2D hexes.

I had found DMPlexCreateSection, and followed plex/ex1to get things set up. You can see my attempt below. Basically I’ve got 4 fields (displacement, velocity, acceleration, and strain) over each element. Here I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange polynomials (line 11). This seemed to somewhat achieve what I wanted — I created a global vector and wrote it to a vtk file with VecView, and the numbering seemed to make sense. You can also see my attempt at defining a boundary condition (it looked like DMPlexCreateFromExodus labeled side sets as “Face Sets”, seems to have worked).

Does this seem to be headed in the right direction?

Yes, however I have some questions. Starting out, I think the GLL points include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1 dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below you have numDof[] for
a 1D mesh with DG element.

The "Face Sets" is the right label to use for boundary conditions. This will eliminate those variables
from the global system, but they will be present in the local spaces.

With elements like these, it is common (I think) to eliminate the cell unknown explicitly, since the
system is dense and not connected to other cells. For this, you would retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so I do not know if there
are any pitfalls.

You can see an example of a similar implementation specifically for the kind of spectral elements
you are considering here: https://github.com/jedbrown/dohp. It would probably be useful to understand
what is done there as you implement.

Thanks,

Matt

Cheers,
Mike.

DM
mesh::createSection(const DM &dm)
{

01 // displacement, velocity, acceleration, strain
02 IS bcPointIs[1];
03 PetscInt numBc = 1;
04 PetscInt bcField[1];
05 PetscInt numFields = 4;
06 PetscInt dim; DMGetDimension(dm, &dim);
07 PetscInt numComp[numFields];
08 for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09 PetscInt numDof[numFields*(dim+1)];
10 for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11 for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12 bcField[0] = 0;
13 PetscSection section;
14 DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15 DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16 NULL, bcPointIs, NULL, &section);
17 ISDestroy(&bcPointIs[0]);
18 PetscSectionSetFieldName(section, 0, "u");
19 PetscSectionSetFieldName(section, 1, "v");
20 PetscSectionSetFieldName(section, 2, "a");
21 PetscSectionSetFieldName(section, 3, "e");
22 DMSetDefaultSection(dm, section);
23 return dm;
}
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 22, 2015, at 2:32 AM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Wed, Oct 21, 2015 at 3:07 PM, Dave May <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Mike,



On 21 October 2015 at 18:01, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Dave,

So I’ve got a couple of days where there’s nothing pressing to work on
 was thinking of ripping out the parallel routines (ugly) in my wave propagation code and replacing them with Petsc DM routines. I can read in my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.

But here I’m stuck, and am having a whale of a time with the documentation. All I think I need is a way to modify the exodus-created DM, and add to it the degrees of freedom that are introduced by my quadrature rule. This would be really neat. I can just treat each sub-domain as its own mesh, with its own global numbering. Then whenever necessary I can scatter stuff the the real global degrees of freedom with something like VecLocToGlob. Most of the things I like about the code could stay the same (element-wise, matrix-free nature), just these parallel broadcasts would be infinitely nicer.


First off - I don't use DMPLEX.

Dave is refreshingly candid about his shortcomings ;)


But I just can’t figure out how to set this up. The main problem really boils down to: what’s the best way to add my quadrature points to an already-created DM, which was constructed with an exodus file? I guess I could do this after the file is read, but before the partitioning. In this case though, what’s stopping the partitioner from cutting an element in half? It seems like it would be a lot cleaner to do this post-partitioning.


Presumably what is read from exodus is just the vertices of the hexes, and what you want to do is define the function space (given by your GLL locations) on top of element geometry read in. Is that what you are asking about?

So Dave is right. We read in topology and geometry from ExodusII. Then you define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with PetscSection. If you give me an idea
what you want I can help you write the code easily I think.

Thanks,

Matt

Any hints here?

Actually I have no experience with this object.
I would just send an email to
petsc-***@mcs.anl.gov<mailto:petsc-***@mcs.anl.gov>
asking for help.

The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.

Cheers,
Dave


Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
Matthew Knepley
2015-10-26 11:52:00 UTC
Permalink
On Mon, Oct 26, 2015 at 6:45 AM, Afanasiev Michael <
Post by Afanasiev Michael
Hi Matthew,
No. There are (k-1)^2 unknowns in the interior of the cell, so that we have
4 + 4 * (k-1) + (k-1)^2 = (k+1)^2
Right, makes sense. With some insight from Dave, I’m getting what I expect
for a simple 4-cell mesh distributed amongst 4 processors (length of 91 for
the global vector, length of 25 for the local vectors, for 4th order
polynomial and the GLL basis). Now I imagine what’s left is to figure out
how these vectors are globally numbered.
I usually do this by having a consistent way of numbering the dofs on the
reference element (i.e. given a (u, v)-space loop through v, then u), and
then getting the local->global map via finding coincident points in a
kd-tree. My question is: in which order are the local vectors now defined?
If I knew this, I could create my local->global map as before. From the way
the section was created, I guess it might be something like loop vertices,
edges, and then interior?
The numbering is cell unknowns, vertex unknowns, face unknowns, and then
edge unknowns. This is, of course, arbitrary and therefore
you can change the order using


http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/PetscSectionSetPermutation.html#PetscSectionSetPermutation

which renumbers all the mesh points in the PetscSection defining the space.
You should be able to reproduce your old ordering using
this.

Thanks,

Matt
Post by Afanasiev Michael
Thanks for all your help,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
On Fri, Oct 23, 2015 at 9:03 AM, Afanasiev Michael <
Post by Afanasiev Michael
Hi Matthew,
Yes, however I have some questions. Starting out, I think the GLL points
include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1
dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like
below you have numDof[] for
a 1D mesh with DG element.
For the GLL basis we have (k+1) points in each dimension, including the
endpoints. So for example a 2D element with 4-order polynomials will have
25 locally numbered points per element. I should also mention that the
velocity, displacement, and acceleration fields are vectors, with d=dim
components at each integration point, whereas strain has (2^dim)-1
components. In what you’ve written above, does this then become (sum over
(sum(compPerField*dofPerField)) -> vertex
(sum((compPerField*dofPerField)*(k+1)) -> edge
(sum((compPerField*dofPerField)*(k+1)^2) -> quad
No. There are (k-1)^2 unknowns in the interior of the cell, so that we have
4 + 4 * (k-1) + (k-1)^2 = (k+1)^2
Post by Afanasiev Michael
With elements like these, it is common (I think) to eliminate the cell
unknown explicitly, since the
system is dense and not connected to other cells. For this, you would
retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself,
so I do not know if there
are any pitfalls.
I believe I don’t worry about this. Everything is solved in a matrix-free
sense, on each element. The relevant physical quantities are calculated on
each element, and then summed into the global degrees of freedom. The
summed global dof are then brought back to the element level, where updates
to the acceleration, velocity, and displacement are calculated via a
Newmark time step. So no global system of equations is ever formed.
You accumulate into the global dofs, so you would need more storage if you
do not do static condensation. It just good to know this,
but you do not have to do it.
Post by Afanasiev Michael
This being said, all the routines to a) integrate the element matrices,
b) calculate the gll point locations, c) construct the local->global dof
mapping, and d) do the element-wise matrix free time stepping are already
written. My problem is really that I do my mesh decomposition by hand
(poorly), and I’d like to transfer this over to Petsc. Of course if I do
this, I might as well use DM's LocalToGlobal vector routines to sum my
field vectors across mesh partitions.
Yes.
Post by Afanasiev Michael
1) Read in exodus mesh and partition (done)
2) Set up local and global degrees of freedom on each mesh partition (done)
Here you just have to setup PetscSections that mirror your local and
global layout. Then the LocalToGlobal and its inverse will work.
Matt
Post by Afanasiev Michael
3) Integrate element matrices (done)
for i 1, nTimeSteps
3) Step fields forward on each partition (done)
4) Sum to global degrees of freedom on each partition (done)
5) Sum to global degrees of freedom across partitions (????)
6) Retrieve summed global degrees of freedom across partitions (????)
7) Continue
So really what I hope Petsc will help with is just steps 5 and 6. I guess
this involves, given a partitioned DMPlex object, which has been
partitioned according to the geometry and topology defined in an exodus
file, adding the degrees of freedom to each partition-local vector (via a
DMPlexSection?). Then, ensuring that the dof added along the partition
boundaries sum together properly when a LocalToGlobal vector operation is
defined.
Does this make sense? If so, is this something that DMPlex is designed for?
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
On Fri, Oct 23, 2015 at 3:01 AM, Afanasiev Michael <
Post by Afanasiev Michael
Hi Matthew,
So I’m discretizing via a tensor product of Lagrange polynomials
co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order
might range from 4 to 10 or something like that. The current problem is
solved on 2D hexes.
I had found DMPlexCreateSection, and followed plex/ex1to get things set
up. You can see my attempt below. Basically I’ve got 4 fields
(displacement, velocity, acceleration, and strain) over each element. Here
I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange
polynomials (line 11). This seemed to *somewhat* achieve what I wanted
— I created a global vector and wrote it to a vtk file with VecView, and
the numbering seemed to make sense. You can also see my attempt at defining
a boundary condition (it looked like DMPlexCreateFromExodus labeled side
sets as “Face Sets”, seems to have worked).
Does this seem to be headed in the right direction?
Yes, however I have some questions. Starting out, I think the GLL points
include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1
dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like
below you have numDof[] for
a 1D mesh with DG element.
The "Face Sets" is the right label to use for boundary conditions. This
will eliminate those variables
from the global system, but they will be present in the local spaces.
With elements like these, it is common (I think) to eliminate the cell
unknown explicitly, since the
system is dense and not connected to other cells. For this, you would
retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself,
so I do not know if there
are any pitfalls.
You can see an example of a similar implementation specifically for the
kind of spectral elements
you are considering here: https://github.com/jedbrown/dohp. It would
probably be useful to understand
what is done there as you implement.
Thanks,
Matt
Post by Afanasiev Michael
Cheers,
Mike.
DM
mesh::createSection(const DM &dm)
{
01 // displacement, velocity, acceleration, strain
02 IS bcPointIs[1];
03 PetscInt numBc = 1;
04 PetscInt bcField[1];
05 PetscInt numFields = 4;
06 PetscInt dim; DMGetDimension(dm, &dim);
07 PetscInt numComp[numFields];
08 for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09 PetscInt numDof[numFields*(dim+1)];
10 for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11 for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12 bcField[0] = 0;
13 PetscSection section;
14 DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15 DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16 NULL, bcPointIs, NULL, &section);
17 ISDestroy(&bcPointIs[0]);
18 PetscSectionSetFieldName(section, 0, "u");
19 PetscSectionSetFieldName(section, 1, "v");
20 PetscSectionSetFieldName(section, 2, "a");
21 PetscSectionSetFieldName(section, 3, "e");
22 DMSetDefaultSection(dm, section);
23 return dm;
}
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
Post by Dave May
Hey Mike,
On 21 October 2015 at 18:01, Afanasiev Michael <
Hey Dave,
So I’ve got a couple of days where there’s nothing pressing to work
on
 was thinking of ripping out the parallel routines (ugly) in my wave
propagation code and replacing them with Petsc DM routines. I can read in
my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a
nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.
But here I’m stuck, and am having a whale of a time with the
documentation. All I *think* I need is a way to modify the
exodus-created DM, and add to it the degrees of freedom that are introduced
by my quadrature rule. This would be really neat. I can just treat each
sub-domain as its own mesh, with its own global numbering. Then whenever
necessary I can scatter stuff the the *real* global degrees of
freedom with something like VecLocToGlob. Most of the things I like about
the code could stay the same (element-wise, matrix-free nature), just these
parallel broadcasts would be infinitely nicer.
First off - I don't use DMPLEX.
Dave is refreshingly candid about his shortcomings ;)
But I just can’t figure out how to set this up. The main problem really
Post by Dave May
boils down to: what’s the best way to add my quadrature points to an
already-created DM, which was constructed with an exodus file? I guess I
could do this after the file is read, but before the partitioning. In this
case though, what’s stopping the partitioner from cutting an element in
half? It seems like it would be a lot cleaner to do this post-partitioning.
Presumably what is read from exodus is just the vertices of the hexes,
and what you want to do is define the function space (given by your GLL
locations) on top of element geometry read in. Is that what you are asking
about?
So Dave is right. We read in topology and geometry from ExodusII. Then
you define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to
global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with
PetscSection. If you give me an idea
what you want I can help you write the code easily I think.
Thanks,
Matt
Post by Dave May
Any hints here?
Actually I have no experience with this object.
I would just send an email to
asking for help.
The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.
Cheers,
Dave
Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich
Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
Afanasiev Michael
2015-10-26 17:15:13 UTC
Permalink
Hi Matt,

Re: a chat with Dave, I’ve constructed a minimal working example of the problem here: https://github.com/michael-afanasiev/salvus2. The relevant stuff is in example/main.c. There’s a little note in there detailing what we’d like to accomplish. It should compile fine with a quick modification of the CMakeLists to tell the makefile where your petsc is. This is all detailed in the Readme, and there’s a small test exodus file hanging out in there as well. A hint would be great.

Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 26, 2015, at 12:52 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Mon, Oct 26, 2015 at 6:45 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

No. There are (k-1)^2 unknowns in the interior of the cell, so that we have

4 + 4 * (k-1) + (k-1)^2 = (k+1)^2

Right, makes sense. With some insight from Dave, I’m getting what I expect for a simple 4-cell mesh distributed amongst 4 processors (length of 91 for the global vector, length of 25 for the local vectors, for 4th order polynomial and the GLL basis). Now I imagine what’s left is to figure out how these vectors are globally numbered.

I usually do this by having a consistent way of numbering the dofs on the reference element (i.e. given a (u, v)-space loop through v, then u), and then getting the local->global map via finding coincident points in a kd-tree. My question is: in which order are the local vectors now defined? If I knew this, I could create my local->global map as before. From the way the section was created, I guess it might be something like loop vertices, edges, and then interior?

The numbering is cell unknowns, vertex unknowns, face unknowns, and then edge unknowns. This is, of course, arbitrary and therefore
you can change the order using

http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/PetscSectionSetPermutation.html#PetscSectionSetPermutation

which renumbers all the mesh points in the PetscSection defining the space. You should be able to reproduce your old ordering using
this.

Thanks,

Matt

Thanks for all your help,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 23, 2015, at 5:11 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Fri, Oct 23, 2015 at 9:03 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

Yes, however I have some questions. Starting out, I think the GLL points include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1 dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below you have numDof[] for
a 1D mesh with DG element.

For the GLL basis we have (k+1) points in each dimension, including the endpoints. So for example a 2D element with 4-order polynomials will have 25 locally numbered points per element. I should also mention that the velocity, displacement, and acceleration fields are vectors, with d=dim components at each integration point, whereas strain has (2^dim)-1 components. In what you’ve written above, does this then become (sum over the 4 fields):

(sum(compPerField*dofPerField)) -> vertex
(sum((compPerField*dofPerField)*(k+1)) -> edge
(sum((compPerField*dofPerField)*(k+1)^2) -> quad

No. There are (k-1)^2 unknowns in the interior of the cell, so that we have

4 + 4 * (k-1) + (k-1)^2 = (k+1)^2

With elements like these, it is common (I think) to eliminate the cell unknown explicitly, since the
system is dense and not connected to other cells. For this, you would retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so I do not know if there
are any pitfalls.

I believe I don’t worry about this. Everything is solved in a matrix-free sense, on each element. The relevant physical quantities are calculated on each element, and then summed into the global degrees of freedom. The summed global dof are then brought back to the element level, where updates to the acceleration, velocity, and displacement are calculated via a Newmark time step. So no global system of equations is ever formed.

You accumulate into the global dofs, so you would need more storage if you do not do static condensation. It just good to know this,
but you do not have to do it.

This being said, all the routines to a) integrate the element matrices, b) calculate the gll point locations, c) construct the local->global dof mapping, and d) do the element-wise matrix free time stepping are already written. My problem is really that I do my mesh decomposition by hand (poorly), and I’d like to transfer this over to Petsc. Of course if I do this, I might as well use DM's LocalToGlobal vector routines to sum my field vectors across mesh partitions.

Yes.

Perhaps a better question to ask would be in the form of a workflow:

1) Read in exodus mesh and partition (done)
2) Set up local and global degrees of freedom on each mesh partition (done)

Here you just have to setup PetscSections that mirror your local and global layout. Then the LocalToGlobal and its inverse will work.

Matt

3) Integrate element matrices (done)

for i 1, nTimeSteps
3) Step fields forward on each partition (done)
4) Sum to global degrees of freedom on each partition (done)
5) Sum to global degrees of freedom across partitions (????)
6) Retrieve summed global degrees of freedom across partitions (????)
7) Continue

So really what I hope Petsc will help with is just steps 5 and 6. I guess this involves, given a partitioned DMPlex object, which has been partitioned according to the geometry and topology defined in an exodus file, adding the degrees of freedom to each partition-local vector (via a DMPlexSection?). Then, ensuring that the dof added along the partition boundaries sum together properly when a LocalToGlobal vector operation is defined.

Does this make sense? If so, is this something that DMPlex is designed for?
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 23, 2015, at 1:14 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Fri, Oct 23, 2015 at 3:01 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

So I’m discretizing via a tensor product of Lagrange polynomials co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order might range from 4 to 10 or something like that. The current problem is solved on 2D hexes.

I had found DMPlexCreateSection, and followed plex/ex1to get things set up. You can see my attempt below. Basically I’ve got 4 fields (displacement, velocity, acceleration, and strain) over each element. Here I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange polynomials (line 11). This seemed to somewhat achieve what I wanted — I created a global vector and wrote it to a vtk file with VecView, and the numbering seemed to make sense. You can also see my attempt at defining a boundary condition (it looked like DMPlexCreateFromExodus labeled side sets as “Face Sets”, seems to have worked).

Does this seem to be headed in the right direction?

Yes, however I have some questions. Starting out, I think the GLL points include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1 dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below you have numDof[] for
a 1D mesh with DG element.

The "Face Sets" is the right label to use for boundary conditions. This will eliminate those variables
from the global system, but they will be present in the local spaces.

With elements like these, it is common (I think) to eliminate the cell unknown explicitly, since the
system is dense and not connected to other cells. For this, you would retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so I do not know if there
are any pitfalls.

You can see an example of a similar implementation specifically for the kind of spectral elements
you are considering here: https://github.com/jedbrown/dohp. It would probably be useful to understand
what is done there as you implement.

Thanks,

Matt

Cheers,
Mike.

DM
mesh::createSection(const DM &dm)
{

01 // displacement, velocity, acceleration, strain
02 IS bcPointIs[1];
03 PetscInt numBc = 1;
04 PetscInt bcField[1];
05 PetscInt numFields = 4;
06 PetscInt dim; DMGetDimension(dm, &dim);
07 PetscInt numComp[numFields];
08 for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09 PetscInt numDof[numFields*(dim+1)];
10 for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11 for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12 bcField[0] = 0;
13 PetscSection section;
14 DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15 DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16 NULL, bcPointIs, NULL, &section);
17 ISDestroy(&bcPointIs[0]);
18 PetscSectionSetFieldName(section, 0, "u");
19 PetscSectionSetFieldName(section, 1, "v");
20 PetscSectionSetFieldName(section, 2, "a");
21 PetscSectionSetFieldName(section, 3, "e");
22 DMSetDefaultSection(dm, section);
23 return dm;
}
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 22, 2015, at 2:32 AM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Wed, Oct 21, 2015 at 3:07 PM, Dave May <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Mike,



On 21 October 2015 at 18:01, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Dave,

So I’ve got a couple of days where there’s nothing pressing to work on
 was thinking of ripping out the parallel routines (ugly) in my wave propagation code and replacing them with Petsc DM routines. I can read in my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.

But here I’m stuck, and am having a whale of a time with the documentation. All I think I need is a way to modify the exodus-created DM, and add to it the degrees of freedom that are introduced by my quadrature rule. This would be really neat. I can just treat each sub-domain as its own mesh, with its own global numbering. Then whenever necessary I can scatter stuff the the real global degrees of freedom with something like VecLocToGlob. Most of the things I like about the code could stay the same (element-wise, matrix-free nature), just these parallel broadcasts would be infinitely nicer.


First off - I don't use DMPLEX.

Dave is refreshingly candid about his shortcomings ;)


But I just can’t figure out how to set this up. The main problem really boils down to: what’s the best way to add my quadrature points to an already-created DM, which was constructed with an exodus file? I guess I could do this after the file is read, but before the partitioning. In this case though, what’s stopping the partitioner from cutting an element in half? It seems like it would be a lot cleaner to do this post-partitioning.


Presumably what is read from exodus is just the vertices of the hexes, and what you want to do is define the function space (given by your GLL locations) on top of element geometry read in. Is that what you are asking about?

So Dave is right. We read in topology and geometry from ExodusII. Then you define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with PetscSection. If you give me an idea
what you want I can help you write the code easily I think.

Thanks,

Matt

Any hints here?

Actually I have no experience with this object.
I would just send an email to
petsc-***@mcs.anl.gov<mailto:petsc-***@mcs.anl.gov>
asking for help.

The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.

Cheers,
Dave


Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
Afanasiev Michael
2015-10-26 11:44:59 UTC
Permalink
Hi Matthew,

No. There are (k-1)^2 unknowns in the interior of the cell, so that we have

4 + 4 * (k-1) + (k-1)^2 = (k+1)^2

Right, makes sense. With some insight from Dave, I’m getting what I expect for a simple 4-cell mesh distributed amongst 4 processors (length of 91 for the global vector, length of 25 for the local vectors, for 4th order polynomial and the GLL basis). Now I imagine what’s left is to figure out how these vectors are globally numbered.

I usually do this by having a consistent way of numbering the dofs on the reference element (i.e. given a (u, v)-space loop through v, then u), and then getting the local->global map via finding coincident points in a kd-tree. My question is: in which order are the local vectors now defined? If I knew this, I could create my local->global map as before. From the way the section was created, I guess it might be something like loop vertices, edges, and then interior?

Thanks for all your help,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 23, 2015, at 5:11 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Fri, Oct 23, 2015 at 9:03 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

Yes, however I have some questions. Starting out, I think the GLL points include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1 dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below you have numDof[] for
a 1D mesh with DG element.

For the GLL basis we have (k+1) points in each dimension, including the endpoints. So for example a 2D element with 4-order polynomials will have 25 locally numbered points per element. I should also mention that the velocity, displacement, and acceleration fields are vectors, with d=dim components at each integration point, whereas strain has (2^dim)-1 components. In what you’ve written above, does this then become (sum over the 4 fields):

(sum(compPerField*dofPerField)) -> vertex
(sum((compPerField*dofPerField)*(k+1)) -> edge
(sum((compPerField*dofPerField)*(k+1)^2) -> quad

No. There are (k-1)^2 unknowns in the interior of the cell, so that we have

4 + 4 * (k-1) + (k-1)^2 = (k+1)^2

With elements like these, it is common (I think) to eliminate the cell unknown explicitly, since the
system is dense and not connected to other cells. For this, you would retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so I do not know if there
are any pitfalls.

I believe I don’t worry about this. Everything is solved in a matrix-free sense, on each element. The relevant physical quantities are calculated on each element, and then summed into the global degrees of freedom. The summed global dof are then brought back to the element level, where updates to the acceleration, velocity, and displacement are calculated via a Newmark time step. So no global system of equations is ever formed.

You accumulate into the global dofs, so you would need more storage if you do not do static condensation. It just good to know this,
but you do not have to do it.

This being said, all the routines to a) integrate the element matrices, b) calculate the gll point locations, c) construct the local->global dof mapping, and d) do the element-wise matrix free time stepping are already written. My problem is really that I do my mesh decomposition by hand (poorly), and I’d like to transfer this over to Petsc. Of course if I do this, I might as well use DM's LocalToGlobal vector routines to sum my field vectors across mesh partitions.

Yes.

Perhaps a better question to ask would be in the form of a workflow:

1) Read in exodus mesh and partition (done)
2) Set up local and global degrees of freedom on each mesh partition (done)

Here you just have to setup PetscSections that mirror your local and global layout. Then the LocalToGlobal and its inverse will work.

Matt

3) Integrate element matrices (done)

for i 1, nTimeSteps
3) Step fields forward on each partition (done)
4) Sum to global degrees of freedom on each partition (done)
5) Sum to global degrees of freedom across partitions (????)
6) Retrieve summed global degrees of freedom across partitions (????)
7) Continue

So really what I hope Petsc will help with is just steps 5 and 6. I guess this involves, given a partitioned DMPlex object, which has been partitioned according to the geometry and topology defined in an exodus file, adding the degrees of freedom to each partition-local vector (via a DMPlexSection?). Then, ensuring that the dof added along the partition boundaries sum together properly when a LocalToGlobal vector operation is defined.

Does this make sense? If so, is this something that DMPlex is designed for?
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 23, 2015, at 1:14 PM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Fri, Oct 23, 2015 at 3:01 AM, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hi Matthew,

So I’m discretizing via a tensor product of Lagrange polynomials co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order might range from 4 to 10 or something like that. The current problem is solved on 2D hexes.

I had found DMPlexCreateSection, and followed plex/ex1to get things set up. You can see my attempt below. Basically I’ve got 4 fields (displacement, velocity, acceleration, and strain) over each element. Here I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange polynomials (line 11). This seemed to somewhat achieve what I wanted — I created a global vector and wrote it to a vtk file with VecView, and the numbering seemed to make sense. You can also see my attempt at defining a boundary condition (it looked like DMPlexCreateFromExodus labeled side sets as “Face Sets”, seems to have worked).

Does this seem to be headed in the right direction?

Yes, however I have some questions. Starting out, I think the GLL points include the endpoints, so
that means for polynomial degree k that numDof[] should really have 4*1 dofs on each vertex,
4*(k-1) dofs on each edge, and 4*(k-1)^2 on each quad. It looks like below you have numDof[] for
a 1D mesh with DG element.

The "Face Sets" is the right label to use for boundary conditions. This will eliminate those variables
from the global system, but they will be present in the local spaces.

With elements like these, it is common (I think) to eliminate the cell unknown explicitly, since the
system is dense and not connected to other cells. For this, you would retain the vertex and edge
unknowns, but not the cell unknowns. I have not tried to do this myself, so I do not know if there
are any pitfalls.

You can see an example of a similar implementation specifically for the kind of spectral elements
you are considering here: https://github.com/jedbrown/dohp. It would probably be useful to understand
what is done there as you implement.

Thanks,

Matt

Cheers,
Mike.

DM
mesh::createSection(const DM &dm)
{

01 // displacement, velocity, acceleration, strain
02 IS bcPointIs[1];
03 PetscInt numBc = 1;
04 PetscInt bcField[1];
05 PetscInt numFields = 4;
06 PetscInt dim; DMGetDimension(dm, &dim);
07 PetscInt numComp[numFields];
08 for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09 PetscInt numDof[numFields*(dim+1)];
10 for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11 for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12 bcField[0] = 0;
13 PetscSection section;
14 DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15 DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16 NULL, bcPointIs, NULL, &section);
17 ISDestroy(&bcPointIs[0]);
18 PetscSectionSetFieldName(section, 0, "u");
19 PetscSectionSetFieldName(section, 1, "v");
20 PetscSectionSetFieldName(section, 2, "a");
21 PetscSectionSetFieldName(section, 3, "e");
22 DMSetDefaultSection(dm, section);
23 return dm;
}
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>

On Oct 22, 2015, at 2:32 AM, Matthew Knepley <***@gmail.com<mailto:***@gmail.com>> wrote:

On Wed, Oct 21, 2015 at 3:07 PM, Dave May <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Mike,



On 21 October 2015 at 18:01, Afanasiev Michael <***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>> wrote:
Hey Dave,

So I’ve got a couple of days where there’s nothing pressing to work on
 was thinking of ripping out the parallel routines (ugly) in my wave propagation code and replacing them with Petsc DM routines. I can read in my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a nicely partitioned DM. This takes me like 5 lines of code. That’s amazing.

But here I’m stuck, and am having a whale of a time with the documentation. All I think I need is a way to modify the exodus-created DM, and add to it the degrees of freedom that are introduced by my quadrature rule. This would be really neat. I can just treat each sub-domain as its own mesh, with its own global numbering. Then whenever necessary I can scatter stuff the the real global degrees of freedom with something like VecLocToGlob. Most of the things I like about the code could stay the same (element-wise, matrix-free nature), just these parallel broadcasts would be infinitely nicer.


First off - I don't use DMPLEX.

Dave is refreshingly candid about his shortcomings ;)


But I just can’t figure out how to set this up. The main problem really boils down to: what’s the best way to add my quadrature points to an already-created DM, which was constructed with an exodus file? I guess I could do this after the file is read, but before the partitioning. In this case though, what’s stopping the partitioner from cutting an element in half? It seems like it would be a lot cleaner to do this post-partitioning.


Presumably what is read from exodus is just the vertices of the hexes, and what you want to do is define the function space (given by your GLL locations) on top of element geometry read in. Is that what you are asking about?

So Dave is right. We read in topology and geometry from ExodusII. Then you define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with PetscSection. If you give me an idea
what you want I can help you write the code easily I think.

Thanks,

Matt

Any hints here?

Actually I have no experience with this object.
I would just send an email to
petsc-***@mcs.anl.gov<mailto:petsc-***@mcs.anl.gov>
asking for help.

The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.

Cheers,
Dave


Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut fÃŒr Geophysik
ETH ZÃŒrich

Sonneggstrasse 5, NO H 39.2
CH 8092 ZÃŒrich
***@erdw.ethz.ch<mailto:***@erdw.ethz.ch>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
Loading...