File: dmbase.rst.txt

package info (click to toggle)
petsc 3.22.5%2Bdfsg1-2
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 516,740 kB
  • sloc: ansic: 814,333; cpp: 50,948; python: 37,416; f90: 17,187; javascript: 3,493; makefile: 3,198; sh: 1,502; xml: 619; objc: 445; java: 13; csh: 1
file content (65 lines) | stat: -rw-r--r-- 3,350 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
.. _ch_dmbase:

DM Basics
----------

The previous chapters have focused on the core numerical solvers in PETSc. However, numerical solvers without efficient ways
(in both human and machine time) of connecting the solvers to the mathematical models and discretizations, including grids (or meshes)
that people wish to build their simulations on,
will not get widely used. Thus PETSc provides a set of abstractions represented by the ``DM`` object to provide a powerful, comprehensive
mechanism for translating the problem specification of a model and its discretization to the language and API of solvers.
``DM`` is an orphan initialism or orphan acronym, the letters have no meaning and never did.

Some of the model
classes ``DM`` currently supports are PDEs on structured and staggered grids with finite difference methods (``DMDA`` and ``DMSTAG`` -- :any:`ch_stag`),
PDEs on unstructured
grids with finite element and finite volume methods (``DMPLEX`` -- :any:`ch_unstructured`), PDEs on quad and octree-grids (``DMFOREST``), models on
networks (graphs) such
as the power grid or river networks (``DMNETWORK`` -- :any:`ch_network`), and particle-in-cell simulations (``DMSWARM``).

In previous chapters, we have demonstrated some simple usage of ``DM`` to provide the input for the solvers. In this chapter, and those that follow,
we will dive deep into the capabilities of ``DM``.


It is possible to create a  ``DM`` with

.. code-block::

   DM dm;
   DMCreate(MPI_Comm comm, DM *dm);
   DMSetType(DM dm, DMType type);

but more commonly, a ``DM`` is created with a type-specific constructor; the construction process for each type of ``DM`` is discussed
in the sections on each ``DMType``. This chapter focuses
on commonalities between all the ``DM`` so we assume the ``DM`` already exists and we wish to work with it.

As discussed earlier, a ``DM`` can construct vectors and matrices appropriate for a model and discretization and provide the mapping between the
global and local vector representations.

.. code-block::

   DMCreateLocalVector(DM dm,Vec *l);
   DMCreateGlobalVector(DM dm,Vec *g);
   DMGlobalToLocal(dm,g,l,INSERT_VALUES);
   DMLocalToGlobal(dm,l,g,ADD_VALUES);
   DMCreateMatrix(dm,Mat *m);

The matrices produced may support ``MatSetValuesLocal()`` allowing one to work with the local numbering on each MPI rank. For ``DMDA`` one can also
use ``MatSetValuesStencil()`` and for ``DMSTAG`` with ``DMStagMatSetValuesStencil()``.


A given ``DM`` can be refined for certain ``DMType``\s with ``DMRefine()`` or coarsened with ``DMCoarsen()``.
Mappings between ``DM``\s may be obtained with routines such as ``DMCreateInterpolation()``, ``DMCreateRestriction()`` and ``DMCreateInjection()``.

One attaches a ``DM`` to a PETSc solver object, ``KSP``, ``SNES``, ``TS``, or ``Tao`` with

.. code-block::

   KSPSetDM(KSP ksp,DM dm);
   SNESSetDM(SNES snes,DM dm);
   TSSetDM(TS ts,DM dm);

Once the ``DM`` is attached, the solver can utilize it to create and process much of the data that the solver needs to set up and implement its solve.
For example, with ``PCMG`` simply providing a ``DM`` can allow it to create all the data structures needed to run geometric multigrid on your problem.

`SNES Tutorial ex19 <PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/snes/tutorials/ex19.c.html>`__ demonstrates how this may be done with ``DMDA``.