File: ex2.py

package info (click to toggle)
petsc4py 3.23.1-1exp2
  • links: PTS, VCS
  • area: main
  • in suites: experimental
  • size: 3,448 kB
  • sloc: python: 12,503; ansic: 1,697; makefile: 343; f90: 313; sh: 14
file content (205 lines) | stat: -rw-r--r-- 6,571 bytes parent folder | download | duplicates (3)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205

'''
Ex2 from PETSc example files implemented for PETSc4py.
https://petsc.org/release/src/ksp/ksp/tutorials/ex2.c.html
By: Miguel Arriaga


Solves a linear system in parallel with KSP.
Input parameters include:
    -view_exact_sol   : write exact solution vector to stdout
    -m <mesh_x>       : number of mesh points in x-direction
    -n <mesh_n>       : number of mesh points in y-direction


Vec            x,b,u;    # approx solution, RHS, exact solution
Mat            A;        # linear system matrix
KSP            ksp;      # linear solver context
PetscReal      norm;     # norm of solution error
'''
import sys
import petsc4py
petsc4py.init(sys.argv)
from petsc4py import PETSc

import numpy as np

comm = PETSc.COMM_WORLD
size = comm.getSize()
rank = comm.getRank()

OptDB = PETSc.Options()
m  = OptDB.getInt('m', 8)
n  = OptDB.getInt('n', 7)

''' - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
        Compute the matrix and right-hand-side vector that define
        the linear system, Ax = b.
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - '''
'''
    Create parallel matrix, specifying only its global dimensions.
    When using MatCreate(), the matrix format can be specified at
    runtime. Also, the parallel partitioning of the matrix is
    determined by PETSc at runtime.

    Performance tuning note:  For problems of substantial size,
    preallocation of matrix memory is crucial for attaining good
    performance. See the matrix chapter of the users manual for details.
'''

A = PETSc.Mat().create(comm=comm)
A.setSizes((m*n,m*n))
A.setFromOptions()
A.setPreallocationNNZ((5,5))

'''
    Currently, all PETSc parallel matrix formats are partitioned by
    contiguous chunks of rows across the processors.  Determine which
    rows of the matrix are locally owned.
'''
Istart,Iend = A.getOwnershipRange()

'''
    Set matrix elements for the 2-D, five-point stencil in parallel.
    - Each processor needs to insert only elements that it owns
    locally (but any non-local elements will be sent to the
    appropriate processor during matrix assembly).
    - Always specify global rows and columns of matrix entries.

    Note: this uses the less common natural ordering that orders first
    all the unknowns for x = h then for x = 2h etc; Hence you see J = Ii +- n
    instead of J = I +- m as you might expect. The more standard ordering
    would first do all variables for y = h, then y = 2h etc.
'''

for Ii in range(Istart,Iend):
    v = -1.0
    i = int(Ii/n)
    j = int(Ii - i*n)

    if (i>0):
        J = Ii - n
        A.setValues(Ii,J,v,addv=True)
    if (i<m-1):
        J = Ii + n
        A.setValues(Ii,J,v,addv=True)
    if (j>0):
        J = Ii - 1
        A.setValues(Ii,J,v,addv=True)
    if (j<n-1):
        J = Ii + 1
        A.setValues(Ii,J,v,addv=True)

    v = 4.0
    A.setValues(Ii,Ii,v,addv=True)

'''
    Assemble matrix, using the 2-step process:
    MatAssemblyBegin(), MatAssemblyEnd()
    Computations can be done while messages are in transition
    by placing code between these two statements.
'''

A.assemblyBegin(A.AssemblyType.FINAL)
A.assemblyEnd(A.AssemblyType.FINAL)
''' A is symmetric. Set symmetric flag to enable ICC/Cholesky preconditioner '''

A.setOption(A.Option.SYMMETRIC,True)

'''
    Create parallel vectors.
    - We form 1 vector from scratch and then duplicate as needed.
    - When using VecCreate(), VecSetSizes and VecSetFromOptions()
    in this example, we specify only the
    vector's global dimension; the parallel partitioning is determined
    at runtime.
    - When solving a linear system, the vectors and matrices MUST
    be partitioned accordingly.  PETSc automatically generates
    appropriately partitioned matrices and vectors when MatCreate()
    and VecCreate() are used with the same communicator.
    - The user can alternatively specify the local vector and matrix
    dimensions when more sophisticated partitioning is needed
    (replacing the PETSC_DECIDE argument in the VecSetSizes() statement
    below).
'''

u = PETSc.Vec().create(comm=comm)
u.setSizes(m*n)
u.setFromOptions()

b = u.duplicate()
x = b.duplicate()

'''
    Set exact solution; then compute right-hand-side vector.
    By default we use an exact solution of a vector with all
    elements of 1.0;
'''
u.set(1.0)
b = A(u)

'''
    View the exact solution vector if desired
'''
flg = OptDB.getBool('view_exact_sol', False)
if flg:
    u.view()

''' - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
            Create the linear solver and set various options
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - '''
ksp = PETSc.KSP().create(comm=comm)

'''
    Set operators. Here the matrix that defines the linear system
    also serves as the preconditioning matrix.
'''
ksp.setOperators(A,A)

'''
    Set linear solver defaults for this problem (optional).
    - By extracting the KSP and PC contexts from the KSP context,
    we can then directly call any KSP and PC routines to set
    various options.
    - The following two statements are optional; all of these
    parameters could alternatively be specified at runtime via
    KSPSetFromOptions().  All of these defaults can be
    overridden at runtime, as indicated below.
'''
rtol=1.e-2/((m+1)*(n+1))
ksp.setTolerances(rtol=rtol,atol=1.e-50)

'''
Set runtime options, e.g.,
    -ksp_type <type> -pc_type <type> -ksp_monitor -ksp_rtol <rtol>
These options will override those specified above as long as
KSPSetFromOptions() is called _after_ any other customization
routines.
'''
ksp.setFromOptions()

''' - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
                    Solve the linear system
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - '''

ksp.solve(b,x)

''' - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
                    Check the solution and clean up
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - '''
x = x - u # x.axpy(-1.0,u)
norm = x.norm(PETSc.NormType.NORM_2)
its = ksp.getIterationNumber()

'''
    Print convergence information.  PetscPrintf() produces a single
    print statement from all processes that share a communicator.
    An alternative is PetscFPrintf(), which prints to a file.
'''
if norm > rtol*10:
    PETSc.Sys.Print("Norm of error {}, Iterations {}".format(norm,its),comm=comm)
else:
    if size==1:
        PETSc.Sys.Print("- Serial OK",comm=comm)
    else:
        PETSc.Sys.Print("- Parallel OK",comm=comm)