File: fedriver.py

package info (click to toggle)
paraview 5.13.2%2Bdfsg-3
  • links: PTS, VCS
  • area: main
  • in suites: sid, trixie
  • size: 544,220 kB
  • sloc: cpp: 3,374,605; ansic: 1,332,409; python: 150,381; xml: 122,166; sql: 65,887; sh: 7,317; javascript: 5,262; yacc: 4,417; java: 3,977; perl: 2,363; lex: 1,929; f90: 1,397; makefile: 170; objc: 153; tcl: 59; pascal: 50; fortran: 29
file content (43 lines) | stat: -rw-r--r-- 1,432 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
"""
A simple example of a Python simulation code working with ParaView Catalyst V2.
It depends on numpy and mpi4py being available. The environment
variables need to be set up properly to find Catalyst when running directly
from Python. For Linux
and Mac machines they should be:
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:<ParaView build dir>/lib
export PYTHONPATH=<ParaView build dir>/lib:<ParaView build dir>/lib/site-packages

Alternatively, pvbatch or pvpython can be used which will automatically set up
system paths for using ParaView Catalyst.

The location of the Catalyst Python wrapped libraries still need to be specified though:
export PYTHONPATH=<Catalyst build dir>/lib64/python<version>/site-packages:$PYTHONPATH

When running, Catalyst scripts must be added in on the command line. For example:
</path/to/pvpython> fedriver.py cpscript.py
mpirun -np 4 </path/to/pvbatch> --sym fedriver.py cpscript.py
"""
import numpy
from mpi4py import MPI

comm = MPI.COMM_WORLD
rank = comm.Get_rank()

import fedatastructures

grid = fedatastructures.GridClass([30, 32, 34], [2.1, 2.2, 2.3])
attributes = fedatastructures.AttributesClass(grid)
doCoprocessing = True

if doCoprocessing:
    import catalyst_adaptor
    catalyst_adaptor.initialize()


for i in range(100):
    attributes.Update(i)
    if doCoprocessing:
        catalyst_adaptor.coprocess(i, i, grid, attributes)

if doCoprocessing:
    catalyst_adaptor.finalize()