File: testing.md

package info (click to toggle)
io4dolfinx 1.1.2-3
  • links: PTS, VCS
  • area: main
  • in suites: sid
  • size: 832 kB
  • sloc: python: 8,419; sh: 29; makefile: 3
file content (113 lines) | stat: -rw-r--r-- 3,847 bytes parent folder | download | duplicates (4)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113

# Testing Strategy

This document outlines how `io4dolfinx` is tested, covering both local development
testing and the Continuous Integration (CI) process on GitHub Actions.

## Coverage reports:

You will find the updated coverage reports for the latest version on `main` tested against `stable` and `nightly` versions of dolfinx at the following links:
- [Coverage report for stable](https://scientificcomputing.github.io/io4dolfinx/code-coverage-report-stable/)
- [Coverage report for nightly](https://scientificcomputing.github.io/io4dolfinx/code-coverage-report-nightly/)

## Local Testing

The library uses `pytest` for testing. To execute the tests locally, you first
need to install the library and its dependencies.

### Installation for Testing

Install the library with the optional `test` dependencies to ensure you have
packages like `pytest`, `coverage`, and `ipyparallel`

```bash
python3 -m pip install ".[test]"
```

### Running Tests

To execute all tests in the repository, run:
```bash
python3 -m pytest .
```
### Generating Test Data

Some tests require specific datasets to verify compatibility with older software
versions.

#### Testing against data from legacy dolfin

Some tests check the capability of reading data created with the legacy version
of DOLFIN. To create this dataset, start a docker container with legacy DOLFIN,
for instance:

```bash
docker run -ti -v $(pwd):/root/shared -w /root/shared --rm ghcr.io/scientificcomputing/fenics:2024-02-19
```

Then, inside this container, call:
```bash
python3 ./tests/create_legacy_data.py --output-dir=legacy
```

#### Testing against data from older versions of adios4dolfinx
Some tests check the capability to read data generated by `adios4dolfinx<0.7.2`.
To generate data for these tests use the following commands:

```bash
docker run -ti -v $(pwd):/root/shared -w /root/shared --rm ghcr.io/fenics/dolfinx/dolfinx:v0.7.3
```

Then, inside the container, call:

```bash
python3 -m pip install adios4dolfinx==0.7.1
python3 ./tests/create_legacy_checkpoint.py --output-dir=legacy_checkpoint
```

---

## Continuous Integration (GitHub Actions)

The repository relies on several GitHub Actions workflows to ensure code quality
and compatibility across different environments.

### 1. Main Test Suite (`test_package.yml`)

This is the primary workflow triggered on pushes to `main`, pull requests, and
scheduled nightly runs. It runs on `ubuntu-24.04` using the official DOLFINx
docker container
.

**Workflow Steps:**
1.  **Linting & Formatting:** Checks code style using `ruff` and type consistency
    with `mypy`
   .
2.  **Data Generation:** * Creates legacy DOLFIN data using the `create_legacy_data.yml` workflow.
    * Creates legacy `adios4dolfinx` checkpoints using `create_legacy_checkpoint.yml`.
3.  **Test Execution:**
    * Installs the package with MPI-enabled `h5py`.
    * Runs the standard test suite with `coverage`.
    * Runs parallel tests using `mpirun -n 4 ... mpi4py -m pytest`.
4.  **Reporting:** Combines coverage reports and uploads them as artifacts.

### 2. Compatibility Testing

To ensure broad support, specific workflows test against different configurations:

* **MPI & ADIOS2 Versions (`test_package_openmpi.yml`):**
    * Tests against both `openmpi` and `mpich` implementations using the
      `ghcr.io/fenics/test-env` containers.
    * Verifies compatibility with different ADIOS2 versions (e.g., `v2.10.2`, `v2.11.0`)
     .

* **Operating System (`test_redhat.yml`):**
    * Runs the full test suite inside a RedHat-based container
      (`docker.io/fenicsproject/test-env:current-redhat`) to guarantee functionality
      on non-Debian systems
     .

### 3. Documentation (`build_docs.yml`)

Ensures the documentation builds correctly with `jupyter-book` on every push and
pull request, preventing documentation regressions.