File: sharing_intermediates.md

package info (click to toggle)
python-opt-einsum 3.4.0-2
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 1,772 kB
  • sloc: python: 4,124; makefile: 31; javascript: 15
file content (42 lines) | stat: -rw-r--r-- 1,735 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
# Sharing Intermediates

If you want to compute multiple similar contractions with common terms, you can embed them in a [`opt_einsum.shared_intermediates`](../api_reference.md#opt_einsumshared_intermediates) context. Computations of subexpressions in this context will be memoized, and will be garbage collected when the contexts exits.

For example, suppose we want to compute marginals at each point in a factor chain:

```python
inputs = 'ab,bc,cd,de,ef'
factors = [np.random.rand(1000, 1000) for _ in range(5)]

%%timeit
marginals = {output: contract('{}->{}'.format(inputs, output), *factors)
             for output in 'abcdef'}
#> 1 loop, best of 3: 5.82 s per loop
```

To share this computation, we can perform all contractions in a shared context:

```python
%%timeit
with shared_intermediates():
    marginals = {output: contract('{}->{}'.format(inputs, output), *factors)
                 for output in 'abcdef'}
#> 1 loop, best of 3: 1.55 s per loop
```

If it is difficult to fit your code into a context, you can instead save the sharing cache for later reuse.

```python
with shared_intermediates() as cache:  # create a cache
    pass
marginals = {}
for output in 'abcdef':
    with shared_intermediates(cache):  # reuse a common cache
        marginals[output] = contract('{}->{}'.format(inputs, output), *factors)
del cache  # garbage collect intermediates
```

Note that sharing contexts can be nested, so it is safe to to use [`opt_einsum.shared_intermediates`](../api_reference.md#opt_einsumshared_intermediates) in library code without leaking intermediates into user caches.

!!! note
    By default a cache is thread safe, to share intermediates between threads explicitly pass the same cache to each thread.