File: warnings.txt

package info (click to toggle)
python-testfixtures 8.3.0-2
  • links: PTS, VCS
  • area: main
  • in suites: sid, trixie
  • size: 1,064 kB
  • sloc: python: 10,208; makefile: 76; sh: 9
file content (102 lines) | stat: -rw-r--r-- 3,253 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
Testing warnings
================

.. currentmodule:: testfixtures

Testfixtures has tools that make it easy to make assertions about code that may emit warnings.

The :class:`ShouldWarn` context manager
---------------------------------------

This context manager allows you to assert that particular warnings are
emitted in a block of code, for example:

>>> from warnings import warn
>>> from testfixtures import ShouldWarn
>>> with ShouldWarn(UserWarning('you should fix that')):
...     warn('you should fix that')

If a warning issued doesn't match the one expected,
:class:`ShouldWarn` will raise an :class:`AssertionError`
causing the test in which it occurs to fail:

>>> from warnings import warn
>>> from testfixtures import ShouldWarn
>>> with ShouldWarn(UserWarning('you should fix that')):
...     warn("sorry dave, I can't let you do that")
Traceback (most recent call last):
...
AssertionError:...
<SequenceComparison(ordered=True, partial=False)(failed)>
same:
[]
<BLANKLINE>
expected:
[
<C:....UserWarning(failed)>
attributes differ:
'args': ('you should fix that',) (Comparison) != ("sorry dave, I can't let you do that",) (actual)
</C:....UserWarning>]
<BLANKLINE>
actual:
[UserWarning("sorry dave, I can't let you do that"...)]
</SequenceComparison(ordered=True, partial=False)> (expected) != [UserWarning("sorry dave, I can't let you do that"...)] (actual)

You can check multiple warnings in a particular piece of code:

>>> from warnings import warn
>>> from testfixtures import ShouldWarn
>>> with ShouldWarn(UserWarning('you should fix that'),
...                 UserWarning('and that too')):
...     warn('you should fix that')
...     warn('and that too')

If you don't care about the order of issued warnings, you can use ``order_matters=False``:

>>> from warnings import warn
>>> from testfixtures import ShouldWarn
>>> with ShouldWarn(UserWarning('you should fix that'),
...                 UserWarning('and that too'),
...                 order_matters=False):
...     warn('and that too')
...     warn('you should fix that')

If you want to inspect more details of the warnings issued, you can capture
them into a list as follows:

>>> from warnings import warn_explicit
>>> from testfixtures import ShouldWarn
>>> with ShouldWarn() as captured:
...     warn_explicit(message='foo', category=DeprecationWarning,
...                   filename='bar.py', lineno=42)
>>> len(captured)
1
>>> captured[0].message
DeprecationWarning('foo'...)
>>> captured[0].lineno
42

The :class:`ShouldNotWarn` context manager
------------------------------------------

If you do not expect any warnings to be logged in a piece of code, you can use
the :class:`ShouldNotWarn` context manager. If any warnings are issued in the
context it manages, it will raise an :class:`AssertionError` to indicate this:

>>> from warnings import warn
>>> from testfixtures import ShouldNotWarn
>>> with ShouldNotWarn():
...     warn("woah dude")
Traceback (most recent call last):
...
AssertionError:...
<SequenceComparison(ordered=True, partial=False)(failed)>
same:
[]
<BLANKLINE>
expected:
[]
<BLANKLINE>
actual:
[UserWarning('woah dude'...)]
</SequenceComparison(ordered=True, partial=False)> (expected) != [UserWarning('woah dude'...)] (actual)