1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184
|
@sequential
Feature: Runner should support a --dry-run option
As a tester
I want to check if behave tests are syntactically correct
And all step definitions exist
Before I actually run the tests (by executing steps).
. SPECIFICATION: Dry-run mode
. * Undefined steps are detected
. * Marks steps as "untested" or "undefined"
. * Marks scenarios as "untested"
. * Marks features as "untested"
. * Causes no failed scenarios, features
. * Causes failed test-run when undefined steps are found.
@setup
Scenario: Feature Setup
Given a new working directory
And a file named "features/steps/steps.py" with:
"""
from behave import step
@step('a step passes')
def step_passes(context):
pass
@step('a step fails')
def step_fails(context):
assert False, "XFAIL"
"""
And a file named "features/alice.feature" with:
"""
Feature: Alice
@selected
Scenario: A1
Given a step passes
When a step passes
Then a step passes
@other_selected
Scenario: A2
Given a step passes
When a step fails
Then a step passes
@selected
Scenario: A3
Given a step passes
@selected
Scenario: A4
Given a step fails
"""
And a file named "features/bob.feature" with:
"""
Feature: Bob
Scenario: B1
Given a step passes
When a step passes
Then a step passes
Scenario: B2
Given a step passes
When a step fails
Then a step passes
"""
And a file named "features/undefined_steps.feature" with:
"""
Feature: Undefined Steps
@selected
Scenario: U1
Given a step passes
When a step is undefined
Then a step fails
@other_selected
Scenario: U2 fails
Given a step is undefined
When a step passes
And a step fails
Then a step is undefined
"""
Scenario: Dry-run one feature should mark feature/scenarios/steps as untested
When I run "behave -f plain --dry-run features/alice.feature"
Then it should pass with:
"""
0 features passed, 0 failed, 0 skipped, 1 untested
0 scenarios passed, 0 failed, 0 skipped, 4 untested
0 steps passed, 0 failed, 0 skipped, 0 undefined, 8 untested
"""
And the command output should contain
"""
Scenario: A1
"""
And the command output should contain:
"""
Scenario: A2
"""
And the command output should contain:
"""
Scenario: A3
"""
And the command output should contain:
"""
Scenario: A4
"""
And note that "all scenarios of this feature are contained"
Scenario: Dry-run one feature with tags should mark skipped scenario/steps as skipped
When I run "behave -f plain --dry-run --tags=@selected --no-skipped features/alice.feature"
Then it should pass with:
"""
0 features passed, 0 failed, 0 skipped, 1 untested
0 scenarios passed, 0 failed, 1 skipped, 3 untested
0 steps passed, 0 failed, 3 skipped, 0 undefined, 5 untested
"""
And the command output should contain:
"""
Scenario: A1
"""
And the command output should contain:
"""
Scenario: A3
"""
And the command output should contain:
"""
Scenario: A4
"""
But the command output should not contain:
"""
Scenario: A2
"""
And note that "only tagged scenarios of this feature are contained (3 of 4)"
Scenario: Dry-run two features
When I run "behave --dry-run features/alice.feature features/bob.feature"
Then it should pass with:
"""
0 features passed, 0 failed, 0 skipped, 2 untested
0 scenarios passed, 0 failed, 0 skipped, 6 untested
0 steps passed, 0 failed, 0 skipped, 0 undefined, 14 untested
"""
Scenario: Dry-run one feature with undefined steps
When I run "behave --dry-run features/undefined_steps.feature"
Then it should fail with:
"""
0 features passed, 0 failed, 0 skipped, 1 untested
0 scenarios passed, 0 failed, 0 skipped, 2 untested
0 steps passed, 0 failed, 0 skipped, 3 undefined, 4 untested
"""
Scenario: Dry-run two features, one with undefined steps
When I run "behave --dry-run features/alice.feature features/undefined_steps.feature"
Then it should fail with:
"""
0 features passed, 0 failed, 0 skipped, 2 untested
0 scenarios passed, 0 failed, 0 skipped, 6 untested
0 steps passed, 0 failed, 0 skipped, 3 undefined, 12 untested
"""
Scenario: Dry-run two features, one with undefined steps and use tags
When I run "behave --dry-run --tags=@selected features/alice.feature features/undefined_steps.feature"
Then it should fail with:
"""
0 features passed, 0 failed, 0 skipped, 2 untested
0 scenarios passed, 0 failed, 2 skipped, 4 untested
0 steps passed, 0 failed, 7 skipped, 1 undefined, 7 untested
"""
Scenario: Dry-run two features, one with undefined steps and use other tags
When I run "behave --dry-run --tags=@other_selected features/alice.feature features/undefined_steps.feature"
Then it should fail with:
"""
0 features passed, 0 failed, 0 skipped, 2 untested
0 scenarios passed, 0 failed, 4 skipped, 2 untested
0 steps passed, 0 failed, 8 skipped, 2 undefined, 5 untested
"""
|