File: README.md

package info (click to toggle)
python-azure 20230112%2Bgit-1
  • links: PTS, VCS
  • area: main
  • in suites: bookworm
  • size: 749,544 kB
  • sloc: python: 6,815,827; javascript: 287; makefile: 195; xml: 109; sh: 105
file content (72 lines) | stat: -rw-r--r-- 3,415 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
# EventHub Performance Tests

In order to run the performance tests, the `azure-devtools` package must be installed. This is done as part of the `dev_requirements`.
Start by creating a new virtual environment for your perf tests. This will need to be a Python 3 environment, preferably >=3.7.
Note that tests for T1 and T2 SDKs cannot be run from the same environment, and will need to be setup separately.

### Setup for test resources

These tests will run against a pre-configured EventHub. The following environment variable will need to be set for the tests to access the live resources:
```
AZURE_EVENTHUB_CONNECTION_STRING=<the connection string of an Event Hub.>
AZURE_EVENTHUB_NAME=<the path of the specific Event Hub to connect to>
```

### Setup for T2 perf test runs

```cmd
(env) ~/azure-eventhub> pip install -r dev_requirements.txt
(env) ~/azure-eventhub> pip install .
```

### Setup for T1 perf test runs

```cmd
(env) ~/azure-servicebus> pip install -r dev_requirements.txt
(env) ~/azure-servicebus> pip install tests/perfstress_tests/T1_legacy_tests/t1_test_requirements.txt
```

## Test commands

When `azure-devtools` is installed, you will have access to the `perfstress` command line tool, which will scan the current module for runable perf tests. Only a specific test can be run at a time (i.e. there is no "run all" feature).

```cmd
(env) ~/azure-eventhub> cd tests
(env) ~/azure-eventhub/tests> perfstress
```
Using the `perfstress` command alone will list the available perf tests found.

### Common perf command line options
These options are available for all perf tests:
- `--duration=10` Number of seconds to run as many operations (the "run" function) as possible. Default is 10.
- `--iterations=1` Number of test iterations to run. Default is 1.
- `--parallel=1` Number of tests to run in parallel. Default is 1.
- `--warm-up=5` Number of seconds to spend warming up the connection before measuring begins. Default is 5.
- `--sync` Whether to run the tests in sync or async. Default is False (async).

### Common Event Hub command line options
The options are available for all SB perf tests:
- `--event-size=100` Number of bytes each event contains. Default is 100.
- `--num-events` Number of events to send/receive as part of a single run.

#### Receive command line options
The receiving tests have these additional command line options:
- `--max-wait-time=0` The max time to wait for the specified number of events to be received. Default is 0 (indefinitely).
- `--preload=10000` The number of events to preload into the event hub before the receiving tests start. Default is 10000 events.

### T2 Tests
The tests currently written for the T2 SDK:
- `SendEventBatchTest` Sends `num-events` in a batch per run.
- `ReceiveEventTest` Receives `num-events` using the `receive` method. Receive command options apply. 
- `ReceiveEventBatchTest` Receives `num-events` using the `receive_batch` method. Receive command options apply.

### T1 Tests
The tests currently written for the T1 SDK:
- `LegacySendEventTest` Sends a single event per run.
- `LegacySendEventBatchTest` Sends `num-events` in a batch per run.
- `LegacyReceiveEventBatchTest` Receives `num-events` using the `receive` method. Receive command options apply.

## Example command
```cmd
(env) ~/azure-eventhub/tests> perfstress ReceiveEventBatchTest --parallel=2 --event-size=1024 --num-events=100 --duration=100
```