File: notes.txt

package info (click to toggle)
php5 5.6.33%2Bdfsg-0%2Bdeb8u1
  • links: PTS, VCS
  • area: main
  • in suites: jessie
  • size: 157,872 kB
  • sloc: ansic: 756,065; php: 22,030; sh: 12,311; cpp: 8,771; xml: 6,179; yacc: 1,564; exp: 1,514; makefile: 1,467; pascal: 1,147; awk: 538; perl: 315; sql: 22
file content (56 lines) | stat: -rw-r--r-- 1,490 bytes parent folder | download | duplicates (9)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
This stress test program is for debugging threading issues with the ISAPI
module.

2 ways to use it:

1: test any php script file on multiple threads
2: run the php test scripts bundled with the source code



GLOBAL SETTINGS
===============

If you need to set special environement variables, in addition to your
regular environment, create a file that contains them, one setting per line:

MY_ENV_VAR=XXXXXXXX

This can be used to simulate ISAPI environment variables if need be.

By default, stress test uses 10 threads.  To change this, change the define
NUM_THREADS in stresstest.cpp.



1: Test any php script file on multiple threads
===============================================

Create a file that contains a list of php script files, one per line.  If
you need to provide input, place the GET data, or Query String, after the
filename.  File contents would look like:

e:\inetpub\pages\index.php
e:\inetpub\pages\info.php
e:\inetpub\pages\test.php a=1&b=2

Run: stresstest L files.txt



2: Run the php test scripts bundled with the source code
========================================================

supply the path to the parent of the "tests" directory (expect a couple
long pauses for a couple of the larger tests)

Run: stresstest T c:\php5-source



TODO:

* Make more options configurable: number of threads, iterations, etc.
* Improve stdout output to make it more useful
* Implement support for SKIPIF
* Improve speed of CompareFile function (too slow on big files).