File: notes.txt

package info (click to toggle)
php5 5.3.3.1-7%2Bsqueeze29
  • links: PTS, VCS
  • area: main
  • in suites: squeeze-lts
  • size: 123,520 kB
  • ctags: 55,742
  • sloc: ansic: 633,963; php: 19,620; sh: 11,344; xml: 5,816; cpp: 2,400; yacc: 1,745; exp: 1,514; makefile: 1,019; pascal: 623; awk: 537; sql: 22
file content (56 lines) | stat: -rw-r--r-- 1,490 bytes parent folder | download | duplicates (9)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
This stress test program is for debugging threading issues with the ISAPI
module.

2 ways to use it:

1: test any php script file on multiple threads
2: run the php test scripts bundled with the source code



GLOBAL SETTINGS
===============

If you need to set special environement variables, in addition to your
regular environment, create a file that contains them, one setting per line:

MY_ENV_VAR=XXXXXXXX

This can be used to simulate ISAPI environment variables if need be.

By default, stress test uses 10 threads.  To change this, change the define
NUM_THREADS in stresstest.cpp.



1: Test any php script file on multiple threads
===============================================

Create a file that contains a list of php script files, one per line.  If
you need to provide input, place the GET data, or Query String, after the
filename.  File contents would look like:

e:\inetpub\pages\index.php
e:\inetpub\pages\info.php
e:\inetpub\pages\test.php a=1&b=2

Run: stresstest L files.txt



2: Run the php test scripts bundled with the source code
========================================================

supply the path to the parent of the "tests" directory (expect a couple
long pauses for a couple of the larger tests)

Run: stresstest T c:\php5-source



TODO:

* Make more options configurable: number of threads, iterations, etc.
* Improve stdout output to make it more useful
* Implement support for SKIPIF
* Improve speed of CompareFile function (too slow on big files).