Mailing List Archive

Stress testing of beta.wikipedia.com
Here is another data point regarding the performance of beta.wikipedia.com

I'm now running
7 stressbots (page readers)
2 postits (page writers)
concurrently accessing beta.wikipedia.com via a 512K DSL connection.

I'm getting the following statistics:
average page read time 2.9 seconds
average page write time 4.9 seconds

corresponding to an
average page read rate of 7/2.9 = 2.4 pages/sec
average page write rate of 2 / 4.9 = 0.41 pages/sec

making a total sustained transaction rate of around 2.8 hits/sec, or
around 240,000 hits/day or over 7 million hits a month.

However, my inbound traffic is around 61 kbytes/sec doing this, so my
512k DSL link is currently the bottleneck, not the server.

Dropping the concurrency to
3 stressbots
1 postit

gives:
average page read time 1.9 seconds
average page write time 3.1 seconds

corresponding to an
average page read rate of 3/1.9 = 1.57 pages/sec
average page write rate of 1 / 3.1 = 0.32 pages/sec

total transaction rate: 1.9 hits/sec

for an inbound traffic rate of about 26 kbytes/s, where my DSL link is
no longer the bottleneck, but the system is under less load.

To really stress test the server, we will need several clients to run at
once on several different links. I'm going to stop the test now.

It would be useful for testing if we could have a page that gave current
Linux operating system stats, perhaps in a sysop-only page?

Neil
Stress testing of beta.wikipedia.com [ In reply to ]
I've noticed that beta.wikipedia.com has not got a robots.txt file.
Yes, I know that most recent robots will read the metadata in the page,
but I'm willing to bet that some of the dimmer or older ones don't.

Should we have one?

Also, there's no favicon.ico. I enclose a file Walone2.ico which should
work if renamed to favicon.ico and placed at
http://beta.wikipedia.com/favicon.ico

Neil