PerformanceMeters: Test Harness for XQuery

Michael Blakeley
Last updated 2010-08-25

PerformanceMeters is a Java-based, XML-driven tool for testing XQuery with MarkLogic Server. Capabilities include:

If you need a development or QA test tool, why not use JUnit, or XmlUnit? Both of those tools require programming: since PerformanceMeters is configured entirely via properties and XML, there's no programming needed.

If you need a performance test tool, why not use JMeter, LoadRunner or another general-purpose performance tool? For some users, it's enough to note that LoadRunner costs money, while PerformanceMeters is free. Aside from the cost, however, LoadRunner would only be useful for HTTP testing, out of the box. Testing XDBC servers (using XCC or XDBC) would require development work. However, LoadRunner is still the right answer for some applications, since it supports web browser simulation and much more sophisticated test scenarios than PerformanceMeters does.

Setting up PerformanceMeters

PerformanceMeters is a Java program, and requires a Java 5 runtime environment. If you don't have Java 5, download and install it.

Next, we'll need PerformanceMeters itself: simply download performance-meters.jar.

Since PerformanceMeters connects to a MarkLogic Server, you'll need the MarkLogic XCC libraries, too. You can download the XCC Java zip archive here. After you download the zip archive, unpack it and find the jar files. You can put xcc.jar anywhere on your disk, but for this tutorial I'll assume that both files are in the current directory.

Running PerformanceMeters

Now that we have a Java environment and all the libraries we need, let's try it out with the simplest possible invocation.

$ java -cp performance-meters.jar:xcc.jar \ com.marklogic.performance.PerformanceMeters

Java command lines can get pretty ugly. I'll keep using them for this tutorial, but you might want to put all of that into a shell script (or a batch file, on Windows). Remember, I put all my jar files in the current directory. You might have used another location: if so, just change the classpath in your command-line to match.

OK, so we copied that command-line and pasted it into a shell. What happened?

$ java -cp performance-meters.jar:xcc.jar \ com.marklogic.performance.PerformanceMeters Exception in thread "main" java.lang.NoClassDefFoundError: com/marklogic/xcc/Request at at com.marklogic.performance.PerformanceMeters.main(

Oh, that doesn't look good. Hmm... looks like we forgot to put xcc.jar into the current directory. We'll do that, and try again...

$ java -cp performance-meters.jar:xcc.jar \ com.marklogic.performance.PerformanceMeters Tue Jul 11 10:44:50 PDT 2006: com.marklogic.performance.PerformanceMeters starting, version 2006-07-11.2 Tue Jul 11 10:44:50 PDT 2006: configuration: -Dhost=localhost -Dport=8003 -Duser=admin -Dpassword=admin -DinputPath=null -DoutputPath= -DnumThreads=1 -Dshared=false -DreportTime=true -DrecordResults=false -DtestType=XCC missing required configuration parameter: inputPath at com.marklogic.performance.XMLFileTestList.initialize( at com.marklogic.performance.PerformanceMeters.initializeTests( at com.marklogic.performance.PerformanceMeters.main(

Now we're getting somewhere: PerformanceMeters started up, but exited immediately because it didn't see a required part of its configuration. Let's learn more about that configuration.

Configuring PerformanceMeters

PerformanceMeters is configured via properties. In Java, properties are simply name-value pairs, such as SIZE=1 or PATH=/lib/java. You can find a list of all the properties that PerformanceMeters understands in the README.

PerformanceMeters already gave us some clues about the properties we need to set, though. The first line of the error message tells us the defaults for a handful of important properties.

According to the error message, we have to set inputPath to proceed. It should be set to the filesystem path to an XML file. That XML file will describe the tests that we want to run: we can start with an example from the PerformanceMeters source code.

The following test configuration XML comes from simple.xml:
<h:script xmlns:h=""> <h:test> <h:name>simple test 1</h:name> <h:comment-expected-result>hello world</h:comment-expected-result> <h:query>"hello world"</h:query> </h:test> <h:test> <h:name>simple test 2</h:name> <h:comment-expected-result>2</h:comment-expected-result> <h:query>1 + 1</h:query> </h:test> <h:test> <h:name>simple test 3</h:name> <h:comment-expected-result>0</h:comment-expected-result> <h:query>xdmp:estimate(doc())</h:query> </h:test> </h:script>

This is fairly easy to read: the entire document describes a script, or test scenario, which is made of individual tests. Each test has a name and a query: the name can be any text, while the query must be a valid XQuery expression. When PerformanceMeters sees this script, it will run each test individually.

How will each test run? That's where the host, port, user, and password come in. If those properties aren't set, PerformanceMeters will use the default values: that is, it will try to submit each test to an XDBC server on localhost:8003. If there is no XDBC server on localhost:8003, or if the default user and password aren't valid, you'll see another error.

$ java -cp performance-meters.jar:xcc.jar \ -DinputPath=../src/tests/simple.xml \ com.marklogic.performance.PerformanceMeters Tue Jul 11 10:49:18 PDT 2006: com.marklogic.performance.PerformanceMeters starting, version 2006-07-11.2 Tue Jul 11 10:49:18 PDT 2006: creating 1 threads... Tue Jul 11 10:49:18 PDT 2006: starting... 2006-07-11 10:49:18.767 WARNING [10] (AbstractRequestController.runRequest): Cannot obtain connection: Connection refused Error running query: "hello world" com.marklogic.xcc.exceptions.ServerConnectionException: Connection refused [Session: user=admin, cb={default} [ContentSource: user=admin, cb={none} [provider: address=localhost/, pool=0/64]]]

Sure enough, I don't have an XDBC server on port 8003. But I do have one on port 9000: let's tell PerformanceMeters to use it. If you don't have an XDBC server, you'll need to configure one: it can be on any port, as long as you set the property to match.

Tue Jul 11 10:50:18 PDT 2006: com.marklogic.performance.PerformanceMeters starting, version 2006-07-11.2 Tue Jul 11 10:50:18 PDT 2006: creating 1 threads... Tue Jul 11 10:50:18 PDT 2006: starting... Tue Jul 11 10:50:18 PDT 2006: Reporting results... Tue Jul 11 10:50:18 PDT 2006: Writing results to com.marklogic.performance.PerformanceMeters-1152640218939.xml Completed 3 tests in 227 milliseconds, with 0 errors. Response times (min/max/avg): 9/108/42 ms Bytes (sent/received): 38/13 B Tests per second: 13.22 Average throughput: 224.00 B/s

Hey, it worked! We can see that all three tests ran successfully, and there were no errors. PerformanceMeters also wrote the results to an XML file in the current directory (we could change the location using the outputPath property).

$ cat com.marklogic.performance.PerformanceMeters-1152640218939.xml <h:results xmlns:h=""> <h:number-of-tests>3</h:number-of-tests> <h:number-of-errors>0</h:number-of-errors> <h:number-of-threads>1</h:number-of-threads> <h:minimum-ms>42</h:minimum-ms> <h:maximum-ms>74</h:maximum-ms> <h:average-ms>63</h:average-ms> <h:total-ms>189</h:total-ms> <h:test-duration>262</h:test-duration> <h:total-bytes-sent>38</h:total-bytes-sent> <h:total-bytes-received>13</h:total-bytes-received> <h:tests-per-second>11.450381679389313</h:tests-per-second> <h:bytes-per-second>194.0</h:bytes-per-second> <h:result> <h:name>simple test 1</h:name> <h:comment-expected-result>hello world</h:comment-expected-result> <h:result-text/> <h:got-error>false</h:got-error> <h:start-millis>1151438357863</h:start-millis> <h:end-millis>1151438357937</h:end-millis> <h:bytes-sent>13</h:bytes-sent> <h:bytes-received>11</h:bytes-received> <h:thread>0</h:thread> </h:result> <h:result> <h:name>simple test 2</h:name> <h:comment-expected-result>2</h:comment-expected-result> <h:result-text/> <h:got-error>false</h:got-error> <h:start-millis>1151438357941</h:start-millis> <h:end-millis>1151438357983</h:end-millis> <h:bytes-sent>5</h:bytes-sent> <h:bytes-received>1</h:bytes-received> <h:thread>0</h:thread> </h:result> <h:result> <h:name>simple test 3</h:name> <h:comment-expected-result>0</h:comment-expected-result> <h:result-text/> <h:got-error>false</h:got-error> <h:start-millis>1151438357987</h:start-millis> <h:end-millis>1151438358060</h:end-millis> <h:bytes-sent>20</h:bytes-sent> <h:bytes-received>1</h:bytes-received> <h:thread>0</h:thread> </h:result> </h:results>

Here we see much the same information as above: a summary of the number of tests, the number of errors, and some overall performance metrics. In addition, we can see a result element for every test that was run, showing its timing information and other useful data.

Now that we have a basic working test scenario, let's look at other ways to customize it.

After a while, you'll get tired of typing all these properties. This is an easy problem to solve: instead of using the -Dkey=value technique, just put your properties into a file. This also makes it very easy to re-run the same test scenarios.

$ java -cp performance-meters.jar:xcc.jar \ com.marklogic.performance.PerformanceMeters \ tests/

Customizing the Output

The results above were in XML. This is great if you want XML, but if you plan to import the results into a spreadsheet, you might prefer comma-separated values (CSV). PerformanceMeters supports arbitrary output plug-ins via the Reporter interface, but you don't have to write one yourself: just set reporter=CSVReporter and run the test again.

$ cat com.marklogic.performance.PerformanceMeters-1151439192585.csv number-of-tests,number-of-errors,number-of-threads,minimum-ms,maximum-ms,average-ms,total-ms,test-duration,total-bytes-sent,total-bytes-received,tests-per-second,bytes-per-second 3,0,1,41,78,63,189,251,38,13,11.952191235059761,203.0 name,comment-expected-result,result-text,got-error,start-millis,end-millis,bytes-sent,bytes-received "simple test 1","hello world",,false,1151439192388,1151439192458,13,11 "simple test 2",2,,false,1151439192462,1151439192503,5,1 "simple test 3",0,,false,1151439192506,1151439192584,20,1

This time, the output file name ends with ".csv", and we can see that the output is suitable OpenOffice Calc, Microsoft Excel, or other consumers of CSV files.

What if we want to record the actual results of every test? That's easy: just set recordResults=true in your test properties.

By default, PerformanceMeters will only report errors if the server refused the connection, or threw some exception. We can check every expected-result element in our test scenario by setting checkResults=true (for this to work, you must also set recordResults=true).

In many test situations, it's useful to report more timing information than just minimum, maximum, and average. Performance requirements are often expressed in terms of 95th-percentile response times, or 98th-percentile response times. PerformanceMeters supports this via an optional property: set reportPercentileDuration=95 or reportPercentileDuration=98 in your test properties.

Completed 3 tests in 254 milliseconds, with 0 errors. Response times (min/max/avg): 41/79/63 ms Response time (95th percentile): 79

Of course, the 95th percentile of three test iterations isn't very interesting. Let's look at more configuration options, this time aimed at customizing test execution.

Customizing Test Execution

By default, PerformanceMeters runs every test in the scenario, and runs it only once. We can change this behavior to run a timed test, instead: setting testTime=60 will run a 60-second test. Now our 95th-percentile response times are more interesting:

Response times (min/max/avg): 1/170/4 ms Response time (95th percentile): 10

We can also tell PerformanceMeters to shuffle the tests, instead of running them sequentially, by setting the isRandomTest=true property. When running random tests, it's sometimes useful to repeat the same sequence of pseudo-random numbers. You can arrange for this to happen by setting randomSeed to the same value every time.

What if you want to test multi-threaded performance? That's easy, too: set the numThreads property to the number of threads you'd like to run.

You can also use PerformanceMeters to test with the older XDBC api, or to test HTTP servers, or to perform simple document fetches. To do this, set the testType property.

Helpful Hints

When writing your test queries, remember that you are writing XQuery expressions as XML text. This is usually fine, but sometimes you'll need to escape certain expressions. For example, suppose we want to change our "hello world" test, above, to output a p element with the "hello world" message. Our first attempt doesn't work:

<h:test> <h:name>simple test 1</h:name> <h:comment-expected-result><p>hello world</p></h:comment-expected-result> <h:query><p>hello world</p></h:query> </h:test>

When we run this test, PerformanceMeters returns an error:

Error running query: com.marklogic.xcc.exceptions.XQueryException: XDMP-UNEXPECTED: Unexpected token syntax error, unexpected $end

It looks like we need some escaping. The query needs to look like flat text to PerformanceMeters. We could escape every < and > in the p elements, but it's much cleaner to use a CDATA section instead.

<h:test> <h:name>simple test 1</h:name> <h:comment-expected-result><![CDATA[<p>hello world</p>]]></h:comment-expected-result> <h:query><![CDATA[<p>hello world</p>]]></h:query> </h:test>

Now our test runs successfully.


PerformanceMeters provides a rich framework for automated testing of XQuery, whether for QA or performance measurement. We hope it finds a place in your toolbox.