...
- First, generate delimited file for import into Excel.
I use Perl to parse the XML logs and generate a delimited file for import into Excel. The heart of the Perl script is the regular expression below. Each line in the JMeter logfile must be matched to this expression:
...
- Rounding Java timestamps to nearest minute
- Collect timestamps grouped by minute
- Convert Java timestamp to YYYY-MM-dd etc.
- Print Throughput for a minute increment
- Print Average response time for a minute increment
- Do all of the above in an efficient single pass through awk (this was the hardest bit!)
Script: jtlmin.sh.txt
An example session, using jtlmin.sh
to process a JTL file. The file produced, queryBalance.jtl.OUT
(tab-delimited), can now be used to produce throughput graph. Response times can also be included on the secondary axis, as in the diagram above. These graphs were very good at showing when the integration layer was slow to respond and when throughput varied from the original JMeter plan.
...
- jtlsummary.sh jtl_csv_result_files...
Extracting JTL files to CSV with Python (JMeter 2.3.x)
...
The script only works with JTL files from JMeter 2.3.x (JTL version 2.1). Please see http://jakarta.apache.org/jmeter/usermanual/listeners.html#xmlformat2.1 for details.
Usage is:
- program.py <JTL input file> <CSV output file> "<regular expression>"
{{{Code Block language py #!/usr/bin/python
""" Description : Split JTL file into a comma delimited CVS by : Oliver Erlewein (c)2008 Date : 04.02.2008 Lang : Python 2.4+
...
JMeter JTL field contents:
...
Attribute & Content by Bytes de Data encoding dt Data type ec Error count (0 or 1, unless multiple samples are aggregated) hn Hostname where the sample was generated lb Label lt Latency = time to initial response (milliseconds) - not all samplers support this na Number of active threads for all thread groups ng Number of active threads in this group rc Response Code (e.g. 200) rm Response Message (e.g. OK) s Success flag (true/false) sc Sample count (1, unless multiple samples are aggregated) t Elapsed time (milliseconds) tn Thread Name ts timeStamp (milliseconds since midnight Jan 1, 1970 UTC) """
...
import sys import re import datetime import
...
time startTime = time.time()
...
cnt = 0 cnt2
...
= 0 failCnt = 0 reCompile = re.compile("\s(
...
[^\s
...
]
...
*?)=\"(.
...
*?)\"")
...
delimiterCharacterOut = ","
...
def writeCSVLine(line):
...
x = reCompile.findall(line)
...
a = dict((row
...
[0
...
], row
...
[1
...
]) for row in x)
...
try:
...
a['ts'
...
] = str(int(int(a
...
['ts'
...
])/1000))
...
x = str(datetime.datetime.fromtimestamp(float(a
...
['ts'
...
])))
...
[0:19
...
]
...
b = a
...
['ts'
...
] + ",\"" + x + "\"," + a
...
['t'
...
] + "," + a
...
['lt'
...
] + ",\"" + a
...
['s'
...
] + "\",\"" + a
...
['lb'
...
] + "\"," + a
...
['rc'
...
] + ",\"" + a
...
['rm'
...
] + "\",\"" + a
...
['tn'
...
] + "\",\"" + a
...
['dt'
...
] + "\"," + a
...
['by'
...
] + "\n"
...
except:
...
return -1 o.write(b)
...
return 1
...
print "Splitting JTL file"
...
try:
...
runArgv = sys.
...
argv # Save the command line jtlInfile = str(sys.argv
...
[1
...
])
...
...
# Name of JTL input file
...
cvsOutfile
...
= str(sys.argv
...
[2
...
])
...
# Name of CVS output file
...
reFilter = str(sys.argv
...
[3]) # Filter the labels (lb) for the filter
...
except:
...
print "Error: Input format: <input file> <output file> <Filter by regular expression>"
...
raise try:
...
f = open(jtlInfile, "r")
...
o = open(cvsOutfile, "w") except:
...
raise
...
print "Filtering on regular expression : " + reFilter cmpFilter = re.compile(reFilter)
...
for line in f:
...
try:
...
if cmpFilter.search(line):
...
returnVal = writeCSVLine(line)
...
if returnVal
...
< 0: failCnt += 1
...
else:
...
cnt2 += 1
...
except:
...
print 'Error in line : ', cnt, line
...
raise cnt += 1
...
endTime = time.time() print "Time taken : ", str(endTime-startTime) print "Lines processed : ", cnt print "Lines that passed the filter : ", cnt2 print "Lines skipped (error?) : ", failCnt
...
f.close() o.close()
...
Generating charts with Perl
...
- insert an 'Aggregate Report' Listener to your testplan (top level).
- enter a filename where to save the results
- invoke 'Configure' and check the boxes as shown.
- Change the names of your requests. For instance in my exemplified testplan all requests for images were named as 'GIF', all requests for java scripts were named as 'JS'. Special points of interest also get their appropriate name (eg. 'Login', 'Address Search', ...). If you don't like particular requests to be accumulated then name them starting with garbage (eg. garbage_gif, garbage_js, ...) This step is necessary because the script collects data into labels of the same name. This way you make sure that you get a chart that will show the response times for all images or all 'Login' or all 'Address Searches' (except those that start with garbage)
The script
The jmetergraph.pl requires Chart 2.4.1 to be installed (http://search.cpan.org/~chartgrp/Chart-2.4.1/).
...