The tamarin performance tests can be used to measure performance changes made to tamarin. To run performance tests on the android shell see 'Testing the Android Shell' below. Running the performance tests requires the following steps:
- Set the AVM environment variable to the path of the avmshell executable.
- Set the BUILTINABC environment variable to the path of the builtin.abc. (in the tamarin-redux/generated directory)
- Set the SHELLABC environment variable to the path of the shell_toplevel.abc. (in tamarin-redux/generated directory)
- Set the ASC environment variable to the path of the asc.jar compiler. You may download the latest asc.jar from ftp://ftp.mozilla.org/pub/js/tamarin...latest/asc.jar. Also the source to asc.jar may be downloaded and rebuilt from http://opensource.adobe.com/wiki/dis...exsdk/Flex+SDK.
- (Optional) You may compare the performance of the avmshell to another avmshell. For example you may download or build the latest checked in avmshell to compare against a local change. To compare a build against another build set the AVM2 environment variable to the other avmshell or use the "-S --avm2" options. Results from the two shells will be shown side to side and the percent speedup will be shown.
$ cd tamarin-redux/test/performance $ python runtests.py Executing tests at 2008-07-22 13:56:54.820920 avm: c:/dev/tamarin-tracing/bld/shell/avmshell.exe test avm sunspider/access-binary-trees.as 82.0 sunspider/access-fannkuch.as 152.0 sunspider/access-nbody.as 173.0 sunspider/access-nsieve.as 65.0 sunspider/bitops-3bit-bits-in-byte.as 13.0 sunspider/bitops-bits-in-byte.as 36.0 $ export AVM2=c:/dev/tamarin-tracing2/bld/shell/avmshell.exe $ python ./runtests.py Executing tests at 2008-07-22 14:03:51.957381 avm: c:/dev/tamarin-tracing/bld/shell/avmshell.exe avm2: c:/dev/tamarin-tracing2/bld/shell/avmshell.exe test avm avm2 %sp sunspider/access-binary-trees.as 82.0 80.0 2.5 sunspider/access-fannkuch.as 153.0 155.0 -1.3 sunspider/access-nbody.as 176.0 178.0 -1.1 sunspider/access-nsieve.as 65.0 68.0 -4.4 sunspider/bitops-3bit-bits-in-byte.as 12.0 13.0 -7.7 sunspider/bitops-bits-in-byte.as 36.0 36.0 0.0
By default tests sunspider and sunspider-as3 (optimized for as3) and run. Other tests may be specified by selecting ./runtests.py -c [misc,scimark,jsbench]. Also for better accuracy tests may be run multiple times. Specifying ./runtests.py -i 10 will run each tests 10 time and use the fastest result. If more than 2 repetitions are specified the 95% percentile confidence rating is calculated. The confidence rating measures the percentage of the mean 95% of the results were within the range.
Performance Test Options
There are a variety of options available with runtests.py - here is the help text followed by explanations for some options:
$ ./runtests.py -h usage: runtests.py [options] [tests] -v --verbose enable additional output -E --avm avmplus command to use -a --asc compiler to use -g --globalabc DEPRECATED but still works - use builtin.abc (used to be location of global.abc) -b --builtinabc location of builtin.abc -s --shellabc location of shell_toplevel.abc -x --exclude comma separated list of directories to skip -h --help display help and exit -t --notime do not generate timestamps (cleaner diffs) -f --forcerebuild force rebuild all test files -c --config sets the config string [default OS-tvm] -q --quiet display minimum output during testrun -l --log also log all output to given logfile --summaryonly only display final summary --rebuildtests rebuild the tests only - do not run against VM --showtimes shows the time for each test --ascargs args to pass to asc on rebuild of test files --vmargs args to pass to vm --timeout max time to run all tests --testtimeout max time to let a test run, in sec (default -1 = never timeout) --html also create an html output file --notimecheck do not recompile .abc if timestamp is older than .as --java location of java executable (default=java) --javaargs arguments to pass to java --random run tests in random order --seed explicitly specify random seed for --random -S --avm2 second avmplus command to use --avmname nickname for avm to use as column header --avm2name nickname for avm2 to use as column header --detail display results in 'old-style' format --raw output all raw test values -i --iterations number of times to repeat test -l --log logs results to a file -k --socketlog logs results to a socket server -r --runtime name of the runtime VM used, including switch info eg. TTVMi (tamarin-tracing interp) -m --memory logs the high water memory mark --aotsdk location of the AOT sdk used to compile tests to standalone executables. --aotout where the resulting binaries should be put (defaults to the location of the as file). --aotargs any extra arguments to pass to compile.py. --vmversion specify vmversion e.g. 502, use this if cannot be calculated from executable --vm2version specify version of avm2 --vmargs2 args to pass to avm2, if not specified --vmargs will be used --nooptimize do not optimize files when compiling --perfm parse the perfm results from avm --csv= also output to csv file, filename required --csvappend append to csv file instead of overwriting --score compute and print geometric mean of scores --index= index file to use (must end with .py) --saveindex= save results to given index file name
Option Details
- indexing:
There are two command line flags that control indexing: --index and —saveindex both require an index filename to be specified after the flag.
"--index index.py" will load index.py and run all of the tests in that index and report results normalized to the index values. A subset of the tests in the index can be run by passing what tests you want to run after specifying the index file.
"--saveindex indexname.py" will run the specified tests and save the values to the give index file. Specifying both --index and --saveindex will merge the values specified in --index to the new saved indexfile.
- cumulative score:
new option of --score will display a cumulative score for every metric. Really most useful with an index file (or v8 tests), since running with raw time values will skew the results towards longer running tests
- csv output:
"--csv filename" will save run results to a csv file that can be analyzed using excel pivottables. Use --csvappend if you want to append the results to the csv file
Support Files
testconfig.txt
This file contains instructions for how the harness should handle performance tests.
metricinfo.py
This file contains a list of metrics that the harness will look for and record. See the file header for details
Testing the Android Shell
The instructions above apply to running tests on an android device as well, but with a few differences. You should set environment variables as described above except for AVM, which should be:
$ export AVM=$TAMARIN_BUILD_TOP/platform/android/android_shell.sh
where $TAMARIN_BUILD_TOP is your main Tamarin repo folder. Note that AVM needs to point to android_shell.sh which is a proxy for running the tests on the phone. The /android-public/android-vars.sh script has commented-out examples of the environment settings for running tests on the android avm shell. To recap the other environment settings:
- Set the BUILTINABC environment variable to the path of the /generated/builtin.abc.
- You may build the .abc files by downloading the latest asc.jar. The latest asc.jar can be downloaded from from ftp://ftp.mozilla.org/pub/js/tamarin...latest/asc.jar. Also the source to asc.jar may be downloaded and rebuilt from http://opensource.adobe.com/wiki/dis...exsdk/Flex+SDK.
- Set the ASC environment variable, as described above.
- Set the SHELLABC environment variable to the path of the /generated/shell_toplevel.abc.
adb
'adb' stands for Android Debug Bridge. adb is used to access your android device and can run both direct and shell-type commands. If you've correctly edited and run the android-vars.sh script mentioned on the Tamarin Build Documentation page you should already be pathed to the adb executable in the public sdk/ndk, but if not it exists at /android-public/android-sdk-mac_86/platform-tools.
adb supports ssh-like commands to an android device. adb automatically connects to a phone if connected by usb. adb shell "ls /" is a good test to see if connected. The command:
$ adb devices
lists the device id of each phone. If this fails, do you have a dev OS installed? Under Settings->Applications->Development, is USB Debugging enabled?
Testsuite Setup
Example commands:
Root the phone
$ adb root
Copy avmshell to /data/app. The shell can be named variously in the build but on the phone the executable must be 'avmshell'
$ adb push avmshell /data/app/avmshell $ adb shell chmod 777 /data/app/avmshell
Copy android_runner.sh, if it doesn't already exist on the phone in /data/app
$ adb push tamarin-redux/platform/android/android_runner.sh /data/app/android_runner.s $ adb shell chmod 777 /data/app/android_runner.sh
Test it out with a simple .abc or no args for usage (should return EXITCODE=0)
$ tamarin-redux/platform/android/android_shell.h hello.abc hello EXITCODE=0
Running Performance Tests
To run performance tests on android you don't need to pass --androidthreads or --threads=1 as with the acceptance tests; in fact, a usage error will result. Just type the following commands after making sure you've set the environment variables correctly.
$ cd /test/performance $ ./runtests.py
Note on using an emulator: the emulator does seem to support hardware floating point but gives a BusError when running avmshell -Darm_vfp. It's unreliable, losing its adb connection frequently. An emulator is ok for testing a few abc files but for any consistency a real device is better.