Chromium Code Reviews
chromiumcodereview-hr@appspot.gserviceaccount.com (chromiumcodereview-hr) | Please choose your nickname with Settings | Help | Chromium Project | Gerrit Changes | Sign out
(1100)

Side by Side Diff: gm/tests/run.sh

Issue 15014011: GM: include filename extension (.png) of each output file in JSON summary (Closed) Base URL: http://skia.googlecode.com/svn/trunk/
Patch Set: Created 7 years, 7 months ago
Use n/p to move between diff chunks; N/P to move between comments. Draft comments are only viewable by you.
Jump to:
View unified diff | Download patch | Annotate | Revision Log
« no previous file with comments | « gm/tests/outputs/pipe-playback-failure/output-expected/writePath/8888/selftest1.png ('k') | no next file » | no next file with comments »
Toggle Intra-line Diffs ('i') | Expand Comments ('e') | Collapse Comments ('c') | Show Comments Hide Comments ('s')
OLDNEW
1 #!/bin/bash 1 #!/bin/bash
2 2
3 # Self-tests for gm, based on tools/tests/run.sh 3 # Self-tests for gm, based on tools/tests/run.sh
4 # 4 #
5 # These tests are run by the Skia_PerCommit_House_Keeping bot at every commit, 5 # These tests are run by the Skia_PerCommit_House_Keeping bot at every commit,
6 # so make sure that they still pass when you make changes to gm! 6 # so make sure that they still pass when you make changes to gm!
7 # 7 #
8 # To generate new baselines when gm behavior changes, run gm/tests/rebaseline.sh 8 # To generate new baselines when gm behavior changes, run gm/tests/rebaseline.sh
9 # 9 #
10 # TODO: because this is written as a shell script (instead of, say, Python) 10 # TODO: because this is written as a shell script (instead of, say, Python)
11 # it only runs on Linux and Mac. 11 # it only runs on Linux and Mac.
12 # See https://code.google.com/p/skia/issues/detail?id=677 12 # See https://code.google.com/p/skia/issues/detail?id=677
13 # ('make tools/tests/run.sh work cross-platform') 13 # ('make tools/tests/run.sh work cross-platform')
14 # Ideally, these tests should pass on all development platforms... 14 # Ideally, these tests should pass on all development platforms...
15 # otherwise, how can developers be expected to test them before committing a 15 # otherwise, how can developers be expected to test them before committing a
16 # change? 16 # change?
17 17
18 # cd into .../trunk so all the paths will work 18 # cd into .../trunk so all the paths will work
19 cd $(dirname $0)/../.. 19 cd $(dirname $0)/../..
20 20
21 # TODO(epoger): make it look in Release and/or Debug 21 # TODO(epoger): make it look in Release and/or Debug
22 GM_BINARY=out/Debug/gm 22 GM_BINARY=out/Debug/gm
23 23
24 # If WRITE_IMAGE_FILES is nonzero, then the self-test will pass --writePath 24 # If WRITE_IMAGE_FILES is nonzero, then the self-test will pass --writePath
25 # and --mismatchPath arguments to GM. Currently, for various reasons, we 25 # and --mismatchPath arguments to GM. Currently, for various reasons, we
26 # cannot run these arguments on the production buildbots, so this should 26 # cannot run these arguments on the production buildbots, so this should
27 # only be set to nonzero for local testing. 27 # only be set to nonzero for local testing.
28 WRITE_IMAGE_FILES=0 28 WRITE_IMAGE_FILES=1
epoger 2013/05/08 19:39:48 First patchset turns on WRITE_IMAGE_FILES in the g
29 29
30 OUTPUT_ACTUAL_SUBDIR=output-actual 30 OUTPUT_ACTUAL_SUBDIR=output-actual
31 OUTPUT_EXPECTED_SUBDIR=output-expected 31 OUTPUT_EXPECTED_SUBDIR=output-expected
32 CONFIGS="--config 8888 565" 32 CONFIGS="--config 8888 565"
33 33
34 ENCOUNTERED_ANY_ERRORS=0 34 ENCOUNTERED_ANY_ERRORS=0
35 35
36 # Compare contents of all files within directories $1 and $2, 36 # Compare contents of all files within directories $1 and $2,
37 # EXCEPT for any dotfiles. 37 # EXCEPT for any dotfiles.
38 # If there are any differences, a description is written to stdout and 38 # If there are any differences, a description is written to stdout and
(...skipping 155 matching lines...) Expand 10 before | Expand all | Expand 10 after
194 194
195 # Ignore some error types (including ExpectationsMismatch) 195 # Ignore some error types (including ExpectationsMismatch)
196 gm_test "--ignoreErrorTypes ExpectationsMismatch NoGpuContext --verbose --hierar chy --match selftest1 $CONFIGS -r $GM_INPUTS/json/different-pixels.json" "$GM_OU TPUTS/ignore-expectations-mismatch" 196 gm_test "--ignoreErrorTypes ExpectationsMismatch NoGpuContext --verbose --hierar chy --match selftest1 $CONFIGS -r $GM_INPUTS/json/different-pixels.json" "$GM_OU TPUTS/ignore-expectations-mismatch"
197 197
198 if [ $ENCOUNTERED_ANY_ERRORS == 0 ]; then 198 if [ $ENCOUNTERED_ANY_ERRORS == 0 ]; then
199 echo "All tests passed." 199 echo "All tests passed."
200 exit 0 200 exit 0
201 else 201 else
202 exit 1 202 exit 1
203 fi 203 fi
OLDNEW
« no previous file with comments | « gm/tests/outputs/pipe-playback-failure/output-expected/writePath/8888/selftest1.png ('k') | no next file » | no next file with comments »

Powered by Google App Engine
This is Rietveld 408576698