|
|
Chromium Code Reviews|
Created:
7 years, 8 months ago by sullivan Modified:
7 years, 8 months ago Reviewers:
tonyg CC:
chromium-reviews, chrome-speed-team+watch_google.com, rcrabb Base URL:
svn://svn.chromium.org/chrome/trunk/src Visibility:
Public. |
DescriptionAdd trace-info.json which contains information about perf traces. Also update test-info.json to use markdown for links.
BUG=
Committed: https://src.chromium.org/viewvc/chrome?view=rev&revision=195284
Patch Set 1 #
Total comments: 25
Patch Set 2 : Addressed review comments #
Messages
Total messages: 7 (0 generated)
lgtm Some nits to fix before landing. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json File tools/perf/trace-info.json (right): https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:252: "description": "" The peak Virtual Memory Size (address space allocated) usage achieved by the browser process. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:255: "description": "" The peak Virtual Memory Size (address space allocated) usage achieved by the renderer process. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:279: "description": "" The peak Resident Set Size (physically resident memory) usage achieved by the browser process. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:282: "description": "" The peak Resident Set Size (physically resident memory) usage achieved by the renderer process. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:285: "description": "" The peak Working Set Size usage achieved by the browser process. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:288: "description": "" The peak Working Set Size usage achieved by the renderer process. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:293: "other_byte_b": { other_byte_* are gone now. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:318: "description": "Number of IO write operations by the renderer process" s/write/read/ https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:333: "description": "" Total external memory fragmentation after each GC in percent. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:357: "description": "" The total size of committed memory used by V8 after each GC in KB. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:381: "description": "" The total size of live memory used by V8 after each GC in KB. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:406: "description": "" Average frames per second as measured by the platform's SurfaceFlinger. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:415: "description": "" I'd ping nduca for the missing scrolling_benchmark descriptions. No need to hold up this patch though.
Thanks for the comments! Updated and submitting. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json File tools/perf/trace-info.json (right): https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:252: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The peak Virtual Memory Size (address space allocated) usage achieved by the > browser process. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:255: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The peak Virtual Memory Size (address space allocated) usage achieved by the > renderer process. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:279: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The peak Resident Set Size (physically resident memory) usage achieved by the > browser process. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:282: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The peak Resident Set Size (physically resident memory) usage achieved by the > renderer process. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:285: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The peak Working Set Size usage achieved by the browser process. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:288: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The peak Working Set Size usage achieved by the renderer process. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:293: "other_byte_b": { On 2013/04/19 18:45:18, tonyg wrote: > other_byte_* are gone now. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:318: "description": "Number of IO write operations by the renderer process" On 2013/04/19 18:45:18, tonyg wrote: > s/write/read/ Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:333: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > Total external memory fragmentation after each GC in percent. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:357: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The total size of committed memory used by V8 after each GC in KB. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:381: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > The total size of live memory used by V8 after each GC in KB. Done. https://codereview.chromium.org/14172018/diff/1/tools/perf/trace-info.json#ne... tools/perf/trace-info.json:406: "description": "" On 2013/04/19 18:45:18, tonyg wrote: > Average frames per second as measured by the platform's SurfaceFlinger. Done.
CQ is trying da patch. Follow status at https://chromium-status.appspot.com/cq/sullivan@chromium.org/14172018/5001
Message was sent while issue was closed.
Committed patchset #2 manually as r195284 (presubmit successful).
Message was sent while issue was closed.
I'm a bit confused about two things with this: 1. how do we ensure that checkins to perf/ update this file? 2. how does an entry in this file relate to the python file? In general, is there a way we can unify this more closely with the benchmark system so there's less "these two files have to be kept in sync with one another?"
Message was sent while issue was closed.
On 2013/04/22 22:36:56, nduca wrote: > I'm a bit confused about two things with this: > 1. how do we ensure that checkins to perf/ update this file? I think in general it's pretty difficult to ensure that people update documentation when they make changes. Test suites which don't have a description in test-info.json are hidden from the menus, so that is an incentive to add documentation to tests (we don't hide traces, though). One nice thing about the separate file is that it's easier to update the documentation in one pass, which is what seems to happen most often for high-level documentation like these files contain. > 2. how does an entry in this file relate to the python file? An entry in test-info.json is the name of a test suite (the "test" part of "system/test/graph/trace"). For example, "sunspider" or "scrolling_benchmark". An entry in trace-info.json is the name of a trace (for example, "V8.MemoryHeapSampleTotalCommitted"). The file contains high-level descriptions of the test suites, and the individual traces. The dashboard will show these descriptions so that it's easy to understand what a test is when it regresses. For test suites, the test-info.json file also lists relevant source code files and will in the future list command lines to run the tests. I'm not really sure what you mean by the python file--do you mean page_cycler.py, robohornetpro.py, etc? The json files contain high-level information about the test suites we are running. For a lot of the page_cycler tests, the python file doesn't really have any sense of what data is in the test suite, and why we picked that data. For example, page_cycler.py doesn't know that our new page cyclers will be broken up into fewer languages per suite than the old intl tests so that we don't over-exercise the font cache. > In general, is there a way we can unify this more closely with the benchmark > system so there's less "these two files have to be kept in sync with one > another?" The usual approach to doing this is to use the RESULT lines the test spits out--the code on the buildbots processes those lines, and parses out things like which are the "important" tests, what the units are, and sends them up with the test results. We didn't think this worked as well for high-level documentation for a few reasons: * Sending up a full description of the test suite and each trace with every single run creates a lot of extra data to upload and process, especially when you consider that tests like moz have over 200 traces. I couldn't think of a good way to have the buildbot "know" it doesn't need to send the documentation on a given run. * It seems like parsing buildbot RESULT lines for free-form text like descriptions could get ugly. * For page_cyclers, where the code is the same but the test data is different, where does the description come from? There's also the possibility of having some special format for comments in the tests with the description. But then we need to make up a special format and parse it correctly, and also have some kind of glue code that tells the dashboard where to find the test source code to parse it for the comments. So we used an approach like UMA histograms, where the high-level descriptions are all in one file. That said, these files are very new, so if you have a better idea about how to communicate high-level descriptions of the test suites and traces to the dashboard, we could definitely switch. It sounds like I am missing a simpler approach. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||
