| 
 | 
 | 
 Chromium Code Reviews
 Chromium Code Reviews Issue 
            2711623002:
    Add a TBMv2 webrtc_rendering_metric.  (Closed)
    
  
    Issue 
            2711623002:
    Add a TBMv2 webrtc_rendering_metric.  (Closed) 
  | Created: 3 years, 10 months ago by ehmaldonado_chromium Modified: 3 years, 9 months ago CC: catapult-reviews_chromium.org, tracing-review_chromium.org, jasontiller Target Ref: refs/heads/master Project: catapult Visibility: Public. | DescriptionAdd a TBMv2 webrtc_rendering_metric.
This CL migrates the TBMv1 version of this metric [1], [2] to TBMv2.
The design doc for the TBMv2 version of the metric is available at [3].
[1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry/web_perf/metrics/webrtc_rendering_timeline.py
[2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry/web_perf/metrics/webrtc_rendering_stats.py
[3] https://docs.google.com/document/d/1TX0NnT_xxvnRNDkw6P2kxunXyK_I8OChkLb4jehXR9I/edit
BUG=chromium:632295
Review-Url: https://codereview.chromium.org/2711623002
Committed: https://chromium.googlesource.com/external/github.com/catapult-project/catapult/+/2ae07fc28bb3b248baa71647d5cfcefd67e57a84
   Patch Set 1 #
      Total comments: 52
      
     Patch Set 2 : Address some comments. #
      Total comments: 30
      
     Patch Set 3 : Add runLengthEncoding and addressed some comments. #Patch Set 4 : More comments. #Patch Set 5 : Updated comments. #
      Total comments: 8
      
     Patch Set 6 : Addressed comments. Started writing tests. #
      Total comments: 5
      
     Patch Set 7 : Tests. #Patch Set 8 : Add more tests. #Patch Set 9 : Addressed comments. #
      Total comments: 15
      
     Patch Set 10 : Addressed some comments. #
      Total comments: 2
      
     Patch Set 11 : New test for driftTime #
      Total comments: 4
      
     Patch Set 12 : Improve the tests. #
      Total comments: 2
      
     Patch Set 13 : Addressed comments. #Patch Set 14 : Added histograms for raw data. #
      Total comments: 6
      
     Patch Set 15 : Fixed percentages. #
      Total comments: 2
      
     Patch Set 16 : Use tr.v.Histogram instead of Map for frameDistribution. #
      Total comments: 8
      
     Patch Set 17 : Addressed comments. Report frozen frames count. Fixed small bug. #
      Total comments: 1
      
     Patch Set 18 : Customize summary options. #
      Total comments: 1
      
     
 Messages
    Total messages: 92 (18 generated)
     
 ehmaldonado@chromium.org changed reviewers: + eakuefner@chromium.org, kjellander@chromium.org 
 
 benjhayden@chromium.org changed reviewers: + benjhayden@chromium.org 
 Here are a few style nits and suggestions. Thanks! https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:1: <!DOCTYPE html> Can you write a test like the other metrics? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:18: var DISPLAY_HERTZ = 60.0; Please use const for these constants. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:43: var mandatory = [ Should this be a const up with the other consts? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:46: for (var parameter in mandatory) { It looks like this doesn't do what you think it does. Python has for..in loops, but JS has both for..in and for..of, and for..in doesn't do what you think it does. Please use for..of. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:61: var percentage_biggerIsBetter = Please use const and move these up with the other consts. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:71: var histogram = new tr.v.Histogram(name, unit); You can set histogram.description Also, you might want to call customizeSummaryOptions() to disable the statistics that you don't need. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:77: var histogram = new tr.v.Histogram(name, unit); Please use let instead of var throughout this file. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:324: Please delete these blank lines. Files should end in a newline but not a blank line. 
 To answer your other questions: > I don't know how to translate the logging statements [3], JS would say console.log(), but we try not to spam the console. Is it ok to leave them out? > the none_reason [4] You can either fail to produce a Histogram so it will simply be missing from results.html and the dashboard. Or you could produce an empty Histogram (without calling addSample()) so it will read as "empty" on results.html and be missing from the dashboard. You could add a Diagnostic to an empty Histogram, so the information will at least be recorded. You could add an invalid value to a Histogram like 0 or -1 so that the Histogram isn't empty and it will show up on the dashboard, but you'll have to just know that the value is invalid. I might recommend the first option (failing to produce a Histogram) for now and revisit it if you feel like you need to. Does that sound ok? > or the description [5]. You can set histogram.description. 
 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:54: function webRtcRenderingTimelineMetric(values, model) { nit: I don't think Timeline should be part of the name. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:79: histogram.addSample(scalarValue); As Ben notes, this should also be a for..of loop. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:106: // In this paragraph I will be using regex notation. What is intended by the 1. These, and the similar such comments below, should be docstrings: /** ... **/ 2. Did you write this, or copy it from the TBMv1 metric? I think it's a little too colloquial generally, and normally you would use third person/passive/"royal we" instead of "I". https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:194: var penalty = { const? 
 Description was changed from ========== Add a TBMv2 webrtc_rendering_timeline metric. This is a first attempt to port the webrtc_rendering_timeline metric to TBMv2. This [1] is the TBMv1 metric, which uses the stats computed here [2]. I don't know how to translate the logging statements [3], the none_reason [4] or the description [5]. I don't know if they are necessary. [1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [3] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [4] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [5] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... BUG=chromium:632295 ========== to ========== Add a TBMv2 webrtc_rendering_timeline metric. This is a first attempt to port the webrtc_rendering_timeline metric to TBMv2. This [1] is the TBMv1 metric, which uses the stats computed here [2]. I don't know how to translate the logging statements [3], the none_reason [4] or the description [5]. I don't know if they are necessary. [1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [3] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [4] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [5] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... BUG=chromium:632295 ========== 
 nednguyen@google.com changed reviewers: + nednguyen@google.com 
 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:137: // and p=Card(k in C). Tim: you may want to take a look at these math 
 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:1: <!DOCTYPE html> On 2017/02/21 21:08:34, benjhayden wrote: > Can you write a test like the other metrics? +1 I assume https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... will be useful. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:31: function isMediaPlayerMSEvent(event) { /** Verify that the event is a WebMediaPlayerMS::UpdateCurrentFrame event. */ https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:106: // In this paragraph I will be using regex notation. What is intended by the On 2017/02/21 22:26:34, eakuefner wrote: > 1. These, and the similar such comments below, should be docstrings: /** ... **/ > 2. Did you write this, or copy it from the TBMv1 metric? I think it's a little > too colloquial generally, and normally you would use third person/passive/"royal > we" instead of "I". I agree first person sounds too colloquial. The text is copy-pasted from https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... so let's just update it a little. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:264: // Get the smoothness stats from the normalized drift time. Convert to JSDoc and document driftTime and the return dict. https://google.github.io/styleguide/javascriptguide.xml?showone=Comments#Comm... Same for the other functions. 
 tdresser@chromium.org changed reviewers: + tdresser@chromium.org 
 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:109: // something like [2 3] which means possibly an observed frame persistence Can you give an example of IDEAL_RENDER_INSTANT's that would result in this cadence? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:127: cadence.shift(); This code feels harder to read than necessary, partially because we're pushing stuff to the cadence list that's never read. What about something like: // add empty list check. let cadence = [] let oldIdealRender = events.shift() let framePersistence = 1 for (let event of events) { if (event[IRI] == oIR) { fP++; } else { cadence.push(fP) oIR = event[IRI] } } https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:136: // a distribution B::= let C be the cadence, B[k]=p with k in Unique(C) I'm not sure what the ::= means here. I'm also having a hard time interpreting this comment. I'd use an annotation like: (Double check the jsdoc syntax here) @returns Array.<Number> where a[k] is the number of times we had a cadence of k. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:137: // and p=Card(k in C). What is Card? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:142: frameDistribution[ticks] = ticksSoFar + 1; I'd find this a bit easier to read as: frameDistribution[ticks] = frameDistribution[ticks] + 1 || 1 instead of the above two lines. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:170: // 5 frozen frames will not be counted as frozen. Can we add jsdoc here, including an @returns annotation? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:173: for (var ticks in frameDistribution) { Is there any reason to do this in two loops, instead of one? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:182: occurences: frameDistribution[frozenFrameVsync], occurrences https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:246: VSYNC_DURATION * cadence[index])); I haven't actually parsed this yet, but are we using "discrepancy" in the formal sense here? https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... 
 Is there a way to test getCadence in webrtc_rendering_timeline.html from webrtc_rendering_timeline_test.html? Or Should I only test the generated Histograms? 
 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:1: <!DOCTYPE html> On 2017/02/22 07:50:42, kjellander_chromium wrote: > On 2017/02/21 21:08:34, benjhayden wrote: > > Can you write a test like the other metrics? > > +1 > I assume > https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... > will be useful. On it. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:18: var DISPLAY_HERTZ = 60.0; On 2017/02/21 21:08:34, benjhayden wrote: > Please use const for these constants. Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:31: function isMediaPlayerMSEvent(event) { On 2017/02/22 07:50:42, kjellander_chromium wrote: > /** Verify that the event is a WebMediaPlayerMS::UpdateCurrentFrame event. */ Acknowledged. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:43: var mandatory = [ On 2017/02/21 21:08:34, benjhayden wrote: > Should this be a const up with the other consts? Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:46: for (var parameter in mandatory) { On 2017/02/21 21:08:34, benjhayden wrote: > It looks like this doesn't do what you think it does. > Python has for..in loops, but JS has both for..in and for..of, and for..in > doesn't do what you think it does. > Please use for..of. Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:54: function webRtcRenderingTimelineMetric(values, model) { On 2017/02/21 22:26:34, eakuefner wrote: > nit: I don't think Timeline should be part of the name. Ack. Will do it in the next patch :) https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:61: var percentage_biggerIsBetter = On 2017/02/21 21:08:34, benjhayden wrote: > Please use const and move these up with the other consts. Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:77: var histogram = new tr.v.Histogram(name, unit); On 2017/02/21 21:08:34, benjhayden wrote: > Please use let instead of var throughout this file. Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:79: histogram.addSample(scalarValue); On 2017/02/21 22:26:34, eakuefner wrote: > As Ben notes, this should also be a for..of loop. Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:106: // In this paragraph I will be using regex notation. What is intended by the On 2017/02/22 07:50:42, kjellander_chromium wrote: > On 2017/02/21 22:26:34, eakuefner wrote: > > 1. These, and the similar such comments below, should be docstrings: /** ... > **/ > > 2. Did you write this, or copy it from the TBMv1 metric? I think it's a little > > too colloquial generally, and normally you would use third > person/passive/"royal > > we" instead of "I". > > I agree first person sounds too colloquial. The text is copy-pasted from > https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... > so let's just update it a little. Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:109: // something like [2 3] which means possibly an observed frame persistence On 2017/02/22 15:48:40, tdresser wrote: > Can you give an example of IDEAL_RENDER_INSTANT's that would result in this > cadence? Not really. I'm not familiar with the code or the terminology. This is a syntactic rewrite of the Python version of the metric at https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:127: cadence.shift(); On 2017/02/22 15:48:40, tdresser wrote: > This code feels harder to read than necessary, partially because we're pushing > stuff to the cadence list that's never read. > > What about something like: > > // add empty list check. > > let cadence = [] > let oldIdealRender = events.shift() > let framePersistence = 1 > > for (let event of events) { > if (event[IRI] == oIR) { > fP++; > } else { > cadence.push(fP) > oIR = event[IRI] > } > } > I'd rather not modify events. I'll figure out a simpler way to do it. :) https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:136: // a distribution B::= let C be the cadence, B[k]=p with k in Unique(C) On 2017/02/22 15:48:39, tdresser wrote: > I'm not sure what the ::= means here. > > I'm also having a hard time interpreting this comment. > > I'd use an annotation like: > > (Double check the jsdoc syntax here) > @returns Array.<Number> where a[k] is the number of times we had a cadence of k. Acknowledged. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:137: // and p=Card(k in C). On 2017/02/22 15:48:39, tdresser wrote: > What is Card? I take it means cardinality, or count the number of times k appears in C. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:142: frameDistribution[ticks] = ticksSoFar + 1; On 2017/02/22 15:48:40, tdresser wrote: > I'd find this a bit easier to read as: > frameDistribution[ticks] = frameDistribution[ticks] + 1 || 1 > > instead of the above two lines. Done. Should I add parentheses? (frameDistribution[ticks] + 1) || 1 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:170: // 5 frozen frames will not be counted as frozen. On 2017/02/22 15:48:40, tdresser wrote: > Can we add jsdoc here, including an @returns annotation? > Done? https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:173: for (var ticks in frameDistribution) { On 2017/02/22 15:48:40, tdresser wrote: > Is there any reason to do this in two loops, instead of one? Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:182: occurences: frameDistribution[frozenFrameVsync], On 2017/02/22 15:48:40, tdresser wrote: > occurrences Acknowledged. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:194: var penalty = { On 2017/02/21 22:26:34, eakuefner wrote: > const? Done. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:246: VSYNC_DURATION * cadence[index])); On 2017/02/22 15:48:39, tdresser wrote: > I haven't actually parsed this yet, but are we using "discrepancy" in the formal > sense here? > > https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... Hummmm, I don't know. Will take a look. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:264: // Get the smoothness stats from the normalized drift time. On 2017/02/22 07:50:42, kjellander_chromium wrote: > Convert to JSDoc and document driftTime and the return dict. > https://google.github.io/styleguide/javascriptguide.xml?showone=Comments#Comm... > Same for the other functions. Acknowledged. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:324: On 2017/02/21 21:08:34, benjhayden wrote: > Please delete these blank lines. Files should end in a newline but not a blank > line. Done. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline_test.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline_test.html:1: <!DOCTYPE html> Please ignore for now. 
 https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:109: // something like [2 3] which means possibly an observed frame persistence On 2017/02/22 16:45:44, ehmaldonado_chromium wrote: > On 2017/02/22 15:48:40, tdresser wrote: > > Can you give an example of IDEAL_RENDER_INSTANT's that would result in this > > cadence? > > Not really. > I'm not familiar with the code or the terminology. This is a syntactic rewrite > of the Python version of the metric at > https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [1,1,5,5,5] -> [2,3], for example. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:127: cadence.shift(); On 2017/02/22 16:45:45, ehmaldonado_chromium wrote: > On 2017/02/22 15:48:40, tdresser wrote: > > This code feels harder to read than necessary, partially because we're pushing > > stuff to the cadence list that's never read. > > > > What about something like: > > > > // add empty list check. > > > > let cadence = [] > > let oldIdealRender = events.shift() > > let framePersistence = 1 > > > > for (let event of events) { > > if (event[IRI] == oIR) { > > fP++; > > } else { > > cadence.push(fP) > > oIR = event[IRI] > > } > > } > > > > I'd rather not modify events. I'll figure out a simpler way to do it. :) Whoops, good point. Thanks. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:137: // and p=Card(k in C). On 2017/02/22 16:45:45, ehmaldonado_chromium wrote: > On 2017/02/22 15:48:39, tdresser wrote: > > What is Card? > > I take it means cardinality, or count the number of times k appears in C. Ah, of course. I think the comment here is significantly more dense than the code. A plain English description which isn't trying so hard to be mathematically precise would likely be more useful. https://codereview.chromium.org/2711623002/diff/1/tracing/tracing/metrics/web... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:142: frameDistribution[ticks] = ticksSoFar + 1; On 2017/02/22 16:45:44, ehmaldonado_chromium wrote: > On 2017/02/22 15:48:40, tdresser wrote: > > I'd find this a bit easier to read as: > > frameDistribution[ticks] = frameDistribution[ticks] + 1 || 1 > > > > instead of the above two lines. > > Done. > Should I add parentheses? > (frameDistribution[ticks] + 1) || 1 Parens SGTM. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:23: // Severity factor. This comment doesn't really add anything. What's the severity factor? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:156: * @param {Array} cadence - See getCadence above. State what this is an array of (and below). https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:157: * @returns {Object} frameDistribution - The source to output distribution. Isn't frameDistribution an array? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:192: * For simplicity we count as freezing the frames that appear at least five Isn't it 6 times in a row? 
 I sent an email about an idea for refactoring getCadence to use a new function in base/ to do standard run-length encoding. We can discuss it there or here if you want. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:75: model, isValidEvent, eventStream, getTimeStats.bind(null, values)); Please don't use null in tracing. I think that an arrow function would be clearer here than bind(): (streamName, events) => getTimeStats(values, streamName, events) 
 https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:19: const VSYNC_DURATION = 1e6 / DISPLAY_HERTZ; Can you rename this VSYNC_DURATION_US to clarify the units? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:22: const FROZEN_THRESHOLD = 6; A better name for this might be "FROZEN_FRAME_VSYNC_COUNT"? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:26: const WEB_MEDIA_PLAYER_MS_EVENT = 'WebMediaPlayerMS::UpdateCurrentFrame'; "MS" usually means milliseconds in tracing, and this is an event title, not an event. Can you rename this to WEB_MEDIA_PLAYER_UPDATE_TITLE? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:27: const IDEAL_RENDER_INSTANT = 'Ideal Render Instant'; Please add "_NAME" to these constants and add comments describing how they are event args field names. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:30: const SERIAL = 'Serial'; Can you rename this to "STREAM_ID_NAME"? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:32: const MANDATORY = [ Maybe "REQUIRED_EVENT_ARGS_NAMES"? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:51: function eventStream(event) { This can also be an arrow function in its single caller. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:55: /** If docstrings don't use jsdoc tags like @param and @returns, then the open comment tag should use 1 asterisk, not 2. This function doesn't look like it requires jsdoc tags -- the existing description looks good. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:65: for (let parameter of mandatory) { Did you mean "MANDATORY"? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:73: function webRtcRenderingTimelineMetric(values, model) { This |values| is a HistogramSet fka ValueSet. Please replace "values" with "histograms" throughout this file and the test. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:80: function addScalarValue(scalarValue, values, name, unit) { This looks like a special case of addListOfScalarValues. Can callers use that instead? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:86: function addListOfScalarValues(scalarValues, values, name, unit) { ListOfScalarValues isn't a thing in TBMv2. Can you name this "addHistogram"? https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:157: * @returns {Object} frameDistribution - The source to output distribution. On 2017/02/22 at 20:08:18, tdresser wrote: > Isn't frameDistribution an array? It looks like a map from ticks (frame sequence length) to count (number of sequences with that length). If that's right, it should use an ES6 Map: let frameDistribution = new Map(); for (let ticks of cadence) { frameDistribution.set(1 + (frameDistribution.get(ticks) || 0)); } return frameDistribution; Then when you iterate over it, you have to destruct the key-value pairs: for (let [ticks, count] of frameDistribution) { ... } Note that you cannot use bracket notation for Maps, you have to use get()/set() methods. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:180: for (let ticks of frameDistribution) { Ah, for..of doesn't work for objects. See my response to Tim's comment about getSourceToOutputDistribution. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:218: * As mentioned earlier, we count for frozen anything above 6 vsync display Please delete "As mentioned earlier". 
 https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:1: <!DOCTYPE html> Apologies if somebody else has already mentioned it, but the filename should match the metric function name including "metric": webRtcRenderingTimelineMetric -> web_rtc_rendering_timeline_metric. (Is it 1 word or two? "webrtc" or "web rtc"->"webRtc"?) https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline_test.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline_test.html:17: const WEB_MEDIA_PLAYER_MS_EVENT = 'WebMediaPlayerMS::UpdateCurrentFrame'; You can export these constants from the metric. 
 PTAL. I added runLengthEncoding https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html (right): https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:1: <!DOCTYPE html> On 2017/02/22 21:38:53, benjhayden wrote: > Apologies if somebody else has already mentioned it, but the filename should > match the metric function name including "metric": webRtcRenderingTimelineMetric > -> web_rtc_rendering_timeline_metric. > (Is it 1 word or two? "webrtc" or "web rtc"->"webRtc"?) Renamed to webrtcRendering -> webrtc_rendering https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:19: const VSYNC_DURATION = 1e6 / DISPLAY_HERTZ; On 2017/02/22 21:35:03, benjhayden wrote: > Can you rename this VSYNC_DURATION_US to clarify the units? Done. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:22: const FROZEN_THRESHOLD = 6; On 2017/02/22 21:35:03, benjhayden wrote: > A better name for this might be "FROZEN_FRAME_VSYNC_COUNT"? Done. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:23: // Severity factor. On 2017/02/22 20:08:18, tdresser wrote: > This comment doesn't really add anything. What's the severity factor? Removed it. It's how much a badly out of sync is worse than an out of sync when calculating the smoothness score. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:51: function eventStream(event) { On 2017/02/22 21:35:03, benjhayden wrote: > This can also be an arrow function in its single caller. Done. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:55: /** On 2017/02/22 21:35:03, benjhayden wrote: > If docstrings don't use jsdoc tags like @param and @returns, then the open > comment tag should use 1 asterisk, not 2. > This function doesn't look like it requires jsdoc tags -- the existing > description looks good. Done. https://codereview.chromium.org/2711623002/diff/20001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_timeline.html:65: for (let parameter of mandatory) { On 2017/02/22 21:35:03, benjhayden wrote: > Did you mean "MANDATORY"? Done. 
 https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering.html:24: const SEVERITY = 3; Sorry, can you add a comment on what SEVERITY is used for? 
 This CL lgtm, but please work with Ben and Tim to land it. 
 https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering.html:76: tr.metrics.MetricRegistry.register(webrtcRendering); Metric function names should end with "Metric". Metric files should end with "_metric.html". 
 Do you know how to run the _test.html tests? 
 On 2017/02/24 at 03:42:18, ehmaldonado wrote: > Do you know how to run the _test.html tests? https://github.com/catapult-project/catapult/blob/master/tracing/README.md#in... 
 Can someone who knows jsdoc well confirm that my proposal in utils.html is reasonable? eslint doesn't choke on @template, and it's used by closure compiler. https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/base/ut... File tracing/tracing/base/utils.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/base/ut... tracing/tracing/base/utils.html:166: * @returns {Array} Clarify the return type here, and move the overall description to the @returns annotation. Get someone more familiar with jsdoc to make sure this is correct, but I think something like: /** * @param {Array.<T>} ary * @returns {Array.<Object.<T, number>>} The run length encoding of the array as an array of {value, count} objects. * @template T */ https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/base/ut... File tracing/tracing/base/utils_test.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/base/ut... tracing/tracing/base/utils_test.html:68: }]); I think this would be easier to read as: var encoded = tr.b.runLengthEncoding([1, 1, 2, 2, 2, 3, 4, 4]); assert.deepEqual(encoded.map(x => x.value), [1, 2, 3, 4]); assert.deepEqual(encoded.map(x => x.count), [2, 2, 1, 2]); https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering.html:116: * @returns {Map} frameDistribution - The source to output distribution. Can we add more detailed type annotations throughout? https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering_test.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering_test.html:261: // IGNORE ? 
 https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering.html:235: * from the cadence. Can this comment describe what these statistics mean? 
 https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... File tracing/tracing/metrics/webrtc/webrtc_rendering.html (right): https://codereview.chromium.org/2711623002/diff/80001/tracing/tracing/metrics... tracing/tracing/metrics/webrtc/webrtc_rendering.html:76: tr.metrics.MetricRegistry.register(webrtcRendering); On 2017/02/23 21:59:16, benjhayden wrote: > Metric function names should end with "Metric". > Metric files should end with "_metric.html". Done. https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:51: let model = tr.c.TestUtils.newModelWithEvents([FAKE_EVENTS]); When I try to test this, I get this error. Am I doing something wrong? Error: Couldn't create an importer for the provided eventData. at Import.createImporter_ (http://localhost:8003/tracing/importer/import.html:379:15) at Import.createImports (http://localhost:8003/tracing/importer/import.html:133:31) at Task.run (http://localhost:8003/tracing/base/task.html:71:21) at Function.Task.RunSynchronously (http://localhost:8003/tracing/base/task.html:152:25) at Import.importTraces (http://localhost:8003/tracing/importer/import.html:73:17) at Function.TestUtils.newModelWithEvents (http://localhost:8003/tracing/core/test_utils.html:327:7) at runWebrtcRenderingMetric (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric_test.htm...) at TestCase.testFn_ (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric_test.htm...) at TestCase.run (http://localhost:8003/tracing/base/unittest/test_case.html:79:19) at TestRunner.<anonymous> (http://localhost:8003/tracing/base/unittest/test_runner.html:218:52) 
 https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:51: let model = tr.c.TestUtils.newModelWithEvents([FAKE_EVENTS]); On 2017/02/24 at 19:09:48, ehmaldonado_chromium wrote: > When I try to test this, I get this error. Am I doing something wrong? > > Error: Couldn't create an importer for the provided eventData. > at Import.createImporter_ (http://localhost:8003/tracing/importer/import.html:379:15) > at Import.createImports (http://localhost:8003/tracing/importer/import.html:133:31) > at Task.run (http://localhost:8003/tracing/base/task.html:71:21) > at Function.Task.RunSynchronously (http://localhost:8003/tracing/base/task.html:152:25) > at Import.importTraces (http://localhost:8003/tracing/importer/import.html:73:17) > at Function.TestUtils.newModelWithEvents (http://localhost:8003/tracing/core/test_utils.html:327:7) > at runWebrtcRenderingMetric (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric_test.htm...) > at TestCase.testFn_ (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric_test.htm...) > at TestCase.run (http://localhost:8003/tracing/base/unittest/test_case.html:79:19) > at TestRunner.<anonymous> (http://localhost:8003/tracing/base/unittest/test_runner.html:218:52) This works for me: let model = tr.c.TestUtils.newModelWithEvents([JSON.stringify(FAKE_EVENTS)]); Then it says ReferenceError: eventStream is not defined at Object.webrtcRenderingMetric (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric.html:73:9) at runWebrtcRenderingMetric (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric_test.htm...) at TestCase.testFn_ (http://localhost:8003/tracing/metrics/webrtc/webrtc_rendering_metric_test.htm...) at TestCase.run (http://localhost:8003/tracing/base/unittest/test_case.html:79:19) at TestRunner.<anonymous> (http://localhost:8003/tracing/base/unittest/test_runner.html:218:52) at TestRunner.runCurrentTestCase_ (http://localhost:8003/tracing/base/unittest/test_runner.html:216:14) at TestRunner.runOneTestCase_ (http://localhost:8003/tracing/base/unittest/test_runner.html:169:12) 
 https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:51: let model = tr.c.TestUtils.newModelWithEvents([FAKE_EVENTS]); After fixing those errors, it doesn't seem to be processing any events, and driftTime below is undefined. If I do something like let model = { getDescendantEvents: function() { return FAKE_EVENTS; }, }; then it does work. 
 Can you upload a patch where eventStream is fixed so I can help with the test? https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:51: let model = tr.c.TestUtils.newModelWithEvents([FAKE_EVENTS]); On 2017/02/24 at 23:06:17, ehmaldonado_chromium wrote: > After fixing those errors, it doesn't seem to be processing any events, and driftTime below is undefined. > > If I do something like > let model = { getDescendantEvents: function() { return FAKE_EVENTS; }, }; > then it does work. It would be best if the test used a real Model in case groupAndProcessEvents is ever changed to use a different API. 
 Patchset #7 (id:120001) has been deleted 
 This should pass the tests https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:51: let model = tr.c.TestUtils.newModelWithEvents([FAKE_EVENTS]); On 2017/02/24 23:44:40, benjhayden wrote: > On 2017/02/24 at 23:06:17, ehmaldonado_chromium wrote: > > After fixing those errors, it doesn't seem to be processing any events, and > driftTime below is undefined. > > > > If I do something like > > let model = { getDescendantEvents: function() { return FAKE_EVENTS; }, }; > > then it does work. > > It would be best if the test used a real Model in case groupAndProcessEvents is > ever changed to use a different API. Yes, of course. This is just for testing while I get it to work. 
 Tim, can you take another look, please? 
 On 2017/02/24 at 23:54:33, ehmaldonado wrote: > This should pass the tests > > https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... > File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): > > https://codereview.chromium.org/2711623002/diff/100001/tracing/tracing/metric... > tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:51: let model = tr.c.TestUtils.newModelWithEvents([FAKE_EVENTS]); > On 2017/02/24 23:44:40, benjhayden wrote: > > On 2017/02/24 at 23:06:17, ehmaldonado_chromium wrote: > > > After fixing those errors, it doesn't seem to be processing any events, and > > driftTime below is undefined. > > > > > > If I do something like > > > let model = { getDescendantEvents: function() { return FAKE_EVENTS; }, }; > > > then it does work. > > > > It would be best if the test used a real Model in case groupAndProcessEvents is > > ever changed to use a different API. > > Yes, of course. This is just for testing while I get it to work. This seems to work. function eventFromPair(pair) { return { title: 'WebMediaPlayerMS::UpdateCurrentFrame', start: pair[1], duration: 1, args: { 'Ideal Render Instant': pair[0], 'Actual Render Begin': pair[1], 'Actual Render End': 0, 'Serial': 0, } }; } function newModel() { function customizeModelCallback(model) { const rendererProcess = model.getOrCreateProcess(1); const mainThread = rendererProcess.getOrCreateThread(2); for (const event of FAKE_EVENTS) { mainThread.sliceGroup.pushSlice(tr.c.TestUtils.newSliceEx(event)); } }; return tr.c.TestUtils.newModelWithEvents([], {customizeModelCallback}); } 
 Doesn't look like feedback in #25, #26 has been addressed. 
 On 2017/02/27 14:45:19, tdresser wrote: > Doesn't look like feedback in #25, #26 has been addressed. I think I addressed them now. PTAL :) 
 Ben, any thoughts on how to review these scoring formulas? Is it worth sanity checking them? I'm not really clear on the role of scoring formulas here. https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:142: function addFpsFromCadence(values, frameDistribution) { It looks like we're using Cadence and FrameDistribution as synonyms here. Can we pick one and stick with it? https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:158: * five times in a row counted from 'Ideal Render Instant' perspective. This wording is confusing, and I don't understand the justification here. I would phrase this as: "In a series of repeated frames of length > 5, all frames after the first are considered frozen." Or something like that. Why do we exclude cases where the number of repeated frames is <= 5? https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:161: * We mitigate this by saying anything unde 5 frozen frames will not be unde -> under https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:39: [1663780586073, 1663780596280], Is there any reason to use hard to parse numbers here? Can we use numbers which are easier to visually parse? https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:70: assert.strictEqual(hist.sum, 400697); Verifying that these are correct is really hard. I need to jump around between the hard to parse timestamps at the top of the file, and the numbers here. Can we use a smaller number of events, where the math works out cleanly, and then just assert on the exact values in the histogram, instead of the summary statistics? https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:82: assert.strictEqual(hist.sum, 0); We should have a test where this isn't 0. https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:96: assert.strictEqual(hist.sum, 60); We should have a test where this isn't 60. https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:110: assert.strictEqual(hist.sum, 100); We should have a test where this isn't 100. 
 Patchset #10 (id:200001) has been deleted 
 Patchset #10 (id:220001) has been deleted 
 On 2017/02/25 05:05:18, benjhayden wrote:
> This seems to work.
> 
>   function eventFromPair(pair) {
>     return {
>       title: 'WebMediaPlayerMS::UpdateCurrentFrame',
>       start: pair[1],
>       duration: 1,
>       args: {
>         'Ideal Render Instant': pair[0],
>         'Actual Render Begin': pair[1],
>         'Actual Render End': 0,
>         'Serial': 0,
>       }
>     };
>   }
> 
>   function newModel() {
>     function customizeModelCallback(model) {
>       const rendererProcess = model.getOrCreateProcess(1);
>       const mainThread = rendererProcess.getOrCreateThread(2);
>       for (const event of FAKE_EVENTS) {
>         mainThread.sliceGroup.pushSlice(tr.c.TestUtils.newSliceEx(event));
>       }
>     };
>     return tr.c.TestUtils.newModelWithEvents([], {customizeModelCallback});
>   }
>
Thanks! :)
 https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:142: function addFpsFromCadence(values, frameDistribution) { They shouldn't be synonyms. I don't know why it was named "FromCadence" in the python code. It should be "FromFrameDistribution". https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:158: * five times in a row counted from 'Ideal Render Instant' perspective. I don't know, there's no mention of it in the design doc. I guess it's to make the metric more stable. https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:161: * We mitigate this by saying anything unde 5 frozen frames will not be On 2017/03/01 14:38:55, tdresser wrote: > unde -> under Done. https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:39: [1663780586073, 1663780596280], On 2017/03/01 14:38:55, tdresser wrote: > Is there any reason to use hard to parse numbers here? > Can we use numbers which are easier to visually parse? I'll work on it. :) https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:70: assert.strictEqual(hist.sum, 400697); On 2017/03/01 14:38:55, tdresser wrote: > Verifying that these are correct is really hard. I need to jump around between > the hard to parse timestamps at the top of the file, and the numbers here. > > Can we use a smaller number of events, where the math works out cleanly, and > then just assert on the exact values in the histogram, instead of the summary > statistics? I don't think we have access to the exact values in the histogram. I think we only get access to a random sample of them. 
 On 2017/03/01 at 14:38:55, tdresser wrote: > Ben, any thoughts on how to review these scoring formulas? Is it worth sanity checking them? > > I'm not really clear on the role of scoring formulas here. > It looks like smoothnessScore is computing a normalized weighted average, which might be exactly what is needed, or it might be an unnecessary combination of information that could be reported directly. It looks like getFreezingScore is computing an integral with a very special weighting function. There could have been a lot of thought and experimentation put into both the weighting function and the integral formula, but I can't tell without a design doc, and I didn't see any links in the python version. I think we can translate the scoring functions to JS in this CL so that we can go ahead and deprecate the TBMv1 metric, and then re-visit the value of the scoring functions. WDYT of that plan? It would be easier to examine the value of the scoring functions if the metric reports both scores and raw values. Can you go ahead and make the metric report the raw values framesOutOfSyncOnlyOnce, framesSeverelyOutOfSync, and each ticks.count of cadence in Histograms in addition to the scores? Annie is working on a dashboard to monitor benchmark/metric quality. If you monitor and alert on both the scores and the raw values, then we'll be able to see how often and when they both/neither/either catch regressions or alert spuriously. If it looks like scoring might be a useful tool, but these scoring functions could be improved, then I might recommend using a statistical system like I developed for RAIL and Lighthouse: http://go/normalized-metrics http://go/grading-metrics-on-the-curve I'd be happy to talk more about scoring. :-) 
 https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:52: // These numbers don't mean anything. We just want to make sure we can Tim, what do you think of this way to test driftTime? 
 Patchset #12 (id:210007) has been deleted 
 Patchset #13 (id:310001) has been deleted 
 Patchset #12 (id:290001) has been deleted 
 On 2017/03/01 21:36:06, benjhayden wrote: > It looks like getFreezingScore is computing an integral with a very special > weighting function. There could have been a lot of thought and experimentation > put into both the weighting function and the integral formula, but I can't tell > without a design doc, and I didn't see any links in the python version. Design doc: https://docs.google.com/document/d/1TX0NnT_xxvnRNDkw6P2kxunXyK_I8OChkLb4jehXR... > I think we can translate the scoring functions to JS in this CL so that we can > go ahead and deprecate the TBMv1 metric, and then re-visit the value of the > scoring functions. > WDYT of that plan? SGTM > It would be easier to examine the value of the scoring functions if the metric > reports both scores and raw values. > Can you go ahead and make the metric report the raw values > framesOutOfSyncOnlyOnce, framesSeverelyOutOfSync, and each ticks.count of > cadence in Histograms in addition to the scores? Ok, I'll do that in another patch. > I'd be happy to talk more about scoring. :-) Please do :) 
 I'll make the metric report the raw values in another patch. Can you take another look at the CL as it is, please? 
 A few final nits then lgtm but please wait for tdresser. https://codereview.chromium.org/2711623002/diff/240001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/240001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:79: function addHistogram(scalarValues, values, name, unit) { "Scalar" refers to a number plus a unit. Can you replace "scalarValues" with "samples"? https://codereview.chromium.org/2711623002/diff/240001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:138: * @param {tr.v.HistogramSet} values Please replace all "values" with "histograms". https://github.com/catapult-project/catapult/issues/3068 https://codereview.chromium.org/2711623002/diff/330001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/330001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:248: return { You can use shorthand literals here and in getSmoothnessStats: return {driftTime, renderingLengthError}; https://codereview.chromium.org/2711623002/diff/330001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/330001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:138: let values = new tr.v.HistogramSet(); Please replace "values" with "histograms". 
 Still lgtm pending those final nits, but I was just wondering if you might be able to run this metric over a trace to produce a results.html file containing real numbers and upload it to x20? 
 https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:158: * five times in a row counted from 'Ideal Render Instant' perspective. On 2017/03/01 20:52:38, ehmaldonado_chromium wrote: > I don't know, there's no mention of it in the design doc. I guess it's to make > the metric more stable. Woah, there's a design doc? Toss a link into the CL description! Since we have no idea why this behavior is the way it is, can you run the benchmark locally with this threshold, and without the threshold, with a large enough number of repeats that we can see how this impacts variability? https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/180001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:70: assert.strictEqual(hist.sum, 400697); On 2017/03/01 20:52:38, ehmaldonado_chromium wrote: > On 2017/03/01 14:38:55, tdresser wrote: > > Verifying that these are correct is really hard. I need to jump around between > > the hard to parse timestamps at the top of the file, and the numbers here. > > > > Can we use a smaller number of events, where the math works out cleanly, and > > then just assert on the exact values in the histogram, instead of the summary > > statistics? > > I don't think we have access to the exact values in the histogram. I think we > only get access to a random sample of them. You're right, nevermind. I think fewer samples would be fine though. 
 https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:52: // These numbers don't mean anything. We just want to make sure we can On 2017/03/01 21:59:05, ehmaldonado_chromium wrote: > Tim, what do you think of this way to test driftTime? This LGTM, thanks. https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:66: } It's a bit confusing to keep track of what's ideal, and what's actual. What about something like: let idealRenderInstant = 0 let actualRenderInstant = 0; for (const driftTime of fakeDriftTimes) { idealRenderInstant += 1; actualRenderInstant += driftTime; fakeEvents.push([idealRenderInstant, actualRenderInstant]); } 
 Description was changed from ========== Add a TBMv2 webrtc_rendering_timeline metric. This is a first attempt to port the webrtc_rendering_timeline metric to TBMv2. This [1] is the TBMv1 metric, which uses the stats computed here [2]. I don't know how to translate the logging statements [3], the none_reason [4] or the description [5]. I don't know if they are necessary. [1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [3] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [4] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [5] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... BUG=chromium:632295 ========== to ========== Add a TBMv2 webrtc_rendering_metric. This CL migrates the TBMv1 version of this metric [1], [2] to TBMv2. The design doc for the TBMv2 version of the metric is available at [3]. [1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [3] https://docs.google.com/document/d/1TX0NnT_xxvnRNDkw6P2kxunXyK_I8OChkLb4jehXR... BUG=chromium:632295 ========== 
 Sorry, what is x20? https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/260001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:66: } On 2017/03/02 13:19:01, tdresser wrote: > It's a bit confusing to keep track of what's ideal, and what's actual. What > about something like: > > let idealRenderInstant = 0 > let actualRenderInstant = 0; > for (const driftTime of fakeDriftTimes) { > idealRenderInstant += 1; > actualRenderInstant += driftTime; > fakeEvents.push([idealRenderInstant, actualRenderInstant]); > } Can you take a look at the latest patch, please? :) 
 1 nit, raw metrics lgtm, thanks! https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:308: framesOutOfSync, framesSeverelyOutOfSync, percentBadlyOutOfSync, Please put these on separate lines and add a trailing comma after the last so they match the dictionaries returned to tr.exportTo(). 
 Code seems reasonable at this point, but I'd also like to see some results from this. Uploading results to Google drive and making them world readable would be awesome. https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:54: } What about just: let pairs = []; for (let i = 0; i < driftTimes.length; ++i) { pairs.push([i, i + driftTimes[i]]); } https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:103: * [ticks, count] entry says that therer are 'count' frames that are displayed therer -> there 
 Here are some results. I couldn't figure out how to make them world readable: https://drive.google.com/a/google.com/file/d/0B5uD3WXkeAVGVDBOODUxX1lRVDA/vie... In the dev server, I'm seeing things like min = 9,911.504%, so I take it that percentages should go from 0 to 1 instead of 0 to 100. It also makes me think if the driftTimes are also correctly scaled. I'll run more traces tomorrow. ------------------------------------- Somewhat unrelated, I'm having trouble with the dev server. It doesn't always work, I have to load the trace a couple of times, and run another metric first, and then run the webrtcRendering metric... Can you reproduce it? 
 https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:308: framesOutOfSync, framesSeverelyOutOfSync, percentBadlyOutOfSync, On 2017/03/02 20:30:05, benjhayden wrote: > Please put these on separate lines and add a trailing comma after the last so > they match the dictionaries returned to tr.exportTo(). Done. https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:54: } On 2017/03/02 21:15:54, tdresser wrote: > What about just: > > let pairs = []; > for (let i = 0; i < driftTimes.length; ++i) { > pairs.push([i, i + driftTimes[i]]); > } Done. 'i' must start with something other than 0, though :) https://codereview.chromium.org/2711623002/diff/370001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:103: * [ticks, count] entry says that therer are 'count' frames that are displayed On 2017/03/02 21:15:53, tdresser wrote: > therer -> there Done. 
 To make something world readable, share it from your chromium account. Could you share the results.html file from some benchmarks runs, with a few repeats? 
 On 2017/03/03 at 13:49:12, tdresser wrote: > To make something world readable, share it from your chromium account. > > Could you share the results.html file from some benchmarks runs, with a few repeats? Here's a results.html for a single trace that I was helping ehmaldonado debug. It LGTM. http://www/~benjhayden/lemur.html Tim: was there a particular reason you wanted to see results for more traces? 
 https://codereview.chromium.org/2711623002/diff/390001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/390001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:90: 'WebRTCRendering_frame_distribution_keys', count_smallerIsBetter); You can let Histogram do the binning instead of doing it manually in getSourceToOutputDistribution(). const frameHist = new tr.v.Histogram('WebRTCRendering_frame_distribution', count_smallerIsBetter, tr.v.HistogramBinBoundaries.createLinear(1, 50, 49)); for (const ticks of cadence) { frameHist.addSample(ticks.count); } histograms.addHistogram(frameHist); Then you can use frameHist instead of frameDistribution like this: for (let ticks = 1; ticks < frameHist.allBins.length; ++ticks) { const count = frameHist.allBins[ticks].count; ... } 
 Patchset #16 (id:410001) has been deleted 
 https://codereview.chromium.org/2711623002/diff/390001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/390001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:90: 'WebRTCRendering_frame_distribution_keys', count_smallerIsBetter); On 2017/03/03 23:47:56, benjhayden wrote: > You can let Histogram do the binning instead of doing it manually in > getSourceToOutputDistribution(). > > const frameHist = new tr.v.Histogram('WebRTCRendering_frame_distribution', > count_smallerIsBetter, tr.v.HistogramBinBoundaries.createLinear(1, 50, 49)); > for (const ticks of cadence) { > frameHist.addSample(ticks.count); > } > histograms.addHistogram(frameHist); > > Then you can use frameHist instead of frameDistribution like this: > for (let ticks = 1; ticks < frameHist.allBins.length; ++ticks) { > const count = frameHist.allBins[ticks].count; > ... > } Done. Looks better :) https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:144: let frameHist = new tr.v.Histogram('', tr.b.Unit.byName.unitlessNumber); Is this a good way to test it? 
 Tim: https://ehmaldonado.users.x20web.corp.google.com/www/no_crawl/results.html Some results using the webrtc.webrtc_smoothness benchmark. 
 Tim: https://drive.google.com/open?id=0B64HUqgWWN_ZYlJmaElOcnRkd00 Some results using the webrtc.webrtc_smoothness benchmark. 
 What do you think? :) 
 A few more nits, still lgtm. https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:114: * as 'source to output' distribution. Can you pick either "frame distribution" or "source to output distribution" and make the entire file use the same terminology? It's great to mention aliases in comments, but using multiple aliases in code makes it more difficult to read. https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:128: tr.v.HistogramBinBoundaries.createLinear(1, 50, 49)); Can you add a comment about how this bin boundary configuration is required for the logic in addFpsFromFrameDistribution? https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:3: Copyright 2016 The Chromium Authors. All rights reserved. 2017 
 WebRTCRendering_frames_out_of_sync and WebRTCRendering_frames_badly_out_of_sync look like they have extremely high variability. Is there a reason we aren't reporting how many frames we considered "frozen"? 
 On 2017/03/06 20:28:19, tdresser wrote: > WebRTCRendering_frames_out_of_sync and WebRTCRendering_frames_badly_out_of_sync > look like they have extremely high variability. > > Is there a reason we aren't reporting how many frames we considered "frozen"? Added them to the histogram. 
 https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:114: * as 'source to output' distribution. On 2017/03/06 19:10:13, benjhayden wrote: > Can you pick either "frame distribution" or "source to output distribution" and > make the entire file use the same terminology? It's great to mention aliases in > comments, but using multiple aliases in code makes it more difficult to read. Done. https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:128: tr.v.HistogramBinBoundaries.createLinear(1, 50, 49)); On 2017/03/06 19:10:13, benjhayden wrote: > Can you add a comment about how this bin boundary configuration is required for > the logic in addFpsFromFrameDistribution? Will do. The upper limit is somewhat arbitrary, isn't it? https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:3: Copyright 2016 The Chromium Authors. All rights reserved. On 2017/03/06 19:10:13, benjhayden wrote: > 2017 Done. https://codereview.chromium.org/2711623002/diff/450001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/450001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:201: freezingScore += count * frozenPenaltyWeight(ticks - 1); This should be ticks - 1. See https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... 
 On 2017/03/06 21:21:05, ehmaldonado_chromium wrote: > https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... > File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): > > https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... > tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:114: * as 'source to > output' distribution. > On 2017/03/06 19:10:13, benjhayden wrote: > > Can you pick either "frame distribution" or "source to output distribution" > and > > make the entire file use the same terminology? It's great to mention aliases > in > > comments, but using multiple aliases in code makes it more difficult to read. > > Done. > > https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... > tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:128: > tr.v.HistogramBinBoundaries.createLinear(1, 50, 49)); > On 2017/03/06 19:10:13, benjhayden wrote: > > Can you add a comment about how this bin boundary configuration is required > for > > the logic in addFpsFromFrameDistribution? > > Will do. The upper limit is somewhat arbitrary, isn't it? > > https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... > File tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html (right): > > https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... > tracing/tracing/metrics/webrtc/webrtc_rendering_metric_test.html:3: Copyright > 2016 The Chromium Authors. All rights reserved. > On 2017/03/06 19:10:13, benjhayden wrote: > > 2017 > > Done. > > https://codereview.chromium.org/2711623002/diff/450001/tracing/tracing/metric... > File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): > > https://codereview.chromium.org/2711623002/diff/450001/tracing/tracing/metric... > tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:201: freezingScore > += count * frozenPenaltyWeight(ticks - 1); > This should be ticks - 1. > See > https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... Can you re-record results.html now that we include the number of frozen frames? Any thoughts on why the variability of WebRTCRendering_frames_out_of_sync and WebRTCRendering_frames_badly_out_of_sync are so variable? Are these communicating meaningful signal? 
 https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/430001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:128: tr.v.HistogramBinBoundaries.createLinear(1, 50, 49)); On 2017/03/06 at 21:21:04, ehmaldonado_chromium wrote: > On 2017/03/06 19:10:13, benjhayden wrote: > > Can you add a comment about how this bin boundary configuration is required for > > the logic in addFpsFromFrameDistribution? > > Will do. The upper limit is somewhat arbitrary, isn't it? I usually recommend about 30-50 bins because the histogram bar chart requires bins to be at least 5px tall, so histograms with too many bins won't fit on a screen, and will be difficult to read and compare. Also, too much resolution can defeat the purpose of using histograms. I recommended 50 in this case because that's the upper bound of what I usually recommend, but more might be necessary in this case. If a frame persists for more than 50 vsyncs, then it will be counted in the Histogram's overflow bin, so it will be treated as if it were only 50 vsyncs long. This could hide regressions from 50 to more vsyncs, or improvements from more down to 50 vsyncs. So if you're worried about regressions/improvements up to 100 vsyncs, then you could use that many bins here, but it might not look great. If you're worried about regressions/improvements beyond 100 vsyncs, then I'll ask for a specific reason why. If there's a good reason to worry about extremely long frames, then you might e.g. need to manually count them like you were doing before I helpfully suggested using Histogram, or we can add a Generic diagnostic containing extremely long frames that addFpsFromFrameDistribution can use instead of the overflow bin. 
 Ben: What is a Diagnostic? I don't think a frame staying more than 50 vsyncs is a frequent occurrence, or that there are regressions from 50 to more vsyncs, or improvements from more than 50 down to 50 vsyncs. So I'd say we leave it as it is, and monitor the data that we have to see if expanding the range is necessary. Tim: I ran the results again: https://drive.google.com/open?id=0B64HUqgWWN_ZYlJmaElOcnRkd00 From the chrome perf dashboard data, they don't look so variable, and they seem to provide a signal. https://chromeperf.appspot.com/report?sid=c841c790a050e6f0ff76087a9acc457c85c... 
 On 2017/03/07 14:48:34, ehmaldonado_chromium wrote: > Ben: > What is a Diagnostic? > I don't think a frame staying more than 50 vsyncs is a frequent occurrence, or > that there are regressions from 50 to more vsyncs, or improvements from more > than 50 down to 50 vsyncs. > So I'd say we leave it as it is, and monitor the data that we have to see if > expanding the range is necessary. > > Tim: > I ran the results again: > https://drive.google.com/open?id=0B64HUqgWWN_ZYlJmaElOcnRkd00 > From the chrome perf dashboard data, they don't look so variable, and they seem > to provide a signal. > https://chromeperf.appspot.com/report?sid=c841c790a050e6f0ff76087a9acc457c85c... The "score" variables don't really give enough context to know how noisy a signal is. I can easily convert a noisy signal between 0 and 1 into a non noisy signal between 0 and 100 by adding 99 to it. Ben may have some thoughts on this. For the number of frozen frames, the overall standard deviation is about equal to the mean. (8.5ish) For frames out of sync, the overall mean is ~38, with a std of ~22. For frames badly out of sync, the overall standard deviation is about equal to the mean. (1.5ish) These appear pretty noisy, but what really matters is what sort of magnitude of regressions we're hoping to catch with this metric. Can you give an example of a regression we've caught with the V1 metric? 
 I just learned it is unmonitored, so we haven't caught a regression with it. Also, I'm not sure they would've caught regressions that the other metrics would miss. However, the Chrome Perf Dashboard link I sent shows that they do change noticeably when a regression happens. 
 On 2017/03/07 at 14:48:34, ehmaldonado wrote: > Ben: > What is a Diagnostic? Please read https://github.com/catapult-project/catapult/blob/master/docs/how-to-write-me... > I don't think a frame staying more than 50 vsyncs is a frequent occurrence, or that there are regressions from 50 to more vsyncs, or improvements from more than 50 down to 50 vsyncs. > So I'd say we leave it as it is, and monitor the data that we have to see if expanding the range is necessary. Excellent! 
 On 2017/03/07 17:00:17, benjhayden wrote: > On 2017/03/07 at 14:48:34, ehmaldonado wrote: > > Ben: > > What is a Diagnostic? > > Please read > https://github.com/catapult-project/catapult/blob/master/docs/how-to-write-me... > > > > I don't think a frame staying more than 50 vsyncs is a frequent occurrence, or > that there are regressions from 50 to more vsyncs, or improvements from more > than 50 down to 50 vsyncs. > > So I'd say we leave it as it is, and monitor the data that we have to see if > expanding the range is necessary. > > Excellent! Ah, was the link you sent pointing to a real regression / improvement? Can you point me to the bug? 
 On 2017/03/07 17:17:48, tdresser wrote: > On 2017/03/07 17:00:17, benjhayden wrote: > > On 2017/03/07 at 14:48:34, ehmaldonado wrote: > > > Ben: > > > What is a Diagnostic? > > > > Please read > > > https://github.com/catapult-project/catapult/blob/master/docs/how-to-write-me... > > > > > > > I don't think a frame staying more than 50 vsyncs is a frequent occurrence, > or > > that there are regressions from 50 to more vsyncs, or improvements from more > > than 50 down to 50 vsyncs. > > > So I'd say we leave it as it is, and monitor the data that we have to see if > > expanding the range is necessary. > > > > Excellent! > > Ah, was the link you sent pointing to a real regression / improvement? Can you > point me to the bug? I clicked on the revision that showed the change, then at "See all performance changes at 413100" and that got me to the bug. https://chromeperf.appspot.com/group_report?bug_id=639751 https://bugs.chromium.org/p/chromium/issues/detail?id=639751 
 On 2017/03/07 19:02:10, ehmaldonado_chromium wrote: > On 2017/03/07 17:17:48, tdresser wrote: > > On 2017/03/07 17:00:17, benjhayden wrote: > > > On 2017/03/07 at 14:48:34, ehmaldonado wrote: > > > > Ben: > > > > What is a Diagnostic? > > > > > > Please read > > > > > > https://github.com/catapult-project/catapult/blob/master/docs/how-to-write-me... > > > > > > > > > > I don't think a frame staying more than 50 vsyncs is a frequent > occurrence, > > or > > > that there are regressions from 50 to more vsyncs, or improvements from more > > > than 50 down to 50 vsyncs. > > > > So I'd say we leave it as it is, and monitor the data that we have to see > if > > > expanding the range is necessary. > > > > > > Excellent! > > > > Ah, was the link you sent pointing to a real regression / improvement? Can you > > point me to the bug? > > I clicked on the revision that showed the change, then at "See all performance > changes at 413100" and that got me to the bug. > https://chromeperf.appspot.com/group_report?bug_id=639751 > https://bugs.chromium.org/p/chromium/issues/detail?id=639751 So is this good to go? 
 Sorry for the delay here, we're still figuring out what the bar is for TBMv2 metrics (if there is one). What is this metric going to be used for? The existing metric isn't monitored. Are you planning on having this metric monitored? Based on the strength of the signal from some of these metrics, I don't think we'll be able to triage regressions as part of the existing perf sheriff rotation. 
 On 2017/03/08 13:22:30, tdresser wrote: > Sorry for the delay here, we're still figuring out what the bar is for TBMv2 > metrics (if there is one). > > What is this metric going to be used for? The existing metric isn't monitored. > Are you planning on having this metric monitored? Based on the strength of the > signal from some of these metrics, I don't think we'll be able to triage > regressions as part of the existing perf sheriff rotation. Yes, we're planning to monitor them on our perf sheriff rotation. From the v1 version of the metric, the data for fps and freezing score seems pretty useful. We'd see if the signal from the rest is useful, otherwise we could delete them. They were supposed to help detect regressions when a new rendering algorithm for WebRTC was introduced. I don't know if they were useful before, since they were not monitored. 
 Patchset #18 (id:470001) has been deleted 
 I added patch to customize the SummaryOptions. Please take another look. 
 LGTM to land, but make sure Ned signs off too. 
 lgtm 
 Feel free to address my concern in a different CL https://codereview.chromium.org/2711623002/diff/490001/tracing/tracing/metric... File tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html (right): https://codereview.chromium.org/2711623002/diff/490001/tracing/tracing/metric... tracing/tracing/metrics/webrtc/webrtc_rendering_metric.html:71: tr.metrics.v8.utils.groupAndProcessEvents(model, Actually, why is this metric reusing a v8 metric utility? Can you please factor this to a more generic place? 
 The CQ bit was checked by ehmaldonado@chromium.org 
 The patchset sent to the CQ was uploaded after l-g-t-m from eakuefner@chromium.org, benjhayden@chromium.org Link to the patchset: https://codereview.chromium.org/2711623002/#ps490001 (title: "Customize summary options.") 
 CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.chromium.or... 
 CQ is committing da patch.
Bot data: {"patchset_id": 490001, "attempt_start_ts": 1489089888148870,
"parent_rev": "ab666350db97f4d0e681e597c139b487186c2cd0", "commit_rev":
"2ae07fc28bb3b248baa71647d5cfcefd67e57a84"}
 
            
              
                Message was sent while issue was closed.
              
            
             Description was changed from ========== Add a TBMv2 webrtc_rendering_metric. This CL migrates the TBMv1 version of this metric [1], [2] to TBMv2. The design doc for the TBMv2 version of the metric is available at [3]. [1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [3] https://docs.google.com/document/d/1TX0NnT_xxvnRNDkw6P2kxunXyK_I8OChkLb4jehXR... BUG=chromium:632295 ========== to ========== Add a TBMv2 webrtc_rendering_metric. This CL migrates the TBMv1 version of this metric [1], [2] to TBMv2. The design doc for the TBMv2 version of the metric is available at [3]. [1] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [2] https://cs.chromium.org/chromium/src/third_party/catapult/telemetry/telemetry... [3] https://docs.google.com/document/d/1TX0NnT_xxvnRNDkw6P2kxunXyK_I8OChkLb4jehXR... BUG=chromium:632295 Review-Url: https://codereview.chromium.org/2711623002 Committed: https://chromium.googlesource.com/external/github.com/catapult-project/catapu... ========== 
 
            
              
                Message was sent while issue was closed.
              
            
             Committed patchset #18 (id:490001) as https://chromium.googlesource.com/external/github.com/catapult-project/catapu... | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
