|
|
Chromium Code Reviews|
Created:
4 years, 10 months ago by David Yen Modified:
4 years, 10 months ago Reviewers:
Ken Russell (switch to Gerrit) CC:
chromium-reviews, Ian Ewell Base URL:
https://chromium.googlesource.com/chromium/src.git@master Target Ref:
refs/pending/heads/master Project:
chromium Visibility:
Public. |
DescriptionFix GPUTiming offset so it is stable within significant digits.
The timer offset between GPU and CPU time were resolved to
microsecond accuracy before. This was causing inaccuracy
flakiness when comparing timestamps.
The issue is that we cannot assume these instructions run in
microsecond accuracy, so the offset will change from query to
query. Instead they should be considered only up to a sensible
accuracy for these instructions (such as milliseconds) and only
updating the offset if it has changed more than 1 millisecond.
The reason we compare with the previous offset is so we catch
drifts from the previous calculation rather than an integer
boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if
we were rounding it we would still have this problem, IE
0.45 vs 0.55 would be 0 and 1.
BUG=582676
Committed: https://crrev.com/ea6d840e4d96f6df739f55a89ccf69daa3569a77
Cr-Commit-Position: refs/heads/master@{#372843}
Patch Set 1 #
Total comments: 4
Patch Set 2 : Use base::TimeDelta #Patch Set 3 : Increased fake CPU time in unittest to be greater than 1ms #
Messages
Total messages: 18 (7 generated)
Description was changed from ========== Fix GPUTiming offset so it is stable within significant digits. BUG=None ========== to ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from run to run. Instead we should be rounded to a more sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=None ==========
dyen@chromium.org changed reviewers: + kbr@chromium.org
kbr@, please review this subtle bug I found. I tried to explain the issue in detail in the CL description. Let me know if anything is unclear.
Could you please file a bug and describe exactly what kind of timestamp comparisons were being done? I am wondering whether the calling code should be what's adjusted, rather than quantizing the offset between CPU and GPU time. Please also file a bug and reference it from the CL description. I don't like stray CLs that change deep behavior subtly. https://codereview.chromium.org/1647333002/diff/1/ui/gl/gpu_timing.cc File ui/gl/gpu_timing.cc (right): https://codereview.chromium.org/1647333002/diff/1/ui/gl/gpu_timing.cc#newcode344 ui/gl/gpu_timing.cc:344: glGetInteger64v(GL_TIMESTAMP, &gl_now); I've recently learned that this query returns 0 on Mac OS X. Can you confirm? How does that affect the correctness of this code? https://codereview.chromium.org/1647333002/diff/1/ui/gl/gpu_timing.cc#newcode354 ui/gl/gpu_timing.cc:354: base::Time::kMicrosecondsPerMillisecond; Is there any way to use base::Time to do this rounding and these delta computations instead of essentially duplicating the code?
Description was changed from ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from run to run. Instead we should be rounded to a more sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=None ========== to ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from run to run. Instead we should be rounded to a more sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 ==========
Description was changed from ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from run to run. Instead we should be rounded to a more sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 ========== to ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from query to query. Instead they should be considered only up to a sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 ==========
I've filed an issue and linked it above. I put in a more detailed explanation in the issue tracker as well. https://codereview.chromium.org/1647333002/diff/1/ui/gl/gpu_timing.cc File ui/gl/gpu_timing.cc (right): https://codereview.chromium.org/1647333002/diff/1/ui/gl/gpu_timing.cc#newcode344 ui/gl/gpu_timing.cc:344: glGetInteger64v(GL_TIMESTAMP, &gl_now); On 2016/01/29 22:54:31, Ken Russell wrote: > I've recently learned that this query returns 0 on Mac OS X. Can you confirm? > How does that affect the correctness of this code? So this would only apply to the core profile (compatibility profile doesn't support ARB_timer_query, only EXT_timer_query). We would have to put in a work around for this if that is the case, or simply disable ARB_timer_query. The only advantage ARB_timer_query gives us over EXT_timer_query is timestamps so if this gives us 0 there is really no point in using it. https://codereview.chromium.org/1647333002/diff/1/ui/gl/gpu_timing.cc#newcode354 ui/gl/gpu_timing.cc:354: base::Time::kMicrosecondsPerMillisecond; On 2016/01/29 22:54:31, Ken Russell wrote: > Is there any way to use base::Time to do this rounding and these delta > computations instead of essentially duplicating the code? Done. This is much better now thank you.
Thanks, this looks good now but could you reply to my question on the bug too?
Thanks for answering the question in https://code.google.com/p/chromium/issues/detail?id=582676#c3 . The code looks good but some related unittests are failing. Please ping once they're fixed.
On 2016/01/30 16:24:17, Ken Russell wrote: > Thanks for answering the question in > https://code.google.com/p/chromium/issues/detail?id=582676#c3 . The code looks > good but some related unittests are failing. Please ping once they're fixed. Looks to be one of the unit tests was testing using a fake CPU time less than 1ms, easy fix by adding a 0. PTAL.
On 2016/02/01 06:27:52, David Yen wrote: > On 2016/01/30 16:24:17, Ken Russell wrote: > > Thanks for answering the question in > > https://code.google.com/p/chromium/issues/detail?id=582676#c3 . The code looks > > good but some related unittests are failing. Please ping once they're fixed. > > Looks to be one of the unit tests was testing using a fake CPU time less than > 1ms, easy fix by adding a 0. PTAL. ping
lgtm
The CQ bit was checked by dyen@chromium.org
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/patch-status/1647333002/40001 View timeline at https://chromium-cq-status.appspot.com/patch-timeline/1647333002/40001
Message was sent while issue was closed.
Description was changed from ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from query to query. Instead they should be considered only up to a sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 ========== to ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from query to query. Instead they should be considered only up to a sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 ==========
Message was sent while issue was closed.
Committed patchset #3 (id:40001)
Message was sent while issue was closed.
Description was changed from ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from query to query. Instead they should be considered only up to a sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 ========== to ========== Fix GPUTiming offset so it is stable within significant digits. The timer offset between GPU and CPU time were resolved to microsecond accuracy before. This was causing inaccuracy flakiness when comparing timestamps. The issue is that we cannot assume these instructions run in microsecond accuracy, so the offset will change from query to query. Instead they should be considered only up to a sensible accuracy for these instructions (such as milliseconds) and only updating the offset if it has changed more than 1 millisecond. The reason we compare with the previous offset is so we catch drifts from the previous calculation rather than an integer boundary. For example 0.9 vs 1.1 would be 0 and 1. Note even if we were rounding it we would still have this problem, IE 0.45 vs 0.55 would be 0 and 1. BUG=582676 Committed: https://crrev.com/ea6d840e4d96f6df739f55a89ccf69daa3569a77 Cr-Commit-Position: refs/heads/master@{#372843} ==========
Message was sent while issue was closed.
Patchset 3 (id:??) landed as https://crrev.com/ea6d840e4d96f6df739f55a89ccf69daa3569a77 Cr-Commit-Position: refs/heads/master@{#372843} |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
