|
|
Created:
5 years, 11 months ago by Daniel Bratell Modified:
5 years, 10 months ago Reviewers:
dshwang, Justin Novosad CC:
blink-reviews Base URL:
https://chromium.googlesource.com/chromium/blink.git@master Target Ref:
refs/heads/master Project:
blink Visibility:
Public. |
DescriptionGive a canvas/video test more time.
We have seen the yuv-video-on-accelerated-canvas.html test timeout
and since it's not obviously fast, maybe it is just a case of too
slow bot. Trying to give it more time hoping that will solve
the problem.
BUG=450699
Committed: https://src.chromium.org/viewvc/blink?view=rev&revision=189018
Patch Set 1 #Patch Set 2 : It has timeouts on all platforms. #
Created: 5 years, 11 months ago
Messages
Total messages: 12 (2 generated)
bratell@opera.com changed reviewers: + junov@chromium.org
junov, adding a test that is flaky in win_blink_rel to the SlowTests file just in case the reason it timeouts is that it is slow. What do you think?
On 2015/01/26 12:30:17, Daniel Bratell wrote: > junov, adding a test that is flaky in win_blink_rel to the SlowTests file just > in case the reason it timeouts is that it is slow. What do you think? sounds reasonable to me. lgtm
On 2015/01/26 14:09:39, junov wrote: > On 2015/01/26 12:30:17, Daniel Bratell wrote: > > junov, adding a test that is flaky in win_blink_rel to the SlowTests file just > > in case the reason it timeouts is that it is slow. What do you think? > > sounds reasonable to me. lgtm wait... could you make that line platform-specific?
On 2015/01/26 14:11:01, junov wrote: > On 2015/01/26 14:09:39, junov wrote: > > On 2015/01/26 12:30:17, Daniel Bratell wrote: > > > junov, adding a test that is flaky in win_blink_rel to the SlowTests file > just > > > in case the reason it timeouts is that it is slow. What do you think? > > > > sounds reasonable to me. lgtm > > wait... could you make that line platform-specific? I did so in the first patch set, but then I noticed that timeouts happened on all (many?) platforms. This is not Windows specific.
On 2015/01/26 14:24:59, Daniel Bratell wrote: > On 2015/01/26 14:11:01, junov wrote: > > On 2015/01/26 14:09:39, junov wrote: > > > On 2015/01/26 12:30:17, Daniel Bratell wrote: > > > > junov, adding a test that is flaky in win_blink_rel to the SlowTests file > > just > > > > in case the reason it timeouts is that it is slow. What do you think? > > > > > > sounds reasonable to me. lgtm > > > > wait... could you make that line platform-specific? > > I did so in the first patch set, but then I noticed that timeouts happened on > all (many?) platforms. This is not Windows specific. thx for handling this. lgtm the test calls drawImage(video) many time on both ganesh canvas and software canvas, so obviously slow. marking slow test for all platform looks reasonable. at least, win and android should be marked.
On 2015/01/26 17:28:58, dshwang wrote: > On 2015/01/26 14:24:59, Daniel Bratell wrote: > > On 2015/01/26 14:11:01, junov wrote: > > > On 2015/01/26 14:09:39, junov wrote: > > > > On 2015/01/26 12:30:17, Daniel Bratell wrote: > > > > > junov, adding a test that is flaky in win_blink_rel to the SlowTests > file > > > just > > > > > in case the reason it timeouts is that it is slow. What do you think? > > > > > > > > sounds reasonable to me. lgtm > > > > > > wait... could you make that line platform-specific? > > > > I did so in the first patch set, but then I noticed that timeouts happened on > > all (many?) platforms. This is not Windows specific. > > thx for handling this. lgtm > the test calls drawImage(video) many time on both ganesh canvas and software > canvas, so obviously slow. > marking slow test for all platform looks reasonable. at least, win and android > should be marked. It has failures (timeouts) all the time on all of Mac, Win and Linux (on all different versions of all of them) according to the flakiness dashboard. I can't see anything Android though so it's possible it works there. Maybe Android has massively higher default timeouts. Not that it's only the virtual/gpu test that has timeouts. The software code is much better.
On 2015/01/26 17:36:02, Daniel Bratell wrote: > On 2015/01/26 17:28:58, dshwang wrote: > > On 2015/01/26 14:24:59, Daniel Bratell wrote: > > > On 2015/01/26 14:11:01, junov wrote: > > > > On 2015/01/26 14:09:39, junov wrote: > > > > > On 2015/01/26 12:30:17, Daniel Bratell wrote: > > > > > > junov, adding a test that is flaky in win_blink_rel to the SlowTests > > file > > > > just > > > > > > in case the reason it timeouts is that it is slow. What do you think? > > > > > > > > > > sounds reasonable to me. lgtm > > > > > > > > wait... could you make that line platform-specific? > > > > > > I did so in the first patch set, but then I noticed that timeouts happened > on > > > all (many?) platforms. This is not Windows specific. > > > > thx for handling this. lgtm > > the test calls drawImage(video) many time on both ganesh canvas and software > > canvas, so obviously slow. > > marking slow test for all platform looks reasonable. at least, win and android > > should be marked. > > It has failures (timeouts) all the time on all of Mac, Win and Linux (on all > different versions of all of them) according to the flakiness dashboard. I can't > see anything Android though so it's possible it works there. Maybe Android has > massively higher default timeouts. > > Not that it's only the virtual/gpu test that has timeouts. The software code is > much better. In that case, there is a moderate probability that this is not a slow test. Could be a race condition (or an other non-deterministic factor) that is causing the test to hang, possibly deadlock. We want to avoid marking tests that hang as slow because that we not make the bots any greener, and it will increase bot cycle time. My suggestion would be to try this patch, but keep the underlying issue open until we have enough data to confirm that we do in fact have a slow test. If the flakiness goes away (or reduces greatly), then we were right to mark the test as slow. If the test continues to be as flaky as before, that would suggest that we are dealing with a hang, in which case we would need to revert this patch. OK? lgtm
> If the test continues to be as flaky as before, that > would suggest that we are dealing with a hang, in which case we would need to > revert this patch. OK? > > lgtm I think that is the plan we need to follow. If they are just a bit slow this will make things better, but if it doesn't help, we should just revert it and we need to figure out what else it could be.
The CQ bit was checked by bratell@opera.com
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/patch-status/872383002/20001
Message was sent while issue was closed.
Committed patchset #2 (id:20001) as https://src.chromium.org/viewvc/blink?view=rev&revision=189018 |