|
|
Chromium Code Reviews|
Created:
7 years, 2 months ago by Siggi Cherem (dart-lang) Modified:
7 years, 2 months ago CC:
reviews_dartlang.org Visibility:
Public. |
DescriptionSwitch to copy files synchronously (address the too-many-open-files bug)
Patch Set 1 #
Total comments: 3
Messages
Total messages: 9 (0 generated)
lgtm
See note. I think I found the problem... https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... File pkg/polymer/lib/src/build/runner.dart (left): https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... pkg/polymer/lib/src/build/runner.dart:280: futures.add(_copyFile(inpath, outpath)); Could this be the problem? We're adding a bunch of running futures to a list. Try using Future.forEach here. I bet this would solve the problem without getting rid of the async code.
https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... File pkg/polymer/lib/src/build/runner.dart (left): https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... pkg/polymer/lib/src/build/runner.dart:280: futures.add(_copyFile(inpath, outpath)); On 2013/10/05 04:23:12, kevmoo wrote: > Could this be the problem? We're adding a bunch of running futures to a list. > > Try using Future.forEach here. I bet this would solve the problem without > getting rid of the async code. Sure, but then you're running them all serially. How is that better than just doing synchronous code? The point of async is to take advantage of parallelism at the OS level, by creating essentially a tree of operations that can resume as soon as their data is ready. If your futures are one big chain, you might as well have done it sync. I suppose you might have other code running in other isolates. Or similarity we could try to parallelize our Futures to keep them inside 256 files. Either way tho, it still risks going over the limit. I do think we should add a TODO & have a bug filed. This is a big trap in dart:io on Mac. By trying to do the "right thing" and parallelize your futures you end up shooting yourself in the foot :|
https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... File pkg/polymer/lib/src/build/runner.dart (left): https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... pkg/polymer/lib/src/build/runner.dart:280: futures.add(_copyFile(inpath, outpath)); On 2013/10/05 05:23:26, John Messerly wrote: > On 2013/10/05 04:23:12, kevmoo wrote: > > Could this be the problem? We're adding a bunch of running futures to a list. > > > > Try using Future.forEach here. I bet this would solve the problem without > > getting rid of the async code. > > Sure, but then you're running them all serially. How is that better than just > doing synchronous code? > > The point of async is to take advantage of parallelism at the OS level, by > creating essentially a tree of operations that can resume as soon as their data > is ready. If your futures are one big chain, you might as well have done it > sync. > > I suppose you might have other code running in other isolates. Or similarity we > could try to parallelize our Futures to keep them inside 256 files. Either way > tho, it still risks going over the limit. > > I do think we should add a TODO & have a bug filed. This is a big trap in > dart:io on Mac. By trying to do the "right thing" and parallelize your futures > you end up shooting yourself in the foot :| With async primitives you have the ability to easily do things in parallel. However that does not mean that you don't have to take care of resource management. In principle the code above is fine, but somewhere there should be resource control. That could be in the _copyFile function which could do throttling so that it is only actively copying a limited number of files simultaneously. Of cause that is only local resource management of the file copying, but is't important.
On 2013/10/07 06:40:26, Søren Gjesse wrote: > https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... > File pkg/polymer/lib/src/build/runner.dart (left): > > https://codereview.chromium.org/26071003/diff/1/pkg/polymer/lib/src/build/run... > pkg/polymer/lib/src/build/runner.dart:280: futures.add(_copyFile(inpath, > outpath)); > On 2013/10/05 05:23:26, John Messerly wrote: > > On 2013/10/05 04:23:12, kevmoo wrote: > > > Could this be the problem? We're adding a bunch of running futures to a > list. > > > > > > Try using Future.forEach here. I bet this would solve the problem without > > > getting rid of the async code. > > > > Sure, but then you're running them all serially. How is that better than just > > doing synchronous code? > > > > The point of async is to take advantage of parallelism at the OS level, by > > creating essentially a tree of operations that can resume as soon as their > data > > is ready. If your futures are one big chain, you might as well have done it > > sync. > > > > I suppose you might have other code running in other isolates. Or similarity > we > > could try to parallelize our Futures to keep them inside 256 files. Either way > > tho, it still risks going over the limit. > > > > I do think we should add a TODO & have a bug filed. This is a big trap in > > dart:io on Mac. By trying to do the "right thing" and parallelize your futures > > you end up shooting yourself in the foot :| > > With async primitives you have the ability to easily do things in parallel. > However that does not mean that you don't have to take care of resource > management. In principle the code above is fine, but somewhere there should be > resource control. That could be in the _copyFile function which could do > throttling so that it is only actively copying a limited number of files > simultaneously. Of cause that is only local resource management of the file > copying, but is't important. FYI: this does not fix the issue for me. barback is opening many files in the background, which I think is causing the problem. I tried adding hooks at the point of file write in polymer build and none of them were ever hit -- the crash happens before any polymer code is executed.
> FYI: this does not fix the issue for me. barback is opening many files in the > background, which I think is causing the problem. > > I tried adding hooks at the point of file write in polymer build and none of > them were ever hit -- the crash happens before any polymer code is executed. I take that back. The crash happens in _FileAsset.read, which is in barback, but it's called via Polymer's runner. Here's an alt fix. https://codereview.chromium.org/26268002/ I've confirmed this addresses the issue.
On 2013/10/07 15:19:24, kevmoo wrote: > > FYI: this does not fix the issue for me. barback is opening many files in the > > background, which I think is causing the problem. > > > > I tried adding hooks at the point of file write in polymer build and none of > > them were ever hit -- the crash happens before any polymer code is executed. > > I take that back. The crash happens in _FileAsset.read, which is in barback, but > it's called via Polymer's runner. > > Here's an alt fix. https://codereview.chromium.org/26268002/ > I've confirmed this addresses the issue. I had a feeling that was going to be the case. I think the problem in barback is quite tricky. Currently we are the ones pushing over the limit, but barback might hit that limit internally if we had too many files that could be processed in parallel by a transformer.
On 2013/10/07 16:29:18, Siggi Cherem (dart-lang) wrote: > On 2013/10/07 15:19:24, kevmoo wrote: > > > FYI: this does not fix the issue for me. barback is opening many files in > the > > > background, which I think is causing the problem. > > > > > > I tried adding hooks at the point of file write in polymer build and none of > > > them were ever hit -- the crash happens before any polymer code is executed. > > > > I take that back. The crash happens in _FileAsset.read, which is in barback, > but > > it's called via Polymer's runner. > > > > Here's an alt fix. https://codereview.chromium.org/26268002/ > > I've confirmed this addresses the issue. > > I had a feeling that was going to be the case. I think the problem in barback is > quite tricky. Currently we are the ones pushing over the limit, but barback > might hit that limit internally if we had too many files that could be processed > in parallel by a transformer. closing this patch as Kevin submitted the fix also for emitting assets |
||||||||||||||||||||||||||||||||||||||||||||||||||||||
