|
|
Created:
6 years, 4 months ago by anatoly techtonik Modified:
6 years, 4 months ago Reviewers:
iannucci CC:
chromium-reviews Base URL:
https://chromium.googlesource.com/infra/infra.git@master Project:
infra Visibility:
Public. |
DescriptionFix virtualenv activation on Windows.
On Windows paths are different:
http://virtualenv.readthedocs.org/en/latest/virtualenv.html#windows-notes
BUG=
R=iannucci@chromium.org
Committed: https://chromium.googlesource.com/infra/infra/+/6fce83cc5f6f05af22eb972141b08c64005b8023
Patch Set 1 #
Total comments: 2
Patch Set 2 : Address review comment #Messages
Total messages: 13 (0 generated)
I suspect there will also be some issues running build_deps.py. I was meaning to use the depot_tools win toolchain that chromium uses in order to get a good compiler environment in build_deps.py, but haven't had time to work on it. I think the toolchain hashes are now available directly in depot_tools, so it should make that work a bit easier, if you want to take a stab at it. If you get it working, I'll run build_deps to actually generate the windows wheels. Alternately / additionally, it would be great for bootstrap.py to be able to fall back to local compilation in the event that the wheels aren't available in the wheelhouse google storage bucket. https://codereview.chromium.org/479583002/diff/1/bootstrap/bootstrap.py File bootstrap/bootstrap.py (right): https://codereview.chromium.org/479583002/diff/1/bootstrap/bootstrap.py#newco... bootstrap/bootstrap.py:151: activate_this = os.path.join(env, 'Scripts', 'activate_this.py') thanks, I was meaning to do this but never got around to it. could we do: bin_dir = 'Scripts' if virtualenv.is_win else 'bin ... ?
There is an issue with wheels. I filled an issue at https://code.google.com/p/chromium/issues/detail?id=404159 I am still investigating how toolchain stuff works, but not sure if'll have time to even complete learning code (already procrastinating too much on volunteer projects). I see coverage is already in depot_tools/third_party, but no idea what and where needs to be replaced for infra to reuse it. The whole stuff with wheels is new/mystery to me.
https://codereview.chromium.org/479583002/diff/1/bootstrap/bootstrap.py File bootstrap/bootstrap.py (right): https://codereview.chromium.org/479583002/diff/1/bootstrap/bootstrap.py#newco... bootstrap/bootstrap.py:151: activate_this = os.path.join(env, 'Scripts', 'activate_this.py') On 2014/08/15 17:38:40, iannucci wrote: > thanks, I was meaning to do this but never got around to it. > > could we do: > > bin_dir = 'Scripts' if virtualenv.is_win else 'bin > ... > > ? Done.
On 2014/08/15 19:36:05, anatoly techtonik wrote: > There is an issue with wheels. I filled an issue at > https://code.google.com/p/chromium/issues/detail?id=404159 > I am still investigating how toolchain stuff works, but > not sure if'll have time to even complete learning code > (already procrastinating too much on volunteer projects). > I see coverage is already in depot_tools/third_party, but no idea > what and where needs to be replaced for infra to reuse it. > > The whole stuff with wheels is new/mystery to me. So, we want all of the other repos to look like this one in the future. The problem was that a lot of the python packages used by these tools were informally know (aka 'just install them on the system!'). We wanted a way to pin the version of all the python dependencies, and also support the ability to pin compiled packages (especially on windows where the toolchain environment is a bit sketchy). I never got to the windows part, but in principle it should be supportable now. The particular bit with coverage is that the binary version is /significantly/ faster than the pure-python tracer, and we use it extensively in infra :).
The CQ bit was checked by iannucci@chromium.org
lgtm
CQ is trying da patch. Follow status at https://chromium-status.appspot.com/cq/techtonik@gmail.com/479583002/20001
Message was sent while issue was closed.
Committed patchset #2 (20001) as 6fce83cc5f6f05af22eb972141b08c64005b8023
On Fri, Aug 15, 2014 at 10:43 PM, <iannucci@chromium.org> wrote: > > So, we want all of the other repos to look like this one in the future. Like infra/ or like depot_tools/? > We wanted a way to pin > the version of all the python dependencies, and also support the ability to > pin > compiled packages (especially on windows where the toolchain environment is > a > bit sketchy). I never got to the windows part, but in principle it should be > supportable now. I did dependencies pinning with "hash size" at: https://bitbucket.org/techtonik/locally/src/18d5d58838c596a5b4c5e439f253052a7... This is way it should be fairly protected from modifications and it doesn't matter where files come from. For example, coverage binary is available from https://pypi.python.org/pypi/coverage but because it is not built by Chromium and not uploaded to private repo, it can not be used. So, I'd switch to the model with "hash size" as id of release, multiple ids that are specific to file repository (filename can change and we may miss opportunity to reuse Debian archives, for example), much of the code about hash size calculation can be reused already. Maybe Google search engine can even be used to find files automatically using their hash size, like it was in setuptools times before. To unsubscribe from this group and stop receiving emails from it, send an email to chromium-reviews+unsubscribe@chromium.org.
Message was sent while issue was closed.
On 2014/08/16 05:23:31, anatoly techtonik wrote: > On Fri, Aug 15, 2014 at 10:43 PM, <mailto:iannucci@chromium.org> wrote: > > > > So, we want all of the other repos to look like this one in the future. > > Like infra/ or like depot_tools/? > > > We wanted a way to pin > > the version of all the python dependencies, and also support the ability to > > pin > > compiled packages (especially on windows where the toolchain environment is > > a > > bit sketchy). I never got to the windows part, but in principle it should be > > supportable now. > > I did dependencies pinning with "hash size" at: > https://bitbucket.org/techtonik/locally/src/18d5d58838c596a5b4c5e439f253052a7... > This is way it should be fairly protected from modifications and it > doesn't matter > where files come from. For example, coverage binary is available from > https://pypi.python.org/pypi/coverage but because it is not built by > Chromium and not uploaded to private repo, it can not be used. > > So, I'd switch to the model with "hash size" as id of release, multiple > ids that are specific to file repository (filename can change and we > may miss opportunity to reuse Debian archives, for example), much > of the code about hash size calculation can be reused already. > Maybe Google search engine can even be used to find files > automatically using their hash size, like it was in setuptools times > before. Hm... I'm not sure I understand what you mean. The binaries (built wheel files) in question are derived from fixed sources (by commit hash), but are differentiated by platform. We did this in order to have control over the source that's being used to produce the wheel files, as well as control over the actual wheel files themselves. It's especially important that we build the wheels ourselves because the pip build process is not deterministic (e.g. an input hash of XXX and the same configuration choices will yield multiple wheel files). Especially in the case of linux and mac, there are no precompiled binaries on PyPI. > > To unsubscribe from this group and stop receiving emails from it, send an email > to mailto:chromium-reviews+unsubscribe@chromium.org.
On Sat, Aug 16, 2014 at 9:33 AM, <iannucci@chromium.org> wrote: > > Hm... I'm not sure I understand what you mean. The binaries (built wheel > files) > in question are derived from fixed sources (by commit hash), but are > differentiated by platform. Do you mean you host sources for `coverage`? Or that you get `coverage` from its repository to build? > We did this in order to have control over the > source > that's being used to produce the wheel files, as well as control over the > actual > wheel files themselves. It's especially important that we build the wheels > ourselves because the pip build process is not deterministic (e.g. an input > hash > of XXX and the same configuration choices will yield multiple wheel files). Is it because C compiler inserts timestamps into binaries? In that case, is it possible to normalize these? > Especially in the case of linux and mac, there are no precompiled binaries > on PyPI. But if there are binaries, why not to use them? To unsubscribe from this group and stop receiving emails from it, send an email to chromium-reviews+unsubscribe@chromium.org.
Message was sent while issue was closed.
On 2014/08/16 06:42:20, anatoly techtonik wrote: > On Sat, Aug 16, 2014 at 9:33 AM, <mailto:iannucci@chromium.org> wrote: > > > > Hm... I'm not sure I understand what you mean. The binaries (built wheel > > files) > > in question are derived from fixed sources (by commit hash), but are > > differentiated by platform. > > Do you mean you host sources for `coverage`? Or that you get `coverage` > from its repository to build? Yes, we have mirrors of coverage (as well as all other python dependencies for infra), which we know will be reliable/available/correct. > > > We did this in order to have control over the > > source > > that's being used to produce the wheel files, as well as control over the > > actual > > wheel files themselves. It's especially important that we build the wheels > > ourselves because the pip build process is not deterministic (e.g. an input > > hash > > of XXX and the same configuration choices will yield multiple wheel files). > > Is it because C compiler inserts timestamps into binaries? In that case, is it > possible to normalize these? Yes, and also the timestamps in the zipfile format, __FILE__ macros in C source code, etc. etc. > > > Especially in the case of linux and mac, there are no precompiled binaries > > on PyPI. > > But if there are binaries, why not to use them? Because we can't reproduce them (see above), we can't verify them, which means we don't really know what's in them. If it were possible to deterministically know `sha1_for_wheel(source_sha, platform) == XYZ`, then we could verify that once, offline, and then trust e.g. the binaries hosted on PyPI (or anywhere else). But since we have no way to reproduce the build hosted there, we prefer to build them ourselves. Additionally, we are hosting the wheels on google storage, which we know to be a reliable host for static files worldwide, and we can rely on it for our infrastructure. Relying on PyPI being up and available isn't an option for our infrastructure. That said, there's no reason you shouldn't be able to provide your own wheels, if you like. I haven't implemented the functionality, but it would make sense to me for build_deps to have a local mode where it would build the pinned versions using e.g. your compiler. > > To unsubscribe from this group and stop receiving emails from it, send an email > to mailto:chromium-reviews+unsubscribe@chromium.org. |