- Jan 29, 2025
-
-
Pierre Augier authored
Now that setuptools-scm is used, it is better to have a static MANIFEST.in file. From https://setuptools-scm.readthedocs.io: > Additionally setuptools-scm provides setuptools with a list of files that > are managed by the SCM (i.e. it automatically adds all the SCM-managed files to the sdist).
-
- Dec 13, 2024
-
-
Matt Harbison authored
VS 2017 was getting old, and 2022 is the latest supported compiler[1]. The exact bootstrapper here is the newly released LTSC 17.12.3, which looks to be supported until July 2026[2]. In making this change, I'm switching to the exported config settings file, because it's easier to open the bootstrapper, select the things needed, and then export to a file than figuring out all of the component names from the website. Somehow, I missed the x86/x64 compiler adapting the previous command line args. Also, the file nicely cleans up a very long command line. [1] https://wiki.python.org/moin/WindowsCompilers#Microsoft_Visual_C.2B-.2B-_14.x_with_Visual_Studio_2022_.28x86.2C_x64.2C_ARM.2C_ARM64.29 [2] https://learn.microsoft.com/en-us/visualstudio/releases/2022/release-history#fixed-version-bootstrappers
-
- Nov 14, 2024
-
-
Matt Harbison authored
This is duplicated from the current CI config, to be able to build releases consistently outside of CI. I don't like the duplication, but I'm not worried about things changing too often, so I'm not bothering with PowerShell or some form that would allow execution by the CI runner. We should consider putting the config in `pyproject.toml`, where things like what python versions to support can be centrally controlled for all platforms. The output directory is different from CI here, but that's fine because it is intended to run this on a system that is *not* hosting the CI setup, and `dist/` is more standard. I dropped the `win32` part of the output because that implies the 32-bit Intel architecture. Apparently, arm64 builds are supported back to Python 3.9, but support is still experimental (with py3.13)[1]. The CI system starts arm64 support with Python 3.11, because that's the first version that an arm64 Python installer was available on Windows. This doesn't second guess that decision. The required `msgfmt.exe` was installed manually[2], as it isn't currently handled by the dependency installation script. Otherwise, this was successfully used with an activated venv based on Python 3.12.5, and only `cibuildwheel==2.21.3` installed. [1] https://cibuildwheel.pypa.io/en/stable/#what-does-it-do [2] https://github.com/mlocati/gettext-iconv-windows/releases/download/v0.22.5a-v1.17-r3/gettext0.22.5a-iconv1.17-shared-64.exe
-
- Apr 05, 2023
-
-
Raphaël Gomès authored
I am fully aware of the irony.
-
- Aug 30, 2022
-
-
Raphaël Gomès authored
On top of fixing fsmonitor, it moves one more "old API" use to the new one. This needs very verbose code to save a few function calls that are very expensive in Python.
-
- Feb 20, 2022
-
-
Gregory Szorc authored
This commit started by deleting references to py2exe (which is only used on Python 2). After pulling the thread, quite a lot of code was orphaned and was deleted. Differential Revision: https://phab.mercurial-scm.org/D12265
-
- Jan 18, 2022
-
-
Pierre-Yves David authored
This seems like a better location for it. Differential Revision: https://phab.mercurial-scm.org/D12036
-
- Dec 01, 2020
-
-
Augie Fackler authored
This means that naive contributors who just run `black` on a source file will get reasonable behavior as long as they have a recent black. Yay! This was previously D9834 but was rolled back due to test failures. nbjoerg thinks it's time to try again, so let's give it a shot. Differential Revision: https://phab.mercurial-scm.org/D10185
-
Augie Fackler authored
This will tell pip et al to call our setup.py for the majority of packaging concerns, but also gives us a place to put standard config stuff like black. This was previously D9833, but was rolled back due to test breakage. nbjoerg thinks that breakage is now resolved, so we're trying again. Differential Revision: https://phab.mercurial-scm.org/D10184
-
- Feb 02, 2021
-
-
Pierre-Yves David authored
This changeset is part of a series that break Continuous integration on python 2 for about a week. As not concrete solution have been found so far the safer seems to back it out until we can figure the details out. Differential Revision: https://phab.mercurial-scm.org/D9948
-
Pierre-Yves David authored
This changeset is part of a series that break Continuous integration on python 2 for about a week. As not concrete solution have been found so far the safer seems to back it out until we can figure the details out. Differential Revision: https://phab.mercurial-scm.org/D9947
-
- Dec 01, 2020
-
-
Augie Fackler authored
This means that naive contributors who just run `black` on a source file will get reasonable behavior as long as they have a recent black. Yay! Differential Revision: https://phab.mercurial-scm.org/D9834
-
Augie Fackler authored
This will tell pip et al to call our setup.py for the majority of packaging concerns, but also gives us a place to put standard config stuff like black. Differential Revision: https://phab.mercurial-scm.org/D9833
-
- Dec 14, 2020
-
-
Victor Stinner authored
* Replace "Py_TYPE(obj) = type;" with "Py_SET_TYPE(obj, type);" * Add pythoncapi_compat.h header file to get Py_SET_TYPE() on Python 2.7-3.8. Header file added to mercurial/ and contrib/python-zstandard/zstd/common/. In Python 3.10, Py_TYPE(obj) must not longer be used as an l-value. pythoncapi_compat.h comes from: https://github.com/pythoncapi/pythoncapi_compat Differential Revision: https://phab.mercurial-scm.org/D9825
-
- Nov 22, 2020
-
-
Matt Harbison authored
This is already used elsewhere in the test to access the current hg repo, and avoids an error about the unknown `revlog-compression-zstd` when C extensions aren't built. The only other such error is in test-check-interfaces.py, but I don't see a way to avoid it other than to create an empty scratch repo. Differential Revision: https://phab.mercurial-scm.org/D9364
-
- Oct 01, 2020
-
-
Martin von Zweigbergk authored
`hg fix` runs the formatters from the repo root so it doesn't pick up the `rustfmt.toml` configs we had in each the `hg-core`, `hg-cpython`, and `rhg` packages, which resulted in warnings about `async fn` not existing in Rust 2015. This patch moves the `rustfmt.toml` file to the root so `hg fix` will use it. By putting the `rustfmt.toml` file in a higher-level directory, it also applies to the `chg` and `hgcli` packages. That makes `test-check-rust-format.t` fail, so this patch also applies the new formatting rules to those packages. Differential Revision: https://phab.mercurial-scm.org/D9142
-
- Apr 24, 2020
-
-
Gregory Szorc authored
We want to start distributing Mercurial on Python 3 on Windows. PyOxidizer will be our vehicle for achieving that. This commit implements basic support for producing Inno installers using PyOxidizer. While it is an eventual goal of PyOxidizer to produce installers, those features aren't yet implemented. So our strategy for producing Mercurial installers is similar to what we've been doing with py2exe: invoke a build system to produce files then stage those files into a directory so they can be turned into an installer. We had to make significant alterations to the pyoxidizer.bzl config file to get it to produce the files that we desire for a Windows install. This meant differentiating the build targets so we can target Windows specifically. We've added a new module to hgpackaging to deal with interacting with PyOxidizer. It is similar to pyexe: we invoke a build process then copy files to a staging directory. Ideally these extra files would be defined in pyoxidizer.bzl. But I don't think it is worth doing at this time, as PyOxidizer's config files are lacking some features to make this turnkey. The rest of the change is introducing a variant of the Inno installer code that invokes PyOxidizer instead of py2exe. Comparing the Python 2.7 based Inno installers with this one, the following changes were observed: * No lib/*.{pyd, dll} files * No Microsoft.VC90.CRT.manifest * No msvc{m,p,r}90.dll files * python27.dll replaced with python37.dll * Add vcruntime140.dll file The disappearance of the .pyd and .dll files is acceptable, as PyOxidizer has embedded these in hg.exe and loads them from memory. The disappearance of the *90* files is acceptable because those provide the Visual C++ 9 runtime, as required by Python 2.7. Similarly, the appearance of vcruntime140.dll is a requirement of Python 3.7. Differential Revision: https://phab.mercurial-scm.org/D8473
-
- Dec 06, 2019
-
-
Augie Fackler authored
This is taken from the (improved since we started fuzzing) guide on ideal integrations. Rather than have our own wonky targets for building outside the fuzzer universe, we have a driver program we carry along and use when we're not using LibFuzzer. This will let us jettison a fair amount of goo. contrib/fuzz/standalone_fuzz_target_runner.cc is https://github.com/google/oss-fuzz/ file projects/example/my-api-repo/standalone from git revision c4579d9358a73ea5dbcc99cb985de1f2bf76dcf7, reformatted with out clang-format settings and a no-check-code comment added. It allows running a single test input through a fuzzer, rather than performing ongoing fuzzing as libfuzzer would. contrib/fuzz/FuzzedDataProvider.h is https://github.com/llvm/llvm-project/ file /compiler-rt/include/fuzzer/FuzzedDataProvider.h from git revision a44ef027ebca1598892ea9b104d6189aeb3bc2f0, reformatted with our clang-format settings and a no-check-code comment added. We can discard this if we instead want to add an hghave check for a new enough llvm that includes FuzzedDataProvder.h in the fuzzer headers. Differential Revision: https://phab.mercurial-scm.org/D7564
-
- Nov 16, 2019
-
-
Gregory Szorc authored
We shouldn't generally be using Windows line endings in files under version control. I've accidentally committed a few files with Windows line endings recently. So let's add a test for this. Differential Revision: https://phab.mercurial-scm.org/D7448
-
- Oct 24, 2019
-
-
Gregory Szorc authored
Consolidating functionality for invoking code in the hgpackaging package through a single CLI entry point will make things simpler when we add more complexity to that package. For example, it will allow us to run things out of a virtualenv with third party packages. This commit consolidates functionality from the Inno and WiX build.py scripts into a new packaging.py script. That script simply creates a virtualenv and runs the CLI functionality in it. The new virtualenv is populated with jinja2 because I felt it easier to incorporate requirements file processing in this commit and we will soon use jinja2 in an upcoming commit. The unified CLI functionality will also make it easier to script other packaging workflows going forward. e.g. RPM, Debian, and macOS packaging. Differential Revision: https://phab.mercurial-scm.org/D7156
-
- Oct 29, 2019
-
-
Pierre-Yves David authored
Now that the official black has all we want, we can drop this.
-
- Oct 14, 2019
-
-
Augie Fackler authored
Black won't read this automatically (you'll have to specify --config), but having a pyproject.toml *at all* puts pip in PEP 517/518 mode which breaks us for obscure reasons I don't understand. Rather than waste a ton of time fighting with pip, let's just do this. Differential Revision: https://phab.mercurial-scm.org/D7087
-
- Oct 06, 2019
-
-
Augie Fackler authored
This is black with https://github.com/psf/black/pull/826 applied as of today. The current git hash of black master is d9e71a75ccfefa3d9156a64c03313a0d4ad981e5, and the hash of my commit is dc1add6e94e212eff37bb3619e1422fb3c6d8dc8. In order to use this, you need to install `black` (from github master) and `typed-ast` using pip, preferably into python3, and then you can run `grey.py` with that Python and you'll have my patched version of black, which is how we've been formatting the codebase. Once my PR is merged, I'll follow up by removing this fork and updating instructions in the example config. # no-check-commit bad style Differential Revision: https://phab.mercurial-scm.org/D7002
-
Augie Fackler authored
Differential Revision: https://phab.mercurial-scm.org/D6993
-
- Oct 05, 2019
-
-
Gregory Szorc authored
The CI code for running the Try Server requires more thorough review. Let's add just the client-side bits for submitting to Try so others can start using it. Differential Revision: https://phab.mercurial-scm.org/D6983
-
- Sep 06, 2019
-
-
Gregory Szorc authored
The new command and associated functionality can be used to automate the publishing of Windows release artifacts. It supports uploading wheels to PyPI (using twine) and copying the artifacts to mercurial-scm.org and updating the latest.dat file to advertise them via the website. I ran `automation.py publish-windows-artifacts 5.1.1` and it appeared to "just work." But the real test will be to do this on the next release... Differential Revision: https://phab.mercurial-scm.org/D6786
-
- Apr 27, 2019
-
-
Gregory Szorc authored
Building on top of our Windows automation support, this commit implements support for performing automated tasks on remote Linux machines. Specifically, we implement support for running tests on ephemeral EC2 instances. This seems to be a worthwhile place to start, as building packages on Linux is more or less a solved problem because we already have facilities for building in Docker containers, which provide "good enough" reproducibility guarantees. The new `run-tests-linux` command works similarly to `run-tests-windows`: it ensures an AMI with hg dependencies is available, provisions a temporary EC2 instance with this AMI, pushes local changes to that instance via SSH, then invokes `run-tests.py`. Using this new command, I am able to run the entire test harness substantially faster then I am on my local machine courtesy of access to massive core EC2 instances: wall: 16:20 ./run-tests.py -l (i7-6700K) wall: 14:00 automation.py run-tests-linux --ec2-instance c5.2xlarge wall: 8:30 automation.py run-tests-linux --ec2-instance m5.4xlarge wall: 8:04 automation.py run-tests-linux --ec2-instance c5.4xlarge wall: 4:30 automation.py run-tests-linux --ec2-instance c5.9xlarge wall: 3:57 automation.py run-tests-linux --ec2-instance m5.12xlarge wall: 3:05 automation.py run-tests-linux --ec2-instance m5.24xlarge wall: 3:02 automation.py run-tests-linux --ec2-instance c5.18xlarge ~3 minute wall time to run pretty much the entire test harness is not too bad! The AMIs install multiple versions of Python. And the run-tests-linux command specifies which one to use: automation.py run-tests-linux --python system3 automation.py run-tests-linux --python 3.5 automation.py run-tests-linux --python pypy2.7 By default, the system Python 2.7 is used. Using this functionality, I was able to identity some unexpected test failures on PyPy! Included in the feature is support for running with alternate filesystems. You can simply pass --filesystem to the command to specify the type of filesystem to run tests on. When the ephemeral instance is started, a new filesystem will be created and tests will run from it: wall: 4:30 automation.py run-tests-linux --ec2-instance c5.9xlarge wall: 4:20 automation.py run-tests-linux --ec2-instance c5d.9xlarge --filesystem xfs wall: 4:24 automation.py run-tests-linux --ec2-instance c5d.9xlarge --filesystem tmpfs wall: 4:26 automation.py run-tests-linux --ec2-instance c5d.9xlarge --filesystem ext4 We also support multiple Linux distributions: $ automation.py run-tests-linux --distro debian9 total time: 298.1s; setup: 60.7s; tests: 237.5s; setup overhead: 20.4% $ automation.py run-tests-linux --distro ubuntu18.04 total time: 286.1s; setup: 61.3s; tests: 224.7s; setup overhead: 21.4% $ automation.py run-tests-linux --distro ubuntu18.10 total time: 278.5s; setup: 58.2s; tests: 220.3s; setup overhead: 20.9% $ automation.py run-tests-linux --distro ubuntu19.04 total time: 265.8s; setup: 42.5s; tests: 223.3s; setup overhead: 16.0% Debian and Ubuntu are supported because those are what I use and am most familiar with. It should be easy enough to add support for other distros. Unlike the Windows AMIs, Linux EC2 instances bill per second. So the cost to instantiating an ephemeral instance isn't as severe. That being said, there is some overhead, as it takes several dozen seconds for the instance to boot, push local changes, and build Mercurial. During this time, the instance is largely CPU idle and wasting money. Even with this inefficiency, running tests is relatively cheap: $0.15-$0.25 per full test run. A machine running tests as efficiently as these EC2 instances would cost say $6,000, so you can run the test harness a >20,000 times for the cost of an equivalent machine. Running tests in EC2 is almost certainly cheaper than buying a beefy machine for developers to use :) # no-check-commit because foo_bar function names Differential Revision: https://phab.mercurial-scm.org/D6319
-
- Mar 15, 2019
-
-
Gregory Szorc authored
Sometimes you don't have access to a machine in order to do something. For example, you may not have access to a Windows machine required to build Windows binaries or run tests on that platform. This commit introduces a pile of code intended to help "automate" common tasks, like building release artifacts. In its current form, the automation code provides functionality for performing tasks on Windows EC2 instances. The hgautomation.aws module provides functionality for integrating with AWS. It manages EC2 resources such as IAM roles, EC2 security groups, AMIs, and instances. The hgautomation.windows module provides a higher-level interface for performing tasks on remote Windows machines. The hgautomation.cli module provides a command-line interface to these higher-level primitives. I attempted to structure Windows remote machine interaction around Windows Remoting / PowerShell. This is kinda/sorta like SSH + shell, but for Windows. In theory, most of the functionality is cloud provider agnostic, as we should be able to use any established WinRM connection to interact with a remote. In reality, we're tightly coupled to AWS at the moment because I didn't want to prematurely add abstractions for a 2nd cloud provider. (1 was hard enough to implement.) In the aws module is code for creating an image with a fully functional Mercurial development environment. It contains VC9, VC2017, msys, and other dependencies. The image is fully capable of building all the existing Mercurial release artifacts and running tests. There are a few things that don't work. For example, running Windows tests with Python 3. But building the Windows release artifacts does work. And that was an impetus for this work. (Although we don't yet support code signing.) Getting this functionality to work was extremely time consuming. It took hours debugging permissions failures and other wonky behavior due to PowerShell Remoting. (The permissions model for PowerShell is crazy and you brush up against all kinds of issues because of the user/privileges of the user running the PowerShell and the permissions of the PowerShell session itself.) The functionality around AWS resource management could use some improving. In theory we support shared tenancy via resource name prefixing. In reality, we don't offer a way to configure this. Speaking of AWS resource management, I thought about using a tool like Terraform to manage resources. But at our scale, writing a few dozen lines of code to manage resources seemed acceptable. Maybe we should reconsider this if things grow out of control. Time will tell. Currently, emphasis is placed on Windows. But I only started there because it was likely to be the most difficult to implement. It should be relatively trivial to automate tasks on remote Linux machines. In fact, I have a ~1 year old script to run tests on a remote EC2 instance. I will likely be porting that to this new "framework" in the near future. # no-check-commit because foo_bar functions Differential Revision: https://phab.mercurial-scm.org/D6142
-
- Mar 08, 2019
-
-
Gregory Szorc authored
Like we did for Inno Setup, we want to make it easier to produce WiX installers. This commit does that. We introduce a new hgpackaging.wix module for performing all the high-level tasks required to produce WiX installers. This required miscellaneous enhancements to existing code in hgpackaging, including support for signing binaries. A new build.py script for calling into the module APIs has been created. It behaves very similarly to the Inno Setup build.py script. Unlike Inno Setup, we didn't have code in the repo previously to generate WiX installers. It appears that all existing automation for building WiX installers lives in the https://bitbucket.org/tortoisehg/thg-winbuild repository - most notably in its setup.py file. My strategy for inventing the code in this commit was to step through the code in that repo's setup.py and observe what it was doing. Despite the length of setup.py in that repository, the actual amount of steps required to produce a WiX installer is actually quite low. It consists of a basic py2exe build plus invocations of candle.exe and light.exe to produce the MSI. One rabbit hole that gave me fits was locating the Visual Studio 9 C Runtime merge modules. These merge modules are only present on your system if you have a full Visual Studio 2008 installation. Fortunately, I have a copy of Visual Studio 2008 and was able to install all the required updates. I then uploaded these merge modules to a personal repository on GitHub. That is where the added code references them from. We probably don't need to ship the merge modules. But that is for another day. The installs from the MSIs produced with the new automation differ from the last official MSI in the following ways: * Our HTML manual pages have UNIX line endings instead of Windows. * We ship modules in the mercurial.pure package. It appears the upstream packaging code is not including this package due to omission (they supply an explicit list of packages that has drifted out of sync with our setup.py). * We do not ship various distutils.* modules. This is because virtualenvs have a custom distutils/__init__.py that automagically imports distutils from its original location and py2exe gets confused by this. We don't use distutils in core Mercurial and don't provide a usable python.exe, so this omission should be acceptable. * The version of the enum package is different and we ship an enum.pyc instead of an enum/__init__.py. * The version of the docutils package is different and we ship a different set of files. * The version of Sphinx is drastically newer and we ship a number of files the old version did not. (I'm not sure why we ship Sphinx - I think it is a side-effect of the way the THG code was installing dependencies.) * We ship the idna package (dependent of requests which is a dependency of newer versions of Sphinx). * The version of imagesize is different and we ship an imagesize.pyc instead of an imagesize/__init__.pyc. * The version of the jinja2 package is different and the sets of files differs. * We ship the packaging package, which is a dependency for Sphinx. * The version of the pygments package is different and the sets of files differs. * We ship the requests package, which is a dependency for Sphinx. * We ship the snowballstemmer package, which is a dependency for Sphinx. * We ship the urllib3 package, which is a dependency for requests, which is a dependency for Sphinx. * We ship a newer version of the futures package, which includes a handful of extra modules that match Python 3 module names. # no-check-commit because foo_bar naming Differential Revision: https://phab.mercurial-scm.org/D6097
-
- Mar 07, 2019
-
-
Gregory Szorc authored
py2exe builds are shared between Inno Setup and WIX. We'll want the logic for performing py2exe builds to be reusable across the code for both installers. This commit extracts the py2exe-specific functionality into its own module. There's definitely room to customize things further. This will be done in future commits, as necessary. (I'm not even sure what customizations WIX will require yet. Presumably a lot.) Differential Revision: https://phab.mercurial-scm.org/D6091
-
Gregory Szorc authored
Aspects of building the Inno Setup and WIX installers are shared. It will make sense for them to share code. Plus, having code in a reusable library (as opposed to a standalone script) is just a better approach. This commit moves the core logic to build the Inno Setup installer into the hgpackaging package. inno/build.py is now a simple frontend script that calls into a module to do the bulk of the work. As part of this change, I also found a typo in build() where it was referencing "iscc" instead of "iscc_exe." Because "iscc" was in the global scope via the only caller, things just happened to work before. Another benefit of always using functions and not putting global code for __main__ in the same file as library code. Differential Revision: https://phab.mercurial-scm.org/D6087
-
Gregory Szorc authored
As we will introduce more code to support packaging, it will be useful to have download code in its own module. Differential Revision: https://phab.mercurial-scm.org/D6084
-
Gregory Szorc authored
Previously, contrib/packaging behaved as a root to a package directory and we had a "packagingutil" module. As I work more on packaging code, we'll want to have more code shared between different packaging tools. I think it makes sense to have a single package containing multiple modules than multiple top-level modules. This commit establishes an "hgpackaging" package by moving the existing packagingutil code to it. Differential Revision: https://phab.mercurial-scm.org/D6083
-
- Mar 04, 2019
-
-
Gregory Szorc authored
The official Inno installer build process is poorly documented. And attempting to reproduce behavior of the installer uploaded to www.mercurial-scm.org has revealed a number of unexpected behaviors. This commit attempts to improve the state of reproducibility of the Inno installer by introducing a Python script to largely automate the building of the installer. The new script (which must be run from an environment with the Visual C++ environment configured) takes care of producing an Inno installer. When run from a fresh Mercurial source checkout with all the proper system dependencies (the VC++ toolchain, Windows 10 SDK, and Inno tools) installed, it "just works." The script takes care of downloading all the Python dependencies in a secure manner and manages the build environment for you. You don't need any additional config files: just launch the script, pointing it at an existing Python and ISCC binary and it takes care of the rest. The produced installer creates a Mercurial installation with a handful of differences from the existing 4.9 installers (produced by someone else): * add_path.exe is missing (this was removed a few changesets ago) * The set of api-ms-win-core-* DLLs is different (I suspect this is due to me using a different UCRT / Windows version). * kernelbase.dll and msasn1.dll are missing. * There are a different set of .pyc files for dulwich, keyring, and pygments due to us using the latest versions of each. * We include Tcl/Tk DLLs and .pyc files (I'm not sure why these are missing from the existing installers). * We include the urllib3 and win32ctypes packages (which are dependencies of dulwich and pywin32, respectively). I'm not sure why these aren't present in the existing installers. * We include a different set of files for the distutils package. I'm not sure why. But it should be harmless. * We include the docutils package (it is getting picked up as a dependency somehow). I think this is fine. * We include a copy of argparse.pyc. I'm not sure why this was missing from existing installers. * We don't have a copy of sqlite3/dump.pyc. I'm not sure why. The SQLite C extension code only imports this module when conn.iterdump() is called. It should be safe to omit. * We include files in the email.test and test packages. The set of files is small and their presence should be harmless. The new script and support code is written in Python 3 because it is brand new and independent code and I don't believe new Python projects should be using Python 2 in 2019 if they have a choice about it. The readme.txt file has been renamed to readme.rst and overhauled to reflect the existence of build.py. Differential Revision: https://phab.mercurial-scm.org/D6066
-
- Feb 04, 2019
-
-
Gregory Szorc authored
To avoid a SyntaxWarning on Python 3.8 due to invalid \ escape. Differential Revision: https://phab.mercurial-scm.org/D5837
-
- Oct 12, 2018
-
-
Augie Fackler authored
Differential Revision: https://phab.mercurial-scm.org/D4984
-
- Aug 10, 2018
-
-
Augie Fackler authored
It now passes on Python 3. # skip-blame b prefix Differential Revision: https://phab.mercurial-scm.org/D4250
-
- Apr 20, 2018
-
-
Yuya Nishihara authored
-
- Feb 26, 2018
-
-
Augie Fackler authored
Eight and a half years ago, as my starter bug on code.google.com, I investigated a mysterious "broken pipe" error from seemingly random clients[0]. That investigation revealed a tragic story: the Python standard library's httplib was (and remains) barely functional. During large POSTs, if a server responds early with an error (even a permission denied error!) the client only notices that the server closed the connection and everything breaks. Such server behavior is implicitly legal under RFC 2616 (the latest HTTP RFC as of when I was last working on this), and my understanding is that later RFCs have made it explicitly legal to respond early with any status code outside the 2xx range. I embarked, probably foolishly, on a journey to write a new http library with better overall behavior. The http library appears to work well in most cases, but it can get confused in the presence of proxies, and it depends on select(2) which limits its utility if a lot of file descriptors are open. I haven't touched the http library in almost two years, and in the interim the Python community has discovered a better way[1] of writing network code. In theory some day urllib3 will have its own home-grown http library built on h11[2], or we could do that. Either way, it's time to declare our current confusingly-named "http2" client logic and move on. I do hope to revisit this some day: it's still garbage that we can't even respond with a 401 or 403 without reading the entire POST body from the client, but the goalposts on writing a new http client library have moved substantially. We're almost certainly better off just switching to requests and eventually picking up their http fixes than trying to live with something that realistically only we'll ever use. Another approach would be to write an adapter so that Mercurial can use pycurl if it's installed. Neither of those approaches seem like they should be investigated prior to a release of Mercurial that works on Python 3: that's where the mindshare is going to be for any improvements to the state of the http client art. 0: http://web.archive.org/web/20130501031801/http://code.google.com/p/support/issues/detail?id=2716 1: http://sans-io.readthedocs.io/ 2: https://github.com/njsmith/h11 Differential Revision: https://phab.mercurial-scm.org/D2444
-
- Nov 30, 2017
-
-
Yuya Nishihara authored
-