Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please allow "file:" for setup.cfg install_requires #1951

Closed
nedbat opened this issue Dec 28, 2019 · 46 comments · Fixed by #3253
Closed

Please allow "file:" for setup.cfg install_requires #1951

nedbat opened this issue Dec 28, 2019 · 46 comments · Fixed by #3253

Comments

@nedbat
Copy link

nedbat commented Dec 28, 2019

The new declarative setup.cfg syntax is fabulous. It's wonderful to have two-line setup.py files!

I tried to use "install_requires = file: requirements/base.in" and found that "file:" wasn't supported there. Our pip-compile workflow works really well, and it would be great to be able to start with base.in as our source of truth, and easily use it in our setup.cfg.

As a design point, I'm not sure it makes sense to special-case which fields can use which kinds of info-gatherers. Why can't I use "file:" for any field? I can understand if the implementation requires it, but the more the developer can choose how to structure their own world, the better.

@pganssle
Copy link
Member

There are two things I'd like to address here:

I tried to use "install_requires = file: requirements/base.in" and found that "file:" wasn't supported there. Our pip-compile workflow works really well, and it would be great to be able to start with base.in as our source of truth, and easily use it in our setup.cfg.

My intuition is that I think that supporting file: in this context would mainly serve to encourage people to use their (version-locked) requirements.txt for their install_requires, which I consider to be an anti-pattern, since it is almost certainly the case that this should go the other direction. If there were a linter for build configurations, I'd want "imports a requirements.txt" to be in there. If 99% of people are going to use this feature to enable an anti-pattern, and 1% have a legitimate use like @nedbat's, I am -1 on it. For me, setup.py exists to enable "I know what I'm doing" use cases like this. (Though #1805 throws a bit of a monkey wrench in this, since static requirements in setup.py would actually be treated differently from requirements read in from a file).

Regardless of how this turns out in setuptools, it would be really nice if pip-compile could be used with arbitrary projects rather than just requirements.txt files, since that would make it a lot easier for install_requires to be your source of truth from which a lock file is generated.

As a design point, I'm not sure it makes sense to special-case which fields can use which kinds of info-gatherers. Why can't I use "file:" for any field? I can understand if the implementation requires it, but the more the developer can choose how to structure their own world, the better.

This is probably not a big deal for file: itself, but I am mildly worried about the idea of inventing a whole generic DSL for our "declarative" build files. The more complexity we add, the harder those files will be to actually parse or statically analyze, and eventually we'll get back into the situation we're in now, where we desperately want to move to more constrained build specs.

I would say I am -0 on both of these propositions. I'm instinctively conservative here and don't see a super compelling use case, though I will admit that for people like @nedbat who have many different contextual dependencies, it makes sense to provide them some ability to refactor and avoid having a huge monolithic configuration file.

One possible compromise here could be supporting file:, but throwing an error if == is specified in the included file. Surely this would be frustrating for people who want to use their locked requirements.txt file, but if the main reason we're not supporting file: at all is to frustrate people who want to reuse their locked requirements.txt file, this might be a lower-impact way to do that that solves the "anti-pattern" portion of the issue.

@layday
Copy link
Member

layday commented Jan 6, 2020

My intuition is that I think that supporting file: in this context would mainly serve to encourage people to use their (version-locked) requirements.txt for their install_requires, which I consider to be an anti-pattern ...

This is something you should expect to see more of now with pipx simplifying the installation of Python CLIs from PyPI. It is an anti-pattern in the context of libraries but the reality is setuptools and PyPI are also used for application delivery.

@pganssle
Copy link
Member

I realized that this is a duplicate of #1074, which was closed as wontfix.

@jaraco Do you think anything has changed since #1074?

@nedbat
Copy link
Author

nedbat commented Jan 15, 2020

Just because a feature can be mis-used does not mean it should be forbidden. I have a reason to use this, and I know what I am doing. Is it better that I hack together something with sed and make?

@jaraco
Copy link
Member

jaraco commented Jan 15, 2020

Do you think anything has changed since #1074?

In my mind, things have changed as there is a more compelling use-case presented above.

I'd be open to a PR that (a) adds support for file: in this field and (b) documents officially the caveat that packagers should resist the temptation to simply reference requirements.txt especially if that file contains pinned requirements.

reality is setuptools [is] used for application delivery

I would recommend that if pinning is done, it should be done outside of setuptools using tools like pip-compile and that the underlying package (even if a final deployment of a local application) should specify its abstract requirements and not some set of concrete requirements, but I won't go as far as to disallow that usage.

I don't think the pipx example is a good one, as applications like tox and twine shouldn't have pinned requirements for the same reason that libraries shouldn't have pinned requirements--the end user may have different demands than the application had at the time it was last released (including honoring environmental variation).

@pganssle
Copy link
Member

Just because a feature can be mis-used does not mean it should be forbidden.

I agree that that's true, but it really depends on baseline rates. If we add this and for 99.9% of people it is mis-used and it's still possible to accomplish it without this feature, I think we're doing people a disservice by giving them a feature that's so easy to misuse.

I think that's the case here except that encouraging people to use setup.py for their "escape hatch" in this scenario may impact our ability to provide guarantees about deterministic dependencies.

It seems like the safest way to do this would be to allow setup.cfg to parse a file in a format that pip doesn't understand (so there's no danger of using it with a requirements.txt), and try and get pip-compile changed to allow for compiling . and .[extras] (and possibly "just the dependencies of .").

@layday
Copy link
Member

layday commented Jan 15, 2020

It seems like the safest way to do this would be to allow setup.cfg to parse a file in a format that pip doesn't understand (so there's no danger of using it with a requirements.txt), and try and get pip-compile changed to allow for compiling . and .[extras] (and possibly "just the dependencies of .").

I don't think pip-compile users would see the need to specify their dependencies in a separate file in a format exclusive to setup.cfg if pip-compile were able to extract dependencies from pip requirement specifiers. Would pinned dependencies be forbidden in this format or are you banking on obscurity?

@pganssle
Copy link
Member

I don't think pip-compile users would see the need to specify their dependencies in a separate file in a format exclusive to setup.cfg if pip-compile were able to extract dependencies from pip requirement specifiers.

Not sure what you mean by "extract dependencies from pip requirement specifiers", but if you look at Ned's original use case I could definitely imagine someone with that many separate extras dependencies not wanting to combine those into one huge setup.cfg. I suppose the two feature requests are separable.

Would pinned dependencies be forbidden in this format or are you banking on obscurity?

I think neither of these are the case. The problem to solve here is that a lot of people don't understand the difference between requirements.txt and install_requires and want to "single source" their install metadata by having setuptools pull data from their requirements.txt. If we make it easy to do that, people will just do it, not realizing that it's the wrong thing to do. We want the right thing to do to be easy and the right thing to do to be hard (we don't need to make it impossible).

As long as it's not easy to output the setup.cfg format with pip freeze or pip-compile, and it's not easy to pip install the setup.cfg format other than through the normal mechanism of pip install ., then the design objective is satisfied. Sure, someone could write a converter between the two things, but hopefully when people google, "How do I convert between requirements.txt and setuptools requirements file?" they will find some documentation saying, "You shouldn't, that's why they have two different formats."

As an aside, this reminds me of the Susan B. Anthony Dollar, which was a US$1 coin that looks almost identical to a quarter ($0.25). People complained because it was very easy to confuse the two and give someone $4 when you meant to give them $1. It was such a similar shape and size that you could even put the $1 coins in vending machines designed for $0.25 coins. My reasoning for using a different format is similar to the reason for making $1 coins a different color and size than $0.25 coins: install_requires and requirements.txt serve different purposes and we don't want to confuse people by making them similar enough that a lot of tooling almost (but not always) "works".

This is just an idea, though. I'm willing to be persuaded otherwise.

@layday
Copy link
Member

layday commented Jan 16, 2020

Not sure what you mean by "extract dependencies from pip requirement specifiers" ...

I mean these: . and .[extras]. There is an open issue on the pip-tools tracker to support build system-agnostic requirements, meaning 'locked' requirements can be compiled from any pip-installable package.

... but if you look at Ned's original use case I could definitely imagine someone with that many separate extras dependencies not wanting to combine those into one huge setup.cfg. I suppose the two feature requests are separable.

If no other program is able to interoperate with the new file - the only benefit there being to reduce the size of setup.cfg - I think it'd fall kind of flat. readme and version take a file: value because single sourcing the readme and version number is desirable.

The problem to solve here is that a lot of people don't understand the difference between requirements.txt and install_requires ...

I think a lot of people don't appreciate that these are different formats. That you are able to pipe a requirements.txt into install_requires and have it work (for the most part) appears to be an accident of history. But do a lot of people who work with setuptools (package authors) not understand that pinning their dependencies is poor practice? I'm not so sure.

@merwok
Copy link
Contributor

merwok commented Jan 23, 2020

Another idea: pip-compile can read source requirements from setup.py, so it could be extended to support setup.cfg too.

@zed
Copy link

zed commented May 22, 2020

Note: requirements.in and requirements.txt are substantially different: the *.in file doesn't pin all versions as a rule otherwise there would be no need to pip-compile it into requirements.txt.

I understand the logic of avoiding requirements.txt in setup.cfg but what is the argument against: install_requires = file: requirements.in?

@ThiefMaster
Copy link

CI+caching is another usecase where this would be useful: I'd like to cache my venv folder (using e.g. the hash of requirements.txt as a cache key), but right now this means i have to duplicate the requirements in setup.cfg and requirements.txt for this to work.

@s0undt3ch
Copy link

I too have the use case where I want to reuse my .in requirements for packaging while I also maintain a locked .txt file for CI purposes.

At the bare minimum, it would be awesome if setuptools provided an entry point so that a third party library could read requirements files and set the various _require on the distribution instance, similar to how setuptools_scm sets the version attribute. In this case, setuptools would not be "responsible" for the misuse, and that third party library would.

Would you be open for said entry-point?

@s0undt3ch
Copy link

So, for those interested, I managed to hack something which might also help you, https://pypi.org/project/setuptools-declarative-requirements

@ghost
Copy link

ghost commented Nov 5, 2020

Another idea: pip-compile can read source requirements from setup.py, so it could be extended to support setup.cfg too.

Some more info on this: if you have a setup.py that gets its install_requires from setup.cfg, this seems to already work (did a quick test with pip-compile 5.3.1), and there seems to already be an issue to support setup.cfg directly (jazzband/pip-tools#1047).

@jaraco
Copy link
Member

jaraco commented Nov 5, 2020

Another idea: pip-compile can read source requirements from setup.py, so it could be extended to support setup.cfg too.

I'd take this a step further and argue that pip-compile (and other projects that need to inspect the requirements for the source of a project) should use pep517.meta.load to extract/parse metadata (and requirements). Such an approach would not only support setup.py and setup.cfg files, but would also support extras and builders other than setuptools.

Example:

tempora master $ pip-run -q pep517 -- -c "from pep517 import meta; print(meta.load('.').metadata.get_all('Requires-Dist'))"
['pytz', 'jaraco.functools (>=1.20)', "sphinx ; extra == 'docs'", "jaraco.packaging (>=3.2) ; extra == 'docs'", "rst.linker (>=1.9) ; extra == 'docs'", "pytest (!=3.7.3,>=3.5) ; extra == 'testing'", "pytest-checkdocs (>=1.2.3) ; extra == 'testing'", "pytest-flake8 ; extra == 'testing'", "pytest-cov ; extra == 'testing'", "jaraco.test (>=3.2.0) ; extra == 'testing'", "backports.unittest-mock ; extra == 'testing'", "freezegun ; extra == 'testing'", "pytest-freezegun ; extra == 'testing'", 'pytest-black (>=0.3.7) ; (platform_python_implementation != "PyPy") and extra == \'testing\'', 'pytest-mypy ; (platform_python_implementation != "PyPy") and extra == \'testing\'']

@nbraud
Copy link

nbraud commented Jan 6, 2021

CI+caching is another usecase where this would be useful: I'd like to cache my venv folder (using e.g. the hash of requirements.txt as a cache key), but right now this means i have to duplicate the requirements in setup.cfg and requirements.txt for this to work.

Same thing here: preinstalling dependencies in a container image (that's rebuilt when dependencies change) can yield <10s CI times (incl. testing on all supported versions of CPython, linting, etc.) on most of my projects, but doing that in a sensible way requires having the list of dependencies in a separate file.

Quick CI times can do a lot for project health and developer experience:

  • waiting on CI means contributors are more tempted to stuff somewhat-unrelated changes in the same PR;
  • waiting also interrupts the “flow”, forcing the developer to either stop coding (forcing them out of the productive headspace if they reached it) or juggle more concurrent PRs (higher cognitive load, risk of merge conflicts, ...)

I know it's perfectly feasible with setup.py, but given that it's pretty-much always the one thing that requires writing that file, file: support in the *_requires options would let many projects get rid of setup.py altogether.

@nbraud
Copy link

nbraud commented Jan 6, 2021

PS: I just added to bork (a dev/release automation tool for Python projects) the ability to dump the list of dependencies, using pep517.meta as per @jaraco's comment. This is far from a complete solution, but at least it will be a stopgap for Bork users.

@dbowring
Copy link

For the pip-compile use case, it's now possible to compile from setup.cfg, E.g.,

# setup.cfg
[metadata]
name = my-package
# ...
[options]
install_requires =
    requests>2,<3

[options.extras_require]
dev =
    mypy
    types-requests
$ pip-compile --extra dev --output-file requirements-dev.txt setup.cfg

Resulting in:

#
# This file is autogenerated by pip-compile with python 3.9
# To update, run:
#
#    pip-compile --extra=dev --output-file=requirements-dev.txt setup.cfg
#

certifi==2021.5.30
    # via requests
chardet==4.0.0
    # via requests
idna==2.10
    # via requests
mypy==0.910
    # via my-package (setup.cfg)
mypy-extensions==0.4.3
    # via mypy
requests==2.25.1
    # via my-package (setup.cfg)
toml==0.10.2
    # via mypy
types-requests==2.25.0
    # via my-package (setup.cfg)
typing-extensions==3.10.0.0
    # via mypy
urllib3==1.26.6
    # via requests

@ssbarnea
Copy link

I was looking for this feature for years but since pip-compile got support for reading setup.cfg, I am no longer sure how critical this is. Once of the deep PITA remaining is that dependabot team failed to add support for reading setup.cfg, ignoring what is in fact the recommended way to record dependencies for python projects. Also I was not able to find a tool that can properly update setup.cfg with new requirements, maybe some watching this ticket know one.

If support for file: is added, it should also be added for extras, or we would endup with requirements in multiple places in multiple formats.

My use case for pip-compile is bit different than the original one, I am using it to build constraints.txt file which lock everything, including the test dependencies, so we can be sure that two runs on CI/CD or local would use the same libraries, making testing bit more predictable. Those constraints do not really end-up as effective package dependencies, as those are more relaxed using ranges manually updated inside setup.cfg files.

@adamcunnington
Copy link

adamcunnington commented Aug 7, 2021

I don't understand why this is still being debated :|.

Let the developer provide install_requires = file: requirements.txt. What's the big deal?
Unfortunately setuptools-declarative-requirements is "broken" as of some recent changes in ecosystem (not sure of exact course) so that option is now less attractive as only works with absolute paths.

@henryiii
Copy link
Contributor

henryiii commented Nov 3, 2022

FYI, there's an easy footgun with this feature: if you forget to include the requirements source in the SDist, you'll get broken SDists. Technically, I think you could fall back on the egg-info if (and only if) this happened, but don't know if it's worth it.

pypa/build#530

Also, this is the requirements it is including, to see how this is being used in the wild. :D

@jaraco
Copy link
Member

jaraco commented Nov 3, 2022

Probably Setuptools' file inference should always include any files it used to build metadata (i.e. any file: reference).

I note that this issue would not have occurred in a system where the file is stored in source control and a SCM-based file finder was employed (e.g. setuptools_scm).

@abravalheri
Copy link
Contributor

abravalheri commented Nov 4, 2022

FYI, there's an easy footgun with this feature

That is very true.

Since it was first proposed, there was a general understanding that this feature is "not for everyone", and that the developer "needs to know what they are doing". So this kind of problem does not really come as a surprise to me.

There are a few open FR tickets related to this (#2821, #3570, maybe more) with the help wanted label, so that users that care for this extra functionality can contribute. (I do think that the "use setuptools-scm or equivalent plugin" advice as pointed out by Jason is a good temporary solution while the automatically inclusion is not implemented).

Meanwhile we can also add a "warning" box in the documentation.

@layday
Copy link
Member

layday commented Nov 4, 2022

Well, I don't think that the developer needs to know what they are doing to use file:; in fact, project configuration is almost entirely imitative.

@pganssle
Copy link
Member

pganssle commented Nov 4, 2022

Well, I don't think that the developer needs to know what they are doing to use file:; in fact, project configuration is almost entirely imitative.

The whole reason I was opposed to this feature is that it is enabling a bad workflow that many people naively think would be a good thing to do (including a requirements.txt). The reason it was added was to make it more convenient for some people who want to do a different thing that most people don't want to do (include a requirements.in file). You absolutely need to know what you are doing to use this feature correctly.

Of course, the much better solution to this would be for pip-compile to support compiling from PEP 621 metadata or from PEP 517 metadata, since that works no matter what your backend. When / if that happens, the only people using this feature will be people who don't know that it's not needed and people doing the wrong thing. 🙁

@spacemanspiff2007
Copy link

spacemanspiff2007 commented Feb 15, 2023

@pganssle
I stumbled across this issue a couple times and still don't understand how a pinned version in the setup.cfg is an antipattern.

Currently I distribute multiple applications through pypi.
I always have two requirements files:

  • requirements.txt
  • requirements_tests.txt

requirements.txt:

dependency_a == 2.1
dependency_b >= 3.1, < 3.4

requirements_tests.txt:

-r requirements.txt
dependency_test_1 ==1.0
dependency_test_2 >=2.0, < 2.1

Some versions are properly pinned and some dependencies use semantic versioning so I can use a version range in the requirements files.

During CI and local testing I install the requirements_tests.txt and in my setup.py I just read the contents of requirements.txt and add it as install_requires.
That way I can ensure that the version I distribute are always the versions I used for testing and that way the distributed application always works.

Since I am not distributing a e.g. low level library I thought it would be good practice to pin the dependencies and ensure the application installation is reproducible.
Could you elaborate why this is considered an anti-pattern?

@sinoroc
Copy link

sinoroc commented Feb 15, 2023

@spacemanspiff2007 From my point of view:

For applications it is less of an issue than for libraries. I do not think anyone is saying the opposite. But...

Even for applications it can end up being counter-productive to hard pin the dependency versions directly in the package metadata. If I want to install one of your applications X years from now, long after you have stopped maintaining this application, while the dependencies of this application have received later releases with critical bug and security fixes... How hard is it going for me to install the application with the fixed dependencies if their versions are hard pinned in the package metadata of the application? How many hoops will I have to jump through?

My recommendation is:

  • In the package metadata (libraries and applications) exclude the dependency version ranges that you know for a fact are incompatible at the time you are writing the package metadata. That's it.

    • You can not predict the future so avoid upper bounds. Even in Semver incrementing the major version number does not equal breaking change. It is perfectly acceptable to increment major without making a breaking change. There is a widely spread falsehood at play there.
  • For an application, you should provide and document a safe and recommended set of pinned dependencies that the user can opt to use (maybe in the form of a requirements.txt file). Recommended being the key word here, you should let the users in charge of doing the installation decide what is best for them, their system, their environment, their own specific requirements that you know nothing about.

And finally, there is also not much debate about the fact that wheels (and sdist and whatever we can find on PyPI right now) are okay-ish but not great means of distribution for applications (for libraries there is not much to complain about). If you are serious about distributing applications, you should think about moving towards additional tooling.

@dolfinus
Copy link

dolfinus commented Feb 15, 2023

@spacemanspiff2007 Instead of >= 3.1; < 3.4 you should use >= 3.1, < 3.4

@spacemanspiff2007
Copy link

@spacemanspiff2007 Instead of >= 3.1; < 3.4 you should use >= 3.1, < 3.4

Thanks for the hint. I just typed it quickly together, of course I'm using , and the correct syntax 😉


Even for applications it can end up being counter-productive to hard pin the dependency versions directly in the package metadata. If I want to install one of your applications X years from now, long after you have stopped maintaining this application, while the dependencies of this application have received later releases with critical bug and security fixes... How hard is it going for me to install the application with the fixed dependencies if their versions are hard pinned in the package metadata of the application? How many hoops will I have to jump through?

You make the assumption that during the whole time there were no breaking changes in the application dependencies and my application will still start and work as expected.
In reality it's more likely that at least one of the libraries had a breaking change and my application will crash with errors.
Now you have to try out all the different versions of the dependencies until the application works again.

I guess it's up to what the actual goal is:
Do I want to distribute a working application or do I want to distribute an application which might not work but has up to date dependencies. From a user perspective it's definitely the first, a developer would probably also be okay with the second.

  • Recommended being the key word here, you should let the users in charge of doing the installation decide what is best for them, their system, their environment, their own specific requirements that you know nothing about.

But install_requires is already more or less a recommendation, so it does what you are suggesting?

pip install application
pip install dependency_of_app_I_know_better == 6.0

And finally, there is also not much debate about the fact that wheels (and sdist and whatever we can find on PyPI right now) are okay-ish but not great means of distribution for applications (for libraries there is not much to complain about). If you are serious about distributing applications, you should think about moving towards additional tooling.

I absolutely agree.
Currently I always write an instruction how to create a venv, install the application from pip and then start it as a service. Do you have any hints and/or tips for further tooling I could look at?

@sinoroc
Copy link

sinoroc commented Feb 15, 2023

@spacemanspiff2007

The one and only important argument I am trying to make, and I believe you already understood it, is that install_requires is not meant for pinned dependencies or indirect dependencies.

So now if you ask where you should list all pinned dependencies for a good end-user application packaging and distribution flow. Then sadly there is no real good answer (but for sure the answer is not install_requires). This lack of a good answer here is one of the main reasons why the current Python packaging ecosystem is not well fitted for distributing applications (and why this seemingly simple and justified feature request turned out more contentious than one could expect).

An okay-ish recommendation I can make is to provide and document a file with the requirements.txt file format containing this list of all direct and indirect dependencies with pinned versions that have been tested against (note that requirements would be a bad name for this file, recommended-safe-and-tested-dependencies.txt would be a better name). This file could be copy-pasted in the documentation (in the Readme?) or distributed in the distribution packages (sdist and wheel) for example.

Do you have any hints and/or tips for further tooling I could look at?

Personally I feel like we will not have great solutions until we have a great (standardized) lock file format (which is being worked on, so keep an eye out for that but do not hold your breath).

I can also recommend to look at things like zipapp-based solutions (pex and shiv for example). Then of course you have the whole group of Briefcase, PyInstaller, cx-Freeze, py2exe, and so on.

@spacemanspiff2007 Although I feel like everything has already been said here and elsewhere, if you feel like continuing the discussion on this, I suggest to continue in a different location. I can suggest these 2 places, where you can ping/mention me (@sinoroc):

csris added a commit to csris/helm that referenced this issue May 6, 2023
Installing with `pip install -e .` fails due to setup.py being
deprecated after setuptools adopted PEP-517 [1]. Migrate to building
using a setup.cfg file.

This was mostly a 1-to-1 migration except for setting
`install_requires`. Setuptools recommends not reading `requirements.txt`
as the value for `install_requires` [4].

See also:
- A Practical Guide to Setuptools and Pyproject.toml [2]
- Configuring setuptools using setup.cfg files [3]

[1] https://peps.python.org/pep-0517/
[2] https://godatadriven.com/blog/a-practical-guide-to-setuptools-and-pyproject-toml/
[3] https://setuptools.pypa.io/en/latest/userguide/declarative_config.html
[4] pypa/setuptools#1951
@Nekmo
Copy link

Nekmo commented Jul 6, 2023

I understand that the requirements.txt is not recommended for the install_requires option. But the requirements.in is my install_requires list. Without this feature, we are duplicating the dependencies in two files.

We need this feature for prevent future issues and inconsistencies.

@ThiefMaster
Copy link

This feature is already there since a long time...

@henryiii
Copy link
Contributor

henryiii commented Jul 6, 2023

Since 62.6.0 actually. (though slightly better in 66.1.0 since then the files get auto-included in the SDist, too)

@Nekmo
Copy link

Nekmo commented Jul 6, 2023

Right, I was using an older version of setuptools. I read issue #3793 and thought it had been rejected. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.