Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plan for metadata proposal #230

Open
henryiii opened this issue Mar 17, 2023 · 54 comments
Open

Plan for metadata proposal #230

henryiii opened this issue Mar 17, 2023 · 54 comments

Comments

@henryiii
Copy link
Collaborator

henryiii commented Mar 17, 2023

Need

In the core metadata specification originally set out in PEP 621 there is the possibility of marking fields as "dynamic", allowing their values to be determined at build time rather than statically included in pyproject.toml. There are several popular packages which make use of this system, most notably setuptools_scm, which dynamically calculates a version string based on various properties from a project's source control system, but also e.g. hatch-fancy-pypi-readme, which builds a readme out of user-defined fragments (like the latest version's CHANGELOG). Most backends, including setuptools, PDM-backend, hatchling, and flit-core, also have built-in support for providing dynamic metadata from sources like reading files.

With the recent profusion of build-backends in the wake of PEPs 517 and 518, it is much more difficult for a user to keep using these kind of tools across their different projects because of the lack of a common interface. Each tool has been written to work with a particular backend, and can only be used with other backends by adding some kind of adapter layer. For example, setuptools_scm has already been wrapped into a hatchling plugin (hatch-vcs), and into scikit-build-core. Poetry also has a custom VCS versioning plugin (poetry-dynamic-versioning), and PDM has a built-in tool for it. However, these adapter layers are inconvenient to maintain (often being dependent on internal functions, for example), confusing to use, and result in a lot of duplication of both code and documentation.

We are proposing a unified interface that would allow metadata providing tools to implement a single function that build backends can call, and a standard format in which to return their metadata. Once a backend chooses to adopt this proposed mechanism, they will gain support for all plugins implementing it.

We are also proposing a modification to the project specification that has been requested by backend and plugin authors to loosen the requirements slightly on mixing dynamic and static metadata, enabling metadata plugins to be more easily adopted for some use cases.

Proposal

Implementing a metadata provider

Our suggestion is that metadata providers include a module (which could be the top level of the package, but need not be) which provides a function dynamic_metadata(fields, settings=None). The first argument is the list of fields requested of the plugin, and the second is the extra settings passed to the plugin configuration, possibly empty. This function will run in the same directory that build_wheel() runs in, the project root (to allow for finding other relevant files/folders like .git).

The function should return a dictionary matching the pyproject.toml structure, but only containing the metadata keys that have been requested. dynamic, of course, is not permitted in the result. Updating the pyproject_dict with this return value (and removing the corresponding keys from the original dynamic entry) should result in a valid pyproject_dict. The backend should only update the key corresponding to the one requested by the user. A backend is allowed (and recommended) to combine identical calls for multiple keys - for example, if a user sets "readme" and "license" with the same provider and arguments, the backend is only required to call the plugin once, and use the readme and license fields.

An optional hook1, get_requires_for_dynamic_metadata, allows providers to determine their requirements dynamically (depending on what is already available on the path, or unique to providing this plugin).

Here's an example implementation:

def dynamic_metadata(
    fields: Sequence[str],
    settings: Mapping[str, Any],
) -> dict[str, dict[str, str | None]]:
    if settings:
        raise RuntimeError("Inline settings are not supported by this plugin")
    if fields != ["readme"]:
        raise RuntimeError("This plugin only supports dynamic 'readme'")

    from hatch_fancy_pypi_readme._builder import build_text
    from hatch_fancy_pypi_readme._config import load_and_validate_config

    with Path("pyproject.toml").open("rb") as f:
        pyproject_dict = tomllib.read(f)

    config = load_and_validate_config(
        pyproject_dict["tool"]["hatch"]["metadata"]["hooks"]["fancy-pypi-readme"]
    )

    return {
        "readme": {
            "content-type": config.content_type,
            "text": build_text(config.fragments, config.substitutions),
        }
    }


def get_requires_for_dynamic_metadata(
    settings: Mapping[str, Any] | None = None,
) -> list[str]:
    return ["hatch-fancy-pypi-readme"]

Using a metadata provider

For maximum flexibility, we propose specifying a 1:1 mapping between the dynamic metadata fields and the providers (specifically the module implementing the interface) which will supply them.

The existing dynamic specification will be expanded to support a table as well:

[project.dynamic]
version = {provider = "plugin.submodule"}                   # Plugin
readme = {provider = "local_module", path = "scripts/meta"} # Local plugin
classifiers = {provider = "plugin.submodule", max="3.11"}   # Plugin with options
requires-python = {min = "3.8"}                             # Build-backend specific
dependencies = {}                                           # Identical to dynamic = ["dependences"]
optional-dependences = "some_plugin"                        # Shortcut for provider =

If project.dynamic is a table, a new provider="..." key will pull from a matching plugin with the hook outlined above. If path="..." is present as well, then the module is a local plugin in the provided local path (just like PEP 517's local backend path). All other keys are passed through to the hook; it is suggested that a hook validate for unrecognized keys. If no keys are present, the backend should fall back on the same behavior a string entry would provide.

Many backends already have some dynamic metadata handling. If keys are present without provider=, then the behavior is backend defined. It is highly recommended that a backend produce an error if keys that it doesn't expect are present when provider= is not given. Setuptools could simply its current tool.setuptools.dynamic support with this approach taking advantage of the ability to pass custom options through the field:

# Current
[project]
dynamic = ["version", "dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
version = {attr="mymod.__version__"}
dependencies = {file="requeriments.in"}
optional-dependencies.dev = {file="dev-requeriments.in"}
optional-dependencies.test = {file="test-requeriments.in"}


# After
[project.dynamic]
version = {attr="mymod.__version__"}
dependencies = {file="requeriments.in"}
optional-dependencies.dev = {file="dev-requeriments.in"}
optional-dependencies.test = {file="test-requeriments.in"}
# "provider = "setuptools.dynamic.version", etc. could be set but would be verbose

Another idea is a hypothetical regex based version discovery, which could look something like this if it was integrated into the backend:

[project.dynamic]
version = {location="src/package/version.txt", regex='Version\s*([\d.]+)'}

Or like this if it was a plugin:

[project.dynamic.version]
provider = "regex.searcher.version"
location = "src/package/version.txt"
regex = 'Version\s*([\d.]+)'

Using project.dynamic as a table keeps the specification succinct without adding extra fields, it avoids duplication, and it is handled by third party libraries that inspect the pyproject.toml exactly the same way (at least if they are written in Python). The downside is that it changes the existing specification, probably mostly breaking validation - however, this is most often done by the backend; a backend must already opt-into this proposal, so that is an acceptable change. pip and cibuildwheel, two non-backend tools that read pyproject.toml, are unaffected by this change.

To keep the most common use case simple2, passing a string is equivalent to passing the provider; version = "..." is treated like version = { provider = "..." }. This makes the backend implementation a bit more complex, but provides a simpler user experience for the most common expected usage. This is similar to the way to how keys like project.readme = and project.license = are treated today.

Supporting metadata providers:

An implementation of this proposal already exists for the scikit-build-core backend and uses only standard library functions. Implementations could be left up to individual build backends to provide but if the proposal were to be adopted then would probably coalesce into a single common implementation. pyproject-metdata could hold such a helper implementation.

Proposed changes in the semantics of project.dynamic

PEP 621 explicitly forbids a field to be "partially" specified in a static way (i.e. by associating a value to project.<field> in pyproject.toml) and later listed in dynamic.

This complicates the mechanism for dynamically defining fields with complex/compound data structures, such as keywords, classifiers and optional-metadata and requires backends to implement "workarounds". Examples of practices that were impacted by this restriction include:

In this PEP, we propose to lift this restriction and change the semantics associated with pyproject.dynamic in the following manner:

  • When a metadata field is simultaneously assigned a value and included in pyproject.dynamic, tools should assume that its value is partially defined. The given static value corresponds to a subset of the value expected after the build process is complete. Backends and dynamic providers are allowed augment the metadata field during the build process.

The fields that are arrays or tables with arbitrary entries are urls, authors, maintainers, keywords, classifiers, dependencies, scripts, entry-points, gui-scripts, and optional-dependencies.

Examples & ideas:

Current PEP 621 backends & dynamic metadata

Backend Dynamic? Config? Plugins?
setuptools
hatchling
flit-core
pdm-backend
scikit-build-core 3
meson-python
maturin
enscons
whey
trampolim

"Dynamic" indicates the tool supports at least one dynamic config option. "Config" indicates the tool has some tool-specific way to configure this option. "Plugins" refers to having a custom plugin ecosystem for these tools. Poetry has not yet adopted PEP 621, so is not listed above, but it does have dynamic metadata with custom configuration and plugins. This proposal will still help tools not using PEP 621, as they can still use the plugin API, just with custom configuration (but they are already using custom configuration for everything else, so that's fine).

Rejected ideas

Notes on extra file generation

Some metadata plugins generate extra files (like a static version file). No special requirements are made on such plugins or backends handling them in this proposal; this is inline with PEP 517's focus on metadata and lack of specifications file handling.

Config-settings

The config-settings dict could be passed to the plugin, but due to the fact there's no standard configuration design for config-settings, you can't have generally handle a specific config-settings item and be sure that no backend will also try to read it or reject it. There was also a design worry about adding this in setuptools, so it was removed (still present in the reference implementation, though).

Passing the pyproject.toml as a dict

This would add a little bit of complexity to the signature of the plugin, but would avoid reparsing the pyproject.toml for plugins that need to read it. Also would avoid an extra dependency on tomli for older Python versions. Custom inline settings alleviated the need for almost every plugin to read the pyproject.toml, so this was removed to keep backend implementations & signatures simpler.

New section

Instead of changing the dynamic metadata field to accept a table, instead there could be a new section:

dynamic = ["version"]

[dynamic-metadata]
version = {provider = "plugin_package.submodule"}

This is the current state of the reference implementation, using [tool.scikit-build.metadata] instead of [dynamic-metadata]. In this version, listing an item in dynamic-metadata should be treated as implicitly listing it in dynamic, though listing in both places can be done (primary for backward compatibility).

dynamic vs. dynamic-metadata could be confusing, as they do the same thing, and it actually makes parsing this harder for third-party tools, as now both project.dynamic and dynamic-metadata have to be combined to see what fields could be dynamic. The fact that dict keys and lists are handled the same way in Python provides a nice method to avoid this complication.

Alternative proposal: new array section

A completely different approach to specification could be taken using a new section and an array syntax4:

dynamic = ["version"]

[[dynamic-metadata]]
provider = "plugin_package.submodule"
path = "src"
provides = ["version"]

This has the benefit of not repeating the plugin if you are pulling multiple metadata items from it, and indicates that this is only going to be called once. It also has the benefit of allowing empty dynamic plugins, which has an interesting non-metadata use case, but is probably out of scope for the proposal. The main downside is that it's harder to parse for the dynamic values by third party projects, as they have to loop over dynamic-metadata and join all provides lists to see what is dynamic. It's also a lot more verbose, especially for the built-in plugin use case for tools like setuptools. (The current version of this suggestion listed above is much better than the original version we proposed, though!). This also would allow multiple plugins to provide the same metadata field, for better (maybe this could be used to allow combining lists or tables from multiple plugins) or worse (this has to be defined and properly handled).

This version could enable a couple of possible additions that were not possible in the current proposal. However, most users would not need these, and some of them are a bit out of scope - the current version is simpler for pyproject.toml authors and would address 95% of the plugin use cases.

Multiple plugins per field

The current proposal requires a metadata field be computed by one plugin; there's no way to use multiple plugins for a single field (like classifiers). This is expected to be rare in practice, and can easily be worked around in the current proposal form by adding a local plugin that itself calls the plugins it wants to combine following the standard API proposed. "Merging" the metadata then would be arbitray, since it's implemented by this local plugin, rather than having to be pre-defined here.

Empty plugins (for side effects)

A closely related but separate could be solved by this paradigm as well with some modifications. Several build tools (like cmake, ninja, patchelf, and swig) are actually system CLI tools that have optional pre-compiled binaries in the PyPI ecosystem. When compiling on systems that do not support binary wheels (a very common reason to compile!), such as WebAssembly, Android, FreeBSD, or ClearLinux, it is invalid to add these as dependencies. However, if the system versions of these dependencies are of a sufficient version, there's no need to add them either. A PEP 517 backend has the ability to declare dynamic dependencies, so this can be (and currently is) handled by tools like scikit-build-core and meson-python in this way. However, it might also be useful to allow this logic to be delegated to a metadata provider, this would potentially allow greater sharing of core functionality in this area.

For example, if you specified "auto_cmake" as a provider, it could provide get_requires_for_dynamic_metadata_wheel to supply this functionality to any backend. This will likely best be covered by the "extensionlib" idea, rather than plugins, so this is not worth trying to address unless this array based syntax becomes the proposed syntax - then it would be worth evaluating to see if it's worth trying to include.

Footnotes

  1. Most plugins will likely not need to implement this hook, so it could be removed. But it is symmetric with PEP 517, fairly simple to implement, and "wrapper" plugins, like the first two example plugins, need it. It is expected that backends that want to provide similar wrapper plugins will find this useful to implement.

  2. This also could be removed from the proposal if needed.

  3. In development, based on a version of proposal.

  4. Note, that unlike the proposed syntax, this probably should not repurpose project.metadata, since this would be much more likely to break existing parsing of this field by static tooling. (Static tooling often may not parse this field anyway, since it's easier to check for a missing field - you only need to check the dynamic today if you care about "missing" version "specified elsewhere".)

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 17, 2023

@bennyrowland - I've added you so you should be able to edit the proposal above. What do you think? Let's get a few others involved once it looks good.

@jcfr
Copy link
Contributor

jcfr commented Mar 17, 2023

From @JeanChristopheMorinPerso

Will we be able to support local plugin ?

@bennyrowland
Copy link
Collaborator

bennyrowland commented Mar 19, 2023

@henryiii I think it sounds great. I have made a few tweaks, essentially just framing a few things differently, and added below so it is easier to compare with your version above, feel free to combine them as you see fit.

Done, original edits here

Need

In the core metadata specification originally set out in PEP621 there is the possibility of marking fields as "dynamic", allowing their values to be determined at build time rather than statically included in pyproject.toml. There are several popular packages which make use of this system, most notably setuptools_scm, which dynamically calculates a version string based on various properties from a project's source control system, but also e.g. hatch-fancy-pypi-readme, which builds a readme out of user-defined fragments (like the latest version's CHANGELOG).

With the recent profusion of build-backends in the wake of PEPs 517 and 518, it is much more difficult for a user to keep using these kind of tools across their different projects because of the lack of a common interface. Each tool has been written to work with a particular backend, and can only be used with other backends by adding some kind of adapter layer. For example, setuptools_scm has already been wrapped into a hatchling plugin (hatch-vcs), and into scikit-build-core. However, these adapter layers are inconvenient to maintain (often being dependent on internal functions, for example), confusing to use, and result in a lot of duplication of both code and documentation.

We are proposing a unified interface that would allow metadata providing tools to implement a single function that build backends can call, and a standard format in which to return their metadata.

Proposal

Implementing a metadata provider

Our suggestion is that metadata providers include a module (which could be the top level of the package, but need not be) which provides a function dynamic_metadata(pyproject_dict, config_settings=none). The first argument is the parsed pyproject.toml contents, the second is the config_settings passed to build_wheel(). This function will run in the same directory that build_wheel() runs in, the project root (to allow for finding other relevant files/folders like .git)

The function should return a dictionary matching the pyproject.toml structure, but only containing the metadata keys that the provider knows about. dynamic, of course, is not permitted in the result. Updating the pyproject_dict with this return value (and removing the corresponding keys from the original dynamic entry) should result in a valid pyproject_dict. In practice things are not so simple, a metadata provider could provide values for multiple keys, but the user only wish to use a subset of them, for example.

An optional hook or family of hooks, get_requires_for_dynamic_metadata[_wheel,_sdist,_editable] could also be included in the provider specification. This would allow providers to determine their requirements dynamically (depending on what is already available on the path, see Alternate Ideas below for an example).

Using a metadata provider

For maximum flexibility, we propose specifying a 1:1 mapping between the dynamic metadata fields and the providers (specifically the module implementing the interface) which will supply them.

Option 1:

This could be added to the existing dynamic metadata specification.

dynamic = ["version:provider_pkg.submodule"]

Here the existing entries in dynamic are optionally extended with the path to the provider module, in which case a colon is used to separate the key from the path. If no extra path is provided, the backend should fall back on the current behaviour of expecting the value to be provided via some other mechanism.

The advantage is that this keeps the specification succinct without adding extra fields, the downside is that it changes the existing specification and requires further parsing of the dynamic entries to extract the components. If a single provider is listed under multiple entries, the provider dynamic_metadata() function should be called once and the results reused across all entries.

Option 2:

Add a new section:

dynamic = ["version"]

[dynamic-metadata]
version = "plugin_package.submodule"

This is the current state of the reference implementation, using [tool.scikit-build.metadata] instead of [dynamic-metadata].

Option 2b: reversed order

dynamic = ["version"]

[dynamic-metadata]
"plugin_package.submodule" = ["version"]

This has the benefit of not repeating the plugin if you are pulling multiple metadata items from it, and indicates that this is only going to be called once. It also has the benefit of allowing empty dynamic plugins, which would support the extra idea listed below.

Supporting metadata providers:

An implementation of this proposal already exists for the scikit-build-core backend and uses only standard library functions. Implementations could be left up to individual build backends to provide but if the proposal were to be adopted then would probably coalesce into a single common implementation, either within the standard library or as a separate package.

Examples & ideas:

Alternate ideas:


A second closely related need might be solved by this paradigm as well. Several build tools (like cmake, ninja, and patchelf) are actually system CLI tools that have optional pre-compiled binaries in the PyPI ecosystem. When compiling on systems that do not support binary wheels (a very common reason to compile!), such as WebAssembly, Android, FreeBSD, or ClearLinux, it is invalid to add these as dependencies. However, if the system versions of these dependencies are of a sufficient version, there's no need to add them either. A PEP 517 backend has the ability to declare dynamic dependencies, so this can be (and currently is) handled by tools like scikit-build-core and meson-python in this way. However, it might also be useful to allow this logic to be delegated to a metadata provider, this would potentially allow greater sharing of core functionality in this area.

For example, if you specified "auto_cmake" as a provider, it could provide get_requires_for_dynamic_metadata_wheel to supply this functionality to any backend.

This may be best covered by the "extensionlib" idea, rather than plugins, so this doesn't need to be added unless there's a good alternate use case.

@henryiii
Copy link
Collaborator Author

Okay, edited edited version above. We need to pick a method for local plugins (and add it to the reference implementation).

@henryiii
Copy link
Collaborator Author

Just noticed swig also has a binary distro on pypi. That's another package where only including it if it's not already installed would be useful (and one that's not directly tied to the build system!). Still likely a bit of a misuse of dynamic-metadata to try to push it into here, though.

@abravalheri
Copy link

Hi @henryiii, I was wondering, in your proposal who is responsible for implementing the hook?

For example dynamic_metadata could be targeting both backend and frontend (but I imagine it currently targets backends due to the last phrase in the Needs section), while get_requires_for_build_dynamic_metadata seems to be targeting the frontend...

A question that I have is if there is any expectation that setuptools would support this new spec? At a first glance it might be complicated to support this, given setuptools architecture (there is a very hard separation between the backend hooks and the other setuptools building blocks, they run in different processes and it is not straight forward to exchange values between them).

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 20, 2023

The hook is implemented by plugins (setuptools_scm, for example). But the handling of the hook would be implemented by backends. Frontends would not be affected. It would be up to the backend to implement get_requires_for_build_dynamic_metadata inside PEP 517's get_requires_for_* (similar - actually identical - to the current handling inside the reference implementation here). That is an optional piece - we could probably remove it from the proposal if it's hard to support, as I think most plugins would not need it, and could just rely on their normal dependencies.

Ideally, I'd hope all backends, including setuptools, would eventually support this. But, just like PEP 660 (and other packaging PEPs), it doesn't "need" to be supported immediately, it's opt-in by the backends.

The dynamtic_metadata hook might be able to run entirely within the "other building blocks" process, away from the PEP 517 backend hooks. I guess config-settings might be tricky. get_requires_for_build_dynamic_metadata does need to be within the PEP 517 hooks, but it's pretty simple to support standalone - you just need to process the pyproject.toml, find the listed plugins, and see if they have this hook, and run it if they do. So it might not be quite as bad as it sounds even for setuptools. I'd like to see how hard the other backends think it might be to support.

@abravalheri
Copy link

abravalheri commented Mar 20, 2023

The hook is implemented by plugins (setuptools_scm, for example). But the handling of the hook would be implemented by backends.

Sorry, I forgot some words in my comment, this is what I mean 😝. Thank you for the clarification.

I guess config-settings might be tricky.

Yeah, that is the complicated part. Would it be possible to make it optional for the backend to pass the second parameter?

Ideally, I'd hope all backends, including setuptools, would eventually support this. But, just like PEP 660 (and other packaging PEPs), it doesn't "need" to be supported immediately, it's opt-in by the backends.

If that is the idea, would you be interested in taking an approach similar to PEP 621 and "average out" the needs of the diverse backends? For example, I would very much be interested in sharing the [dynamic-metadata] "TOML namespace" with this proposal...

Currently setuptools makes use of tool.setuptools.dynamic, and having to use in conjunction project.dynamic, dynamic-metadata and tool.setuptools.dynamic seems too much...

If there is an interest, I think something like-ish the following would be ideal:

[dynamic-metadata]
version = {provider = "plugin_package.submodule"}

... specially if extra items in the inline table were allowed, and (in the absence of the provider key) the backend itself was free to fill in the value... (I could contribute to the text if the proponents are on board).

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

I was actually already making that change, since I wanted a path= key. How about the following changes on top of 2a:

  • Keys in dynamic-metadata are implicitly implied to be part of dynamic. Tools that parse the pyproject.toml should also include dynamic-metadata They can explicitly listed in both if an author choses, probably for backward compatibility.
  • Providing a provider = "..." will use this interface
  • path = "..." provides a path for a local plugin
  • Any remaining keys are passed through to the hook
  • A backend is recommended (but not required) to merge identical calls
  • If provider is not provided, the implementation is up to the backend
  • Maybe config-settings should be removed. Since build backends get config-settings and there are no requirements placed on them, I don't think it can be reliably used for anything between all build backends.

That said, I think the main thing you need is actually the arbitrary inline metadata, not the ability to skip provider=, as you could have a provider="setuptools.dynamic.version" or similar.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

Actually, how much validation is done on project.dynamic outside of the build backend? One option would be to simply allow a table for project.dynamic, with [project.dynamic] version={} being treated identically to project.dynamic = ["version"]. This has the nice side effect that the code to analyze a pyproject.toml doesn't change in Python - iterating over the keys of a dict is identical to iterating over a list. And it has the nice feature of not having to merge a project.dynamic list with the dynamic-metadata keys to see what fields are dynamic, and having two places to set nearly the same thing.

Honestly, I think this is a easier change than Option 1.

@abravalheri
Copy link

  • Keys in dynamic-metadata are implicitly implied to be part of dynamic. Tools that parse the pyproject.toml should also include dynamic-metadata They can explicitly listed in both if an author choses, probably for backward compatibility.
  • Providing a provider = "..." will use this interface
  • path = "..." provides a path for a local plugin
  • Any remaining keys are passed through to the hook
  • A backend is recommended (but not required) to merge identical calls
  • If provider is not provided, the implementation is up to the backend
  • Maybe config-settings should be removed. Since build backends get config-settings and there are no requirements placed on them, I don't think it can be reliably used for anything between all build backends.

This looks good! Thank you @henryiii.

That said, I think the main thing you need is actually the arbitrary inline metadata, not the ability to skip provider=, as you could have a provider="setuptools.dynamic.version" or similar.

That is true, but the following does look redundant:

[dynamic-metadata]
version = {provider="setuptools.dynamic.version", attr="mymod.__version__"}
dependencies = {provider="setuptools.dynamic.dependencies", file="requeriments.in"}
optional-dependencies.dev = {provider="setuptools.dynamic.dependencies", file="dev-requeriments.in"}
optional-dependencies.test = {provider="setuptools.dynamic.dependencies", file="test-requeriments.in"}

@abravalheri
Copy link

One option would be to simply allow a table for project.dynamic, with [project.dynamic] version={} being treated identically to project.dynamic = ["version"]. This has the nice side effect that the code to analyze a pyproject.toml doesn't change in Python - iterating over the keys of a dict is identical to iterating over a list. And it has the nice feature of not having to merge a project.dynamic list with the dynamic-metadata keys to see what fields are dynamic, and having two places to set nearly the same thing.

I do like this idea.

@abravalheri
Copy link

abravalheri commented Mar 21, 2023

I was thinking about get_requires_for_build_dynamic_metadata.
Is there an use-case for it that is not covered by [build-backend] requires, and/or adding an extra to the dependency specified there?

I mean, let's think about a scenario that you have a metadata provider that needs a specific package to be installed so it can execute. Isn't it just the case to add this package to the dependency list of the metadata provider (considering for example all the available markers and an optional explicit extra)?

@henryiii
Copy link
Collaborator Author

For plugin metadata, possibly not. There's a clear use case for building - ninja, cmake, patchelf, and swig are all examples of things that you can't just "add" to the build-backend.requires, because they are only required if there isn't a copy already available, and the binary wheels are not always available. If you are building on Android, WebAssembly, FreeBSD, ClearLinux, or similar, having these present breaks the build, even if you already installed the CLI tool you need.

However, I'm not aware of a case where this is a problem with metadata. Maybe you could add a Python implementation of Git if git wasn't already on the system? But then not sure how you could have gotten the repo you want to read in the first place. :)

That's very much an optional part of the proposal we could drop (scikit-build-core will likely keep it for quite a while, since it is providing plugin wrappers, and plugin wrappers is a use case for being able to declare a dependency only if the plugin is used).

@bennyrowland
Copy link
Collaborator

I very much like the idea of converting project.dynamic to a table, it feels like the obvious structure for specifying where to get the value from. However, I worry that it might be considered a step too far from a backwards compatibility point of view. I think it should definitely have a place in a proposal, see what other people think about the idea.

I might propose that we retain the originally specified str option as shorthand for {provider = "..."}. For many people they will probably only be getting a dynamic version using setuptools_scm/vcs_versioning without further config and so allowing just the module string as a parameter just keeps things nice and succinct for that most common case. Inline config is a nice addition, I had previously assumed that would all be handled by a providers own [tool.*] section but @abravalheri gives an excellent example with dependencies where you might want to run the same provider with different arguments.

That is true, but the following does look redundant:

Actually, to me that reads quite clearly and I think is better than trying to reason about what the backend will do if you don't specify the provider (explicit better than implicit etc.).

The discussion on get_requires_for_build_dynamic_metadata() is interesting. I suspect it will only be relevant to a small number of potential providers (mainly around linking to other build systems like cmake and meson) but it doesn't seem like a major pain to include it in the specification, most providers can ignore it, and adding it to backends shouldn't be too hard (I would have thought), given that there is already a similar mechanism for backends themselves.

The "feature creep" of thinking about how this could be extended to e.g. allow backends to outsource their own get_requires_for_build process to plugins suggests that there might be a wider scope to think about for other types of "backend plugins" beyond the metadata question. However, at the moment I suspect that those kind of features are much more likely to be more backend specific, compared to the project metadata which is very universal. They are also probably more of a problem for backend developers, rather than users?

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

I'm not as strongly in favor of the shortcut as I thought I'd be, actually.

[project.dynamic]
license = "mylicenseapp"

To me, that looks a little like you are setting the value to "mylicenseapp", rather than running mylicenseapp, unless you pick up on the dynamic in the header. while:

[project.dynamic]
license = {provider = "mylicenseapp"}

is clearly not just setting the value to a string.

Do we have an idea of what would break if project.dynamic was a table? I know backends validate this, but that's completely fine - backends have to adopt the proposal anyway, and users opt-in. But does pip break, for example? scikit-build-core uses pyproject-metadata, making it a little harder to quickly check this, but flit-core probably could be used to test it.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

I tested this with flit-core. By disabling one line (the check to make sure that "dynamic" is a list), it builds - both flit and pip are happy with it, because the keys behave like a list.

diff --git a/flit_core/flit_core/config.py b/flit_core/flit_core/config.py
index 1292956..4e6fbf9 100644
--- a/flit_core/flit_core/config.py
+++ b/flit_core/flit_core/config.py
@@ -609,7 +609,7 @@ def read_pep621_metadata(proj, path) -> LoadedConfig:
         lc.reqs_by_extra['.none'] = reqs_noextra

     if 'dynamic' in proj:
-        _check_list_of_str(proj, 'dynamic')
+        # _check_list_of_str(proj, 'dynamic')
         dynamic = set(proj['dynamic'])
         unrec_dynamic = dynamic - {'version', 'description'}
         if unrec_dynamic:
diff --git a/flit_core/pyproject.toml b/flit_core/pyproject.toml
index affb6d0..5553271 100644
--- a/flit_core/pyproject.toml
+++ b/flit_core/pyproject.toml
@@ -17,7 +17,9 @@ classifiers = [
     "License :: OSI Approved :: BSD License",
     "Topic :: Software Development :: Libraries :: Python Modules",
 ]
-dynamic = ["version"]
+
+[project.dynamic]
+version = {}

 [project.urls]
 Documentation = "https://flit.pypa.io"

@ofek
Copy link
Contributor

ofek commented Mar 21, 2023

This would fail with Hatchling because types are strictly enforced per the PEP and I'm assuming setuptools will also break

@henryiii
Copy link
Collaborator Author

That's fine (same is true with flit-core - and scikit-build-core/meson-python, too, since pyproject-metadata strictly validates) - the build backend has to be updated to support this proposal. But that's fine - any update to the pyproject spec needs to be supported by backends one by one; if your chosen backend doesn't support this proposal yet, then just don't use it there yet! :) The problem is other consumers of pyproject.toml - pip, for example; the pyproject.toml author doesn't have control over that. Those generally don't do as much validation, since they are not there so much to help authors write pyproject.toml, they are just trying to read existing ones.

@ofek
Copy link
Contributor

ofek commented Mar 21, 2023

I don't know much about the code of pip but my assumption is that it's not doing anything here actually but rather using the backend to generate the metadata which it then reads

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

The rule is "you can statically infer x from pyproject.toml if it exists and unless x is listed in dynamic" - and that rule remains the same in Python, "in dynamic" is the same regardless of whether it's a dict or a list. (and, technically, since you can't do both, I'm not sure if anyone really checks the dynamic list - if it's listed, it's inferable, generally).

I would assume so, but Pip at least looks at pyproject.toml, so it was worth checking. Old (18 and older) versions of pip are broken by using x.y TOML syntax, due to use of pytoml. 🤦

GitHub's dependency graph is one of the main users I'm aware of that's reading pyproject.toml for static inference; we could verify that they wouldn't be broken by the change and/or could be updated to support the change.

cibuildwheel also does look at this, but I know that it wouldn't be broken by this - it doesn't bother to check the dynamic table, but just checks for the presence of project.requires-python.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

Is it really necessary to pass the parsed pyproject.toml dict in? The tool should be able to find it (using the same procedure as the PEP 517 build hook calling it), and reparsing it's not that expensive, I'd think. Some tools will have their own configuration options, probably, it doesn't have to be forced into the pyproject.toml. It could tempt tools to try to modify the dict inline, which is not how it's supposed to work.

(I'be been continuing to edit the initial proposal at the top, remember to check for updates)

@bennyrowland
Copy link
Collaborator

That is something I thought about initially, and eventually decided to pass the pyproject_dict both to save the cost of reparsing (agree probably not terribly expensive) but also to avoid any possibility of the provider failing to parse the file correctly. Again, if the only line required in the provider is tomlib.load(“pyproject.toml”) to exactly reproduce the dict then it is probably quite hard for someone to mess that up.

On the other hand, fairly unlikely that any provider is going to spend too long trying to modify the dict instead of returning the results correctly, so I don’t have a strong opinion on this either way.

@henryiii
Copy link
Collaborator Author

I think the existing plugins are either parsing the file already, and it's just just a simple load (backends should not be passing a modified pyproject dict), so it's pretty easy - the only thing is it does require tomllib on older Pythons, but a plugin depending on that shouldn't be an issue (you could even add it to get_requires_for_dynamic_metadata!). Keeping the interface simple is nice too - and some plugins might want to support <plugin>.toml, etc. I don't have a good feel for which is better.

@ofek
Copy link
Contributor

ofek commented Mar 21, 2023

I think the existing plugins are either parsing the file already

Hatchling/Hatch view that as very wasteful which is why it passes the data to plugins, which also avoids the conditional use of TOML libraries for everyone

@henryiii
Copy link
Collaborator Author

Well, a TOML library requirement (for reading, anyway) will simply go away in a few years, and is already a very lightweight dependency. My general stance is any conditional backport library should be seen as a non-dependency. Okay, though, you've convinced me to keep it in, and only pull it if people complain about it.

@eli-schwartz
Copy link

My general stance is any conditional backport library should be seen as a non-dependency.

tomllib is "fun" because it's not just a conditional dependency, it's also a try/except import block. It's trivial, it's just mind-numbing.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 21, 2023

It's a if sys.version_info block, that is. Backports should always be based on sys.version_info, not try/except. Then readers know why it's there and when to get rid of it - in fact, pyupgrade/Ruff will even remove them for you when the time is right that way. MyPy understands it too. But yes, it's five total lines instead of one.

@abravalheri
Copy link

abravalheri commented Mar 22, 2023

Actually, to me that reads quite clearly and I think is better than trying to reason about what the backend will do if you don't specify the provider (explicit better than implicit etc.)

Hi @bennyrowland, I understand that it is more explicit, but from the standpoint of the UX of the developer writing the file, I think it is mostly boilerplate ...

If we think about what are the trends right now in terms of pyproject.toml metadata/build configuration, I argue that the community moved past the "implicit vs. explicit" debate in favour of convenience (achieved via a "convention over configuration" approach).

For example, when selecting which modules/packages to include in the distribution and where to find them, most of the backends will follow a convention (and the user will only need to use an explicit config if they want to diverge from the convention).


I believe that most frequently than not, backends are equipped with some dynamic metadata capabilities. For example, PDM can derive at least version and classifiers; flit can derive version and description; setuptools can derive version, dependencies and concatenate files for the readme; whey can derive classifiers, dependencies and requires-python, etc...

Considering that the main audience of the project and an eventual dynamic-metada/project.dynamic tables is the backend, a good "convention over configuration" approach would be to assume that if provider or path are not given, the backend itself is expected to provide the dynamic value.

One thing we can do in the spec is to add something along the lines: "If a provider is not specified and the backend does not know how to obtain the dynamic value of the specified field, the backend SHOULD raise and exception and halt the build process".

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 22, 2023

I've updated above, asking a backend to error if provides= is not present and it sees unexpected keys is a good idea, and I'd already done that for plugins.

The current implementation was designed to allow a rather expensive check (cmake metadata, for example) to only require one run. The one downside is that the check doesn't know exactly what values were requested of it. Though, come to think about it, it can get this from the pyproject.toml itself, so I guess it's actually available if necessary. I guess that's okay. (Thinking about a regex-based plugin that could provide multiple fields, similar to hatchling's integrated version one)

@abravalheri
Copy link

Thank you very much @henryiii.

What do you think about the following signature?

def dynamic_metadata(
    fields: list[str],
    pyproject_dict: Mapping[str, Any],
    settings: Mapping[str, Any] | None = None,
) -> dict[str, str | dict | list | None]:
   ...

I changed a little bit the return type to accommodate things like dependencies/optional-dependencies, which also deal with lists and dict of lists.

I am also proposing that we pass a list of fields that the plugin is supposed to provide (since the text recommends to combine identical calls for multiple keys). This way the plugin does not need to inspect the pyproject_dict to find out that information1.

Footnotes

  1. Regarding https://github.com/scikit-build/scikit-build-core/issues/230#issuecomment-1478531282, I believe that by passing fields and settings we are effectively alleviating the need for pyproject_dict and/or the tools parsing pyproject.toml themselves.

@bennyrowland
Copy link
Collaborator

@abravalheri what is your proposed content for the settings parameter - is it the inline parameters? I think we would certainly continue to need the pyproject_dict content for some providers: hatch-fancy-pypi-readme for example requires far too much config to include inline.

Passing the list of fields requested of the provider is only relevant to providers which a) provide more than one field and b) can provide at least one field substantially less expensively than another (otherwise you might as well provide all of them). Might it therefore make more sense for such a provider to provide specific settings values that can be used to stop it from doing an expensive check e.g. version = {provider="...", get="VERSION_ONLY"}. At the end of the day, the backend is not going to enforce the provider only providing the fields it specified.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 22, 2023

settings is all the options passed in the inline table except for provider= and path=.

I'm thinking of a regex based plugin:

version = {provider = "myregexplugin", regex = 'version = (.*)'}

How would it know what fields to fill in? Would it just return a dict with all known fields set? Actually, there may not be that many:

class RetDict(TypedDict, total=False):
    version: str
    description: str
    requires-python: str
    
    readme: str | dict[str, str]
    
    license: dict[str, str]
    urls: dict[str, str]

    authors: list[dict[str, str]]
    maintainers: list[dict[str, str]]
    
    keywords: list[str]
    classifiers: list[str]
    dependencies: list[str]
    
    scripts: dict[str, dict[str, str]]
    entry-points: dict[str, dict[str, str]]
    gui-scripts: dict[str, dict[str, str]]
    
    optional-dependencies: dict[str, list[str]]

(Note: actually implementing this would require the TypedDict("RetDict", {...}, total=False) form due to the dashes in some entries)

So only three - version, description, and requires-python. The others would already requires something different to produce the different return type. So maybe that's fine. Certainly it's not expensive to list the same thing three times.

@bennyrowland
Copy link
Collaborator

Ok, I see what you mean. I had imagined that a provider would know in advance what fields it knew how to provide and is doing a specific job, but you could in principle have a provider which is more generic and could be used to provide multiple pieces of metadata simply by calling it with different parameters. Not sure how likely that would actually be in practice though? It would probably be easier to solve with an extra custom parameter indicating which field to return than returning the same value for all relevant fields and letting the backend choose though.

@abravalheri
Copy link

abravalheri commented Mar 22, 2023

Hi @bennyrowland, please find my comments below, complementing the answer given by Henry.

what is your proposed content for the settings parameter - is it the inline parameters?

Yes, that is my assumption.

You could in principle have a provider which is more generic and could be used to provide multiple pieces of metadata simply by calling it with different parameters. Not sure how likely that would actually be in practice though?

Even if we don't assume a generic plugin, we can already cluster fields that can be satisfied by the same specific plugin:

  • author and maintainer
  • description, readme, license1
  • scripts, gui-scripts, entry-points
  • dependencies, optional-dependencies2

So passing fields helps the plugin implementation to avoid looping trough the keys (or specific subset) in dynamic in pyproject_dict and comparing the given value for provider.

It would probably be easier to solve with an extra custom parameter indicating which field to return than returning the same value for all relevant fields and letting the backend choose though.

Considering that the backend already will have to loop through project.dynamic/dynamic-metadata, I believe that passing the iteration key3 as an argument to the dynamic_metadata function is easier than having an extra custom parameter.

The reason why I suggested fields: list[str] instead of field: str is because the backend is recommended to combine all identical calls to the same plugin.

So for example:

[project]
name = "my_proj"
author = {name = "John Doe"}

[project.dynamic]
version = {provider = "setuptools_scm"}
maintainer = {provider = "scan_codeowners.github"}
readme = {provider = "fancy_pypi_readme"}
license = {provider = "expand_spdx", value = "MIT"}
description = {provider = "my_proj.docstring_wrapper", path = "src"}
optional-dependencies.all = {combine = ["dev", "tests"]}

Would be translated in the following calls:

setuptools_scm.dynamic_metadata(["version"], {...}, {})
scan_codeowners.github.dynamic_metadata(["maintainer"], {...}, {})
fancy_pypi_readme.dynamic_metadata(["readme"], {...}, {})
expand_spdx.dynamic_metadata(["license"], {...}, {"value": "MIT"})
my_proj.docstring_wrapper.dynamic_metadata(["description"], {...}, {})
backend_specific_dynamic_handler.dynamic_metadata(["optional-dependencies.all"], {...}, {"combine": ["dev", "tests"]})

Footnotes

  1. Even if PEP 639 goes ahead, the plugin can be used to do text replacements/additions/removals in the license description.

  2. optional-dependencies is a tricky one, because each group can be fulfilled by the plugin.

  3. As in for key, value in pyproject_dict["project"].get("dynamic", {}).items())

@abravalheri
Copy link

abravalheri commented Mar 22, 2023

I think we would certainly continue to need the pyproject_dict content for some providers

I agree that some edge cases may require the entire pyproject_dict, but I am assuming that this need will be alleviated by settings and fields.

hatch-fancy-pypi-readme for example requires far too much config to include inline.

From the point of view of TOML, this is valid:

[project.dynamic.readme]
provider = "fancy_pypi_readme"
content-type = "text/markdown"
# ... other settings

Not saying it needs to be this way, but it is definitely a possibility, and it will become more viable with newer versions of the TOML spec, https://github.com/toml-lang/toml/pull/904.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 22, 2023

Wow, that's fantastic, so glad that happened - it's a great user quality of life improvement to be able to add a newline or a trailing comma in an inline table, like already exists for arrays. Though the main problem here is that the fancy pypi readme makes heavy use of tables in arrays. Though tables in arrays will be be easier - so maybe.

I'd say the "suggested" way for new plugins is to support inline config, and reading tool.whatever is optional, so I'm still in favor of removing the pyproject dict from the interface. It's kind of a weird thing to pass, and it's not something that's got a precedent in PEP 517, and it's easy enough to parse if a tool wants to support it. settings is the most important one.

Does PEP 621 support mixing dynamic and static items in a table? That is, can you make optional-dependencies.<item> dynamic while the others were static? (This would be very useful, but important to know if it's supported). Currently, PEP 621 doesn't support simidynamic fields (fields listed but still statically defined too), though I don't know if tables are special. (IMO, lists would also be useful - you could statically define some trove classifiers, then let the rest be filled in dynamically. Or you could dynamically add some requirements to the existing list based on the platform for a platform-specific wheel, etc. But currently that's not supported.)

@ofek
Copy link
Contributor

ofek commented Mar 22, 2023

Does PEP 621 support mixing dynamic and static items in a table?

No

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 22, 2023

That's what I thought. Would there be interest in updating the interaction between dynamic and static in this proposal, or would that be out of scope for this proposal?

Here's my proposed modification if there was interest:

If a field that can take a list or a table with arbitrary keys is listed in dynamic but also defined statically, the static definition should be seen as a base, but tools statically analyzing this field should not assume that the value is comprehensive if it is listed in dynamic. Build backends and plugins are allowed to add items to the list or keys to the table; however, they should not modify the values provided statically. Tools can add urls, authors, maintainers, keywords, classifiers, dependencies, scripts, entry-points, gui-scripts, or optional-dependencies, but they cannot remove or modify existing items.

And if there's not, I don't think it's such a big deal, a tool still could provide all of optional-dependencies entirely from it's own config.

@ofek
Copy link
Contributor

ofek commented Mar 22, 2023

That seems way too complex. I think this discussion should be on Discourse btw so more people can see

@henryiii
Copy link
Collaborator Author

Discourse is the next step, yes, just trying to get a good idea of what to propose before expanding the audience. This is the place to propose and shoot down wild ideas, then the final version will be posted to discuss.python.org. That's why I've been modifying the proposal above, that's what gets copied into discourse. :)

@abravalheri
Copy link

abravalheri commented Mar 22, 2023

Does PEP 621 support mixing dynamic and static items in a table?

Not yet, I am afraid.
I was waiting for us to achieve a high level agreement on the proposal, but it was in the back of my mind that optional-dependencies is an edge case that still needs to be discussed.

So far optional-dependencies is all or nothing, which is very inconvenient... The same problem happens for classifiers and keywords, which motivated pdm-backend to promote some inconvenient changes.

Since the idea is to write a PEP, we could propose a change in this behaviour1, something like:

Proposed changes in the semantics of `project.dynamic`
------------------------------------------------------
:pep:`PEP 621 <621#dynamic>` explicitly forbids a field to be "partially" specified in a static way
(i.e. by associating a value to `project.<field>` in `pyproject.toml`)
and later listed in `dynamic`.

This complicates the mechanicanism for dynamically defining fields with complex/compound data structures,
such as  `keywords`, `classifiers` and `optional-metadata` and requires backends to implement
"workarounds". Examples of practices that were impacted by this restriction include:

- `whey`'s re-implementation of [`classifiers`](https://whey.readthedocs.io/en/latest/configuration.html#tconf-tool.whey.base-classifiers) in a `tool` subtable
- the [removal](https://github.com/pdm-project/pdm/pull/759) of the `classifiers` augmentation feature in `pdm-backed`.
- [setuptools restrictions](https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html#dynamic-metadata) on dynamic `optional-dependencies`

In this PEP, we propose to lift this restriction and change the semantics associated with `pyproject.dynamic` in the following manner:

- When a metadata field is simultaneously assigned a value and included in `pyprojec.dynamic`,
tools should assume that its value is partially defined. The given static value corresponds
to a subset of the value expected after the build process is complete.
Backends and dynamic providers are allowed augment the metadata field during the build process.

Footnotes

  1. Not sure how well received it would be, but personally I don't think it is a big deal. Is there anyway we can test the waters?

@abravalheri
Copy link

Sorry, it took me to long to finish writing and you guys were already discussing the same concepts 😝.

@henryiii
Copy link
Collaborator Author

The initial water test is here. Second water test is in a discourse thread. So yes, I think testing the waters is fine. We can add things to the "rejected ideas" if they get rejected, after all. :)

@abravalheri
Copy link

abravalheri commented Mar 22, 2023

Perfect!

My personal opinion is that lifting the restrictions on "pre-filling" dynamic is long overdue...
The raison d'être of the [project] section is to communicate between backend and developer, so I think we should make this as smooth as possible and in a way that the experience differ as minimum as possible when switching backends.

The dynamic field serves a secondary optimization problem so tools can know if they need to run the prepare_metadata_for_* hooks or can just read the value directly from pyproject.toml. Since the dynamic field would still be listed in dynamic I don't think that lifting that restriction would prevent tools from performing this optimization.

@bennyrowland
Copy link
Collaborator

I am happy to drop explicit passing of pyproject.toml but it should probably make it into the “considered alternate ideas” list of the proposal.

The extra complexity with modifying fields is that people will want to run multiple providers on the same field - I am thinking mainly of classifiers but also dependencies here - and then there are issues of precedence etc. In addition for optional dependencies you might well want different providers to provide different option sets, so your field that you pass to the provider would be a nested field, but with the first part describing the field (optional-dependencies) and the second one being an arbitrary string under which to store the provider dependency list.

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 22, 2023

I'm not sure I'd go with listing optional-dependencies.all in the dynamic list (edit: I thought I saw this) - with this proposal, you'd only have optional-dependencies (which is backwards compatible with the current status quo), and you'd have something like:

[project.optional-dependecies]
test = ["pytest"]
dev = ["ipython"]

[project.dynamic]
optional-dependencies = {all = {combine=["dev", "test"]}}
# Same:
optional-dependencies.all = {combine=["dev", "test"]}

With that syntax you can't add a provider or path extra, by the way. Anyway, there is a feature or limitation here - you can't have multiple providers on a single key. The "inverted" syntax above is the only one that supports multiple providers for a single field.

If you really needed multiple providers per field, I think you could very easily write a little local plugin that combines other plugins yourself, just like how you can wrap PEP 517 backends with a local backend today. But in general, I think one plugin per field is an acceptable limitation.

@henryiii
Copy link
Collaborator Author

The main thing I'm thinking here is this could be two PEPs - one to allow mixed static and dynamic fields, and one to add plugins. They are related, but I don't think either requires the other, and it might be harder to get one more complex PEP in than two smaller PEPs? Or is it better to keep this a single PEP proposal?

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 27, 2023

Okay, I've integrated the metadata update, I've put everything "else" in "rejected ideas", I've really reworked the "reversed order" rejected idea a lot (it's much nicer and more powerful now!). I've elevated the "reversed order" idea a bit in heading emphasis, rather than treating it as equivalent to the other rejected ideas. It would enable both multiple plugins per metadata slot and the "add requirements only" idea, as well as making better to the "one call per plugin" idea. But it's more complex, more effort to handle without duplication, a separate section, and all those things are pretty minor additions, 95% of users would be happier with the current proposal, I think.

If no one has further comment, ideas, or fixes, I'll make a discuss.python.org topic with this proposal (the issue description) next.

@abravalheri
Copy link

Hi @henryiii thank you very much for working on this.

I noticed the following items that might need some minor corrections:

  • [tool.setuptools.dynamic] to be used instead of [tool.setuptools.metadata]
  • In the text, the signature dynamic_metadata(pyproject_dict, settings=None) is mentioned (without a fields argument). In the example implementation fields is added.

@henryiii
Copy link
Collaborator Author

Thanks! I noticed the signature issues and was just updating that. I think we decided on removing pyproject_dict when we added settings, but it wasn't updated everywhere. I think I got it now. I wouldn't have caught the tool.setuptools.dynamic one, though!

@henryiii
Copy link
Collaborator Author

@henryiii
Copy link
Collaborator Author

henryiii commented Mar 28, 2023

So project maintainers not already in the discussion this: @dholth, @takluyver, @frostming, @messense, @FFY00 and @domdfcoding (I'll also ping the Poetry Discord).

For plugin authors, @RonnyPfannschmidt & @hynek.

If you have any opinions on the proposal at https://discuss.python.org/t/pep-for-dynamic-metdata-plugins/25237 (discussion is happening there, not here anymore), that would be great. There are a few polls that you can vote on, too. At the very least this is hopefully a useful heads up that this is being discussed!

@henryiii
Copy link
Collaborator Author

The PyCon Packaging Summit was useful; I'll be working on version 2 of the proposal. It will be a bit slow for a while, since I'm traveling a lot in the near future (and recent past!).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants