-
Notifications
You must be signed in to change notification settings - Fork 192
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Release v1.2.0 #3934
Release v1.2.0 #3934
Conversation
Only upload coverage for Py3.5
Replace outdated travis build status badge with github actions one.
Revert to the continuously updated v1 tag for the action. Fail CI if failing to upload coverage, since we can upload coverage for all PRs. Remove secret token. It is now possible to upload coverage reports to codecov without a token (from public GitHub repositories only). Upload coverage only if: - Python version: 3.5; AND - Is a push in or PR against repository 'aiidateam/aiida-core'.
This is an updated and more robust version of the checkout action. See https://github.com/actions/checkout for more information. Originally, v2 was attempted, but this clashes with being able to upload coverage, hence `master` is used.
The recent update to Click (7.1) breaks our tests and this CI workflow, because the help output formatting was slightly changed. Until we have resolved other issues we should pin the Click version to 7.0.
Pin the Click version to 7.0.
Revise dependency management workflow in accordance with AEP 002. * Use .github/CODEOWNERS file to trigger reviews from dependency-manager team. * All jobs related to testing the installation of aiida-core with different tools (pip, conda) on different python versions and backends are moved into the dedicated 'install-tests' workflow. * The test jobs run as part of the continuous integration on every push are executed against pinned environments (e.g. 'requirements/requirement-py-37.txt'). * The 'install-tests' workflow is triggered for the 'master', 'develop', and release branches ('release/*') as well as branches related to dependency-management (prefixed with 'dm/'). In addition, the workflow is triggered nightly to ensure that changes within the Python ecosystem that break our toolchain and possibly even tests, are automatically detected within 24 hours. * A new 'update-requirements' workflow is run on release branches and automatically creates a pull request with revised versions of requirements files. * The issues caused by pymatgen's use of setup_requires and previously addressed by installing numpy prior to aiida by default, are now addressed by specifically checking the setuptools version for Python 3.5 only. We fail the installation process with a descriptive message to the user in case that the installed setuptools version is insufficient. * All dependency-management related utility functions, such as generating and validating non-authoritative dependency specification files (e.g. 'environment.yml') are concentrated into the 'util/dependency_management.py' scripting file.
Used incorrect 'on.push.branch' instead of the correct 'on.push.branches' triggers.
And not on forks.
…triggers Update requirements workflow triggers
* Implement the dependency_management 'check-requirements' command. To check whether the environments frozen in the 'requirements/*.txt' files are matching the dependency specification of 'setup.json'. * Implement 'check-requirements' GitHub actions job. * Checkout specified head_ref in update-requirements workflow. * Execute 'update-requirements' workflow only upon repository_dispatch. With type 'update-requirements-command'. * Apply suggestions from code review Co-Authored-By: Leopold Talirz <[email protected]>
* Auto-generate all dependent requirements files on commit. * Add packaging to 'dev_precommit' extra requirements. Required by the dependency_management script.
…3765) This is the case either when the entry point cannot be found at all or when multiple entry points are registered with the same name.
This is a very useful flag that used to be there at some point.
* Apply naming convention of "plugin" vs "plugin package" consistently across the whole documentation * Update plugin design guidelines with suggestions on how to think about which information to store where and further improvements
This requirement extra does not exist, and has not existed since a very long time.
…3824) Often one would like to define an exit code for a process where the general gist of the message is known, however, the exact form might be slightly situation dependent. Creating individual exit codes for each slight variation is cumbersome so we introduce a way to parameterize the exit message. We implement a `format` method that will format the message string with the provided keyword arguments and returns, importantly, a a new exit code instance with the new formatted message. exit_code_template = ExitCode.format(450, '{parameter} is invalid.') exit_code_concrete = exit_code_template(parameter='some_specific_key') The `exit_code_concrete` is now unique to `exit_code_template` except for the message whose parameters have been replaced. To enable to implement a new method, we had to turn the `ExitCode` from a named tuple into a class, but to keep backwards compatibility, we sub class from a namedtuple, which guarantees that the new class behaves exactly like a tuple.
…3873) Without conversion to string, any non-string type will cause the function to raise an `AttributeError` since it won't have the method `str_replace`.
The `get_attr` function defined here seems to be used nowhere in the codebase.
Add intersphinx mapping to python standard library documentation and of various direct dependency libraries: * `click` * `flask` * `flask_restful` * `kiwipy` * `plumpy` Adding this documentation interconnection allows references in our docs like :py:exc:`ValueError` to now directly link to the relevant external documentation. This also allows to cut down the "nitpick exceptions" quite dramatically. Co-Authored-By: Sebastiaan Huber <[email protected]>
The current `run_api` interface for running the AiiDA REST API had unnecessarily many parameters, making it complicated to use in WSGI scripts, e.g.: from aiida.restapi import api from aiida.restapi.run_api import run_api import aiida.restapi CONFIG_DIR = os.path.join(os.path.split( os.path.abspath(aiida.restapi.__file__))[0], 'common') (app, api) = run_api( api.App, api.AiidaApi, hostname="localhost", port=5000, config=CONFIG_DIR, debug=False, wsgi_profile=False, hookup=False, catch_internal_server=False ) While all but the first two parameters are keyword arguments, the code would actually crash if they are not provided. In reality, there is no reason to have to specify *any* parameters whatsoever and one should simply be able to call `run_api()`. This commit accomplishes this by defining the appropriate default values.
Bumps [bleach](https://github.com/mozilla/bleach) from 3.1.1 to 3.1.4. - [Release notes](https://github.com/mozilla/bleach/releases) - [Changelog](https://github.com/mozilla/bleach/blob/master/CHANGES) - [Commits](mozilla/bleach@v3.1.1...v3.1.4) Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
The `--group-type` option in `verdi group list` and `verdi group path ls` has been deprecated and superseded with `-T/--type-string`. The recently added support for subclassing of the `Group` ORM class has made it interesting to be able to query for specific subclasses based on the type strings of groups. Since these can be hierarchical it is useful to support the inclusion of SQL wildcards `%` and `_` to perform like instead of exact matching. Note that this new functionality of filtering is only available for `verdi group list` and not for `verdi group path ls`, as the `GroupPath` utility does not support subclassing yet.
This will allow options and arguments that are supposed to match `Group` instance, to narrow the scope of subclasses that are to be matched. For example, an option or argument with the following type: GroupParamType(sub_classes=('aiida.groups:core.auto',)) will only match group instances that are a subclass of `AutoGroup`. Any other `Group` subclasses will not be matched.
This will enable auto-completion for existing `Code` and `Group` instances by their label for commands that have options or arguments with the corresponding parameter type.
@ltalirz Even though your recipe for |
Hm yeah... for some reason, there are just some rather strange python builds on conda-forge right now
However, the new repo immediately got 2 automated PRs for new builds... let's see whether merging them changes the situation... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks a lot @sphuber !
Just some very minor suggestions for the changelog
Also, the test-install should work now - installing the conda package works on my machine, at least. |
Codecov Report
@@ Coverage Diff @@
## master #3934 +/- ##
=======================================
Coverage 78.23% 78.23%
=======================================
Files 461 461
Lines 34075 34075
=======================================
Hits 26659 26659
Misses 7416 7416
Continue to review full report at Codecov.
|
Three failed jobs, three random/transient reasons 🤦 |
Hm... can we use this snapshot to open an issue on pylint? |
I am afraid that it won't be much help to them though. If we cannot provide a MWE of the bug, it will be pretty much impossible for them to debug it. Can always open an issue of course |
@ltalirz I am putting this on hold, because I need to put in a quick fix still I'm afraid |
Doing so would actually break backwards compatibility. Code that creates groups with explicit custom type strings, would no longer be able to query for them as the type string was silently converted to `core`. Even though accepting the passed `type_string` will cause warnings when loading them from the database, that is preferable then breaking existing code.
@ltalirz I have merged a fix that was blocking this. The tests all passed, also the conda install, even though Github incorrectly seems to still be waiting on it to finish. I think we can just temporarily make remove the required tag from this step to be able to merge this. |
Sounds good to me, feel free to proceed! |
thanks, I would like to, but I would first need an approval 😄 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry! ;-)
As is the new norm, this release PR comes with a single new commit (the release commit, which is put on top of current state of
develop
) that has not yet been reviewed elsewhere. @ltalirz please make sure that it is ok before approving, thanks!