diff --git a/.gitignore b/.gitignore index b59d4ed3..ce0ad021 100644 --- a/.gitignore +++ b/.gitignore @@ -788,3 +788,7 @@ examples/ex1-ase/ /paper/jats/ /doc/_build/ /doc/api/ +/paper/paper.tex +/paper/paper.out +/paper/paper.log +/paper/TT-JOSS-arxiv-submission/ diff --git a/doc/advanced_socket.md b/doc/advanced_socket.md index d441ee23..0e6eb227 100644 --- a/doc/advanced_socket.md +++ b/doc/advanced_socket.md @@ -10,14 +10,15 @@ overhead typically associated with file I/O during calculation restarts. This feature is particularly beneficial for tasks involving repetitive operations like structural optimization and saddle point searches, where traditional file-based communication can become a -bottleneck. The underlying software architecture is shown in [the following figure](#fig-3-sparc-electronic-calculations-with-socket-communication-across-hybrid-computing-platforms): +bottleneck. The underlying software architecture is shown in +[Fig. 1](#scheme-sparc-socket): - +```{figure} img/scheme_socket_hetero.png +:alt: scheme-sparc-socket +:name: scheme-sparc-socket -![scheme-sparc-socket](img/scheme_socket_hetero.png) - - -**TODO** change doc source +Fig. 1. SPARC electronic calculations with socket communication across hybrid computing platforms. +``` **Requirements**: the SPARC binary must be manually compiled from the source code with [socket @@ -35,19 +36,30 @@ The communication protocol implemented in SPARC and SPARC-X-API adheres to the [i-PI protocol](https://github.com/i-pi/i-pi) standard. Specifically, we implement the original i-PI protocol within the SPARC C-source code, while the python SPARC-X-API uses a -backward-compatible protocol based on i-PI. The dual-mode design is aimed for both low-level and -high-level interfacing of the DFT codes, providing the following features as shown in [Fig. 4](#fig-4-overview-of-the-sparc-protocol-as-an-extension-to-the-standard-i-pi-protocol): +backward-compatible protocol based on i-PI. The dual-mode design is +aimed for both low-level and high-level interfacing of the DFT codes, +providing the following features as shown in [Fig. 2](#SPARC-protocol-overview): + +(scheme-sparc-protocol)= +```{figure} img/scheme_sparc_protocol.png +:alt: scheme-sparc-protocol + +Fig. 2. Overview of the SPARC protocol as an extension to the standard i-PI protocol. +``` -### Fig. 4. Overview of the SPARC protocol as an extension to the standard i-PI protocol. -![scheme-sparc-protocol](img/scheme_sparc_protocol.png) +Based on the scenarios, the socket communication layer can be accessed +via the following approaches as shown in +[Fig. 3](#scheme-sparc-modes): -Based on the scenarios, the socket communication layer can be accessed via the following approaches as shown in [Fig. 5](#fig-5-different-ways-of-using-sparcs-socket-mode): +(scheme-sparc-modes)= +```{figure} img/scheme-SPARC-socket-modes.png +:alt: scheme-sparc-modes -### Fig. 5. Different ways of using SPARC's socket mode. -![scheme-sparc-modes](img/scheme-SPARC-socket-modes.png) +Fig. 3. Different ways of using SPARC's socket mode. +``` -1. **SPARC binary only** ([Fig. 5](#fig-5-different-ways-of-using-sparcs-socket-mode) **a**) +1. **SPARC binary only** ([Fig. 3](#scheme-sparc-modes) **a**) SPARC binary with socket support can be readily coupled with any i-PI compatible socker server, such as `ase.calculators.socketio.SocketIOCalculator`, for example @@ -69,7 +81,7 @@ Based on the scenarios, the socket communication layer can be accessed via the f to be run on a single computer system. -2. **Local-only Mode** ([Fig. 5](#fig-5-different-ways-of-using-sparcs-socket-mode) **b**) +2. **Local-only Mode** ([Fig. 3](#scheme-sparc-modes) **b**) Ideal for standalone calculations, this mode simulates a conventional calculator while benefiting from socket-based efficiency. @@ -79,7 +91,7 @@ Based on the scenarios, the socket communication layer can be accessed via the f ``` For most users we recommend using this mode when performing a calculation on a single HPC node. -3. **Client (Relay) Mode** ([Fig. 5](#fig-5-different-ways-of-using-sparcs-socket-mode) **c**) +3. **Client (Relay) Mode** ([Fig. 3](#scheme-sparc-modes) **c**) In this mode, the `sparc.SPARC` calculator servers as a passive client which listens to a remote i-PI-compatible server. When @@ -107,7 +119,7 @@ Based on the scenarios, the socket communication layer can be accessed via the f automatically determine if it is necessary to restart the SPARC subprocess. -4. **Server Mode** ([Fig. 5](#fig-5-different-ways-of-using-sparcs-socket-mode) **d**) +4. **Server Mode** ([Fig. 3](#scheme-sparc-modes) **d**) Paired with the client mode in (3), SPARC-X-API can be run as a socket server, isolated from the node that performs the @@ -132,7 +144,7 @@ Based on the scenarios, the socket communication layer can be accessed via the f ## (In-progress) Controlling SPARC routines from socket interface -As shown in [Fig. 4](#fig-4-overview-of-the-sparc-protocol-as-an-extension-to-the-standard-i-pi-protocol), +As shown in [Fig. 2](#scheme-sparc-protocol), the SPARC socket protocol designs allows bidirectional control of internal SPARC routines. Local- or server-mode `sparc.SPARC` calculators can communicate with the SPARC binary via functions like diff --git a/doc/contribute.md b/doc/contribute.md index 73e24f10..a77ece43 100644 --- a/doc/contribute.md +++ b/doc/contribute.md @@ -100,6 +100,7 @@ files and lines, as shown in the following screenshot: :align: center ``` +(doc-edit)= ### Editing documentation Source files for documentation are placed under `doc/` directory, diff --git a/doc/maintainers.md b/doc/maintainers.md index a5717895..118ac048 100644 --- a/doc/maintainers.md +++ b/doc/maintainers.md @@ -5,145 +5,12 @@ developers. For general guidelines regarding how to contribute to this project please refer to the [how to contribute](contribute.md) page. Most of the tasks in this documentation require `maintain` or `admin` -role for the repository. - -## Github Settings -### Remote Branches - -There are multiple branches required for the CI/CD workflow in -SPARC-X-API. Push / pull request to these branches should only be made by automatic github actions. - -- [`badges`](https://github.com/SPARC-X/SPARC-X-API/tree/badges): - branch for maintaining the svg badges (package version, CI status, - etc.) - - A list of svg badges can be found under `badges/` directory of this - branch. - - See **TODO** for how to add / modify badges to be shown in - the README. - -- [`gh_pages`](https://github.com/SPARC-X/SPARC-X-API/tree/gh_pages): - branch to publish the documentation site. - - -### Github Pages - -To allow pushed to the `gh_pages` branches to be automatically -deployed to the document pages, go to the [pages -setting](https://github.com/SPARC-X/SPARC-X-API/settings/pages) and -set the "Source" to "Deploy from a branch", as well as "Branch" to -"gh_pages", as shown in the UI screenshot below: - -![Github Pages Settings](img/screenshots/github_pages_setting.png) - - -### Secrets - -Environment secrets (such as PyPI access key) can be configured in the -[secrets -setting](https://github.com/SPARC-X/SPARC-X-API/settings/secrets/actions) -panel. Please check the [github -documentation](https://docs.github.com/en/actions/security-for-github-actions/security-guides/using-secrets-in-github-actions) -for more details. - - -## Managing CI/CD Pipelines - -CI/CD pipelines in the SPARC-X-API repo are managed by Github -workflows, consisting of multiple "Actions". Workflow configuration -YAML files are placed under `.github/workflows/` and the workflow -status can be checked at the [Actions -page](https://github.com/SPARC-X/SPARC-X-API/actions). Please take a -look at the official [documentation for -actions](https://docs.github.com/en/actions) to get familiar with the syntax. - -All workflows in the SPARC-X-API are designed to be able to run from -manual dispatch (with the `workflow_dispatch` enabled in the YAML -files) for debug purposes, from the "Run workflow" drop panel in the -[actions -page](https://github.com/SPARC-X/SPARC-X-API/actions/workflows), as -shown in the screenshot below: - -![Github Actions Manual Dispatch](img/screenshots/github_action_dispatch.png) - -- [Unit-test - workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/unit_test.yml) - includes several steps to run unit and coverage test. - - - The steps `Create badges` and ` Manually add git badges` defines how - the status badges in `README.md` are created and pushed to the - `badges` branch. - - When adding unit test examples involving real SPARC calculations, - do not use more than 4 MPI cores (may subject to changes) due to - the [resource limitation](https://docs.github.com/en/actions/using-github-hosted-runners/using-github-hosted-runners/about-github-hosted-runners) - of hosted runners. - -- [Publish doc pages - workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/publish_doc_pages.yml) - uses Sphinx to convert `doc/` to doc html files. - - The rendered - changes will only be pushed to the `gh_pages` branch with direct - commit on the master branch or after one PR is merged. - -- [Update JSON schema - workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/update_api.yml) - updates the JSON schema file after a new release in SPARC C/C++ - source code. - - The workflow is run both nightly and after normal push. Change the - behavior as needed. - -- [Publish PyPI - workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/publish-pypi.yml) - package the source as `sparc-x-api` and publish on PyPI. Only - activates on new releases. - -## Deploy on conda-forge - -### Managing SPARC-X-API Python Package on conda-forge - -SPARC-X-API is packaged as -[`sparc-x-api`](https://anaconda.org/conda-forge/sparc-x-api) in the -conda-forge channel. The source code (feedstock) for the package is -managed at -[`sparc-x-api-feedstock`](https://github.com/conda-forge/sparc-x-api-feedstock). -Please note that this repository is under the conda-forge -organization. If you wish to become a maintainer, please ping -[@alchem0x2a](https://github.com/alchem0x2A). - -The feedstock is set to track new releases in SPARC-X-API, and usually -no major maintenance is required. The bot system will create a PR for -a version bump, see [one -example](https://github.com/conda-forge/sparc-x-api-feedstock/pull/2) -for the maintainers to modify and merge. Please also ensure: - -- Only the `recipe/meta.yaml` needs to be changed. -- Follow the conda-forge's own [recipe standard](https://conda-forge.org/docs/maintainer/guidelines/) -- Do not directly use the [`.conda/meta.yaml`](https://github.com/SPARC-X/SPARC-X-API/blob/master/.conda/meta.yaml) for conda-forge (it is designed for local packaging test) -- Bump the `build.number` if you need to change the recipe YAML on the same SPARC-X-API release. - -### Managing SPARC C/C++ Package on conda-forge - -The conda-forge [`sparc-x`](https://anaconda.org/conda-forge/sparc-x) -package is managed by the -[`sparc-x-feedstock`](https://github.com/conda-forge/sparc-x-feedstock). The -maintenance rules are similar to `sparc-x-api-feedstock`. There are -several issues for future releases: -- Allow fetching dated releases of SPARC C/C++ code -- Add more build variants for MKL and MPICH - -## Deploy on PyPI - -SPARC-X-API is deployed on PyPI under the name -[`sparc-x-api`](https://pypi.org/project/sparc-x-api/). Please contact -the current maintainer [@alchem0x2a](mailto:alchem0x2a@gmail.com) if -you wish to become a co-contributor. - -Publishing on PyPI does not require setting an API token in the CI -workflow. Instead, it uses the [OIDC -protocol](https://docs.pypi.org/trusted-publishers/) for a trusted -publisher. The current settings on PyPI are like follows: - -![pypi-setting](img/screenshots/pypi_publisher_setup.png) +role for the repository. Please check the following topics for details: + +```{toctree} +:maxdepth: 2 +:caption: Topics +maintainers/github.md +maintainers/sparc-x-api.md +maintainers/sparc-c-c++.md +``` diff --git a/doc/maintainers/github.md b/doc/maintainers/github.md new file mode 100644 index 00000000..cf40e318 --- /dev/null +++ b/doc/maintainers/github.md @@ -0,0 +1,90 @@ +## Github Settings +### Remote Branches + +There are multiple branches required for the CI/CD workflow in +SPARC-X-API. Push / pull request to these branches should only be made by automatic github actions. + +- [`badges`](https://github.com/SPARC-X/SPARC-X-API/tree/badges): + branch for maintaining the svg badges (package version, CI status, + etc.) + + A list of svg badges can be found under `badges/` directory of this + branch. See [the development guide](#doc-edit) for how to add / + modify badges to be shown in the README. + +- [`gh_pages`](https://github.com/SPARC-X/SPARC-X-API/tree/gh_pages): + branch to publish the documentation site. + + +### Github Pages + +To allow pushed to the `gh_pages` branches to be automatically +deployed to the document pages, go to the [pages +setting](https://github.com/SPARC-X/SPARC-X-API/settings/pages) and +set the "Source" to "Deploy from a branch", as well as "Branch" to +"gh_pages", as shown in the UI screenshot below: + +![Github Pages Settings](img/screenshots/github_pages_setting.png) + + +### Secrets + +Environment secrets (such as PyPI access key) can be configured in the +[secrets +setting](https://github.com/SPARC-X/SPARC-X-API/settings/secrets/actions) +panel. Please check the [github +documentation](https://docs.github.com/en/actions/security-for-github-actions/security-guides/using-secrets-in-github-actions) +for more details. + +(cicd-sparc-x-api)= +## Managing CI/CD Pipelines + +CI/CD pipelines in the SPARC-X-API repo are managed by Github +workflows, consisting of multiple "Actions". Workflow configuration +YAML files are placed under `.github/workflows/` and the workflow +status can be checked at the [Actions +page](https://github.com/SPARC-X/SPARC-X-API/actions). Please take a +look at the official [documentation for +actions](https://docs.github.com/en/actions) to get familiar with the syntax. + +All workflows in the SPARC-X-API are designed to be able to run from +manual dispatch (with the `workflow_dispatch` enabled in the YAML +files) for debug purposes, from the "Run workflow" drop panel in the +[actions +page](https://github.com/SPARC-X/SPARC-X-API/actions/workflows), as +shown in the screenshot below: + +![Github Actions Manual Dispatch](img/screenshots/github_action_dispatch.png) + +- [Unit-test + workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/unit_test.yml) + includes several steps to run unit and coverage test. + + - The steps `Create badges` and ` Manually add git badges` defines how + the status badges in `README.md` are created and pushed to the + `badges` branch. + - When adding unit test examples involving real SPARC calculations, + do not use more than 4 MPI cores (may subject to changes) due to + the [resource limitation](https://docs.github.com/en/actions/using-github-hosted-runners/using-github-hosted-runners/about-github-hosted-runners) + of hosted runners. + +- [Publish doc pages + workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/publish_doc_pages.yml) + uses Sphinx to convert `doc/` to doc html files. + + The rendered + changes will only be pushed to the `gh_pages` branch with direct + commit on the master branch or after one PR is merged. + +- [Update JSON schema + workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/update_api.yml) + updates the JSON schema file after a new release in SPARC C/C++ + source code. + + The workflow is run both nightly and after normal push. Change the + behavior as needed. + +- [Publish PyPI + workflow](https://github.com/SPARC-X/SPARC-X-API/blob/master/.github/workflows/publish-pypi.yml) + package the source as `sparc-x-api` and publish on PyPI. Only + activates on new releases. diff --git a/doc/maintainers/sparc-c-c++.md b/doc/maintainers/sparc-c-c++.md new file mode 100644 index 00000000..dd73741a --- /dev/null +++ b/doc/maintainers/sparc-c-c++.md @@ -0,0 +1,166 @@ +# Maintaining SPARC C/C++ Code + +The documentation for SPARC C/C++ code is already complete as its +own. This maintainers' guide is aimed to cover topics like test cases, +CI/CD management, and package release. + +## Installation of SPARC C/C++ code + +Please refer to SPARC's +[README](https://github.com/SPARC-X/SPARC?tab=readme-ov-file#2-installation) +for more details. + +## Test cases + +To run the test cases, please refer to the [main +README](https://github.com/SPARC-X/SPARC?tab=readme-ov-file#4-execution) +and [tests' +README](https://github.com/SPARC-X/SPARC/blob/master/tests/README.md). + +The test script `SPARC_testing_script.py` was designed to parse SPARC +outputs without the usage of SPARC-X-API, and could be used as a +reference script for testing SPARC-X-API. + +### Socket interface test cases + +Currently, test cases for socket communication implemented in C/C++ +code are placed under +[`tests/Socket/`](https://github.com/SPARC-X/SPARC/tree/master/tests/Socket), +and are not covered by the `SPARC_testing_script.py` script, as most +of them are dependent on the socket calculator class in ASE. + +```{note} +As a rule of thumb, the test cases should not reply on SPARC-X-API's +`sparc.socketio` module (and `sparc.SPARC` calculator when `use_socket=True`), +as the design of the Python API may change in the future. +``` + +These tests can be run in an environment with ASE and SPARC-X-API (for +file I/O alone) installed. Assume that you have compiled `sparc` with socket support into `/lib/sparc`,a minimal setup could look like: +```{code} bash +pip install "ase>=3.23" +# Use minimal SPARC-X-API version +pip install git+https://github.com/SPARC-X/SPARC-X-API.git@v1.0.5 +cd tests/Socket +# Setup pseudopotential and SPARC command +export ASE_SPARC_COMMAND="mpirun -n 16 ../lib/sparc" +export SPARC_PSP_PATH="../../psps" +# Run all the tests +bash run_all.sh +``` + +## CI/CD workflows + +Currently there are 2 CI/CD workflows when submit commits / PRs to the +public SPARC C/C++ repo: + +1) Building SPARC binary and run unit test [link](https://github.com/SPARC-X/SPARC/blob/master/.github/workflows/build-test.yml) +2) LaTeX documentation and parameters validation [link](https://github.com/SPARC-X/SPARC/blob/master/.github/workflows/update-doc-pdf.yml) + +Like all CI/CD workflows [in SPARC-X-API](#cicd-sparc-x-api), the +workflows contain the `workflow_dispatch` keywords, allowing them to +be manually executed / re-run from a personal fork. Please check the +sections below for more details. + +### Unit test workflow + +This workflow contains two parts +1) Make sure the SPARC version date (defined in `initialization.c`) is + synchronized with that in the `ChangeLog`. (Checked by [`test-missing-parameters.py`](https://github.com/SPARC-X/SPARC/blob/master/.github/workflows/test-missing-parameters.py)) +2) Make sure the SPARC code compiles and runs on simple test cases + +The compilation of SPARC C/C++ code uses the `conda-build` recipe under [`.conda/meta.yaml`](https://github.com/SPARC-X/SPARC/blob/master/.conda/meta.yaml), and +Only the `quick_run` test cases are checked during this step +```{code} bash +python SPARC_testing_script.py quick_run +``` + + +### Documentation and parameters validation workflow + +This is a quick test to ensure that: + +1. The LaTeX documentation under `doc/.LaTeX` and subdirs can be +compiled correctly +2. All the parameter keywords from `.inpt` files +under the `tests` directory have been documented. (Checked by [`test-outdated-package.py`](https://github.com/SPARC-X/SPARC/blob/master/.github/workflows/test-outdated-package.py)) + +The LaTeX files are compiled using the light-weight [`tectonic` +engine](https://tectonic-typesetting.github.io/en-US/), + + +## Maintaining the conda-forge release + +Compiled binary programs of SPARC C/C++ codes are released in +conda-forge channel under the name +[`sparc-x`](https://anaconda.org/conda-forge/sparc-x). The release is +managed by +[`sparc-x-feedstock`](https://github.com/conda-forge/sparc-x-feedstock). Please +note that this repository is under the conda-forge organization. If +you wish to become a maintainer, please ping +[@alchem0x2a](https://github.com/alchem0x2A). + +The feedstock is set to track [new +releases](https://github.com/SPARC-X/SPARC/releases) in SPARC-X, and in most cases the changes should only be added to [`recipe/meta.yaml`](https://github.com/conda-forge/sparc-x-feedstock/blob/main/recipe/meta.yaml). + +If the future release mechanism of SPARC changes, it should be +reflected in both the release tag and the following lines in `recipe/meta.yaml`: +```{code} yaml +source: + url: https://github.com/SPARC-X/{{ package_name }}/archive/refs/tags/v{{ version }}.zip + sha256: "full sha256 checksum of the above zip" +``` + +```{note} +conda-forge does not allow direct download from the github commit. If a minor update (e.g. nightly builds) is to be distributed by conda-forge, please create a release in the SPARC main repo first. +``` + +Once a new release is created in the SPARC main repo, the auto bot +will send a pull request with the updated recipe (or you can manually +create the PR yourself). After confirming that all the checks have +passed, you can merge the pull request to include the compiled +binaries in the conda-forge channel. + +### Revised builds + +If by any chance you find the newly released SPARC binaries in +conda-forge channel contain errors or missing parts (e.g. not working +properly on one platform, missing a shared library, error from +MPI/Lapack dependencies, etc) due to a bug in `recpie/meta.yaml`, +please update the recipe and only change the build number. For example, +if the current recipe contains following build information: +```{code} yaml +build: + number: 0 +``` + +After implementing the necessary changes in the recipe, please bump the build number to 1: +```{code} yaml +build: + number: 1 +``` + +This allows conda-forge to distinguish two builds without affecting the version number, as the actual package is named like `-/sparc-x-2.0.0-_.conda`. + +```{note} +If the error stems from the C/C++ itself, you should update the release in the SPARC github repo instead. +``` + +### Cross-platform compilation + +Settings for the cross-platform compilation are defined in the +[`conda-forge.yml`](https://github.com/conda-forge/sparc-x-feedstock/blob/main/conda-forge.yml), +which can be safely edited by yourself (the feedstock maintainer), +although it is usually not required. + +Currently the `sparc-x` package is compiled on linux platform alone +with `x86_64` and `aarch64` variants. To change the settings including +host platforms, compilation toolchains (e.g. use native compilation vs +cross-compilation), please refer to the [conda-forge +manual](https://conda-forge.org/docs/maintainer/conda_forge_yml/). + +### Future development +There are +several issues for future releases: +- Allow fetching dated releases of SPARC C/C++ code +- Add more build variants for MKL and MPICH diff --git a/doc/maintainers/sparc-x-api.md b/doc/maintainers/sparc-x-api.md new file mode 100644 index 00000000..d3e059ed --- /dev/null +++ b/doc/maintainers/sparc-x-api.md @@ -0,0 +1,36 @@ +# SPARC-X-API package releases +## On conda-forge + +SPARC-X-API is packaged as +[`sparc-x-api`](https://anaconda.org/conda-forge/sparc-x-api) in the +conda-forge channel. The source code (feedstock) for the package is +managed at +[`sparc-x-api-feedstock`](https://github.com/conda-forge/sparc-x-api-feedstock). +Please note that this repository is under the conda-forge +organization. If you wish to become a maintainer, please ping +[@alchem0x2a](https://github.com/alchem0x2A). + +The feedstock is set to track new releases in SPARC-X-API, and usually +no major maintenance is required. The bot system will create a PR for +a version bump, see [one +example](https://github.com/conda-forge/sparc-x-api-feedstock/pull/2) +for the maintainers to modify and merge. Please also ensure: + +- Only the `recipe/meta.yaml` needs to be changed. +- Follow the conda-forge's own [recipe standard](https://conda-forge.org/docs/maintainer/guidelines/) +- Do not directly use the [`.conda/meta.yaml`](https://github.com/SPARC-X/SPARC-X-API/blob/master/.conda/meta.yaml) for conda-forge (it is designed for local packaging test) +- Bump the `build.number` if you need to change the recipe YAML on the same SPARC-X-API release. + +## Deploy on PyPI + +SPARC-X-API is deployed on PyPI under the name +[`sparc-x-api`](https://pypi.org/project/sparc-x-api/). Please contact +the current maintainer [@alchem0x2a](mailto:alchem0x2a@gmail.com) if +you wish to become a co-contributor. + +Publishing on PyPI does not require setting an API token in the CI +workflow. Instead, it uses the [OIDC +protocol](https://docs.pypi.org/trusted-publishers/) for a trusted +publisher. The current settings on PyPI are like follows: + +![pypi-setting](img/screenshots/pypi_publisher_setup.png)