Skip to content

Commit

Permalink
merge in main
Browse files Browse the repository at this point in the history
  • Loading branch information
callumrollo committed Nov 29, 2024
2 parents d1dbe13 + 83fb120 commit 14047e2
Show file tree
Hide file tree
Showing 82 changed files with 954 additions and 478 deletions.
113 changes: 113 additions & 0 deletions .github/workflows/publish-to-pypi.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
name: Publish Python 🐍 distribution 📦 to PyPI

on: push

jobs:
build:
name: Build distribution 📦
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install pypa/build
run: python3 -m pip install build --user
- name: Build a binary wheel and a source tarball
run: python3 -m build
- name: Store the distribution packages
uses: actions/upload-artifact@v4
with:
name: python-package-distributions
path: dist/

publish-to-pypi:
name: >-
Publish Python 🐍 distribution 📦 to PyPI
if: startsWith(github.ref, 'refs/tags/') # only publish to PyPI on tag pushes
needs:
- build
runs-on: ubuntu-latest
environment:
name: pypi
url: https://pypi.org/p/pyglider
permissions:
id-token: write # IMPORTANT: mandatory for trusted publishing

steps:
- name: Download all the dists
uses: actions/download-artifact@v4
with:
name: python-package-distributions
path: dist/
- name: Publish distribution 📦 to PyPI
uses: pypa/gh-action-pypi-publish@release/v1

github-release:
name: >-
Sign the Python 🐍 distribution 📦 with Sigstore
and upload them to GitHub Release
needs:
- publish-to-pypi
runs-on: ubuntu-latest

permissions:
contents: write # IMPORTANT: mandatory for making GitHub Releases
id-token: write # IMPORTANT: mandatory for sigstore

steps:
- name: Download all the dists
uses: actions/download-artifact@v4
with:
name: python-package-distributions
path: dist/
- name: Sign the dists with Sigstore
uses: sigstore/[email protected]
with:
inputs: >-
./dist/*.tar.gz
./dist/*.whl
- name: Create GitHub Release
env:
GITHUB_TOKEN: ${{ github.token }}
run: >-
gh release create
'${{ github.ref_name }}'
--repo '${{ github.repository }}'
--notes ""
- name: Upload artifact signatures to GitHub Release
env:
GITHUB_TOKEN: ${{ github.token }}
# Upload to GitHub Release using the `gh` CLI.
# `dist/` contains the built packages, and the
# sigstore-produced signatures and certificates.
run: >-
gh release upload
'${{ github.ref_name }}' dist/**
--repo '${{ github.repository }}'
publish-to-testpypi:
name: Publish Python 🐍 distribution 📦 to TestPyPI
needs:
- build
runs-on: ubuntu-latest

environment:
name: testpypi
url: https://test.pypi.org/p/pyglider

permissions:
id-token: write # IMPORTANT: mandatory for trusted publishing

steps:
- name: Download all the dists
uses: actions/download-artifact@v4
with:
name: python-package-distributions
path: dist/
- name: Publish distribution 📦 to TestPyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
repository-url: https://test.pypi.org/legacy/
37 changes: 14 additions & 23 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,42 +7,33 @@ jobs:
strategy:
matrix:
os: ["ubuntu-latest"]
python-version: ["3.9", "3.10"]
python-version: ["3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v2
- name: Cache conda
uses: actions/cache@v2
env:
# Increase this value to reset cache if etc/example-environment.yml has not changed
CACHE_NUMBER: 0
- uses: actions/checkout@v4
- name: mamba setup enviroment
uses: mamba-org/[email protected]
with:
path: ~/conda_pkgs_dir
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-${{hashFiles('environment.yml') }}
- uses: conda-incubator/setup-miniconda@v2
with:
activate-environment: pyglider-test
environment-name: test-env
environment-file: tests/environment.yml
python-version: ${{ matrix.python-version }}
channel-priority: strict
use-only-tar-bz2: true # IMPORTANT: This needs to be set for caching to work properly!
create-args: >-
python=${{ matrix.python-version }}
- name: Conda info
shell: bash -l {0}
shell: micromamba-shell {0}
run: conda info; conda list
- name: install pyglider source
shell: bash -l {0}
shell: micromamba-shell {0}
run: which pip; pip install -e .
- name: Process seaexplorer
shell: bash -l {0}
shell: micromamba-shell {0}
run: which python; cd tests/example-data/example-seaexplorer; make clean-all; python process_deploymentRealTime.py
- name: Process slocum
shell: bash -l {0}
shell: micromamba-shell {0}
run: which python; cd tests/example-data/example-slocum; make clean-all; python process_deploymentRealTime.py
- name: Process seaexplorer-legato-flntu-arod-ad2cp
shell: bash -l {0}
shell: micromamba-shell {0}
run: which python; cd tests/example-data/example-seaexplorer-legato-flntu-arod-ad2cp; make clean-all; python process_deploymentRealTime.py
- name: Run tests
shell: bash -l {0}
shell: micromamba-shell {0}
run: which python; pytest --cov --cov-report xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@v4
50 changes: 40 additions & 10 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,42 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2
python:
version: 3.8
install:
- method: pip
path: .
extra_requirements:
- docs

# Build documentation in the docs/ directory with Sphinx

# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.12"
# You can also specify other tool versions:
# nodejs: "20"
# rust: "1.70"
# golang: "1.20"

# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: docs/conf.py
configuration: docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
# fail_on_warning: true

python:
install:
- requirements: docs-requirements.txt
# Install our python package before building the docs
- method: pip
path: .

# Optionally build your docs in additional formats such as PDF and ePub
# formats:
# - pdf
# - epub

# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
# python:
# install:
# - requirements: docs/requirements.txt
18 changes: 6 additions & 12 deletions docs/getting-started-slocum.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,12 @@

## Gather data

Slocum gliders have 4 types of files. For telemetry data there is `*.sbd` files for sensor data, and `*.tbd` for the glider's attitude and position data. These are called `*.dbd` and `*.ebd` respectively, when retrieved from the gliders' payload post deployment. These need to be made available in a single directory for `pyglider` to process.
Slocum gliders have 4 types of files. For telemetry data there are `*.tbd` files for sensor data, and `*.sbd` for the glider's attitude and position data. These are called `*.ebd` and `*.tbd` respectively, when retrieved from the gliders' payload post deployment. Modern gliders have compressed version of these, eg `*.tcd`, `*.scd` that *pyglider* should be able to parse. These data files need to be made available in a _single_ directory for *pyglider* to process. Note that on the glider they are often separated into `science/logs` and `flight/logs`.

Slocums also have a sensor cache file `*.cac`, all of which have randomized names. These are needed by the processing, and are usually stored in a separate cache directory.
Slocum gliders also have a sensor cache file `*.cac`, all of which have randomized names. These are needed by the processing, and are usually stored in a separate cache directory.

You can download and expand example data using `.get_example_data`:
You can download example data at <https://cproof.uvic.ca/pyglider-example-data/pyglider-example-data.zip> which will add a local directory `example-data` to your current directory.

```python
import pyglider.example_data as pexamp

pexamp.get_example_data('./')
```

which will add a local directory `example-data` to your current directory.

## Make a deployment configuration file

Expand All @@ -33,8 +26,9 @@ The example script is relatively straight forward if there is no intermediate pr

Data comes from an input directory, and is translated into a single CF-compliant netCDF timeseries file using the package [dbdreader](https://dbdreader.readthedocs.io/en/latest/). Finally individual profiles are saved and a 2-D 1-m grid in time-depth is saved.

```{note} There is a version that does not require `dbdreader` to do the initial conversion from the Dinkum format to netCDF. However it is quite slow, particularly for full-resolution datasets, and less robust. We suggest using the `slocum.raw_to_timeseries`.
```
:::{note}
There is a version that does not require `dbdreader` to do the initial conversion from the Dinkum format to netCDF. However it is quite slow, particularly for full-resolution datasets, and less robust. We suggest using the `slocum.raw_to_timeseries`.
:::

It is possible that between these steps the user will want to add any screening steps, or adjustments to the calibrations. PyGlider does not provide those steps, but is designed so they are easy to add.

Expand Down
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ dependencies:
- dask
- netcdf4
- gsw
- polars>=1.1
- scipy
- bitstring
- pooch
- pip:
- dbdreader
- polars
2 changes: 1 addition & 1 deletion pyglider/_version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = '0.0.4'
__version__ = '0.0.7'
Loading

0 comments on commit 14047e2

Please sign in to comment.