Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: [pre-commit.ci] pre-commit autoupdate #2343

Merged
merged 3 commits into from
Oct 3, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,13 @@ repos:
exclude: ^validation/|\.dtd$|\.xml$

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.0.287"
rev: "v0.0.292"
hooks:
- id: ruff
args: ["--fix", "--show-fixes"]

- repo: https://github.com/psf/black-pre-commit-mirror
rev: 23.7.0
rev: 23.9.1
hooks:
- id: black-jupyter

Expand Down Expand Up @@ -62,11 +62,11 @@ repos:
rev: 1.7.0
hooks:
- id: nbqa-ruff
additional_dependencies: [ruff==0.0.287]
additional_dependencies: [ruff==0.0.292]
args: ["--extend-ignore=F821,F401,F841,F811"]

- repo: https://github.com/codespell-project/codespell
rev: v2.2.5
rev: v2.2.6
hooks:
- id: codespell
files: ^.*\.(py|md|rst)$
Expand Down
2 changes: 1 addition & 1 deletion docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ How did ``pyhf`` get started?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

In 2017 Lukas Heinrich was discussing with colleauge Holger Schulz how it would be convenient
to share and produce statistical results from LHC experiements if they were able to be
to share and produce statistical results from LHC experiments if they were able to be
created with tools that didn't require the large ``C++`` dependencies and tooling expertise as
:math:`\HiFa{}`.
Around the same time that Lukas began thinking on these ideas, Matthew Feickert was working on
Expand Down
2 changes: 1 addition & 1 deletion docs/governance/ROADMAP.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ This is the pyhf 2019 into 2020 Roadmap (Issue
Overview and Goals
------------------

We will follow loosely Seibert’s `Heirarchy of
We will follow loosely Seibert’s `Hierarchy of
Needs <https://twitter.com/FRoscheck/status/1159158552298229763>`__

|Seibert Hierarchy of Needs SciPy 2019| (`Stan
Expand Down
2 changes: 1 addition & 1 deletion src/pyhf/contrib/viz/brazil.py
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ def plot_brazil_band(test_pois, cls_obs, cls_exp, test_size, ax, **kwargs):
ax (:obj:`matplotlib.axes.Axes`): The matplotlib axis object to plot on.

Returns:
:obj:`tuple`: The :obj:`matplotlib.aritst` objects drawn.
:obj:`tuple`: The :obj:`matplotlib.artist` objects drawn.
"""
line_color = kwargs.pop("color", "black")
(cls_obs_line,) = ax.plot(
Expand Down
4 changes: 2 additions & 2 deletions src/pyhf/tensor/jax_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
self.rate = rate

def sample(self, sample_shape):
# TODO: Support other dtypes

Check notice on line 22 in src/pyhf/tensor/jax_backend.py

View check run for this annotation

codefactor.io / CodeFactor

src/pyhf/tensor/jax_backend.py#L22

unresolved comment '# TODO: Support other dtypes' (C100)
return jnp.asarray(
osp_stats.poisson(self.rate).rvs(size=sample_shape + self.rate.shape),
dtype=jnp.float64,
Expand Down Expand Up @@ -89,8 +89,8 @@

Args:
tensor_in (:obj:`tensor`): The input tensor object
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be cliped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be cliped to
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be clipped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be clipped to

Returns:
JAX ndarray: A clipped `tensor`
Expand Down
4 changes: 2 additions & 2 deletions src/pyhf/tensor/numpy_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,8 +98,8 @@ def clip(

Args:
tensor_in (:obj:`tensor`): The input tensor object
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be cliped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be cliped to
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be clipped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be clipped to

Returns:
NumPy ndarray: A clipped `tensor`
Expand Down
4 changes: 2 additions & 2 deletions src/pyhf/tensor/pytorch_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,8 @@ def clip(self, tensor_in, min_value, max_value):

Args:
tensor_in (:obj:`tensor`): The input tensor object
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be cliped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be cliped to
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be clipped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be clipped to

Returns:
PyTorch tensor: A clipped `tensor`
Expand Down
4 changes: 2 additions & 2 deletions src/pyhf/tensor/tensorflow_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,8 @@ def clip(self, tensor_in, min_value, max_value):

Args:
tensor_in (:obj:`tensor`): The input tensor object
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be cliped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be cliped to
min_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The minimum value to be clipped to
max_value (:obj:`scalar` or :obj:`tensor` or :obj:`None`): The maximum value to be clipped to

Returns:
TensorFlow Tensor: A clipped `tensor`
Expand Down