Skip to content

Commit

Permalink
Merge pull request #663 from SeldonIO/patch/v0.10.4
Browse files Browse the repository at this point in the history
Merge patch release v0.10.4 into master
  • Loading branch information
ascillitoe authored Oct 21, 2022
2 parents 9809975 + ce23672 commit 47a0134
Show file tree
Hide file tree
Showing 12 changed files with 79 additions and 71 deletions.
10 changes: 9 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Change Log

## v0.10.4dev
## v0.11.0dev
[Full Changelog](https://github.com/SeldonIO/alibi-detect/compare/v0.10.3...master)

### Added
Expand All @@ -17,6 +17,14 @@ See the [documentation](https://docs.seldon.io/projects/alibi-detect/en/latest/c
- UTF-8 decoding is enforced when `README.md` is opened by `setup.py`. This is to prevent pip install errors on systems with `PYTHONIOENCODING` set to use other encoders ([#605](https://github.com/SeldonIO/alibi-detect/pull/605)).
- Skip specific save/load tests that require downloading remote artefacts if the relevant URI(s) is/are down ([#607](https://github.com/SeldonIO/alibi-detect/pull/607)).

## v0.10.4
## [v0.10.4](https://github.com/SeldonIO/alibi-detect/tree/v0.10.4) (2022-10-21)
[Full Changelog](https://github.com/SeldonIO/alibi-detect/compare/v0.10.3...v0.10.4)

### Fixed
- Fixed an incorrect default value for the `alternative` kwarg in the `FETDrift` detector ([#661](https://github.com/SeldonIO/alibi-detect/pull/661)).
- Fixed an issue with `ClassifierDrift` returning incorrect prediction probabilities when `train_size` given ([#662](https://github.com/SeldonIO/alibi-detect/pull/662)).

## [v0.10.3](https://github.com/SeldonIO/alibi-detect/tree/v0.10.3) (2022-08-17)
[Full Changelog](https://github.com/SeldonIO/alibi-detect/compare/v0.10.2...v0.10.3)

Expand Down
4 changes: 2 additions & 2 deletions CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,6 @@ authors:
- family-names: "Athorne"
given-names: "Alex"
title: "Alibi Detect: Algorithms for outlier, adversarial and drift detection"
version: 0.10.3
date-released: 2022-08-17
version: 0.10.4
date-released: 2022-10-21
url: "https://github.com/SeldonIO/alibi-detect"
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -407,8 +407,8 @@ BibTeX entry:
title = {Alibi Detect: Algorithms for outlier, adversarial and drift detection},
author = {Van Looveren, Arnaud and Klaise, Janis and Vacanti, Giovanni and Cobb, Oliver and Scillitoe, Ashley and Samoilescu, Robert and Athorne, Alex},
url = {https://github.com/SeldonIO/alibi-detect},
version = {0.10.3},
date = {2022-08-17},
version = {0.10.4},
date = {2022-10-21},
year = {2019}
}
```
15 changes: 7 additions & 8 deletions alibi_detect/cd/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,14 +153,13 @@ def preprocess(self, x: Union[np.ndarray, list]) -> Tuple[Union[np.ndarray, list
else:
return self.x_ref, x # type: ignore[return-value]

def get_splits(self,
x_ref: Union[np.ndarray, list],
x: Union[np.ndarray, list],
return_splits: bool = True
) -> Union[
Tuple[Union[np.ndarray, list], np.ndarray],
Tuple[Union[np.ndarray, list], np.ndarray, Optional[List[Tuple[np.ndarray, np.ndarray]]]]
]:
def get_splits(
self,
x_ref: Union[np.ndarray, list],
x: Union[np.ndarray, list],
return_splits: bool = True
) -> Union[Tuple[Union[np.ndarray, list], np.ndarray],
Tuple[Union[np.ndarray, list], np.ndarray, Optional[List[Tuple[np.ndarray, np.ndarray]]]]]:
"""
Split reference and test data in train and test folds used by the classifier.
Expand Down
2 changes: 1 addition & 1 deletion alibi_detect/cd/fet.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ def __init__(
update_x_ref: Optional[Dict[str, int]] = None,
preprocess_fn: Optional[Callable] = None,
correction: str = 'bonferroni',
alternative: str = 'decrease',
alternative: str = 'greater',
n_features: Optional[int] = None,
input_shape: Optional[tuple] = None,
data_type: Optional[str] = None
Expand Down
3 changes: 2 additions & 1 deletion alibi_detect/cd/pytorch/classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,6 @@ def score(self, x: Union[np.ndarray, list]) -> Tuple[float, float, np.ndarray, n
and the out-of-fold classifier model prediction probabilities on the reference and test data
"""
x_ref, x = self.preprocess(x)
n_ref, n_cur = len(x_ref), len(x)
x, y, splits = self.get_splits(x_ref, x) # type: ignore

# iterate over folds: train a new model for each fold and make out-of-fold (oof) predictions
Expand All @@ -200,6 +199,8 @@ def score(self, x: Union[np.ndarray, list]) -> Tuple[float, float, np.ndarray, n
probs_oof = softmax(preds_oof, axis=-1) if self.preds_type == 'logits' else preds_oof
idx_oof = np.concatenate(idx_oof_list, axis=0)
y_oof = y[idx_oof]
n_cur = y_oof.sum()
n_ref = len(y_oof) - n_cur
p_val, dist = self.test_probs(y_oof, probs_oof, n_ref, n_cur)
probs_sort = probs_oof[np.argsort(idx_oof)]
return p_val, dist, probs_sort[:n_ref, 1], probs_sort[n_ref:, 1]
6 changes: 2 additions & 4 deletions alibi_detect/cd/sklearn/classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,29 +252,27 @@ def score(self, x: Union[np.ndarray, list]) -> Tuple[float, float, np.ndarray, n

def _score(self, x: Union[np.ndarray, list]) -> Tuple[float, float, np.ndarray, np.ndarray]:
x_ref, x = self.preprocess(x)
n_ref, n_cur = len(x_ref), len(x)
x, y, splits = self.get_splits(x_ref, x, return_splits=True) # type: ignore

# iterate over folds: train a new model for each fold and make out-of-fold (oof) predictions
probs_oof_list, idx_oof_list = [], []
for idx_tr, idx_te in splits:
y_tr = y[idx_tr]

if isinstance(x, np.ndarray):
x_tr, x_te = x[idx_tr], x[idx_te]
elif isinstance(x, list):
x_tr, x_te = [x[_] for _ in idx_tr], [x[_] for _ in idx_te]
else:
raise TypeError(f'x needs to be of type np.ndarray or list and not {type(x)}.')

self.model.fit(x_tr, y_tr)
probs = self.model.aux_predict_proba(x_te)
probs_oof_list.append(probs)
idx_oof_list.append(idx_te)

probs_oof = np.concatenate(probs_oof_list, axis=0)
idx_oof = np.concatenate(idx_oof_list, axis=0)
y_oof = y[idx_oof]
n_cur = y_oof.sum()
n_ref = len(y_oof) - n_cur
p_val, dist = self.test_probs(y_oof, probs_oof, n_ref, n_cur)
probs_sort = probs_oof[np.argsort(idx_oof)]
return p_val, dist, probs_sort[:n_ref, 1], probs_sort[n_ref:, 1]
Expand Down
3 changes: 2 additions & 1 deletion alibi_detect/cd/tensorflow/classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,6 @@ def score(self, x: np.ndarray) -> Tuple[float, float, np.ndarray, np.ndarray]:
and the out-of-fold classifier model prediction probabilities on the reference and test data
"""
x_ref, x = self.preprocess(x) # type: ignore[assignment]
n_ref, n_cur = len(x_ref), len(x)
x, y, splits = self.get_splits(x_ref, x) # type: ignore

# iterate over folds: train a new model for each fold and make out-of-fold (oof) predictions
Expand All @@ -187,6 +186,8 @@ def score(self, x: np.ndarray) -> Tuple[float, float, np.ndarray, np.ndarray]:
probs_oof = softmax(preds_oof, axis=-1) if self.preds_type == 'logits' else preds_oof
idx_oof = np.concatenate(idx_oof_list, axis=0)
y_oof = y[idx_oof]
n_cur = y_oof.sum()
n_ref = len(y_oof) - n_cur
p_val, dist = self.test_probs(y_oof, probs_oof, n_ref, n_cur)
probs_sort = probs_oof[np.argsort(idx_oof)]
return p_val, dist, probs_sort[:n_ref, 1], probs_sort[n_ref:, 1]
3 changes: 2 additions & 1 deletion alibi_detect/version.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@
# 1) we don't load dependencies by storing it in __init__.py
# 2) we can import it in setup.py for the same reason
# 3) we can import it into your module module
__version__ = "0.10.4dev"

__version__ = "0.11.0dev"
4 changes: 2 additions & 2 deletions doc/source/cd/methods/fetdrift.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
"\n",
"The null hypothesis is $H_0: \\widehat{OR}=1$. In other words, the proportion of 1's to 0's is unchanged between the test and reference distributions, such that the odds of 1's vs 0's is independent of whether the data is drawn from the reference or test distribution. The offline FET detector can perform one-sided or two-sided tests, with the alternative hypothesis set by the `alternative` keyword argument:\n",
"\n",
"- If `alternative='greater'`, the alternative hypothesis is $H_a: \\widehat{OR}>1$ i.e. proportion of 1's versus 0's has increased compared to reference distribution.\n",
"- If `alternative='greater'`, the alternative hypothesis is $H_a: \\widehat{OR}>1$ i.e. proportion of 1's versus 0's has increased compared to the reference distribution.\n",
"- If `alternative='less'`, the alternative hypothesis is $H_a: \\widehat{OR}<1$ i.e. the proportion of 1's versus 0's has decreased compared to the reference distribution.\n",
"- If `alternative='two-sided'`, the alternative hypothesis is $H_a: \\widehat{OR} \\ne 1$ i.e. the proportion of 1's versus 0's has changed compared to the reference distribution.\n",
"\n",
Expand All @@ -56,7 +56,7 @@
"\n",
"Arguments:\n",
"\n",
"* `x_ref`: Data used as reference distribution.\n",
"* `x_ref`: Data used as reference distribution. Note this should be the raw data, for example `np.array([0, 0, 1, 0, 0, 0])`, not the 2x2 contingency table.\n",
"\n",
"Keyword arguments:\n",
"\n",
Expand Down
60 changes: 30 additions & 30 deletions licenses/license.txt
Original file line number Diff line number Diff line change
Expand Up @@ -639,7 +639,7 @@ the file ChangeLog history information documenting your changes. Please read
the FAQ for more information on the distribution of modified source versions.

PyWavelets
1.4.0
1.4.1
MIT License
Copyright (c) 2006-2012 Filip Wasilewski <http://en.ig.ma/>
Copyright (c) 2012-2020 The PyWavelets Developers <https://github.com/PyWavelets/pywt>
Expand Down Expand Up @@ -689,7 +689,7 @@ SOFTWARE.


alibi-detect
0.10.4.dev0
0.10.3
Apache Software License
Apache License
Version 2.0, January 2004
Expand Down Expand Up @@ -921,7 +921,7 @@ SOFTWARE.


certifi
2022.9.14
2022.9.24
Mozilla Public License 2.0 (MPL 2.0)
This package contains a modified version of ca-bundle.crt:

Expand Down Expand Up @@ -1111,7 +1111,7 @@ For more information, please refer to <http://unlicense.org>


fonttools
4.37.2
4.37.4
MIT License
MIT License

Expand All @@ -1137,7 +1137,7 @@ SOFTWARE.


fsspec
2022.8.2
2022.10.0
BSD License
BSD 3-Clause License

Expand Down Expand Up @@ -1171,7 +1171,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


huggingface-hub
0.9.1
0.10.1
Apache Software License
Apache License
Version 2.0, January 2004
Expand Down Expand Up @@ -1438,7 +1438,7 @@ THE SOFTWARE.


imageio
2.21.3
2.22.2
BSD License
Copyright (c) 2014-2022, imageio developers
All rights reserved.
Expand Down Expand Up @@ -1467,7 +1467,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


importlib-metadata
4.12.0
5.0.0
Apache Software License

Apache License
Expand Down Expand Up @@ -1840,7 +1840,7 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


matplotlib
3.6.0
3.6.1
Python Software Foundation License
License agreement for matplotlib versions 1.3.0 and later
=========================================================
Expand Down Expand Up @@ -1943,7 +1943,7 @@ Licensee agrees to be bound by the terms and conditions of this License
Agreement.

networkx
2.8.6
2.8.7
BSD License
NetworkX is distributed with the 3-clause BSD license.

Expand Down Expand Up @@ -5379,14 +5379,14 @@ under the terms of *both* these licenses.


pandas
1.4.4
1.5.1
BSD License
BSD 3-Clause License

Copyright (c) 2008-2011, AQR Capital Management, LLC, Lambda Foundry, Inc. and PyData Development Team
All rights reserved.

Copyright (c) 2011-2021, Open source contributors.
Copyright (c) 2011-2022, Open source contributors.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Expand Down Expand Up @@ -5555,7 +5555,7 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
The above BSD License Applies to all code, even that also covered by Apache 2.0.

pytz
2022.2.1
2022.5
MIT License
Copyright (c) 2003-2019 Stuart Bishop <[email protected]>

Expand Down Expand Up @@ -6092,7 +6092,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.


scipy
1.9.1
1.9.3
BSD License
Copyright (c) 2001-2002 Enthought, Inc. 2003-2022, SciPy Developers.
All rights reserved.
Expand Down Expand Up @@ -6127,7 +6127,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

----

This binary distribution of Scipy also bundles the following software:
This binary distribution of SciPy also bundles the following software:


Name: OpenBLAS
Expand Down Expand Up @@ -7058,7 +7058,7 @@ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

tifffile
2022.8.12
2022.10.10
BSD License
BSD 3-Clause License

Expand Down Expand Up @@ -7093,7 +7093,7 @@ POSSIBILITY OF SUCH DAMAGE.


tokenizers
0.12.1
0.13.1
Apache Software License
UNKNOWN

Expand Down Expand Up @@ -7216,7 +7216,7 @@ CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.


transformers
4.22.0
4.23.1
Apache Software License
Copyright 2018- The Hugging Face team. All rights reserved.

Expand Down Expand Up @@ -7424,7 +7424,7 @@ Copyright 2018- The Hugging Face team. All rights reserved.


typing-extensions
4.3.0
4.4.0
Python Software Foundation License
A. HISTORY OF THE SOFTWARE
==========================
Expand All @@ -7441,12 +7441,11 @@ software.

In May 2000, Guido and the Python core development team moved to
BeOpen.com to form the BeOpen PythonLabs team. In October of the same
year, the PythonLabs team moved to Digital Creations (now Zope
Corporation, see http://www.zope.com). In 2001, the Python Software
Foundation (PSF, see http://www.python.org/psf/) was formed, a
non-profit organization created specifically to own Python-related
Intellectual Property. Zope Corporation is a sponsoring member of
the PSF.
year, the PythonLabs team moved to Digital Creations, which became
Zope Corporation. In 2001, the Python Software Foundation (PSF, see
https://www.python.org/psf/) was formed, a non-profit organization
created specifically to own Python-related Intellectual Property.
Zope Corporation was a sponsoring member of the PSF.

All Python releases are Open Source (see http://www.opensource.org for
the Open Source Definition). Historically, most, but not all, Python
Expand Down Expand Up @@ -7502,8 +7501,9 @@ analyze, test, perform and/or display publicly, prepare derivative works,
distribute, and otherwise use Python alone or in any derivative version,
provided, however, that PSF's License Agreement and PSF's notice of copyright,
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are
retained in Python alone or in any derivative version prepared by Licensee.
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation;
All Rights Reserved" are retained in Python alone or in any derivative version
prepared by Licensee.

3. In the event Licensee prepares a derivative work that is based on
or incorporates Python or any part thereof, and wants to make
Expand Down Expand Up @@ -7608,9 +7608,9 @@ version prepared by Licensee. Alternately, in lieu of CNRI's License
Agreement, Licensee may substitute the following text (omitting the
quotes): "Python 1.6.1 is made available subject to the terms and
conditions in CNRI's License Agreement. This Agreement together with
Python 1.6.1 may be located on the Internet using the following
Python 1.6.1 may be located on the internet using the following
unique, persistent identifier (known as a handle): 1895.22/1013. This
Agreement may also be obtained from a proxy server on the Internet
Agreement may also be obtained from a proxy server on the internet
using the following URL: http://hdl.handle.net/1895.22/1013".

3. In the event Licensee prepares a derivative work that is based on
Expand Down Expand Up @@ -7709,7 +7709,7 @@ SOFTWARE.


zipp
3.8.1
3.9.0
MIT License
Copyright Jason R. Coombs

Expand Down
Loading

0 comments on commit 47a0134

Please sign in to comment.