Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pyspelling #188

Merged
merged 5 commits into from
Mar 7, 2023
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions .pyspelling.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
matrix:
- name: Python Source
aspell:
lang: en
d: en_GB
dictionary:
wordlists:
- wordlist-custom.txt
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please suggest a better place/name for putting the wordlist :)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine with is being in root.

pipeline:
- pyspelling.filters.python:
strings: true
- pyspelling.filters.context:
context_visible_first: true
# escapes: \\[\\`~]
delimiters:
# ignore code examples
- open: '>>>'
close: '$'
# ignore .. math:: blocks
- open: '(?s)^(?P<open> *)\.\. math::[\n]*'
close: '^[ ]*$'
# ignore :math:`` inline
- open: ':math:(?P<open>`+)'
close: '(?P=open)'
# Ignore multiline content between fences (fences can have 3 or more back ticks)
# ```
# content
# ```
- open: '(?s)^(?P<open> *`{3,})$'
close: '^(?P=open)$'
# Ignore text between inline back ticks
- open: '(?P<open>`+)'
close: '(?P=open)'
- open: 'http://'
close: '[: \n]'
- open: 'https://'
close: '[: \n]'
- open: 'pragma:'
close: '$'
- open: '__all__ = \['
close: ']'
st-- marked this conversation as resolved.
Show resolved Hide resolved
sources:
- gpjax/**/[a-z]*.py
2 changes: 1 addition & 1 deletion gpjax/gaussian_distribution.py
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,7 @@ def _kl_divergence(
# Mahalanobis term, (μp - μq)ᵀ Σp⁻¹ (μp - μq) = tr [(μp - μq)ᵀ [LpLpᵀ]⁻¹ (μp - μq)] = (fr[Lp⁻¹(μp - μq)])²
mahalanobis = _frobeinius_norm_squared(
sqrt_p.solve(diff)
) # TODO: Need to improve this. Perhaps add a Mahalanobis method to LinearOperators.
) # TODO: Need to improve this. Perhaps add a Mahalanobis method to ``LinearOperator``s.

# KL[q(x)||p(x)] = [ [(μp - μq)ᵀ Σp⁻¹ (μp - μq)] - n - log|Σq| + log|Σp| + tr[Σp⁻¹ Σq] ] / 2
return (mahalanobis - n_dim - sigma_q.log_det() + sigma_p.log_det() + trace) / 2.0
Expand Down
10 changes: 5 additions & 5 deletions gpjax/gps.py
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ def predict(
) -> Callable[[Float[Array, "N D"]], GaussianDistribution]:
"""Compute the predictive prior distribution for a given set of
parameters. The output of this function is a function that computes
a distrx distribution for a given set of inputs.
a Distrax distribution for a given set of inputs.

In the following example, we compute the predictive prior distribution
and then evaluate it on the interval :math:`[0, 1]`:
Expand Down Expand Up @@ -444,8 +444,8 @@ def predict(test_inputs: Float[Array, "N D"]) -> GaussianDistribution:
test_inputs (Float[Array, "N D"]): A Jax array of test inputs.

Returns:
GaussianDistribution: A ``GaussianDistribution``
object that represents the predictive distribution.
A ``GaussianDistribution`` object that represents the
predictive distribution.
"""

# Unpack test inputs
Expand Down Expand Up @@ -723,8 +723,8 @@ def marginal_log_likelihood(

Unlike the marginal_log_likelihood function of the ConjugatePosterior
object, the marginal_log_likelihood function of the
NonConjugatePosterior object does not provide an exact marginal
log-likelihood function. Instead, the NonConjugatePosterior object
``NonConjugatePosterior`` object does not provide an exact marginal
log-likelihood function. Instead, the ``NonConjugatePosterior`` object
represents the posterior distributions as a function of the model's
hyperparameters and the latent function. Markov chain Monte Carlo,
variational inference, or Laplace approximations can then be used to
Expand Down
6 changes: 3 additions & 3 deletions gpjax/kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -266,7 +266,7 @@ def cross_covariance(


##########################################
# Abtract classes
# Abstract classes
##########################################
class AbstractKernel(PyTree):
"""
Expand Down Expand Up @@ -1040,8 +1040,8 @@ def __call__(

Args:
params (Dict): Parameter set for which the kernel should be evaluated on.
x (Float[Array, "1 D"]): Index of the ith vertex.
y (Float[Array, "1 D"]): Index of the jth vertex.
x (Float[Array, "1 D"]): Index of the i'th vertex.
y (Float[Array, "1 D"]): Index of the j'th vertex.

Returns:
Float[Array, "1"]: The value of :math:`k(v_i, v_j)`.
Expand Down
4 changes: 2 additions & 2 deletions gpjax/natural_gradients.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ def _expectation_elbo(
def _rename_expectation_to_natural(params: Dict) -> Dict:
"""
This function renames the gradient components (that have expectation
parameterisation keys) to match the natural parameterisation pytree.
parameterisation keys) to match the natural parameterisation PyTree.

Args:
params (Dict): A dictionary of variational Gaussian parameters
Expand All @@ -145,7 +145,7 @@ def _rename_expectation_to_natural(params: Dict) -> Dict:
def _rename_natural_to_expectation(params: Dict) -> Dict:
"""
This function renames the gradient components (that have natural
parameterisation keys) to match the expectation parameterisation pytree.
parameterisation keys) to match the expectation parameterisation PyTree.

Args:
params (Dict): A dictionary of variational Gaussian parameters
Expand Down
2 changes: 1 addition & 1 deletion gpjax/parameters.py
Original file line number Diff line number Diff line change
Expand Up @@ -192,7 +192,7 @@ def recursive_bijectors(ps, bs) -> Tuple[Dict, Dict]:
else:
bijector = Identity
warnings.warn(
f"Parameter {key} has no transform. Defaulting to identity transfom."
f"Parameter {key} has no transform. Defaulting to identity transform."
)
bs[key] = bijector
return bs
Expand Down
2 changes: 1 addition & 1 deletion gpjax/variational_families.py
Original file line number Diff line number Diff line change
Expand Up @@ -575,7 +575,7 @@ class ExpectationVariationalGaussian(AbstractVariationalGaussian):
The variational family is q(f(·)) = ∫ p(f(·)|u) q(u) du, where u = f(z) are the function values at the inducing inputs z
and the distribution over the inducing inputs is q(u) = N(μ, S). Expressing the variational distribution, in the form of the
exponential family, q(u) = exp(θᵀ T(u) - a(θ)), gives rise to the natural parameterisation θ = (θ₁, θ₂) = (S⁻¹μ, -S⁻¹/2) and
sufficient stastics T(u) = [u, uuᵀ]. The expectation parameters are given by η = ∫ T(u) q(u) du. This gives a parameterisation,
sufficient statistics T(u) = [u, uuᵀ]. The expectation parameters are given by η = ∫ T(u) q(u) du. This gives a parameterisation,
η = (η₁, η₁) = (μ, S + uuᵀ) to perform model inference over.
"""

Expand Down
2 changes: 1 addition & 1 deletion gpjax/variational_inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@


class AbstractVariationalInference(PyTree):
"""A base class for inference and training of variational families against an extact posterior"""
"""A base class for inference and training of variational families against an exact posterior"""

def __init__(
self,
Expand Down
174 changes: 174 additions & 0 deletions wordlist-custom.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,174 @@
TODO

GPJax
GPFlow
GPFlow's
GPG
JAX
Jax
JaxKern
JaxUtils
NumPyro
Distrax
PyTree

jax
optax
tqdm
args
kwargs
config
concat
jaxutils
init
natgrads
gauss
hermite
inv

ARD
bijection
bijectors
variational
jitter
probit
dataset
datapoints
hyperparameter
hyperparameters
param
params
parameterisation
parameterisations
trainables
trainability
unconstrain
unexpanded
unparsable
untagged
branchless
elementwise
pointwise
precompute
dimensionality
lengthscale
stateful
reimplement
subclasses
optimisers
subsample
getter

i'th
j'th

bool
boolean
iters
num
optim
loc
softplus

sd
nd

str

elbo
pdf
diag
cov
cholesky
sqrt

Kxx
Kxz
Kzx
KxzKzz
KxzLz
Ktt
Ktx
Ktz
Kxt
Kzt
Kzz
Lz
Lp
LpLp
Lq
LqLp
LqLq
Lx
wx
mz
uu

df
du
dx

MxD
NxD
NxM
NxN


Diggle
Frobenius
Hensman
MacKay
Ribeiro
Salimbeni
Titsias
et
al

Mahalanobis
Matérn
Schur
Geostatistics

Matern
RBF
AbstractKernel
AbstractKernelComputation
AbstractLikelihood
AbstractMeanFunction
AbstractPosterior
AbstractPrior
AbstractVariationalFamily
AbstractVariationalGaussian
AbstractVariationalInference
CollapsedVariationalGaussian
CollapsedVI
CombinationKernel
ConjugatePosterior
ConstantDiagonalKernelComputation
CovarianceOperator
CPython
DenseKernelComputation
DiagonalKernelComputation
ExpectationVariationalGaussian
GaussianDistribution
GradientTransformation
GraphKernel
InferenceState
KeyArray
LinearOperator
MeanFunction
MultivariateNormalTri
NaturalVariationalGaussian
NonConjugate
NonConjugatePosterior
Optax
ParameterState
PoweredExponential
PRNG
PRNGKey
ProductKernel
RationalQuadratic
StochasticVI
SumKernel
VariationalGaussian
WhitenedVariationalGaussian