-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Speedup logcdf tests #4734
Speedup logcdf tests #4734
Conversation
4e706ac
to
5de3f2d
Compare
I am getting a strange issue with some of the logcdf methods which seems to be behind the failing tests: Setting the values to constants gives the correct values with pm.Model() as model:
value = pm.NegativeBinomial('value', mu=0.9, alpha=0.1)
logcdf = logpt(model['value'], cdf=True)
fn1 = model.fastfn(logcdf)
print(fn1({'value': 1.0})) # -0.14408081390569186 But using shared variables does not with pm.Model() as model:
mu = aesara.shared(np.asarray(0.9))
alpha = aesara.shared(np.asarray(0.1))
value = pm.NegativeBinomial('value', mu=mu, alpha=alpha)
logcdf = logpt(model['value'], cdf=True)
fn2 = model.fastfn(logcdf)
mu.set_value(0.9)
alpha.set_value(0.1)
print(fn2({'value': 1.0})) # -0.08777554398474162
Found the culprint: it's the incomplete_beta function: incomplete_beta(0.1, 2.0, 0.1).eval() # array(0.86581778)
alpha = aesara.shared(0.1, 'alpha')
beta = aesara.shared(2.0, 'beta')
incomplete_beta(alpha, beta, 0.1).eval() # array(0.91596645)
import scipy.special
scipy.special.betainc(0.1, 2.0, 0.1) # 0.8658177758494668 All failing tests (expect for the HyperGeometric, which is a different issue) rely on the incomplete_beta. Maybe it's time I finish #4519 |
Also found some issues with the recent This snippet often leads to a with pm.Model() as model:
mu = pm.Normal('mu', initval=100)
alpha = pm.HalfNormal('alpha', initval=100, transform=None)
value = pm.NegativeBinomial('value', mu=mu, alpha=alpha)
model.initial_values
# {mu: array(100., dtype=float32),
# alpha: array(100., dtype=float32),
# value: array(1)} When it doesn't fail, initval is still far from the expected ~ 100: pm.NegativeBinomial.dist(mu=100, alpha=100).eval()
# array(98) |
The |
Closing in favor of #4736 |
The logcdf tests were running very slow due to constantly rebuilding the logcdf function.
Also fixed a couple of failing tests on float32 (which were difficult to identify before via because setting
n_samples=-1
took ages to run.