Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function to optimize prior under constraints #5231
Function to optimize prior under constraints #5231
Changes from 46 commits
8ca3ded
9dc0096
9675e4f
d132364
6f9ccd4
fea6643
524a900
91174b9
29741f1
1ad4297
a708e6d
e1c5125
bc9b543
18ad975
e92d6d8
94b406b
76dbb1f
53bfc00
77a0bb1
fd5f498
171a4aa
36b95cb
55138d9
4bed2cd
02d117b
7742571
8a6e0e7
602391b
9bb14a3
ab0ef0f
58f5d56
a6c7f0d
c9c24d6
c75f8c9
3ffd7ff
a1a6bdf
7cd0e55
1d868fa
063bc96
cb7908c
16ed438
1b84e18
6ea7861
d63b652
37e6251
b912ac6
d4bce39
9a51289
90a88ff
8b9ae6e
d53154a
1f42835
bad236c
d89e375
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Follow up PR (not to be sadistic with this one): we should add a code example in the docstrings
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not preaching for my choir here, but I actually should add that here. Don't merge in the meantime. Will ping when done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done @ricardoV94 -- we can merge once tests pass
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the exception we want here is the one found in
aesara.gradient.NullTypeGradError
as well asNotImplementedError
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The exception that's thrown is a
TypeError
by the Aesaragrad
method (line 501) inaesara/gradient.py
:I added that error in the Except clause and tests pass locally.
aesara.gradient.NullTypeGradError
andNotImplementedError
don't seem to be raised but I kept them in case they are by other cases we may have forgottenThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We shouldn't catch that TypeError. That means we produced a wrong input to aesara grad.
The other two exceptions are the ones that (should) appear when a grad is not implemented for an Op.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That means we should make these two exceptions appear then, shouldn't we? Because they are not raised right now -- only the TypeError is raised
(here is to my first GH comment of the year 🥂 )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Those two exceptions mean there is no grad implemented for some Op in the cdf which can very well happen and its a good reason to silently default to the scipy approximation. The TypeError, on the other hand, should not be catched, as that means we did something wrong.
In fact there was a point during this PR when it was always silently defaulting to the scipy approximation because we were passing two values to grad and suppressing the TypeError.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That seems to be the case. Locally it is passing in float64/float32 and 1e-5 precision, so we don't need to have the separate test just for the Poisson anymore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ha ha damn, I just pushed without that change. Tests are indeed passing locally. Gonna refactor the tests and push again
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hot damn, tests are passing locally 🔥 Pushed!
Why does the symbolic gradien help so much with numerical errors?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because the logcdf uses gammaincc whose gradient is notoriously tricky. We somewhat recently added a numerically stable(r) implementation to Aesara
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
aesara-devs/aesara#513