Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hyperparameters optimization #2

Closed

Conversation

esavary
Copy link

@esavary esavary commented May 23, 2024

  • Integrate custom kernel based on spherical covariance function.
  • Implement stochastic optimization with early stopping for hyperparameter tuning.

This is a draft to plug the hyperparameter optimizations into the GaussianProcessModel (for now only with maximum likelihood estimation). I think some of the functions might be redundant with your code, so feel free to remove them from my code. I reshaped the data array for the hyperparameter optimization, but depending on the shape of the data we feed into the fit method of the Gaussian processes, this could cause problems. Do you know what the shape of y_train will be?

@esavary esavary changed the title Gaussian processes hyperparameters optimization May 23, 2024
@esavary esavary marked this pull request as draft May 24, 2024 09:33
@esavary
Copy link
Author

esavary commented May 24, 2024

I'm not entirely sure anymore if my understanding of theta is correct, so I've converted the PR to a draft to work on it further. Don't hesitate to share any input you have on this.

Implement Gaussian Process.
@jhlegarreta
Copy link
Owner

So I push-forced to the branch after having rebased, and thus there are some conflicts to be solved. Sorry for this.

First, and just in case, do a copy of your work in a separate folder. Then, assuming upstream is the nipreps/eddymotion repository, and origin is yours (i.e. esavary/eddymotion),

git checkout main
git fetch upstream
git merge upstream/main
git push origin main
git checkout gaussian-process
git rebase upstream/ImplementGaussianProcess

And solve conflicts. If you run into trouble, we could do a zoom call so that I can help you sort this out.

Other comments:

  • Very nice the subclassing of the sklearn.gaussian_process.kernels.Kernel: the GP requires a Kernel instance, so 👌 . Looks good, but we'll see as we make progress if further changes are required.
  • Probably good idea to add your tests so that we can have an idea of how the classes are called (SphericalCovarianceKernel), and whether they give the expected result with some simple data.

Do you know what the shape of y_train will be?

X needs to be (n_samples, n_features), and y (n_samples). If I did not get things totally wrong, in our case, X are the bvecs, so shape (:, 3), and y are the dwi data values, so shape (:). The estimation in each voxel is independent as far as I understand. So in the most simple case, assuming a single-shell, and assuming we have e.g. 100 total gradient directions, if we want to predict one of them, we would leave it out, and include the rest for the training, i.e. X (99, 3) and y (99), and we would give X (1,3) to the prediction method. And doing this for every voxel, we would have 1 prediction for every voxel, for every gradient direction, which would eventually give us our 4D volume.

See https://github.com/nipreps/eddymotion/pull/188/files#diff-b7713b8731a4d1f895063e58f31f8ab54f7bb6ca0e2b1712d85683874d94a6e6R137

Also, I will split my PR so that some parts that can be finalised independently (e.g. the shell identification, angle computation, etc.) can be worked on and merged independently.

@esavary
Copy link
Author

esavary commented May 27, 2024

Thanks a lot for your inputs! The dimension of the input is now clearer to me. It's a good idea to add tests specifically for the kernel. I'll work on adding them to the PR.

@jhlegarreta
Copy link
Owner

After our discussion, I believe that the kernel part would probably fit better into a separate kernels.py module. That way we would reduce conflicts in utils.py and keep things cleaner.

Implement Gaussian Process.
@jhlegarreta
Copy link
Owner

I push forced again (beb0839). I added a notebook that takes the test code snippet and adds a simple visualization of the signal vs. the prediction. In the future, the loop would disappear and be more readable. Might be helpful when we will be checking visually how good the prediction is. Rebasing on the tip of the branch will require fixing the conflicts; apologies again Elodie.

Once the utils PRs are merged, there should be less conflicts. Refactoring the kernel construction into their own module will also help. If they can be tested without relying on the GP iterative process, maybe they can be put into a separate PR soon and get them merged.

@oesteban
Copy link

What is the overlap of this and nipreps#202, nipreps#203 & nipreps#204? Are the contents of this PR all included within those PRs to the upstream repo?

@esavary
Copy link
Author

esavary commented Jun 13, 2024

What is the overlap of this and nipreps#202, nipreps#203 & nipreps#204? Are the contents of this PR all included within those PRs to the upstream repo?

Everything is included in nipreps#202, nipreps#203 & nipreps#204. I'm closing this pull request.

@esavary esavary closed this Jun 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants