Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel evaluation (Cross-Session, Within-Session) #364

Merged
merged 22 commits into from
May 31, 2023

Conversation

bruAristimunha
Copy link
Collaborator

100% still determining, draft. Does it make some change in the function to be parallel? I don't know.

@bruAristimunha
Copy link
Collaborator Author

I changed the WithinSession also...

@bruAristimunha
Copy link
Collaborator Author

I changed my mind, I want to merge this pull request. It is super helpful to increase the speed, but maybe we need to change the name of the parameters. I was wondering, can you check @sylvchev and @carraraig?

@bruAristimunha bruAristimunha marked this pull request as ready for review April 28, 2023 15:20
@bruAristimunha bruAristimunha changed the title Parallel cross session Parallel evalution (Cross-Session, Within-Session and Cross-Subject) May 16, 2023
@bruAristimunha bruAristimunha changed the title Parallel evalution (Cross-Session, Within-Session and Cross-Subject) Parallel evalution (Cross-Session, Within-Session) May 16, 2023
@bruAristimunha
Copy link
Collaborator Author

Hi @sylvchev,

Can you please review this code?

@bruAristimunha bruAristimunha changed the title Parallel evalution (Cross-Session, Within-Session) Parallel evaluation (Cross-Session, Within-Session) May 16, 2023
Copy link
Member

@sylvchev sylvchev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems correct, but the diff is difficult to read due to the indentation change.
Did you tested this parallel version to ensure that the results are the same when n_jobs and n_jobs_evaluation are > 1?

@bruAristimunha
Copy link
Collaborator Author

Yes @sylvchev. Tested and working, same result!

In the extreme case, with more than 100 jobs in parallel, we end up losing some jobs. It might be interesting to put some warning about the possible stability, but the joblib already does that.

Btw, do you think I should also convert the cross-subjects evaluation?

@sylvchev
Copy link
Member

Btw, do you think I should also convert the cross-subjects evaluation?

Cross-subject eval are very memory agressive and parallelization could require huge amount of memory, but it could be nice to have congruent API.

I'll give a pragmatic answer: add the n_jobs_evaluation for cross-subject evaluation if you could, otherwise we could do it later.

@bruAristimunha
Copy link
Collaborator Author

I had try and better leave it for later

Copy link
Member

@sylvchev sylvchev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is a go! Let's open an issue to remember to add parallel evaluation and n_jobs_evaluation to CrossSubjectEvaluation

@sylvchev sylvchev merged commit 79f342b into NeuroTechX:develop May 31, 2023
@bruAristimunha bruAristimunha deleted the parallel-cross-session branch June 8, 2024 20:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants