Update workflow file to install datashuttle. #394
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR changes the github workflow to install datashuttle as recommended in the docs. #393 shows that this was not sufficiently tested.
Currently it's not ideal, the full datashuttle install is performed from conda. Then, the datashuttle package itself is uninstalled so that the local version that has the tests can be installed. It would be nicer in many respects to have this as a different job. The reason I did do it this way is because I could find no (simple) way to share the strategy matrix across jobs. It would also require all the conda activation setup, so basically copy and pasting the whole job to change one line. However, maybe the isolation between jobs would be worth it...
This change does now leave the dependency install from
pip
untested. This could also be tested by using a different job. However, I'm not sure if it is worth it due to the duplication issue mentioned above, and the required spin-up of lots more runners. If we test the recommended install method that should be enough?