-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Store remote dataset credentials separately #6646
Conversation
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool stuff, works really well :) I added a few refactoring comments, but nothing major.
|
||
private def parseAnyCredential(r: CredentialsRow): Fox[AnyCredential] = | ||
r.`type` match { | ||
case "HTTP_Basic_Auth" => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could possibly use the CredentialType enum values (with toString if needed) to avoid bare unnamed strings here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[error] case CredentialType.HTTP_Basic_Auth.toString => parseAsHttpBasicAuthCredential(r)
[error] ^
[error] /home/felix/scm/webknossos/app/models/binary/credential/CredentialDAO.scala:68:41: stable identifier required, but CredentialType.S3_Access_Key.toString found.
[error] case CredentialType.S3_Access_Key.toString => parseAsS3AccessKeyCredential(r)
...sos-datastore/app/com/scalableminds/webknossos/datastore/storage/FileSystemCredentials.scala
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM :) Please have a look at the merge conflicts, including evolution numbering.
Note that refresh-schema will delete the stored credentials for local datasets, so streaming test datasets become unusable then. I’m not sure how to manage this for our local development. May not be a serious problem. The old format still works, right? So maybe we could manually edit the json for a local dataset to include the credentials again, to keep it locally for testing across schema refreshs.
URL of deployed dev instance (used for testing):
Steps to test:
TODO
Issues:
(Please delete unneeded items, merge only when none are left open)