Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve error messages for starting jobs on other orgs datasets #8181

Merged
merged 7 commits into from
Nov 18, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- Migrated nightly screenshot tests from CircleCI to GitHub actions. [#8134](https://github.com/scalableminds/webknossos/pull/8134)
- Migrated nightly screenshot tests for wk.org from CircleCI to GitHub actions. [#8135](https://github.com/scalableminds/webknossos/pull/8135)
- Thumbnails for datasets now use the selected mapping from the view configuration if available. [#8157](https://github.com/scalableminds/webknossos/pull/8157)
- Improved error messages for starting jobs on datasets from other organizations. [#8181](https://github.com/scalableminds/webknossos/pull/8181)

### Fixed
- Fixed a bug during dataset upload in case the configured `datastore.baseFolder` is an absolute path. [#8098](https://github.com/scalableminds/webknossos/pull/8098) [#8103](https://github.com/scalableminds/webknossos/pull/8103)
Expand Down
13 changes: 11 additions & 2 deletions app/controllers/AiModelController.scala
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
package controllers

import com.scalableminds.util.accesscontext.GlobalAccessContext
import com.scalableminds.util.geometry.{BoundingBox, Vec3Int}
import com.scalableminds.util.tools.{Fox, FoxImplicits}
import models.aimodels.{AiInference, AiInferenceDAO, AiInferenceService, AiModel, AiModelDAO, AiModelService}
Expand All @@ -18,6 +19,7 @@ import scala.concurrent.ExecutionContext
import com.scalableminds.util.time.Instant
import models.aimodels.AiModelCategory.AiModelCategory
import models.organization.OrganizationDAO
import play.api.i18n.Messages

case class TrainingAnnotationSpecification(annotationId: ObjectId,
colorLayerName: String,
Expand All @@ -41,6 +43,7 @@ object RunTrainingParameters {
case class RunInferenceParameters(annotationId: Option[ObjectId],
aiModelId: ObjectId,
datasetName: String,
organizationId: String,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Issue: organizationId parameter not passed in runInferenceJob calls

  • organizationId is not included in any runInferenceJob function calls in the frontend code, potentially leading to missing organization validation.
🔗 Analysis chain

LGTM: Added organizationId parameter for explicit organization validation

The addition of organizationId parameter enables proper organization validation before running inference jobs. This aligns with the PR objective of improving error handling for cross-organization access attempts.

Let's verify that the frontend is updated to pass this new parameter:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for RunInferenceParameters usage in frontend code
rg -t ts -t js "RunInferenceParameters|runInference.*organizationId"

Length of output: 246


Script:

#!/bin/bash
# Search for runInferenceJob calls with organizationId parameter
rg -t ts -t js 'runInferenceJob\s*\([^)]*organizationId'

Length of output: 58


Script:

#!/bin/bash
# List all runInferenceJob calls with their arguments for manual inspection
rg -t ts -t js 'runInferenceJob\s*\(([^)]*)\)' -A 2

Length of output: 336

colorLayerName: String,
boundingBox: String,
newDatasetName: String,
Expand Down Expand Up @@ -135,6 +138,7 @@ class AiModelController @Inject()(
firstAnnotationId <- trainingAnnotations.headOption.map(_.annotationId).toFox
annotation <- annotationDAO.findOne(firstAnnotationId)
dataset <- datasetDAO.findOne(annotation._dataset)
_ <- bool2Fox(request.identity._organization == dataset._organization) ?~> "job.trainModel.notAllowed.organization" ~> FORBIDDEN
dataStore <- dataStoreDAO.findOneByName(dataset._dataStore) ?~> "dataStore.notFound"
_ <- Fox
.serialCombined(request.body.trainingAnnotations.map(_.annotationId))(annotationDAO.findOne) ?~> "annotation.notFound"
Expand Down Expand Up @@ -172,8 +176,13 @@ class AiModelController @Inject()(
sil.SecuredAction.async(validateJson[RunInferenceParameters]) { implicit request =>
for {
_ <- userService.assertIsSuperUser(request.identity)
organization <- organizationDAO.findOne(request.identity._organization)
dataset <- datasetDAO.findOneByNameAndOrganization(request.body.datasetName, organization._id)
organization <- organizationDAO.findOne(request.body.organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
request.body.organizationId)
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.runInference.notAllowed.organization" ~> FORBIDDEN
dataset <- datasetDAO.findOneByNameAndOrganization(request.body.datasetName, organization._id) ?~> Messages(
"dataset.notFound",
request.body.datasetName)
Comment on lines +183 to +185
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add explicit organization validation for inference jobs

While the code validates the dataset belongs to the organization during lookup, it should have an explicit organization check similar to runTraining for consistency. This aligns with the PR objective of standardizing organization access checks.

         dataset <- datasetDAO.findOneByNameAndOrganization(request.body.datasetName, organization._id) ?~> 
           Messages("dataset.notFound", request.body.datasetName)
+        _ <- bool2Fox(request.identity._organization == dataset._organization) ?~> "job.inferWithModel.notAllowed.organization" ~> FORBIDDEN
         dataStore <- dataStoreDAO.findOneByName(dataset._dataStore) ?~> "dataStore.notFound"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
dataset <- datasetDAO.findOneByNameAndOrganization(request.body.datasetName, organization._id) ?~> Messages(
"dataset.notFound",
request.body.datasetName)
dataset <- datasetDAO.findOneByNameAndOrganization(request.body.datasetName, organization._id) ?~> Messages(
"dataset.notFound",
request.body.datasetName)
_ <- bool2Fox(request.identity._organization == dataset._organization) ?~> "job.inferWithModel.notAllowed.organization" ~> FORBIDDEN

dataStore <- dataStoreDAO.findOneByName(dataset._dataStore) ?~> "dataStore.notFound"
_ <- aiModelDAO.findOne(request.body.aiModelId) ?~> "aiModel.notFound"
_ <- datasetService.assertValidDatasetName(request.body.newDatasetName)
Expand Down
24 changes: 17 additions & 7 deletions app/controllers/JobController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ class JobController @Inject()(
voxelSizeFactor <- Vec3Double.fromUriLiteral(scale).toFox
voxelSizeUnit <- Fox.runOptional(unit)(u => LengthUnit.fromString(u).toFox)
voxelSize = VoxelSize.fromFactorAndUnitWithDefault(voxelSizeFactor, voxelSizeUnit)
_ <- bool2Fox(request.identity._organization == organization._id) ~> FORBIDDEN
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.convertToWkw.notAllowed.organization" ~> FORBIDDEN
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
Expand Down Expand Up @@ -230,7 +230,9 @@ class JobController @Inject()(
sil.SecuredAction.async { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
for {
organization <- organizationDAO.findOne(organizationId) ?~> Messages("organization.notFound", organizationId)
organization <- organizationDAO.findOne(organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organizationId)
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.inferNeurons.notAllowed.organization" ~> FORBIDDEN
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
Expand Down Expand Up @@ -261,7 +263,9 @@ class JobController @Inject()(
sil.SecuredAction.async { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
for {
organization <- organizationDAO.findOne(organizationId) ?~> Messages("organization.notFound", organizationId)
organization <- organizationDAO.findOne(organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organizationId)
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.inferMitochondria.notAllowed.organization" ~> FORBIDDEN
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
Expand Down Expand Up @@ -293,7 +297,9 @@ class JobController @Inject()(
sil.SecuredAction.async { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
for {
organization <- organizationDAO.findOne(organizationId) ?~> Messages("organization.notFound", organizationId)
organization <- organizationDAO.findOne(organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organizationId)
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.alignSections.notAllowed.organization" ~> FORBIDDEN
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
Expand Down Expand Up @@ -412,7 +418,9 @@ class JobController @Inject()(
sil.SecuredAction.async { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
for {
organization <- organizationDAO.findOne(organizationId) ?~> Messages("organization.notFound", organizationId)
organization <- organizationDAO.findOne(organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organizationId)
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.findLargestSegmentId.notAllowed.organization" ~> FORBIDDEN
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
Expand All @@ -434,9 +442,11 @@ class JobController @Inject()(
sil.SecuredAction.async(validateJson[AnimationJobOptions]) { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
for {
organization <- organizationDAO.findOne(organizationId) ?~> Messages("organization.notFound", organizationId)
userOrganization <- organizationDAO.findOne(request.identity._organization)
organization <- organizationDAO.findOne(organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organizationId)
_ <- bool2Fox(request.identity._organization == organization._id) ?~> "job.renderAnimation.notAllowed.organization" ~> FORBIDDEN
userOrganization <- organizationDAO.findOne(request.identity._organization)
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
Expand Down
16 changes: 9 additions & 7 deletions conf/messages
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ dataLayer.wrongMag=DataLayer “{0}” does not have mag “{1}”
dataLayer.invalidMag=Supplied “{0}” is not a valid mag format. Please use “x-y-z”

zarr.invalidChunkCoordinates=Invalid chunk coordinates. Expected dot separated coordinates like c.<additional_axes.>x.y.z
zarr.invalidFirstChunkCoord="First Channel must be 0"
zarr.invalidFirstChunkCoord=First Channel must be 0
zarr.chunkNotFound=Could not find the requested chunk
zarr.notEnoughCoordinates=Invalid number of chunk coordinates. Expected to get at least 3 dimensions and channel 0.
zarr.invalidAdditionalCoordinates=Invalid additional coordinates for this data layer.
Expand All @@ -150,8 +150,8 @@ nml.file.noFile=No file or file with empty content uploaded
nml.file.differentDatasets=Cannot upload tracings that belong to different datasets at once
nml.parameters.notFound=No parameters section found
nml.element.invalid=Invalid {0}
nml.comment.node.invalid=Invalid node id of comment {0}"
nml.node.id.invalid=Invalid {0} node id {1}"
nml.comment.node.invalid=Invalid node id of comment {0}
nml.node.id.invalid=Invalid {0} node id {1}
nml.node.attribute.invalid=Invalid node {0} for node id {1}
nml.edge.invalid=Invalid edge {0} {1}
nml.tree.id.invalid=Invalid tree id {0}
Expand Down Expand Up @@ -321,6 +321,7 @@ job.invalidBoundingBox = The selected bounding box could not be parsed, must be
job.invalidMag = The selected mag could not be parsed, must be x-y-z
job.volumeExceeded = The volume of the selected bounding box is too large.
job.edgeLengthExceeded = An edge length of the selected bounding box is too large.
job.convertToWkw.notAllowed.organization = Converting to WKW is only allowed for datasets of your own organization.
job.inferNuclei.notAllowed.organization = Currently nuclei inferral is only allowed for datasets of your own organization.
job.inferNeurons.notAllowed.organization = Currently neuron inferral is only allowed for datasets of your own organization.
job.meshFile.notAllowed.organization = Calculating mesh files is only allowed for datasets of your own organization.
Expand All @@ -329,10 +330,11 @@ job.globalizeFloodfill.notAllowed.organization = Globalizing floodfills is only
job.materializeVolumeAnnotation.notAllowed.organization = Materializing volume annotations is only allowed for datasets of your own organization.
job.noWorkerForDatastoreAndJob = No webknossos-worker supporting the requested job is available for the selected datastore.
job.emailNotifactionsDisabled = Email notifications are not enabled for this job type.
job.renderAnimation.notAllowed.organization = "Rendering animations is only allowed for datasets of your own organization."
job.alignSections.notAllowed.organization = "Aligning sections is only allowed for datasets of your own organization."
job.alignSections.notAllowed.onlySuperUsers = "For now, aligning sections is only allowed for super users."
job.additionalCoordinates.invalid = "The passed additional coordinates are invalid."
job.renderAnimation.notAllowed.organization = Rendering animations is only allowed for datasets of your own organization.
job.alignSections.notAllowed.organization = Aligning sections is only allowed for datasets of your own organization.
job.additionalCoordinates.invalid = The passed additional coordinates are invalid.
job.trainModel.notAllowed.organization = Training AI models is only allowed for datasets of your own organization.
job.runInference.notAllowed.organization = Running inference is only allowed for datasets of your own organization.

voxelytics.disabled = Voxelytics workflow reporting and logging are not enabled for this WEBKNOSSOS instance.
voxelytics.runNotFound = Workflow runs not found
Expand Down
1 change: 1 addition & 0 deletions frontend/javascripts/admin/api/jobs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -348,6 +348,7 @@ type RunInferenceParameters = {
annotationId?: string;
aiModelId: string;
datasetName: string;
organizationId: string;
colorLayerName: string;
boundingBox: Vector6;
newDatasetName: string;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -829,6 +829,7 @@ function CustomAiModelInferenceForm() {
aiModelId: form.getFieldValue("aiModel"),
workflowYaml: useCustomWorkflow ? form.getFieldValue("workflowYaml") : undefined,
datasetName: dataset.name,
organizationId: dataset.owningOrganization,
colorLayerName: colorLayer.name,
boundingBox,
newDatasetName: newDatasetName,
Expand Down