Skip to content

Commit

Permalink
Merge branch 'master' of github.com:scalableminds/webknossos into pri…
Browse files Browse the repository at this point in the history
…cing

* 'master' of github.com:scalableminds/webknossos:
  Fix parsing failure during import of ngff zarr datasets with translation transforms (#6621)
  Fix rerender after each layer name keypress in dataset import view (#6628)
  Create a layer for each channel of NGFF-Zarr datasets (#6609)
  Fix screenshot tests (#6623)
  Fix importing a dataset from disk (#6615)
  Allow deleting annotation layers (#6593)
  Rephrased Error Messages (#6616)
  • Loading branch information
hotzenklotz committed Nov 11, 2022
2 parents c657eb0 + 35c3673 commit cb094d4
Show file tree
Hide file tree
Showing 45 changed files with 377 additions and 285 deletions.
8 changes: 7 additions & 1 deletion CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,16 +15,22 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- The largest segment id for a segmentation layer can be computed automatically from the dataset settings page. [#6415](https://github.com/scalableminds/webknossos/pull/6415)
- Button for switching organizations for Voxelytics workflows. [#6572](https://github.com/scalableminds/webknossos/pull/6572)
- Added ability to shuffle / set colors for a whole tree group. [#6586](https://github.com/scalableminds/webknossos/pull/6586)
- Annotation layers can now be removed. [#6593](https://github.com/scalableminds/webknossos/pull/6593)
- When adding remote Zarr datasets with multiple channels, channels are converted into layers. [#6609](https://github.com/scalableminds/webknossos/pull/6609)

### Changed
- The log viewer in the Voxelytics workflow reporting now uses a virtualized list. [#6579](https://github.com/scalableminds/webknossos/pull/6579)
- Node positions are always handled as integers. They have always been persisted as integers by the server, anyway, but the session in which a node was created handled the position as floating point in earlier versions. [#6589](https://github.com/scalableminds/webknossos/pull/6589)
- When merging annotations, bounding boxes are no longer duplicated. [#6576](https://github.com/scalableminds/webknossos/pull/6576)
- Jobs can no longer be started on datastores without workers. [#6595](https://github.com/scalableminds/webknossos/pull/6595)
- When downloading volume annotations with volume data skipped, the nml volume tag is now included anyway (but has no location attribute in this case). [#6566](https://github.com/scalableminds/webknossos/pull/6566)
- Re-phrased some backend (error) messages to improve clarity and provide helping hints. [#6616](https://github.com/scalableminds/webknossos/pull/6616)
- Redesigned organization page to include more infos on organization users, storage, webKnossos plan and provided opportunities to upgrade. [#6602](https://github.com/scalableminds/webknossos/pull/6602)

### Fixed
- Fixed importing a dataset from disk. [#6615](https://github.com/scalableminds/webknossos/pull/6615)
- Fixed a bug in the dataset import view, where the layer name text field would lose focus after each key press. [#6615](https://github.com/scalableminds/webknossos/pull/6615)
- Fixed importing NGFF Zarr datasets with non-scale transforms. [#6621](https://github.com/scalableminds/webknossos/pull/6621)

### Removed

Expand Down
35 changes: 28 additions & 7 deletions app/controllers/AnnotationController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -5,33 +5,29 @@ import com.mohiva.play.silhouette.api.Silhouette
import com.scalableminds.util.accesscontext.{DBAccessContext, GlobalAccessContext}
import com.scalableminds.util.geometry.BoundingBox
import com.scalableminds.util.tools.{Fox, FoxImplicits}
import com.scalableminds.webknossos.datastore.models.annotation.AnnotationLayerType.AnnotationLayerType
import com.scalableminds.webknossos.datastore.models.annotation.{AnnotationLayer, AnnotationLayerType}
import com.scalableminds.webknossos.tracingstore.tracings.volume.ResolutionRestrictions
import com.scalableminds.webknossos.tracingstore.tracings.{TracingIds, TracingType}
import io.swagger.annotations._

import javax.inject.Inject
import models.analytics.{AnalyticsService, CreateAnnotationEvent, OpenAnnotationEvent}
import com.scalableminds.webknossos.datastore.models.annotation.AnnotationLayerType.AnnotationLayerType
import models.annotation.AnnotationState.Cancelled
import models.annotation._
import models.binary.{DataSetDAO, DataSetService}
import models.organization.OrganizationDAO
import models.project.ProjectDAO
import models.task.TaskDAO
import models.team.{TeamDAO, TeamService}
import models.user.time._
import models.user.{User, UserDAO, UserService}
import oxalis.mail.{MailchimpClient, MailchimpTag}
import oxalis.security.{URLSharing, WkEnv}
import play.api.i18n.{Messages, MessagesProvider}
import play.api.libs.json._
import play.api.mvc.{Action, AnyContent, PlayBodyParsers}
import utils.{ObjectId, WkConf}

import javax.inject.Inject
import models.analytics.{AnalyticsService, CreateAnnotationEvent, OpenAnnotationEvent}
import models.organization.OrganizationDAO
import oxalis.mail.{MailchimpClient, MailchimpTag}

import scala.concurrent.ExecutionContext
import scala.concurrent.duration._

Expand Down Expand Up @@ -225,6 +221,31 @@ class AnnotationController @Inject()(
} yield result
}

@ApiOperation(hidden = true, value = "")
def deleteAnnotationLayer(typ: String, id: String, layerName: String): Action[AnyContent] =
sil.SecuredAction.async { implicit request =>
for {
_ <- bool2Fox(AnnotationType.Explorational.toString == typ) ?~> "annotation.deleteLayer.explorationalsOnly"
annotation <- provider.provideAnnotation(typ, id, request.identity)
_ <- bool2Fox(annotation._user == request.identity._id) ?~> "notAllowed" ~> FORBIDDEN
_ <- annotation.annotationLayers.find(annotationLayer => annotationLayer.name == layerName) ?~> Messages(
"annotation.layer.notFound",
layerName)
_ <- bool2Fox(annotation.annotationLayers.length != 1) ?~> "annotation.deleteLayer.onlyLayer"
_ = logger.info(s"Deleting annotation layer $layerName for annotation $id")
_ <- annotationService.deleteAnnotationLayer(annotation, layerName)
} yield Ok
}

@ApiOperation(hidden = true, value = "")
def deleteAnnotationLayerWithoutType(id: String, layerName: String): Action[AnyContent] =
sil.SecuredAction.async { implicit request =>
for {
annotation <- provider.provideAnnotation(id, request.identity) ~> NOT_FOUND
result <- deleteAnnotationLayer(annotation.typ.toString, id, layerName)(request)
} yield result
}

@ApiOperation(hidden = true, value = "")
def createExplorational(organizationName: String, dataSetName: String): Action[List[AnnotationLayerParameters]] =
sil.SecuredAction.async(validateJson[List[AnnotationLayerParameters]]) { implicit request =>
Expand Down
2 changes: 1 addition & 1 deletion app/controllers/DataSetController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -440,7 +440,7 @@ Expects:
for {
organization <- organizationDAO.findOneByName(organizationName)
_ <- bool2Fox(organization._id == request.identity._organization) ~> FORBIDDEN
_ <- dataSetService.assertValidDataSetName(dataSetName) ?~> "dataSet.name.invalid"
_ <- dataSetService.assertValidDataSetName(dataSetName)
_ <- dataSetService.assertNewDataSetName(dataSetName, organization._id) ?~> "dataSet.name.alreadyTaken"
} yield Ok
}
Expand Down
2 changes: 1 addition & 1 deletion app/controllers/WKRemoteDataStoreController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ class WKRemoteDataStoreController @Inject()(
"organization.notFound",
uploadInfo.organization) ~> NOT_FOUND
_ <- bool2Fox(organization._id == user._organization) ?~> "notAllowed" ~> FORBIDDEN
_ <- dataSetService.assertValidDataSetName(uploadInfo.name) ?~> "dataSet.name.invalid"
_ <- dataSetService.assertValidDataSetName(uploadInfo.name)
_ <- dataSetService.assertNewDataSetName(uploadInfo.name, organization._id) ?~> "dataSet.name.alreadyTaken"
_ <- bool2Fox(dataStore.onlyAllowedOrganization.forall(_ == organization._id)) ?~> "dataSet.upload.Datastore.restricted"
_ <- Fox.serialCombined(uploadInfo.layersToLink.getOrElse(List.empty))(l => validateLayerToLink(l, user)) ?~> "dataSet.upload.invalidLinkedLayers"
Expand Down
6 changes: 6 additions & 0 deletions app/models/annotation/Annotation.scala
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,12 @@ class AnnotationLayerDAO @Inject()(SQLClient: SQLClient)(implicit ec: ExecutionC
sqlu"""insert into webknossos.annotation_layers(_annotation, tracingId, typ, name)
values($annotationId, ${a.tracingId}, '#${a.typ}', ${a.name})"""

def deleteOne(annotationId: ObjectId, layerName: String): Fox[Unit] =
for {
_ <- run(sqlu"""delete from webknossos.annotation_layers where _annotation = $annotationId and
name = $layerName""")
} yield ()

def findAnnotationIdByTracingId(tracingId: String): Fox[ObjectId] =
for {
rList <- run(sql"select _annotation from webknossos.annotation_layers where tracingId = $tracingId".as[String])
Expand Down
5 changes: 5 additions & 0 deletions app/models/annotation/AnnotationService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -193,6 +193,11 @@ class AnnotationService @Inject()(
_ <- annotationLayersDAO.insertForAnnotation(annotation._id, newAnnotationLayers)
} yield ()

def deleteAnnotationLayer(annotation: Annotation, layerName: String): Fox[Unit] =
for {
_ <- annotationLayersDAO.deleteOne(annotation._id, layerName)
} yield ()

private def createTracingsForExplorational(dataSet: DataSet,
dataSource: DataSource,
allAnnotationLayerParameters: List[AnnotationLayerParameters],
Expand Down
17 changes: 5 additions & 12 deletions app/models/annotation/AnnotationSettings.scala
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,15 @@ import com.scalableminds.util.enumeration.ExtendedEnumeration
import com.scalableminds.webknossos.tracingstore.tracings.TracingType
import com.scalableminds.webknossos.tracingstore.tracings.TracingType.TracingType
import com.scalableminds.webknossos.tracingstore.tracings.volume.ResolutionRestrictions
import models.annotation.AnnotationSettings.skeletonModes
import play.api.libs.json._

object TracingMode extends ExtendedEnumeration {
type TracingMode = Value
val orthogonal, oblique, flight, volume = Value
val orthogonal, oblique, flight = Value
}

case class AnnotationSettings(
allowedModes: List[TracingMode.Value] = skeletonModes,
allowedModes: List[TracingMode.Value],
preferredMode: Option[String] = None,
branchPointsAllowed: Boolean = true,
somaClickingAllowed: Boolean = true,
Expand All @@ -23,17 +22,11 @@ case class AnnotationSettings(
)

object AnnotationSettings {
private val skeletonModes = List(TracingMode.orthogonal, TracingMode.oblique, TracingMode.flight)
private val volumeModes = List(TracingMode.volume)
private val allModes = skeletonModes ::: volumeModes

def defaultFor(tracingType: TracingType): AnnotationSettings = tracingType match {
case TracingType.skeleton =>
AnnotationSettings(allowedModes = skeletonModes)
case TracingType.volume =>
AnnotationSettings(allowedModes = volumeModes)
case TracingType.hybrid =>
AnnotationSettings(allowedModes = allModes)
AnnotationSettings(allowedModes = List(TracingMode.orthogonal))
case TracingType.skeleton | TracingType.hybrid =>
AnnotationSettings(allowedModes = List(TracingMode.orthogonal, TracingMode.oblique, TracingMode.flight))
}

implicit val jsonFormat: OFormat[AnnotationSettings] = Json.format[AnnotationSettings]
Expand Down
2 changes: 1 addition & 1 deletion app/models/binary/explore/N5ArrayExplorer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ class N5ArrayExplorer extends RemoteLayerExplorer {
elementClass <- n5Header.elementClass ?~> "failed to read element class from n5 header"
guessedAxisOrder = AxisOrder.asZyxFromRank(n5Header.rank)
boundingBox <- n5Header.boundingBox(guessedAxisOrder) ?~> "failed to read bounding box from zarr header. Make sure data is in (T/C)ZYX format"
magLocator = MagLocator(Vec3Int.ones, Some(remotePath.toString), credentials, Some(guessedAxisOrder))
magLocator = MagLocator(Vec3Int.ones, Some(remotePath.toString), credentials, Some(guessedAxisOrder), None)
layer: N5Layer = if (looksLikeSegmentationLayer(name, elementClass)) {
N5SegmentationLayer(name, boundingBox, elementClass, List(magLocator), largestSegmentId = None)
} else N5DataLayer(name, Category.color, boundingBox, elementClass, List(magLocator))
Expand Down
2 changes: 1 addition & 1 deletion app/models/binary/explore/N5MultiscalesExplorer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ class N5MultiscalesExplorer extends RemoteLayerExplorer with FoxImplicits {
elementClass <- n5Header.elementClass ?~> s"failed to read element class from n5 header at $headerPath"
boundingBox <- n5Header.boundingBox(axisOrder) ?~> s"failed to read bounding box from n5 header at $headerPath"
} yield
MagWithAttributes(MagLocator(mag, Some(magPath.toString), credentials, Some(axisOrder)),
MagWithAttributes(MagLocator(mag, Some(magPath.toString), credentials, Some(axisOrder), None),
magPath,
elementClass,
boundingBox)
Expand Down
66 changes: 47 additions & 19 deletions app/models/binary/explore/NgffExplorer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -25,36 +25,63 @@ class NgffExplorer extends RemoteLayerExplorer {
for {
zattrsPath <- Fox.successful(remotePath.resolve(NgffMetadata.FILENAME_DOT_ZATTRS))
ngffHeader <- parseJsonFromPath[NgffMetadata](zattrsPath) ?~> s"Failed to read OME NGFF header at $zattrsPath"
layers <- Fox.serialCombined(ngffHeader.multiscales)(layerFromNgffMultiscale(_, remotePath, credentials))

layerLists: List[List[(ZarrLayer, Vec3Double)]] <- Fox.serialCombined(ngffHeader.multiscales)(multiScale => {
for {
channelCount: Int <- getNgffMultiscaleChannelCount(multiScale, remotePath)
layers <- layersFromNgffMultiscale(multiScale, remotePath, credentials, channelCount)
} yield layers
})
layers: List[(ZarrLayer, Vec3Double)] = layerLists.flatten
} yield layers

private def layerFromNgffMultiscale(multiscale: NgffMultiscalesItem,
remotePath: Path,
credentials: Option[FileSystemCredentials]): Fox[(ZarrLayer, Vec3Double)] =
private def getNgffMultiscaleChannelCount(multiscale: NgffMultiscalesItem, remotePath: Path): Fox[Int] =
for {
firstDataset <- multiscale.datasets.headOption.toFox
magPath = remotePath.resolve(firstDataset.path)
zarrayPath = magPath.resolve(ZarrHeader.FILENAME_DOT_ZARRAY)
zarrHeader <- parseJsonFromPath[ZarrHeader](zarrayPath) ?~> s"failed to read zarr header at $zarrayPath"
axisOrder <- extractAxisOrder(multiscale.axes) ?~> "Could not extract XYZ axis order mapping. Does the data have x, y and z axes, stated in multiscales metadata?"
channelAxisIndex <- axisOrder.c.toFox
} yield zarrHeader.shape(channelAxisIndex)

private def layersFromNgffMultiscale(multiscale: NgffMultiscalesItem,
remotePath: Path,
credentials: Option[FileSystemCredentials],
channelCount: Int): Fox[List[(ZarrLayer, Vec3Double)]] =
for {
axisOrder <- extractAxisOrder(multiscale.axes) ?~> "Could not extract XYZ axis order mapping. Does the data have x, y and z axes, stated in multiscales metadata?"
axisUnitFactors <- extractAxisUnitFactors(multiscale.axes, axisOrder) ?~> "Could not extract axis unit-to-nm factors"
voxelSizeInAxisUnits <- extractVoxelSizeInAxisUnits(
multiscale.datasets.map(_.coordinateTransformations),
axisOrder) ?~> "Could not extract voxel size from scale transforms"
magsWithAttributes <- Fox.serialCombined(multiscale.datasets)(d =>
zarrMagFromNgffDataset(d, remotePath, voxelSizeInAxisUnits, axisOrder, credentials))
_ <- bool2Fox(magsWithAttributes.nonEmpty) ?~> "zero mags in layer"
elementClass <- elementClassFromMags(magsWithAttributes) ?~> "Could not extract element class from mags"
boundingBox = boundingBoxFromMags(magsWithAttributes)
voxelSizeNanometers = voxelSizeInAxisUnits * axisUnitFactors
nameFromPath <- guessNameFromPath(remotePath)
name = multiscale.name.getOrElse(nameFromPath)
voxelSizeNanometers = voxelSizeInAxisUnits * axisUnitFactors
layer: ZarrLayer = if (looksLikeSegmentationLayer(name, elementClass)) {
ZarrSegmentationLayer(name, boundingBox, elementClass, magsWithAttributes.map(_.mag), largestSegmentId = None)
} else ZarrDataLayer(name, Category.color, boundingBox, elementClass, magsWithAttributes.map(_.mag))
} yield (layer, voxelSizeNanometers)
layerTuples <- Fox.serialCombined((0 until channelCount).toList)({ channelIndex: Int =>
for {
magsWithAttributes <- Fox.serialCombined(multiscale.datasets)(d =>
zarrMagFromNgffDataset(d, remotePath, voxelSizeInAxisUnits, axisOrder, credentials, Some(channelIndex)))
_ <- bool2Fox(magsWithAttributes.nonEmpty) ?~> "zero mags in layer"
elementClass <- elementClassFromMags(magsWithAttributes) ?~> "Could not extract element class from mags"
boundingBox = boundingBoxFromMags(magsWithAttributes)
layer: ZarrLayer = if (looksLikeSegmentationLayer(name, elementClass)) {
ZarrSegmentationLayer(name,
boundingBox,
elementClass,
magsWithAttributes.map(_.mag),
largestSegmentId = None)
} else ZarrDataLayer(name, Category.color, boundingBox, elementClass, magsWithAttributes.map(_.mag))
} yield (layer, voxelSizeNanometers)
})
} yield layerTuples

private def zarrMagFromNgffDataset(ngffDataset: NgffDataset,
layerPath: Path,
voxelSizeInAxisUnits: Vec3Double,
axisOrder: AxisOrder,
credentials: Option[FileSystemCredentials]): Fox[MagWithAttributes] =
credentials: Option[FileSystemCredentials],
channelIndex: Option[Int]): Fox[MagWithAttributes] =
for {
mag <- magFromTransforms(ngffDataset.coordinateTransformations, voxelSizeInAxisUnits, axisOrder) ?~> "Could not extract mag from scale transforms"
magPath = layerPath.resolve(ngffDataset.path)
Expand All @@ -63,7 +90,7 @@ class NgffExplorer extends RemoteLayerExplorer {
elementClass <- zarrHeader.elementClass ?~> s"failed to read element class from zarr header at $zarrayPath"
boundingBox <- zarrHeader.boundingBox(axisOrder) ?~> s"failed to read bounding box from zarr header at $zarrayPath"
} yield
MagWithAttributes(MagLocator(mag, Some(magPath.toString), credentials, Some(axisOrder)),
MagWithAttributes(MagLocator(mag, Some(magPath.toString), credentials, Some(axisOrder), channelIndex),
magPath,
elementClass,
boundingBox)
Expand Down Expand Up @@ -121,9 +148,10 @@ class NgffExplorer extends RemoteLayerExplorer {
private def extractAndCombineScaleTransforms(coordinateTransforms: List[NgffCoordinateTransformation],
axisOrder: AxisOrder): Vec3Double = {
val filtered = coordinateTransforms.filter(_.`type` == "scale")
val xFactors = filtered.map(_.scale(axisOrder.x))
val yFactors = filtered.map(_.scale(axisOrder.y))
val zFactors = filtered.map(_.scale(axisOrder.z))
val scalesFromTransforms = filtered.flatMap(_.scale)
val xFactors = scalesFromTransforms.map(_(axisOrder.x))
val yFactors = scalesFromTransforms.map(_(axisOrder.y))
val zFactors = scalesFromTransforms.map(_(axisOrder.z))
Vec3Double(xFactors.product, yFactors.product, zFactors.product)
}
}
2 changes: 1 addition & 1 deletion app/models/binary/explore/ZarrArrayExplorer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ class ZarrArrayExplorer extends RemoteLayerExplorer {
elementClass <- zarrHeader.elementClass ?~> "failed to read element class from zarr header"
guessedAxisOrder = AxisOrder.asZyxFromRank(zarrHeader.rank)
boundingBox <- zarrHeader.boundingBox(guessedAxisOrder) ?~> "failed to read bounding box from zarr header. Make sure data is in (T/C)ZYX format"
magLocator = MagLocator(Vec3Int.ones, Some(remotePath.toString), credentials, Some(guessedAxisOrder))
magLocator = MagLocator(Vec3Int.ones, Some(remotePath.toString), credentials, Some(guessedAxisOrder), None)
layer: ZarrLayer = if (looksLikeSegmentationLayer(name, elementClass)) {
ZarrSegmentationLayer(name, boundingBox, elementClass, List(magLocator), largestSegmentId = None)
} else ZarrDataLayer(name, Category.color, boundingBox, elementClass, List(magLocator))
Expand Down
Loading

0 comments on commit cb094d4

Please sign in to comment.