Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write wkurl into annotation nml file #6964

Merged
merged 19 commits into from
Apr 18, 2023
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
### Changed
- Moved the view mode selection in the toolbar next to the position field. [#6949](https://github.com/scalableminds/webknossos/pull/6949)
- Redesigned welcome toast for new, annonymous users with new branding. [#6961](https://github.com/scalableminds/webknossos/pull/6961)
- When saving annotations, the URL of the webknossos instance is stored in the resulting NML file. [#6964](https://github.com/scalableminds/webknossos/pull/6964)

### Fixed
- Fixed incorrect initial tab when clicking "Show Annotations" for a user in the user list. Also, the datasets tab was removed from that page as it was the same as the datasets table from the main dashboard. [#6957](https://github.com/scalableminds/webknossos/pull/6957)
Expand Down
58 changes: 40 additions & 18 deletions app/controllers/AnnotationIOController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ import play.api.i18n.{Messages, MessagesProvider}
import play.api.libs.Files.{TemporaryFile, TemporaryFileCreator}
import play.api.libs.json.Json
import play.api.mvc.{Action, AnyContent, MultipartFormData}
import utils.ObjectId
import utils.{ObjectId, WkConf}

import scala.concurrent.{ExecutionContext, Future}

Expand All @@ -62,6 +62,7 @@ class AnnotationIOController @Inject()(
temporaryFileCreator: TemporaryFileCreator,
annotationService: AnnotationService,
analyticsService: AnalyticsService,
conf: WkConf,
sil: Silhouette[WkEnv],
provider: AnnotationInformationProvider,
annotationUploadService: AnnotationUploadService)(implicit ec: ExecutionContext, val materializer: Materializer)
Expand Down Expand Up @@ -102,9 +103,9 @@ Expects:
val attachedFiles = request.body.files.map(f => (f.ref.path.toFile, f.filename))
val parsedFiles =
annotationUploadService.extractFromFiles(attachedFiles, useZipName = true, overwritingDataSetName)
val parsedFilesWraped =
val parsedFilesWrapped =
annotationUploadService.wrapOrPrefixTrees(parsedFiles.parseResults, shouldCreateGroupForEachFile)
val parseResultsFiltered: List[NmlParseResult] = parsedFilesWraped.filter(_.succeeded)
val parseResultsFiltered: List[NmlParseResult] = parsedFilesWrapped.filter(_.succeeded)

if (parseResultsFiltered.isEmpty) {
returnError(parsedFiles)
Expand All @@ -113,13 +114,15 @@ Expects:
parseSuccesses <- Fox.serialCombined(parseResultsFiltered)(r => r.toSuccessBox)
name = nameForUploaded(parseResultsFiltered.map(_.fileName))
description = descriptionForNMLs(parseResultsFiltered.map(_.description))
wkUrl = wkUrlsForNMLs(parseResultsFiltered.map(_.wkUrl))
_ <- assertNonEmpty(parseSuccesses)
skeletonTracings = parseSuccesses.flatMap(_.skeletonTracing)
// Create a list of volume layers for each uploaded (non-skeleton-only) annotation.
// This is what determines the merging strategy for volume layers
volumeLayersGroupedRaw = parseSuccesses.map(_.volumeLayers).filter(_.nonEmpty)
dataSet <- findDataSetForUploadedAnnotations(skeletonTracings,
volumeLayersGroupedRaw.flatten.map(_.tracing))
volumeLayersGroupedRaw.flatten.map(_.tracing),
wkUrl)
volumeLayersGrouped <- adaptVolumeTracingsToFallbackLayer(volumeLayersGroupedRaw, dataSet)
tracingStoreClient <- tracingStoreService.clientFor(dataSet)
mergedVolumeLayers <- mergeAndSaveVolumeLayers(volumeLayersGrouped,
Expand Down Expand Up @@ -198,19 +201,31 @@ Expects:

private def findDataSetForUploadedAnnotations(
skeletonTracings: List[SkeletonTracing],
volumeTracings: List[VolumeTracing])(implicit mp: MessagesProvider, ctx: DBAccessContext): Fox[DataSet] =
volumeTracings: List[VolumeTracing],
wkUrl: String)(implicit mp: MessagesProvider, ctx: DBAccessContext): Fox[DataSet] =
for {
dataSetName <- assertAllOnSameDataSet(skeletonTracings, volumeTracings) ?~> "nml.file.differentDatasets"
organizationNameOpt <- assertAllOnSameOrganization(skeletonTracings, volumeTracings) ?~> "nml.file.differentDatasets"
organizationIdOpt <- Fox.runOptional(organizationNameOpt) {
organizationDAO.findOneByName(_)(GlobalAccessContext).map(_._id)
} ?~> Messages("organization.notFound", organizationNameOpt.getOrElse("")) ~> NOT_FOUND
} ?~> (if (wkUrl.nonEmpty && conf.Http.uri != wkUrl) {
Messages("organization.notFound.wrongHost", organizationNameOpt.getOrElse(""), wkUrl, conf.Http.uri)
} else { Messages("organization.notFound", organizationNameOpt.getOrElse("")) }) ~>
NOT_FOUND
frcroth marked this conversation as resolved.
Show resolved Hide resolved
organizationId <- Fox.fillOption(organizationIdOpt) {
dataSetDAO.getOrganizationForDataSet(dataSetName)(GlobalAccessContext)
} ?~> Messages("dataSet.noAccess", dataSetName) ~> FORBIDDEN
dataSet <- dataSetDAO.findOneByNameAndOrganization(dataSetName, organizationId) ?~> Messages(
"dataSet.noAccess",
dataSetName) ~> FORBIDDEN
dataSet <- dataSetDAO.findOneByNameAndOrganization(dataSetName, organizationId) ?~> (if (wkUrl.nonEmpty && conf.Http.uri != wkUrl) {
Messages(
"dataSet.noAccess.wrongHost",
dataSetName,
wkUrl,
conf.Http.uri)
} else {
Messages(
"dataSet.noAccess",
dataSetName)
}) ~> FORBIDDEN
} yield dataSet

private def nameForUploaded(fileNames: Seq[String]) =
Expand All @@ -222,6 +237,9 @@ Expects:
private def descriptionForNMLs(descriptions: Seq[Option[String]]) =
if (descriptions.size == 1) descriptions.headOption.flatten.getOrElse("") else ""

private def wkUrlsForNMLs(wkUrls: Seq[Option[String]]) =
if (wkUrls.toSet.size == 1) wkUrls.headOption.flatten.getOrElse("") else ""

private def returnError(zipParseResult: NmlResults.MultiNmlParseResult)(implicit messagesProvider: MessagesProvider) =
if (zipParseResult.containsFailure) {
val errors = zipParseResult.parseResults.flatMap {
Expand Down Expand Up @@ -370,6 +388,7 @@ Expects:
dataSet.scale,
None,
organizationName,
conf.Http.uri,
dataSet.name,
Some(user),
taskOpt)
Expand All @@ -395,15 +414,18 @@ Expects:
}
user <- userService.findOneById(annotation._user, useCache = true)
taskOpt <- Fox.runOptional(annotation._task)(taskDAO.findOne)
nmlStream = nmlWriter.toNmlStream(fetchedSkeletonLayers ::: fetchedVolumeLayers,
Some(annotation),
dataset.scale,
None,
organizationName,
dataset.name,
Some(user),
taskOpt,
skipVolumeData)
nmlStream = nmlWriter.toNmlStream(
fetchedSkeletonLayers ::: fetchedVolumeLayers,
Some(annotation),
dataset.scale,
None,
organizationName,
conf.Http.uri,
dataset.name,
Some(user),
taskOpt,
skipVolumeData
)
temporaryFile = temporaryFileCreator.create()
zipper = ZipIO.startZip(new BufferedOutputStream(new FileOutputStream(new File(temporaryFile.path.toString))))
_ <- zipper.addFileFromEnumerator(name + ".nml", nmlStream)
Expand Down
6 changes: 4 additions & 2 deletions app/models/annotation/AnnotationService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ import play.api.i18n.{Messages, MessagesProvider}
import play.api.libs.Files.{TemporaryFile, TemporaryFileCreator}
import play.api.libs.iteratee.Enumerator
import play.api.libs.json.{JsNull, JsObject, JsValue, Json}
import utils.ObjectId
import utils.{ObjectId, WkConf}

import java.io.{BufferedOutputStream, File, FileOutputStream}
import javax.inject.Inject
Expand Down Expand Up @@ -104,7 +104,8 @@ class AnnotationService @Inject()(
nmlWriter: NmlWriter,
temporaryFileCreator: TemporaryFileCreator,
meshDAO: MeshDAO,
meshService: MeshService
meshService: MeshService,
conf: WkConf,
)(implicit ec: ExecutionContext, val materializer: Materializer)
extends BoxImplicits
with FoxImplicits
Expand Down Expand Up @@ -639,6 +640,7 @@ class AnnotationService @Inject()(
scaleOpt,
Some(name + "_data.zip"),
organizationName,
conf.Http.uri,
datasetName,
Some(user),
taskOpt)
Expand Down
12 changes: 6 additions & 6 deletions app/models/annotation/AnnotationUploadService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@ class AnnotationUploadService @Inject()(tempFileService: TempFileService) extend
isTaskUpload: Boolean,
basePath: Option[String] = None)(implicit m: MessagesProvider): NmlParseResult =
NmlParser.parse(name, inputStream, overwritingDataSetName, isTaskUpload, basePath) match {
case Full((skeletonTracing, uploadedVolumeLayers, description)) =>
NmlParseSuccess(name, skeletonTracing, uploadedVolumeLayers, description)
case Full((skeletonTracing, uploadedVolumeLayers, description, wkUrl)) =>
NmlParseSuccess(name, skeletonTracing, uploadedVolumeLayers, description, wkUrl)
case Failure(msg, _, chain) => NmlParseFailure(name, msg + chain.map(_ => formatChain(chain)).getOrElse(""))
case Empty => NmlParseEmpty(name)
}
Expand Down Expand Up @@ -82,8 +82,8 @@ class AnnotationUploadService @Inject()(tempFileService: TempFileService) extend

if (parseResults.length > 1) {
parseResults.map {
case NmlParseSuccess(name, Some(skeletonTracing), uploadedVolumeLayers, description) =>
NmlParseSuccess(name, Some(renameTrees(name, skeletonTracing)), uploadedVolumeLayers, description)
case NmlParseSuccess(name, Some(skeletonTracing), uploadedVolumeLayers, description, wkUrl) =>
NmlParseSuccess(name, Some(renameTrees(name, skeletonTracing)), uploadedVolumeLayers, description, wkUrl)
case r => r
}
} else {
Expand All @@ -104,8 +104,8 @@ class AnnotationUploadService @Inject()(tempFileService: TempFileService) extend
}

parseResults.map {
case NmlParseSuccess(name, Some(skeletonTracing), uploadedVolumeLayers, description) =>
NmlParseSuccess(name, Some(wrapTreesInGroup(name, skeletonTracing)), uploadedVolumeLayers, description)
case NmlParseSuccess(name, Some(skeletonTracing), uploadedVolumeLayers, description, wkUrl) =>
NmlParseSuccess(name, Some(wrapTreesInGroup(name, skeletonTracing)), uploadedVolumeLayers, description, wkUrl)
case r => r
}
}
Expand Down
8 changes: 6 additions & 2 deletions app/models/annotation/nml/NmlParser.scala
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ object NmlParser extends LazyLogging with ProtoGeometryImplicits with ColorGener
overwritingDataSetName: Option[String],
isTaskUpload: Boolean,
basePath: Option[String] = None)(
implicit m: MessagesProvider): Box[(Option[SkeletonTracing], List[UploadedVolumeLayer], String)] =
implicit m: MessagesProvider): Box[(Option[SkeletonTracing], List[UploadedVolumeLayer], String, Option[String])] =
try {
val data = XML.load(nmlInputStream)
for {
Expand All @@ -52,6 +52,7 @@ object NmlParser extends LazyLogging with ProtoGeometryImplicits with ColorGener
} yield {
val dataSetName = overwritingDataSetName.getOrElse(parseDataSetName(parameters \ "experiment"))
val description = parseDescription(parameters \ "experiment")
val wkUrl = parseWkUrl(parameters \ "experiment")
val organizationName =
if (overwritingDataSetName.isDefined) None else parseOrganizationName(parameters \ "experiment")
val activeNodeId = parseActiveNode(parameters \ "activeNode")
Expand Down Expand Up @@ -115,7 +116,7 @@ object NmlParser extends LazyLogging with ProtoGeometryImplicits with ColorGener
)
)

(skeletonTracingOpt, volumeLayers, description)
(skeletonTracingOpt, volumeLayers, description, wkUrl)
}
} catch {
case e: org.xml.sax.SAXParseException if e.getMessage.startsWith("Premature end of file") =>
Expand Down Expand Up @@ -232,6 +233,9 @@ object NmlParser extends LazyLogging with ProtoGeometryImplicits with ColorGener
private def parseDescription(nodes: NodeSeq): String =
nodes.headOption.map(node => getSingleAttribute(node, "description")).getOrElse(DEFAULT_DESCRIPTION)

private def parseWkUrl(nodes: NodeSeq): Option[String] =
nodes.headOption.map(node => getSingleAttribute(node, "wkUrl"))

private def parseOrganizationName(nodes: NodeSeq): Option[String] =
nodes.headOption.flatMap(node => getSingleAttributeOpt(node, "organization"))

Expand Down
5 changes: 4 additions & 1 deletion app/models/annotation/nml/NmlResults.scala
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ object NmlResults extends LazyLogging {
def fileName: String

def description: Option[String] = None
def wkUrl: Option[String] = None

def succeeded: Boolean

Expand All @@ -32,11 +33,13 @@ object NmlResults extends LazyLogging {
case class NmlParseSuccess(fileName: String,
skeletonTracing: Option[SkeletonTracing],
volumeLayers: List[UploadedVolumeLayer],
_description: String)
_description: String,
_wkUrl: Option[String])
extends NmlParseResult {
def succeeded = true

override def description: Option[String] = Some(_description)
override def wkUrl: Option[String] = _wkUrl

override def withName(name: String): NmlParseResult = this.copy(fileName = name)
}
Expand Down
9 changes: 9 additions & 0 deletions app/models/annotation/nml/NmlWriter.scala
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ case class NmlParameters(
dataSetName: String,
organizationName: String,
description: Option[String],
wkUrl: String,
frcroth marked this conversation as resolved.
Show resolved Hide resolved
scale: Option[Vec3Double],
createdTimestamp: Long,
editPosition: Vec3IntProto,
Expand All @@ -40,6 +41,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
scale: Option[Vec3Double],
volumeFilename: Option[String],
organizationName: String,
wkUrl: String,
datasetName: String,
annotationOwner: Option[User],
annotationTask: Option[Task],
Expand All @@ -53,6 +55,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
scale,
volumeFilename,
organizationName,
wkUrl,
datasetName,
annotationOwner,
annotationTask,
Expand All @@ -66,6 +69,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
scale: Option[Vec3Double],
volumeFilename: Option[String],
organizationName: String,
wkUrl: String,
datasetName: String,
annotationOwner: Option[User],
annotationTask: Option[Task],
Expand All @@ -82,6 +86,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
volumeLayers,
annotation: Option[Annotation],
organizationName,
wkUrl,
datasetName,
scale)
_ = writeParameters(parameters)
Expand All @@ -103,6 +108,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
volumeLayers: List[FetchedAnnotationLayer],
annotation: Option[Annotation],
organizationName: String,
wkUrl: String,
datasetName: String,
scale: Option[Vec3Double]): Fox[NmlParameters] =
for {
Expand All @@ -113,6 +119,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
datasetName,
organizationName,
annotation.map(_.description),
wkUrl,
scale,
s.createdTimestamp,
s.editPosition,
Expand All @@ -127,6 +134,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
datasetName,
organizationName,
annotation.map(_.description),
wkUrl,
scale,
v.createdTimestamp,
v.editPosition,
Expand Down Expand Up @@ -154,6 +162,7 @@ class NmlWriter @Inject()(implicit ec: ExecutionContext) extends FoxImplicits {
writer.writeAttribute("name", parameters.dataSetName)
writer.writeAttribute("organization", parameters.organizationName)
parameters.description.foreach(writer.writeAttribute("description", _))
writer.writeAttribute("wkUrl", parameters.wkUrl)
}
Xml.withinElementSync("scale") {
writer.writeAttribute("x", parameters.scale.map(_.x).getOrElse(-1).toString)
Expand Down
2 changes: 2 additions & 0 deletions conf/messages
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ team.inUse.projects=Team is referenced by {0} projects
organization.create.forbidden=You are not allowed to create a new organization
organization.create.failed=Failed to create a new organization
organization.notFound=Organization {0} could not be found
organization.notFound.wrongHost=Organization {0} could not be found. Please check whether you are on the correct WEBKNOSSOS instance. The uploaded file indicates {1} while this instance is {2}.
organization.list.failed=Failed to retrieve list of organizations.
organization.name.invalid=This organization name contains illegal characters. Please only use letters and numbers.
organization.name.alreadyInUse=This name is already claimed by a different organization and not available anymore. Please choose a different name.
Expand Down Expand Up @@ -75,6 +76,7 @@ dataSet.notFound=Dataset {0} does not exist or could not be accessed
dataSet.notFoundConsiderLogin=Dataset {0} does not exist or could not be accessed. You may need to log in.
dataSet.notFoundForAnnotation=The Dataset for this annotation does not exist or could not be accessed.
dataSet.noAccess=Could not access DataSet {0}. Does your team have access?
dataSet.noAccess.wrongHost=Could not access DataSet {0}. Please check whether you are on the correct WEBKNOSSOS instance. The uploaded file indicates {1} while this instance is {2}.
frcroth marked this conversation as resolved.
Show resolved Hide resolved
dataSet.noAccessById=Could not access the corresponding DataSet. This is likely because you are not a member of a team that has access to it.
dataSet.notImported=Dataset {0} is not imported
dataSet.name.invalid.characters=Dataset name is invalid. Please use only letters, digits, dots, underscores, hypens.
Expand Down
1 change: 1 addition & 0 deletions frontend/javascripts/oxalis/model/helpers/nml_helpers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -224,6 +224,7 @@ function serializeParameters(
name: state.dataset.name,
description: annotation.description,
organization: state.dataset.owningOrganization,
wkUrl: `${location.protocol}//${location.host}`,
}),
serializeTag("scale", {
x: state.dataset.dataSource.scale[0],
Expand Down
Loading