Skip to content

Commit

Permalink
Merge branch 'master' of github.com:scalableminds/webknossos into emp…
Browse files Browse the repository at this point in the history
…ty-list-drawings

* 'master' of github.com:scalableminds/webknossos: (23 commits)
  fix scrolling in organization switcher in terms-of-services modal (#7083)
  Auto-Select via SAM (#7051)
  Build STL in chunks when exporting them (#7074)
  Improve performance for large (oversegmentation) meshes (#7077)
  Fix display of used storage in power plan (#7057)
  Fix user limits for invites (#7078)
  adds fileSize to voxelytics workflow list (#7071)
  Improve error logging for bucket requests (#7053)
  Fix zarr streaming datasource-properties.json generation for non-wkw/zarr datasets (#7065)
  Min length for layer names is one (#7064)
  Include voxelytics workflow name in tab title (#7070)
  Fix local to global layer index look up (#7066)
  Merge editable mappings (#7026)
  store correct artifacts for wkorg nightly (#7060)
  Team edit modal (#7043)
  Fix voxel offset in chunk name for unsharded neuroglancer precomputed datasets (#7062)
  Fix offset when loading non-aligned buckets for zarr/n5/precomputed (#7058)
  fixes wallTimes query for workflow reports (#7059)
  Handle Remote Dataset Edge Cases: compressed content-encoding, float voxel size (#7041)
  Release 23.05.2 (#7056)
  ...
  • Loading branch information
hotzenklotz committed May 17, 2023
2 parents c9ee334 + 9d85dbe commit 790a5bf
Show file tree
Hide file tree
Showing 148 changed files with 3,812 additions and 1,991 deletions.
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -310,7 +310,7 @@ jobs:
yarn test-wkorg-screenshot
- store_artifacts:
path: frontend/javascripts/test/screenshots
path: frontend/javascripts/test/screenshots-wkorg

workflows:
version: 2
Expand Down
6 changes: 6 additions & 0 deletions CHANGELOG.released.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,12 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MICRO`.
For upgrade instructions, please check the [migration guide](MIGRATIONS.released.md).

## [23.05.2](https://github.com/scalableminds/webknossos/releases/tag/23.05.2) - 2023-05-08
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.1...23.05.2)

### Fixed
- Fixed a bug where users could sometimes not access their own time tracking information. [#7055](https://github.com/scalableminds/webknossos/pull/7055)

## [23.05.1](https://github.com/scalableminds/webknossos/releases/tag/23.05.1) - 2023-05-02
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.0...23.05.1)

Expand Down
25 changes: 22 additions & 3 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,22 +8,41 @@ and this project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MIC
For upgrade instructions, please check the [migration guide](MIGRATIONS.released.md).

## Unreleased
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.1...HEAD)
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.2...HEAD)

### Added
- Added segment groups so that segments can be organized in a hierarchy (similar to skeletons). [#6966](https://github.com/scalableminds/webknossos/pull/6966)
- In addition to drag and drop, the selected tree(s) in the Skeleton tab can also be moved into another group by right-clicking the target group and selecting "Move selected tree(s) here". [#7005](https://github.com/scalableminds/webknossos/pull/7005)
- Added a machine-learning based quick select mode. Activate it via the "AI" button in the toolbar after selecting the quick-select tool. [#7051](https://github.com/scalableminds/webknossos/pull/7051)
- Added support for remote datasets encoded with [brotli](https://datatracker.ietf.org/doc/html/rfc7932). [#7041](https://github.com/scalableminds/webknossos/pull/7041)
- Teams can be edited more straight-forwardly in a popup in the team edit page. [#7043](https://github.com/scalableminds/webknossos/pull/7043)
- Annotations with Editable Mappings (a.k.a Supervoxel Proofreading) can now be merged. [#7026](https://github.com/scalableminds/webknossos/pull/7026)
- The file size and inodes of artifacts are now aggregated and shown in the Voxelytics workflow list. [#7071](https://github.com/scalableminds/webknossos/pull/7071)

### Changed
- Loading of precomputed meshes got significantly faster (especially when using a mesh file for an oversegmentation with an applied agglomerate mapping). [#7001](https://github.com/scalableminds/webknossos/pull/7001)
- Improved speed of proofreading by only reloading affected areas after a split or merge. [#7050](https://github.com/scalableminds/webknossos/pull/7050)
- The minimum length of layer names in datasets was set from 3 to 1, enabling single-character names for layers. [#7064](https://github.com/scalableminds/webknossos/pull/7064)
- Improved performance for large meshes, especially when loaded from a precomputed oversegmentation mesh file. [#7077](https://github.com/scalableminds/webknossos/pull/7077)

### Fixed
- Fixed that changing a segment color could lead to a crash. [#7000](https://github.com/scalableminds/webknossos/pull/7000)
- The fill_value property of zarr dataset is now used when it is not a number. [#7017](https://github.com/scalableminds/webknossos/pull/7017)
- Fixed a bug that made downloads of public annotations fail occasionally. [#7025](https://github.com/scalableminds/webknossos/pull/7025)
- Fixed layouting of used storage space on the organization page. [#7034](https://github.com/scalableminds/webknossos/pull/7034)
- Fixed a bug where updating skeleton annotation tree group visibility would break the annotation. [#7037](https://github.com/scalableminds/webknossos/pull/7037)
- Fixed importing Neuroglancer Precomputed datasets that have a voxel offset in their header. [#7019](https://github.com/scalableminds/webknossos/pull/7019)
- Fixed an bug where invalid email addresses were not readable in dark mode on the login/signup pages. [#7052](https://github.com/scalableminds/webknossos/pull/7052)
- Fixed importing Neuroglancer Precomputed datasets that have a voxel offset in their header. [#7019](https://github.com/scalableminds/webknossos/pull/7019), [#7062](https://github.com/scalableminds/webknossos/pull/7062)
- Fixed reading Neuroglancer Precomputed datasets with non-integer resolutions. [#7041](https://github.com/scalableminds/webknossos/pull/7041)
- Fixed a bug where invalid email addresses were not readable in dark mode on the login/signup pages. [#7052](https://github.com/scalableminds/webknossos/pull/7052)
- Fixed a bug where users could sometimes not access their own time tracking information. [#7055](https://github.com/scalableminds/webknossos/pull/7055)
- Fixed a bug in the wallTime calculation of the Voxelytics reports. [#7059](https://github.com/scalableminds/webknossos/pull/7059)
- Fixed a bug where thumbnails and raw data requests with non-bucket-aligned positions would show data at slightly wrong positions. [#7058](https://github.com/scalableminds/webknossos/pull/7058)
- Avoid crashes when exporting big STL files. [#7074](https://github.com/scalableminds/webknossos/pull/7074)
- Fixed rare rendering bug for datasets with multiple layers and differing magnifications. [#7066](https://github.com/scalableminds/webknossos/pull/7066)
- Fixed a bug where duplicating annotations with Editable Mappings could lead to a server-side endless loop. [#7026](https://github.com/scalableminds/webknossos/pull/7026)
- Fixed the datasource-properties.json route for zarr-streaminge export of datasets that are not wkw/zarr. [#7065](https://github.com/scalableminds/webknossos/pull/7065)
- Fixed an issue where you could no longer invite users to your organization even though you had space left. [#7078](https://github.com/scalableminds/webknossos/pull/7078)
- Fixed displayed units of used storage in the organization's overview page. [#7057](https://github.com/scalableminds/webknossos/pull/7057)

### Removed

Expand Down
5 changes: 3 additions & 2 deletions DEV_INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ For non-localhost deployments, check out the [installation guide in the document
* [PostgreSQL 10+](https://www.postgresql.org/)
* [Redis 5+](https://redis.io/)
* [Blosc](https://github.com/Blosc/c-blosc)
* [Brotli](https://github.com/google/brotli)
* [node.js 16 or 18](http://nodejs.org/download/)
* [yarn package manager](https://yarnpkg.com/)
* [git](http://git-scm.com/downloads)
Expand All @@ -41,7 +42,7 @@ arch -x86_64 /bin/zsh
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

# Install git, node.js, postgres, sbt, gfind, gsed
brew install openjdk@14 openssl git node postgresql sbt findutils coreutils gnu-sed redis yarn
brew install openjdk@14 openssl git node postgresql sbt findutils coreutils gnu-sed redis yarn c-blosc brotli

# Set env variables for openjdk and openssl
# You probably want to add these lines manually to avoid conflicts in your zshrc
Expand Down Expand Up @@ -83,7 +84,7 @@ curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -
echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list

sudo apt update
sudo apt install -y nodejs git postgresql postgresql-client scala sbt openjdk-11-jdk yarn redis-server build-essential libblosc1
sudo apt install -y nodejs git postgresql postgresql-client scala sbt openjdk-11-jdk yarn redis-server build-essential libblosc1 libbrotli1

# Assign a password to PostgreSQL user
sudo -u postgres psql -c "ALTER USER postgres WITH ENCRYPTED PASSWORD 'postgres';"
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ FROM eclipse-temurin:11
ARG VERSION_NODE="16.x"

RUN curl -sL "https://deb.nodesource.com/setup_${VERSION_NODE}" | bash - \
&& apt-get -y install libblosc1 postgresql-client git nodejs \
&& apt-get -y install libblosc1 libbrotli1 postgresql-client git nodejs \
&& rm -rf /var/lib/apt/lists/*

RUN mkdir -p /webknossos
Expand Down
6 changes: 6 additions & 0 deletions MIGRATIONS.released.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ See `MIGRATIONS.unreleased.md` for the changes which are not yet part of an offi
This project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MICRO`.
User-facing changes are documented in the [changelog](CHANGELOG.released.md).

## [23.05.2](https://github.com/scalableminds/webknossos/releases/tag/23.05.2) - 2023-05-08
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.1...23.05.2)

### Postgres Evolutions:
None.

## [23.05.1](https://github.com/scalableminds/webknossos/releases/tag/23.05.1) - 2023-05-02
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.0...23.05.1)

Expand Down
6 changes: 4 additions & 2 deletions MIGRATIONS.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,10 @@ This project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MICRO`.
User-facing changes are documented in the [changelog](CHANGELOG.released.md).

## Unreleased
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.1...HEAD)
[Commits](https://github.com/scalableminds/webknossos/compare/23.05.2...HEAD)
- FossilDB needs to be opened with new additional column families editableMappingsInfo, editableMappingsAgglomerateToGraph, editableMappingsSegmentToAgglomerate.
- For instances with existing editable mapping (a.k.a supervoxel proofreading) annotations: To keep those annotations alive, a python migration has to be run with access to your tracingstore’s FossilDB. It is recommended to do this during a webknossos downtime to avoid data loss. It needs python 3.8+ and the pip packages installable by `pip install grpcio-tools grpcio-health-checking`. Run it with `python tools/migrate-editable-mappings/migrate-editable-mappings.py -v -w -o localhost:7155`. Omit -o for a faster migration but no access to older versions of the editable mappings. The migration is idempotent.
- For instances with existing editable mapping (a.k.a supervoxel proofreading) annotations: To keep those annotations alive, a python migration has to be run with access to your tracingstore’s FossilDB. It is recommended to do this during a webknossos downtime to avoid data loss. It needs python 3.8+ and the pip packages installable by `pip install grpcio-tools grpcio-health-checking`. Run it with `python tools/migrate-editable-mappings/migrate-editable-mappings.py -v -w -o localhost:7155`. Omit -o for a faster migration but no access to older versions of the editable mappings. The migration is idempotent.
- The datastore now needs `brotli`. For Debian-based systems, this can be installed with `apt-get install libbrotli1`.
- New FossilDB version 0.1.23 (`master__448` on Dockerhub) is required, compare [FossilDB PR](https://github.com/scalableminds/fossildb/pull/38).

### Postgres Evolutions:
3 changes: 2 additions & 1 deletion app/controllers/AnnotationIOController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -365,14 +365,15 @@ Expects:
} yield result
}

// TODO: select versions per layer
private def downloadExplorational(annotationId: String,
typ: String,
issuingUser: Option[User],
skeletonVersion: Option[Long],
volumeVersion: Option[Long],
skipVolumeData: Boolean)(implicit ctx: DBAccessContext) = {

// Note: volumeVersion cannot currently be supplied per layer, see https://github.com/scalableminds/webknossos/issues/5925

def skeletonToTemporaryFile(dataSet: DataSet,
annotation: Annotation,
organizationName: String): Fox[TemporaryFile] =
Expand Down
62 changes: 58 additions & 4 deletions app/controllers/DataSetController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,12 @@ import com.scalableminds.util.accesscontext.{DBAccessContext, GlobalAccessContex
import com.scalableminds.util.geometry.{BoundingBox, Vec3Int}
import com.scalableminds.util.time.Instant
import com.scalableminds.util.tools.{Fox, JsonHelper, Math}
import com.scalableminds.webknossos.datastore.models.datasource.{DataLayer, DataLayerLike, GenericDataSource}
import com.scalableminds.webknossos.datastore.models.datasource.{
DataLayer,
DataLayerLike,
ElementClass,
GenericDataSource
}
import io.swagger.annotations._
import models.analytics.{AnalyticsService, ChangeDatasetSettingsEvent, OpenDatasetEvent}
import models.binary._
Expand All @@ -20,7 +25,7 @@ import play.api.i18n.{Messages, MessagesProvider}
import play.api.libs.functional.syntax._
import play.api.libs.json._
import play.api.mvc.{Action, AnyContent, PlayBodyParsers}
import utils.ObjectId
import utils.{ObjectId, WkConf}

import javax.inject.Inject
import scala.collection.mutable.ListBuffer
Expand All @@ -42,17 +47,27 @@ object DatasetUpdateParameters extends TristateOptionJsonHelper {
Json.configured(tristateOptionParsing).format[DatasetUpdateParameters]
}

case class SegmentAnythingEmbeddingParameters(
mag: Vec3Int,
boundingBox: BoundingBox
)

object SegmentAnythingEmbeddingParameters {
implicit val jsonFormat: Format[SegmentAnythingEmbeddingParameters] = Json.format[SegmentAnythingEmbeddingParameters]
}

@Api
class DataSetController @Inject()(userService: UserService,
userDAO: UserDAO,
dataSetService: DataSetService,
dataSetDataLayerDAO: DataSetDataLayerDAO,
dataStoreDAO: DataStoreDAO,
dataSetLastUsedTimesDAO: DataSetLastUsedTimesDAO,
organizationDAO: OrganizationDAO,
teamDAO: TeamDAO,
wKRemoteSegmentAnythingClient: WKRemoteSegmentAnythingClient,
teamService: TeamService,
dataSetDAO: DataSetDAO,
conf: WkConf,
analyticsService: AnalyticsService,
mailchimpClient: MailchimpClient,
exploreRemoteLayerService: ExploreRemoteLayerService,
Expand Down Expand Up @@ -161,7 +176,7 @@ class DataSetController @Inject()(userService: UserService,
reportMutable += "Error when exploring as layer set: Resulted in zero layers."
None
case f: Failure =>
reportMutable += s"Error when exploring as layer set: ${exploreRemoteLayerService.formatFailureForReport(f)}"
reportMutable += s"Error when exploring as layer set: ${Fox.failureChainAsString(f)}"
None
case Empty =>
reportMutable += "Error when exploring as layer set: Empty"
Expand Down Expand Up @@ -520,4 +535,43 @@ Expects:
case _ => Messages("dataSet.notFoundConsiderLogin", dataSetName)
}

@ApiOperation(hidden = true, value = "")
def segmentAnythingEmbedding(organizationName: String,
dataSetName: String,
dataLayerName: String,
intensityMin: Option[Float],
intensityMax: Option[Float]): Action[SegmentAnythingEmbeddingParameters] =
sil.SecuredAction.async(validateJson[SegmentAnythingEmbeddingParameters]) { implicit request =>
log() {
for {
_ <- bool2Fox(conf.Features.segmentAnythingEnabled) ?~> "segmentAnything.notEnabled"
_ <- bool2Fox(conf.SegmentAnything.uri.nonEmpty) ?~> "segmentAnything.noUri"
dataset <- dataSetDAO.findOneByNameAndOrganizationName(dataSetName, organizationName) ?~> notFoundMessage(
dataSetName) ~> NOT_FOUND
dataSource <- dataSetService.dataSourceFor(dataset) ?~> "dataSource.notFound" ~> NOT_FOUND
usableDataSource <- dataSource.toUsable ?~> "dataSet.notImported"
dataLayer <- usableDataSource.dataLayers.find(_.name == dataLayerName) ?~> "dataSet.noLayers"
datastoreClient <- dataSetService.clientFor(dataset)(GlobalAccessContext)
targetMagBbox: BoundingBox = request.body.boundingBox / request.body.mag
_ <- bool2Fox(targetMagBbox.dimensions.sorted == Vec3Int(1, 1024, 1024)) ?~> s"Target-mag bbox must be sized 1024×1024×1 (or transposed), got ${targetMagBbox.dimensions}"
data <- datastoreClient.getLayerData(organizationName,
dataset,
dataLayer.name,
request.body.boundingBox,
request.body.mag) ?~> "segmentAnything.getData.failed"
_ = logger.debug(
s"Sending ${data.length} bytes to SAM server, element class is ${dataLayer.elementClass}, range: $intensityMin-$intensityMax...")
_ <- bool2Fox(
!(dataLayer.elementClass == ElementClass.float || dataLayer.elementClass == ElementClass.double) || (intensityMin.isDefined && intensityMax.isDefined)) ?~> "For float and double data, a supplied intensity range is required."
embedding <- wKRemoteSegmentAnythingClient.getEmbedding(
data,
dataLayer.elementClass,
intensityMin,
intensityMax) ?~> "segmentAnything.getEmbedding.failed"
_ = logger.debug(
s"Received ${embedding.length} bytes of embedding from SAM server, forwarding to front-end...")
} yield Ok(embedding)
}
}

}
2 changes: 1 addition & 1 deletion app/controllers/TimeController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ class TimeController @Inject()(userService: UserService,
userIdValidated <- ObjectId.fromString(userId)
user <- userService.findOneCached(userIdValidated) ?~> "user.notFound" ~> NOT_FOUND
isTeamManagerOrAdmin <- userService.isTeamManagerOrAdminOf(request.identity, user)
_ <- bool2Fox(isTeamManagerOrAdmin || user == request.identity) ?~> "user.notAuthorised" ~> FORBIDDEN
_ <- bool2Fox(isTeamManagerOrAdmin || user._id == request.identity._id) ?~> "user.notAuthorised" ~> FORBIDDEN
js <- loggedTimeForUserListByTimestamp(user, startDate, endDate)
} yield Ok(js)
}
Expand Down
8 changes: 6 additions & 2 deletions app/models/binary/DataSet.scala
Original file line number Diff line number Diff line change
Expand Up @@ -671,8 +671,12 @@ class DataSetDataLayerDAO @Inject()(

def findAllForDataSet(dataSetId: ObjectId): Fox[List[DataLayer]] =
for {
rows <- run(DatasetLayers.filter(_._Dataset === dataSetId.id).result).map(_.toList)
rowsParsed <- Fox.combined(rows.map(parseRow(_, dataSetId)))
rows <- run(q"""SELECT _dataSet, name, category, elementClass, boundingBox, largestSegmentId, mappings,
defaultViewConfiguration, adminViewConfiguration
FROM webknossos.dataset_layers
WHERE _dataset = $dataSetId
ORDER BY name""".as[DatasetLayersRow])
rowsParsed <- Fox.combined(rows.toList.map(parseRow(_, dataSetId)))
} yield rowsParsed

private def insertLayerQuery(dataSetId: ObjectId, layer: DataLayer): SqlAction[Int, NoStream, Effect] =
Expand Down
Loading

0 comments on commit 790a5bf

Please sign in to comment.