-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement Zarr v3 and sharding codec #7079
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider me impressed! This is good stuff, I only added minor comments to the code. Thanks for getting through all of this :-)
Some things I noticed during testing:
-
Got execution exception while exploring https://static.webknossos.org/data/zarr_v3/l4_sample_no_sharding/color (which is not the right uri, should have been https://static.webknossos.org/data/zarr_v3/l4_sample_no_sharding/color/1 ) but should be Fox.failure (so the other explorers can be tried)
-
I get different segment ids (pretty large numbers) for https://static.webknossos.org/data/zarr_v3/l4dense_motta_et_al_demo/segmentation/1, as compared to https://webknossos.org/datasets/scalable_minds/l4dense_motta_et_al_demo/view#2786,4254,1706,0,1.074
(segment positions seem right, but the ids are different. possibly something with endianness? or could it be because there is no ZarrV3SegmentationLayer case class yet?)
(Edit: looks like this is a problem of the written dataset, not wk reading it) -
Minor: Explore does not work for https://static.webknossos.org/data/zarr_v3/l4_sample_no_sharding/color/1/zarr.json (I think the other explorers can handle the uri to the json file (there .zarray) being supplied directly?)
...astore/app/com/scalableminds/webknossos/datastore/dataformats/zarr/v3/ZarrV3DataLayers.scala
Outdated
Show resolved
Hide resolved
...re/app/com/scalableminds/webknossos/datastore/datareaders/precomputed/PrecomputedArray.scala
Show resolved
Hide resolved
...tastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/ZarrV3ChunkReader.scala
Outdated
Show resolved
Hide resolved
...datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/ZarrArrayHeader.scala
Outdated
Show resolved
Hide resolved
...datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/ZarrArrayHeader.scala
Outdated
Show resolved
Hide resolved
...datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/ZarrArrayHeader.scala
Outdated
Show resolved
Hide resolved
...sos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/ZarrV3Array.scala
Outdated
Show resolved
Hide resolved
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/Codecs.scala
Show resolved
Hide resolved
The wrong segment ids are an endian issue. Where would be the best place to switch the endianness? |
Thanks for addressing the feedback! As discussed in person, the endianness issue for the segmentation layer may be in the data, we should be re-checking that. I “unresolved” two conversations above that I think are still open, and turned some notes into checkboxes in my last comment. Could you have a look at those? I’d say this is already very close to complete :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Niice!
I took the liberty to change a few lines in the json validation, where I still got bare exceptions. This works for me now :)
If more issues come up with future datasets we can still fix them then.
Good to go!
…esign-right-sidebar * 'master' of github.com:scalableminds/webknossos: Create bounding box by dragging with box tool (#7118) Prevent 'negative' buckets from being created (#7124) Lazy load onnx and canvas2html module (#7121) Disable editing of super voxel skeletons in skeleton mode (#7086) add missing evolution to migration guide (#7126) Change sttp backend to HttpURLConnectionBackend (#7125) Implement Zarr v3 and sharding codec (#7079) Fix decompression of garbage data after valid gzip data causing decompression to fail (#7119) When scanning volume buckets, skip those with unparseable key (#7115)
URL of deployed dev instance (used for testing):
Steps to test:
Questions
Name classes zarr3 or zarrV3?-> zarr3Storage transformers: There are currently no storage transformers proposed (or I didn't find them). Should this implementation already allow for these, or is that not relevant?-> Not relevantAre there any more codecs that work like sharding (wrapping other codecs in its implementation) or is it just that one. If this is the only one (that should be supported anyway), it could be treated as a special case. Otherwise, codec parsing and handling probably needs to be overhauled significantly-> Current assumption is only support one layer of shardingTodos
Issues:
(Please delete unneeded items, merge only when none are left open)