Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Various fixes for composing datasets with landmarks #7992

Merged
merged 24 commits into from
Aug 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
10fb4e4
use tps instead of affine in compose
philippotto Aug 14, 2024
d571e52
temporarily disable some CI checks
philippotto Aug 14, 2024
0f7eebb
always sort trees when composing datasets with landmarks
philippotto Aug 14, 2024
509b7db
add UI checkbox for affine vs tps
philippotto Aug 14, 2024
b624b58
fix that p tag cannot contain ul
philippotto Aug 14, 2024
ddf1d26
also sort nodes by id
philippotto Aug 16, 2024
5aae4d9
fix inverted TPS transform creation when loading dataset; add more co…
philippotto Aug 16, 2024
cef63e8
render dataset description as markdown in dashboard; update dataset d…
philippotto Aug 16, 2024
5a690e1
fix mean error formatting in description
philippotto Aug 16, 2024
61b4155
Merge branch 'master' of github.com:scalableminds/webknossos into com…
philippotto Aug 19, 2024
8fcc8ee
disable TPS checkbox for now
philippotto Aug 19, 2024
c6227b1
restore ci
philippotto Aug 19, 2024
7ccf57c
update changelog
philippotto Aug 19, 2024
e74a069
make optInfoOut writing independent from IS_TESTING
philippotto Aug 20, 2024
734d426
ask user for landmark augmentation when affine estimation fails; asse…
philippotto Aug 20, 2024
17f51d8
Merge branch 'master' of github.com:scalableminds/webknossos into com…
philippotto Aug 21, 2024
173e093
move allowThinPlateSplines into wk dev flags
philippotto Aug 21, 2024
c2a9a70
Update frontend/javascripts/admin/dataset/composition_wizard/04_confi…
philippotto Aug 21, 2024
ff5415b
move guardedWithErrorToast fn
philippotto Aug 21, 2024
0f00869
Merge branch 'compose-tps' of github.com:scalableminds/webknossos int…
philippotto Aug 21, 2024
5da255a
Merge branch 'master' into compose-tps
philippotto Aug 21, 2024
6bee6c5
format
philippotto Aug 21, 2024
c977210
Merge branch 'compose-tps' of github.com:scalableminds/webknossos int…
philippotto Aug 21, 2024
4bab1a9
fix cyclic dependency
philippotto Aug 22, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- Fixed that skeleton groups couldn't be collapsed or expanded in locked annotations. [#7988](https://github.com/scalableminds/webknossos/pull/7988)
- Fixed that registering segments for a bounding box did only work if the segmentation had mag 1. [#8009](https://github.com/scalableminds/webknossos/pull/8009)
- Fixed uploading datasets in neuroglancer precomputed and n5 data format. [#8008](https://github.com/scalableminds/webknossos/pull/8008)
- Various fixes for composing datasets with landmarks. Note that the interpretation of correspondence points was inverted for thin plate splines. [#7992](https://github.com/scalableminds/webknossos/pull/7992)

### Removed

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -173,14 +173,17 @@ async function parseNmlFiles(fileList: FileList): Promise<Partial<WizardContext>
throw new SoftError("The two NML files should have the same tree count.");
}

for (const [tree1, tree2] of _.zip(Utils.values(trees1), Utils.values(trees2))) {
for (const [tree1, tree2] of _.zip(
Utils.values(trees1).sort((a, b) => a.treeId - b.treeId),
Utils.values(trees2).sort((a, b) => a.treeId - b.treeId),
)) {
if (tree1 == null || tree2 == null) {
// Satisfy TS. This should not happen, as we checked before that both tree collections
// have the same size.
throw new SoftError("A tree was unexpectedly parsed as null. Please try again");
}
const nodes1 = Array.from(tree1.nodes.values());
const nodes2 = Array.from(tree2.nodes.values());
const nodes1 = Array.from(tree1.nodes.values()).sort((a, b) => a.id - b.id);
const nodes2 = Array.from(tree2.nodes.values()).sort((a, b) => a.id - b.id);
for (const [node1, node2] of _.zip(nodes1, nodes2)) {
if ((node1 == null) !== (node2 == null)) {
throw new SoftError(
Expand All @@ -194,15 +197,19 @@ async function parseNmlFiles(fileList: FileList): Promise<Partial<WizardContext>
}
}

if (sourcePoints.length < 3) {
throw new SoftError("Each file should contain at least 3 nodes.");
}

const datasets = await tryToFetchDatasetsByName(
[datasetName1, datasetName2],
"Could not derive datasets from NML. Please specify these manually.",
);

return {
datasets: datasets || [],
sourcePoints,
targetPoints,
sourcePoints, // The first dataset (will be transformed to match the second later)
targetPoints, // The second dataset (won't be transformed by default)
currentWizardStep: "SelectDatasets",
};
}
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,11 @@ export default function SelectDatasets({ wizardContext, setWizardContext }: Wiza

return (
<div>
<p>Select the datasets that you want to combine or doublecheck the pre-selected datasets.</p>
<p>
Select the datasets that you want to combine or doublecheck the pre-selected datasets. Note
that the order of the datasets is important and needs to be equal to the order of the files
from the upload.
</p>
<DatasetSelectionComponent
datasetValues={datasetValues}
setDatasetValues={setDatasetValues}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,27 +1,37 @@
import { DeleteOutlined } from "@ant-design/icons";
import { createDatasetComposition } from "admin/admin_rest_api";
import { createDatasetComposition, updateDatasetPartial } from "admin/admin_rest_api";
import {
AllowedTeamsFormItem,
DatasetNameFormItem,
layerNameRules,
} from "admin/dataset/dataset_components";
import { Button, Col, Form, FormInstance, Input, List, Row, Tooltip } from "antd";
import { Button, Checkbox, Col, Form, FormInstance, Input, List, Modal, Row, Tooltip } from "antd";
import { FormItemWithInfo } from "dashboard/dataset/helper_components";
import FolderSelection from "dashboard/folders/folder_selection";
import { estimateAffineMatrix4x4 } from "libs/estimate_affine";
import Toast from "libs/toast";
import Toast, { guardedWithErrorToast } from "libs/toast";
import * as Utils from "libs/utils";
import _ from "lodash";
import messages from "messages";
import { Vector3 } from "oxalis/constants";
import { flatToNestedMatrix } from "oxalis/model/accessors/dataset_accessor";
import { OxalisState } from "oxalis/store";
import React, { useState } from "react";
import { useSelector } from "react-redux";
import { APIDataLayer, APIDataset, APIDatasetId, APITeam, LayerLink } from "types/api_flow_types";
import {
APIDataLayer,
APIDataset,
APIDatasetId,
APITeam,
areDatasetsIdentical,
LayerLink,
} from "types/api_flow_types";
import { syncValidator } from "types/validation";
import { WizardComponentProps } from "./common";
import { useEffectOnlyOnce } from "libs/react_hooks";
import { formatNumber } from "libs/format_utils";
import { checkLandmarksForThinPlateSpline } from "oxalis/model/helpers/transformation_helpers";
import { Vector3 } from "oxalis/constants";
import { WkDevFlags } from "oxalis/api/wk_dev";

const FormItem = Form.Item;

Expand Down Expand Up @@ -50,20 +60,9 @@ export function ConfigureNewDataset(props: WizardComponentProps) {
form.setFieldsValue({ layers: newLayers });
};

const handleTransformImport = async (sourcePoints: Vector3[], targetPoints: Vector3[]) => {
const datasets = linkedDatasets;
const transformationArr =
sourcePoints.length > 0 && targetPoints.length > 0
? [
{
type: "affine" as const,
matrix: flatToNestedMatrix(estimateAffineMatrix4x4(sourcePoints, targetPoints)),
},
]
: [];

const handleTransformImport = async () => {
const newLinks: LayerLink[] = (
_.flatMap(datasets, (dataset) =>
_.flatMap(linkedDatasets, (dataset) =>
dataset.dataSource.dataLayers.map((layer) => [dataset, layer]),
) as [APIDataset, APIDataLayer][]
).map(
Expand All @@ -74,21 +73,54 @@ export function ConfigureNewDataset(props: WizardComponentProps) {
},
sourceName: dataLayer.name,
newName: dataLayer.name,
transformations: dataset === datasets[0] ? transformationArr : [],
transformations: [],
}),
);
form.setFieldsValue({ layers: newLinks });
};

useEffectOnlyOnce(() => {
handleTransformImport(wizardContext.sourcePoints, wizardContext.targetPoints);
handleTransformImport();
});

const handleSubmit = async () => {
if (activeUser == null) {
throw new Error("Cannot create dataset without being logged in.");
}
const layers = form.getFieldValue(["layers"]);
const layersWithoutTransforms = form.getFieldValue(["layers"]) as LayerLink[];
const useThinPlateSplines = (form.getFieldValue("useThinPlateSplines") ?? false) as boolean;

const affineMeanError = { meanError: 0 };

function withTransforms(layers: LayerLink[], sourcePoints: Vector3[], targetPoints: Vector3[]) {
if (sourcePoints.length + targetPoints.length === 0) {
return layers;
}

const transformationArr = [
useThinPlateSplines
? {
type: "thin_plate_spline" as const,
correspondences: { source: sourcePoints, target: targetPoints },
}
: {
type: "affine" as const,
matrix: flatToNestedMatrix(
estimateAffineMatrix4x4(sourcePoints, targetPoints, affineMeanError),
),
},
];
if (useThinPlateSplines) {
checkLandmarksForThinPlateSpline(sourcePoints, targetPoints);
}
return layers.map((layer) => ({
...layer,
// The first dataset will be transformed to match the second.
transformations: areDatasetsIdentical(layer.datasetId, linkedDatasets[0])
? transformationArr
: [],
}));
}

const uploadableDatastores = props.datastores.filter((datastore) => datastore.allowsUpload);
const datastoreToUse = uploadableDatastores[0];
Expand All @@ -97,6 +129,35 @@ export function ConfigureNewDataset(props: WizardComponentProps) {
return;
}

let layersWithTransforms;
const { sourcePoints, targetPoints } = wizardContext;
try {
layersWithTransforms = withTransforms(layersWithoutTransforms, sourcePoints, targetPoints);
} catch (exception) {
const tryAugmentation = await new Promise((resolve) => {
Modal.confirm({
title: "Augment landmarks?",
content:
"The provided landmarks can't be used for affine estimation, possibly " +
"due to their planar nature. Should a constant translation in the Z " +
"direction be assumed, and the landmarks adjusted accordingly?",
onOk: () => resolve(true),
onCancel: () => resolve(false),
});
});
const augmentLandmarks = (points: Vector3[]) =>
points.concat(points.map((p) => [p[0], p[1], p[2] + 1]));
if (tryAugmentation) {
layersWithTransforms = withTransforms(
layersWithoutTransforms,
augmentLandmarks(sourcePoints),
augmentLandmarks(targetPoints),
);
} else {
throw exception;
}
}
Comment on lines +137 to +159
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am afraid, that always using z for "extrusion" won't cover all cases. In case I annotate all my point in the zy viewport without moving in x direction, the x axis value of all points should be equal. Thus, moving the points by z should fix the problem here. In case we want to catch all possible planes, we should calculate the otrhogol vector to the plane and move the nodes in that direction.

But I can also understand that this might be a bit too much for this pr and can be solved in a follow up pr :)


const newDatasetName = form.getFieldValue(["name"]);
setIsLoading(true);
try {
Expand All @@ -105,8 +166,38 @@ export function ConfigureNewDataset(props: WizardComponentProps) {
targetFolderId: form.getFieldValue(["targetFolderId"]),
organizationName: activeUser.organization,
voxelSize: linkedDatasets.slice(-1)[0].dataSource.scale,
layers,
layers: layersWithTransforms,
});

const uniqueDatasets = _.uniqBy(
layersWithoutTransforms.map((layer) => layer.datasetId),
(id) => id.owningOrganization + "-" + id.name,
);
const datasetMarkdownLinks = uniqueDatasets
.map((el) => `- [${el.name}](/datasets/${el.owningOrganization}/${el.name})`)
.join("\n");

await updateDatasetPartial(
{ owningOrganization: activeUser.organization, name: newDatasetName },
{
description: [
"This dataset was composed from:",
datasetMarkdownLinks,
"",
"The layers were combined " +
(sourcePoints.length === 0
? "without any transforms"
: `with ${
useThinPlateSplines
? `Thin-Plate-Splines (${sourcePoints.length} correspondences)`
: `an affine transformation (mean error: ${formatNumber(
affineMeanError.meanError,
)} vx)`
}`) +
".",
].join("\n"),
},
);
} finally {
setIsLoading(false);
}
Expand All @@ -118,7 +209,7 @@ export function ConfigureNewDataset(props: WizardComponentProps) {
// Using Forms here only to validate fields and for easy layout
<div style={{ padding: 5 }}>
<p>Please configure the dataset that is about to be created.</p>
<Form form={form} layout="vertical" onFinish={handleSubmit}>
<Form form={form} layout="vertical" onFinish={() => guardedWithErrorToast(handleSubmit)}>
<Row gutter={8}>
<Col span={12}>
<DatasetNameFormItem activeUser={activeUser} />
Expand Down Expand Up @@ -180,6 +271,12 @@ export function ConfigureNewDataset(props: WizardComponentProps) {
);
}}
</Form.Item>
{WkDevFlags.datasetComposition.allowThinPlateSplines &&
wizardContext.sourcePoints.length > 0 && (
<FormItem name={["useThinPlateSplines"]} valuePropName="checked">
<Checkbox>Use Thin-Plate-Splines (Experimental)</Checkbox>
</FormItem>
)}

<FormItem
style={{
Expand Down
4 changes: 2 additions & 2 deletions frontend/javascripts/admin/dataset/dataset_upload_view.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -1011,7 +1011,7 @@ function FileUploadArea({
color: "var(--ant-color-primary)",
}}
/>
<p
<div
style={{
maxWidth: 800,
textAlign: "center",
Expand Down Expand Up @@ -1142,7 +1142,7 @@ function FileUploadArea({
</div>
</>
) : null}
</p>
</div>
</div>

{files.length > 0 ? (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -682,7 +682,7 @@ class ExplorativeAnnotationsView extends React.PureComponent<Props, State> {
return (
<>
<div>
<UserOutlined />
<UserOutlined className="icon-margin-right" />
{ownerName}
</div>
<div className="flex-container">
Expand Down
3 changes: 2 additions & 1 deletion frontend/javascripts/dashboard/folders/details_sidebar.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ import { useSelector } from "react-redux";
import { OxalisState } from "oxalis/store";
import { getOrganization } from "admin/admin_rest_api";
import { useQuery } from "@tanstack/react-query";
import Markdown from "libs/markdown_adapter";

export function DetailsSidebar({
selectedDatasets,
Expand Down Expand Up @@ -148,7 +149,7 @@ function DatasetDetails({ selectedDataset }: { selectedDataset: APIDatasetCompac

<div style={{ marginBottom: 4 }}>
<div className="sidebar-label">Description</div>
<div>{fullDataset?.description}</div>
<Markdown>{fullDataset?.description}</Markdown>
</div>

<div style={{ marginBottom: 4 }}>
Expand Down
22 changes: 14 additions & 8 deletions frontend/javascripts/libs/estimate_affine.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,12 @@ import { Matrix4x4 } from "mjs";
import { Matrix, solve } from "ml-matrix";
import { Vector3 } from "oxalis/constants";

// Estimates an affine matrix that transforms from source points to target points.
export default function estimateAffine(sourcePoints: Vector3[], targetPoints: Vector3[]) {
export default function estimateAffine(
sourcePoints: Vector3[],
targetPoints: Vector3[],
optInfoOut?: { meanError: number },
) {
/* Estimates an affine matrix that transforms from source points to target points. */
// Number of correspondences
const N = sourcePoints.length;

Expand All @@ -30,12 +34,12 @@ export default function estimateAffine(sourcePoints: Vector3[], targetPoints: Ve
const xMatrix = solve(A, b);
const x = xMatrix.to1DArray();
const error = Matrix.sub(b, new Matrix(A).mmul(xMatrix)).to1DArray();
const meanError = _.mean(error.map((el) => Math.abs(el)));
if (optInfoOut) {
optInfoOut.meanError = meanError;
}
if (!process.env.IS_TESTING) {
console.log(
"Affine estimation error: ",
error,
`(mean=${_.mean(error.map((el) => Math.abs(el)))})`,
);
console.log("Affine estimation error: ", error, `(mean=${meanError})`);
}

const affineMatrix = new Matrix([
Expand All @@ -51,6 +55,8 @@ export default function estimateAffine(sourcePoints: Vector3[], targetPoints: Ve
export function estimateAffineMatrix4x4(
sourcePoints: Vector3[],
targetPoints: Vector3[],
optInfoOut?: { meanError: number },
): Matrix4x4 {
return estimateAffine(sourcePoints, targetPoints).to1DArray() as any as Matrix4x4;
/* Estimates an affine matrix that transforms from source points to target points. */
return estimateAffine(sourcePoints, targetPoints, optInfoOut).to1DArray() as any as Matrix4x4;
}
13 changes: 13 additions & 0 deletions frontend/javascripts/libs/toast.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,19 @@ type ToastParams = {
details?: string;
};

export async function guardedWithErrorToast(fn: () => Promise<any>) {
try {
await fn();
} catch (error) {
import("libs/error_handling").then((_ErrorHandling) => {
const ErrorHandling = _ErrorHandling.default;
Toast.error("An unexpected error occurred. Please check the console for details");
console.error(error);
ErrorHandling.notify(error as Error);
});
}
}

const Toast = {
// The notificationAPI is designed to be a singleton spawned by the ToastContextMountRoot
// mounted in the GlobalThemeProvider.
Expand Down
Loading