-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Various fixes for composing datasets with landmarks #7992
Conversation
…escription after composing
@@ -722,7 +722,7 @@ function _getOriginalTransformsForLayerOrNull( | |||
} else if (type === "thin_plate_spline") { | |||
const { source, target } = transformation.correspondences; | |||
|
|||
return createThinPlateSplineTransform(target, source, dataset.dataSource.scale.factor); | |||
return createThinPlateSplineTransform(source, target, dataset.dataSource.scale.factor); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this was an error before. since we don't have any tps in production, there shouldnt be any problems.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code looks good to me but sadly I found something during testing locally:
When I used the following nmls I could not submit the dataset creation (using affine transformations) and the console showed the following error:
Here are the nmls I used:
l4_first_data_types_second.zip
The nml & dataset order was l4_sample first and then data_types.
Creating a tps dataset with the exact same setup works but showing the dataset crashes due to the same error
Ok with the help of @valentin-pinkau I was able to identify that all nodes must not be present in the same z slice. This fixes the matrix calculation (the error above). But I am still unable to see anything with the test setup described above. But that might be due to using two completely different datasets 🙈 |
For the record: Turns out, that the bbox of the dataset is the union of both layers and thus the default view starts at the default center of the "too large" bounding box resulting in not showing any data initially. Using the find my data button solved the problem :) |
…rt >= landmarks; test tps creation eagerly etc
@MichaelBuessemeyer please have a look at my latest commit :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I only found minor things but also identified, that pushing a copy of all points in the z direction does not cover all cases. Feel free to do this in a follow up pr :)
Will retest now
frontend/javascripts/admin/dataset/composition_wizard/04_configure_new_dataset.tsx
Outdated
Show resolved
Hide resolved
frontend/javascripts/admin/dataset/composition_wizard/04_configure_new_dataset.tsx
Outdated
Show resolved
Hide resolved
const tryAugmentation = await new Promise((resolve) => { | ||
Modal.confirm({ | ||
title: "Augment landmarks?", | ||
content: | ||
"The provided landmarks can't be used for affine estimation, possibly " + | ||
"due to their planar nature. Should a constant translation in the Z " + | ||
"direction be assumed, and the landmarks adjusted accordingly?", | ||
onOk: () => resolve(true), | ||
onCancel: () => resolve(false), | ||
}); | ||
}); | ||
const augmentLandmarks = (points: Vector3[]) => | ||
points.concat(points.map((p) => [p[0], p[1], p[2] + 1])); | ||
if (tryAugmentation) { | ||
layersWithTransforms = withTransforms( | ||
layersWithoutTransforms, | ||
augmentLandmarks(sourcePoints), | ||
augmentLandmarks(targetPoints), | ||
); | ||
} else { | ||
throw exception; | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am afraid, that always using z for "extrusion" won't cover all cases. In case I annotate all my point in the zy viewport without moving in x direction, the x axis value of all points should be equal. Thus, moving the points by z should fix the problem here. In case we want to catch all possible planes, we should calculate the otrhogol vector to the plane and move the nodes in that direction.
But I can also understand that this might be a bit too much for this pr and can be solved in a follow up pr :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testing went well :)
Feel free to merge in case you want to do the only z as a direction for extrusion in another pr
…gure_new_dataset.tsx Co-authored-by: MichaelBuessemeyer <[email protected]>
Thanks for the review + test!
Yes, I think, this is fine, because the user is asked whether a constant translation in Z should be assumed. if this is not the case, they should decline and fix it themself. |
* use tps instead of affine in compose * temporarily disable some CI checks * always sort trees when composing datasets with landmarks * add UI checkbox for affine vs tps * fix that p tag cannot contain ul * also sort nodes by id * fix inverted TPS transform creation when loading dataset; add more comments * render dataset description as markdown in dashboard; update dataset description after composing * fix mean error formatting in description * disable TPS checkbox for now * restore ci * update changelog * make optInfoOut writing independent from IS_TESTING * ask user for landmark augmentation when affine estimation fails; assert >= landmarks; test tps creation eagerly etc * move allowThinPlateSplines into wk dev flags * Update frontend/javascripts/admin/dataset/composition_wizard/04_configure_new_dataset.tsx Co-authored-by: MichaelBuessemeyer <[email protected]> * move guardedWithErrorToast fn * format * fix cyclic dependency --------- Co-authored-by: MichaelBuessemeyer <[email protected]>
URL of deployed dev instance (used for testing):
Steps to test:
Issues:
(Please delete unneeded items, merge only when none are left open)