-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Absolute distances from registration warp files? #346
Comments
note: the true distance is the composition of the the affine + the warp. so you would have to compose them both if that is your case. see antsApplyTransforms help. also see https://github.com/stnava/isa which has some other information that may be relevant |
Thanks! I'll look at the isa stuff too. Some basic questions about splitChannels:
...where: affine_plus_warp is a composition for a particular registration. Do I have the x, y, and z correctly labeled?
...where affine_plus_warp_absWarpDistances has the euclidean distances between corresponding points indicated by the registration, in original units (in my case, mm)? Thanks for any help pointing me in the right direction... -Tom |
yes - it is as simple as:
in 2D. same style for 3D. |
I see that you wrote the composed warp file to /tmp, then read it back in with antsImageRead. Does this mean that I shouldn't properly access the composed warp file from within R, something like (using your notation):
etc.? |
I believe (but could be wrong) that the calculation is not right... the numbers look very odd:
Given that the images are 1x1, these seem much too large to me: average deviation of over 11 mm? Maybe my intuition is wrong? I tried the equivalent with a large 3D image, and got numbers - for the surface only - of 283.429 (presumably mm). The image is only 157x157x159 (voxel spacing of 1x1x1), so that value cannot be right. I'm wondering if step of squaring and/or adding the squared images is not doing the right thing. I.e., this step:
|
The resulting affine transform
shows that the translation in x and y is (4.5, 2.2) which probably explains the bulk of your results. If you look at just the deformable part:
you get something that is probably closer to what you're seeing intuitively. |
it's more likely that the registration result is not correct than the code/core tools are getting distances wrong. anyway, as nick points out, it's important to decide what distances you actually want. E.g. with or without translation, rotation, global scaling, affine, etc. |
OK, so doing a rigid registration first, and using that as the starting point for the deformation registration, resulted in values that make sense. Thanks for all your input on this. Two related questions:
|
In case it might be useful to someone in the future (its a pretty obscure question): I think I have figured out a way to address my second question, which was if there was a way to differentiate (euclidean) distances that reflect moving points that were inside the fixed image space, vs. moving points that were outside the fixed image space. The idea is to use the Maurer distance map (an option in iMath function which assigns negative values to internal voxels, and positive values to external voxels, of a binary image) applied to the fixed image. If we use antsApplyTransforms to warp this distance map into the fixed image space - as if it actually were in the moving image space (it is of course actually already in the fixed image space) - then the voxels of this image that 'end up' on the fixed image surface after warping will be negative if the corresponding moving image point was inside the fixed image space, but positive if the corresponding moving image point was outside the fixed image space. Replacing all of these Maurer distances with -1 (if they are negative) or +1 (if they are positive), and then multiplying this 'mask' by the euclidean distance map image, then all the euclidean distances that correspond to moving points that were inside the fixed image will be negative, and the euclidean distances that correspond to moving points that were outside the fixed image will be positive. |
Once I have calculated a registration between two images, I know I can calculate Jacobians of the transformation at each voxel. However, I'd like to also calculate absolute distances for each voxel that these registrations map. I believe (from here: https://sourceforge.net/p/advants/discussion/840261/thread/da89aaf8/) that ANTs ConvertImage will spit out x, y, and z values for each voxel in a separate file, and from these I should be able to calculate euclidean distances for each voxel (if I understand correctly). However, Is there the equivalent to ConvertImage in ANTsR? Or another way to calculate distances from the warp files within ANTsR? Or will I need to spit out the warp file, use ConvertImage to separate into x, y and z files, and do basic math on these?
My apologies if I've missed a thread already discussing this..
-Tom
The text was updated successfully, but these errors were encountered: