Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Absolute distances from registration warp files? #346

Open
ptsii opened this issue Mar 30, 2021 · 10 comments
Open

Absolute distances from registration warp files? #346

ptsii opened this issue Mar 30, 2021 · 10 comments

Comments

@ptsii
Copy link

ptsii commented Mar 30, 2021

Once I have calculated a registration between two images, I know I can calculate Jacobians of the transformation at each voxel. However, I'd like to also calculate absolute distances for each voxel that these registrations map. I believe (from here: https://sourceforge.net/p/advants/discussion/840261/thread/da89aaf8/) that ANTs ConvertImage will spit out x, y, and z values for each voxel in a separate file, and from these I should be able to calculate euclidean distances for each voxel (if I understand correctly). However, Is there the equivalent to ConvertImage in ANTsR? Or another way to calculate distances from the warp files within ANTsR? Or will I need to spit out the warp file, use ConvertImage to separate into x, y and z files, and do basic math on these?

My apologies if I've missed a thread already discussing this..

-Tom

@stnava
Copy link
Member

stnava commented Mar 30, 2021

warp=antsImageRead("/tmp//RtmpmMgvkd/file5e4c44312a4b1Warp.nii.gz")
splitChannels(warp)

note: the true distance is the composition of the the affine + the warp. so you would have to compose them both if that is your case. see antsApplyTransforms help.

also see https://github.com/stnava/isa which has some other information that may be relevant

@ptsii
Copy link
Author

ptsii commented Mar 31, 2021

Thanks! I'll look at the isa stuff too.

Some basic questions about splitChannels:

  • I see I can do something like:
affine_plus_warp_xvec <- splitChannels(affine_plus_warp)[[1]]
affine_plus_warp_yvec <- splitChannels(affine_plus_warp)[[2]]
affine_plus_warp_zvec <- splitChannels(affine_plus_warp)[[3]]

...where: affine_plus_warp is a composition for a particular registration. Do I have the x, y, and z correctly labeled?

  • Assuming I got that right, am I correct that the warp files identify vectors in the original units, such that:

affine_plus_warp_absWarpDistances <- sqrt(affine_plus_warp_xvec^2 + affine_plus_warp_yvec^2 + affine_plus_warp_zvec^2)

...where affine_plus_warp_absWarpDistances has the euclidean distances between corresponding points indicated by the registration, in original units (in my case, mm)?

Thanks for any help pointing me in the right direction...

-Tom

@stnava
Copy link
Member

stnava commented Mar 31, 2021

yes - it is as simple as:

mytx <- antsRegistration(fixed=ri(1), moving=ri(2), typeofTransform = c('SyN') )
aa=antsApplyTransforms(fi,mi,mytx$fwdtransforms,compose="/tmp/temp.nii.gz" )
warp=splitChannels(antsImageRead(aa))
mag=sqrt( warp[[1]]^2 + warp[[2]]^2 )

in 2D. same style for 3D.

@ptsii
Copy link
Author

ptsii commented Mar 31, 2021

I see that you wrote the composed warp file to /tmp, then read it back in with antsImageRead. Does this mean that I shouldn't properly access the composed warp file from within R, something like (using your notation):

aa=antsApplyTransforms(fi,mi,mytx$fwdtransforms,compose= 1 )
warp=splitChannels(aa)

etc.?

@ptsii
Copy link
Author

ptsii commented Apr 1, 2021

I believe (but could be wrong) that the calculation is not right... the numbers look very odd:

> mytx <- antsRegistration(fixed=ri(1), moving=ri(2), typeofTransform = c('SyN') )
> aa=antsApplyTransforms(ri(1), ri(2),mytx$fwdtransforms,compose="/tmp/temp.nii.gz" )
> warp=splitChannels(antsImageRead(aa))
> mag=sqrt( warp[[1]]^2 + warp[[2]]^2 )
> mean(mag)
[1] 11.79986
> range(mag)
[1]  0.09793366 29.24368286
> sd(mag)
[1] 6.006566

Given that the images are 1x1, these seem much too large to me: average deviation of over 11 mm? Maybe my intuition is wrong?

I tried the equivalent with a large 3D image, and got numbers - for the surface only - of 283.429 (presumably mm). The image is only 157x157x159 (voxel spacing of 1x1x1), so that value cannot be right. I'm wondering if step of squaring and/or adding the squared images is not doing the right thing. I.e., this step:

> mag=sqrt( warp[[1]]^2 + warp[[2]]^2 )

@ptsii
Copy link
Author

ptsii commented Apr 1, 2021

Here is what the 2 images look like, plus the mag results:

r16_r27_mag

I have a hard time believing the mean displacement is over 11 mm...

@ntustison
Copy link
Member

The resulting affine transform

#Insight Transform File V1.0
#Transform 0
Transform: AffineTransform_double_2_2
Parameters: 0.951479434967041 0.05314630642533302 -0.15686210989952087 1.020974040031433 4.509864330291748 2.1678619384765625
FixedParameters: 125.27064514160156 129.2100067138672

shows that the translation in x and y is (4.5, 2.2) which probably explains the bulk of your results.

If you look at just the deformable part:

> warp=splitChannels(antsImageRead(mytx$fwdtransforms[1]))
> mag=sqrt( warp[[1]]^2 + warp[[2]]^2 )
> mean( mag )
[1] 0.8930398

you get something that is probably closer to what you're seeing intuitively.

@stnava
Copy link
Member

stnava commented Apr 1, 2021

it's more likely that the registration result is not correct than the code/core tools are getting distances wrong.

anyway, as nick points out, it's important to decide what distances you actually want. E.g. with or without translation, rotation, global scaling, affine, etc.

@ptsii
Copy link
Author

ptsii commented Apr 15, 2021

OK, so doing a rigid registration first, and using that as the starting point for the deformation registration, resulted in values that make sense. Thanks for all your input on this.

Two related questions:

  1. Is there a way to extract the rigid registration part from a deformation registration without having to do it in two steps (rigid first, create the rotated image, then doing a deformation registration)? In other words, is it possible to simply go straight to a deformation registration (e.g., antsRegistration using SyNRA), but then compose only the warp and affine components (and not any rigid part)? Just curious.

  2. Is there a way to differentiate distances that are where the moving image point was inside the fixed image, vs. distances that are the opposite (where the moving image point was outside the fixed image)? For example, label all the first type with negative values (distances), and the second type labeled with positive values (or vice-versa)? I'm interested in visualizing where the two images are different and in which direction.

@ptsii
Copy link
Author

ptsii commented Apr 28, 2021

In case it might be useful to someone in the future (its a pretty obscure question): I think I have figured out a way to address my second question, which was if there was a way to differentiate (euclidean) distances that reflect moving points that were inside the fixed image space, vs. moving points that were outside the fixed image space. The idea is to use the Maurer distance map (an option in iMath function which assigns negative values to internal voxels, and positive values to external voxels, of a binary image) applied to the fixed image. If we use antsApplyTransforms to warp this distance map into the fixed image space - as if it actually were in the moving image space (it is of course actually already in the fixed image space) - then the voxels of this image that 'end up' on the fixed image surface after warping will be negative if the corresponding moving image point was inside the fixed image space, but positive if the corresponding moving image point was outside the fixed image space. Replacing all of these Maurer distances with -1 (if they are negative) or +1 (if they are positive), and then multiplying this 'mask' by the euclidean distance map image, then all the euclidean distances that correspond to moving points that were inside the fixed image will be negative, and the euclidean distances that correspond to moving points that were outside the fixed image will be positive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants