-
Notifications
You must be signed in to change notification settings - Fork 33
Getting started
This tutorial illustrates the basics for using the COMMIT framework to evaluate the evidence of a tractogram.
Download and extract the example dataset from the following ZIP archive, which contains the following files:
-
DWI.nii.gz
: a diffusion MRI dataset with 100 measurements distributed on 2 shells, respectively at b=700 s/mm^2 and b=2000 s/mm^2; -
bvals.txt
andbvecs.txt
: its corresponding acquisition scheme; -
peaks.nii.gz
: main diffusion orientations estimated with CSD; -
WM.nii.gz
: white-matter mask extracted from an anatomical T1-weighted image.
Also, download the precomputed tractogram from here and put it in the same folder of the extracted data, e.g. demo01_data
.
The file demo01_fibers.tck
contains about 280K fibers estimated using a streamline-based algorithm.
Open the Python interpreter and go to the folder obtained after extracting the ZIP archive, e.g. demo01_data
.
This step precomputes the rotation matrices used internally by COMMIT to create the lookup table for the response functions:
import commit
commit.core.setup()
Now, run the following commands:
from commit import trk2dictionary
trk2dictionary.run(
filename_tractogram = 'demo01_fibers.tck',
filename_peaks = 'peaks.nii.gz',
filename_mask = 'WM.nii.gz',
fiber_shift = 0.5,
peaks_use_affine = True
)
The output should look like this:
-> Creating the dictionary from tractogram:
* Configuration:
- Segment position = COMPUTE INTERSECTIONS
- Fiber shift X = 0.500 (voxel-size units)
- Fiber shift Y = 0.500 (voxel-size units)
- Fiber shift Z = 0.500 (voxel-size units)
- Points to skip = 0
- Min segment len = 0.001 mm
- Min fiber len = 0.00 mm
- Max fiber len = 250.00 mm
- Do not blur fibers
- Output written to "./COMMIT"
* Loading data:
- Tractogram
- geometry taken from "peaks.nii.gz"
- 106 x 106 x 60
- 2.0000 x 2.0000 x 2.0000
- 283522 fibers
- Filtering mask
- 106 x 106 x 60
- 2.0000 x 2.0000 x 2.0000
- EC orientations
- 106 x 106 x 60 x 9
- 2.0000 x 2.0000 x 2.0000
- ignoring peaks < 0.10 * MaxPeak
- using affine matrix
- flipping axes : [ x=False, y=False, z=False ]
* Exporting IC compartments:
[ 283522 fibers kept, 16664340 segments in total ]
* Exporting EC compartments:
[ 53008 voxels, 145434 segments ]
[ 14.3 seconds ]
Please note that, in this particular example, in order to have all the data in the same reference system we had to apply a translation of half voxel to the fibers, as can be noticed in the following figure:
First, we need to convert the bvals/bvecs pair to a single scheme file, using the fsl2scheme
function of AMICO:
import amico
amico.util.fsl2scheme( 'bvals.txt', 'bvecs.txt', 'DWI.scheme' )
Now, load the data (see our conventions on the folder structure):
mit = commit.Evaluation( '.', '.' )
mit.load_data( 'DWI.nii.gz', 'DWI.scheme' )
The output should be something like:
-> Loading data:
* DWI signal:
- dim : 106 x 106 x 60 x 100
- pixdim : 2.000 x 2.000 x 2.000
- values : min=0.00, max=4095.00, mean=54.18
* Acquisition scheme:
- 100 samples, 2 shells
- 10 @ b=0, 30 @ b=700.0, 60 @ b=2000.0
[ 1.2 seconds ]
-> Preprocessing:
* Normalizing to b0... [ min=0.00, max=290.00, mean=0.65 ]
* Keeping all b0 volume(s)... [ 106 x 106 x 60 x 100 ]
[ 0.1 seconds ]
For this example we made use of the Stick-Zeppelin-Ball
model described in (Panagiotaki et al., NeuroImage, 2012):
- the contributions of the tracts are modeled as "sticks", i.e. tensors with a given axial diffusivity (in this case,
1.7*10^-3 mm^2/s
) but null perpendicular diffusivity; - extra-cellular contributions are modeled as tensors with the same axial diffusivity as the sticks and given perpendicular diffusivities (in this case we set only one, i.e.,
0.51^-3 mm^2/s
); - isotropic contributions are modeled as tensors with given diffusivities (in this case we set two, i.e.
1.7*10^-3 mm^2/s
and3.0*10^-3 mm^2/s
).
Setup the parameters of the model and generate the lookup tables:
mit.set_model( 'StickZeppelinBall' )
d_par = 1.7E-3 # Parallel diffusivity [mm^2/s]
d_perps = [ 0.51E-3 ] # Perpendicular diffusivity(s) [mm^2/s]
d_isos = [ 1.7E-3, 3.0E-3 ] # Isotropic diffusivity(s) [mm^2/s]
mit.model.set( d_par, d_perps, d_isos )
mit.generate_kernels( regenerate=True )
mit.load_kernels()
and the output should look like:
-> Simulating with "Stick-Zeppelin-Ball" model:
[==================================================] 100.0%
[ 2.7 seconds ]
-> Resampling LUT for subject ".":
* Keeping all b0 volume(s)...
[==================================================] 100.0%
[ OK ]
* Normalizing... [ OK ]
[ 1.0 seconds ]
Load in memory the sparse data-structure previously created with trk2dictionary.run()
:
mit.load_dictionary( 'COMMIT' )
The output should show that around 280K fibers have been loaded, in addition to 145K segments for the extra-cellular contributions in the 53K voxels of the white matter:
-> Loading the dictionary:
* Segments from the tracts... [ 283522 fibers and 16664340 segments ]
* Segments from the peaks... [ 145434 segments ]
* Isotropic contributions... [ 53008 voxels ]
* Post-processing... [ OK ]
[ 3.0 seconds ]
Now it's time to build the linear operator A which computes the matrix-vector multiplications for solving the linear system. This operator uses information from the segments loaded in the previous step and the lookup tables for the response functions; it also splits the workload to different threads (if possible) during the multiplications. To this aim, run the following commands:
mit.set_threads()
mit.build_operator()
The output should be similar to this:
-> Distributing workload to different threads:
* Number of threads : 4
* A operator... [ OK ]
* A' operator... [ OK ]
[ 1.5 seconds ]
-> Building linear operator A:
[ 1.0 seconds ]
NB: the number of threads is automatically set to the maximum number found in the system (4 in this example), but this value can be manually set, e.g., mit.set_threads( 1 )
.
To fit the model (Stick-Zeppelin-Ball
in this case) to the data, simply run:
mit.fit( tol_fun=1e-3, max_iter=1000 )
The optimization progress is displayed by default:
-> Fit model using "nnls":
| 1/2||Ax-y||^2 Omega | Cost function Abs error Rel error | Abs x Rel x
------|--------------------------------|-----------------------------------------------|------------------------------
1 | 2.8824892e+05 0.0000000e+00 | 2.8824892e+05 4.0306793e+05 1.3983328e+00 | 5.4057490e+01 1.0000000e+00
2 | 2.3378806e+05 0.0000000e+00 | 2.3378806e+05 5.4460861e+04 2.3294972e-01 | 1.6251658e+01 2.6647239e-01
3 | 1.9746841e+05 0.0000000e+00 | 1.9746841e+05 3.6319650e+04 1.8392638e-01 | 1.4524881e+01 2.0469624e-01
...
...
138 | 1.0231110e+04 0.0000000e+00 | 1.0231110e+04 1.0601612e+01 1.0362132e-03 | 1.5289716e+00 4.1011283e-03
139 | 1.0220811e+04 0.0000000e+00 | 1.0220811e+04 1.0299621e+01 1.0077107e-03 | 1.5192500e+00 4.0673395e-03
140 | 1.0210809e+04 0.0000000e+00 | 1.0210809e+04 1.0001744e+01 9.7952516e-04 | 1.5095913e+00 4.0339229e-03
< Stopping criterion: REL_OBJ >
[ 00h 03m 07s ]
where the columns report, among other information, the iteration number, the current cost function and its relative change between two consecutive iterations.
The estimated streamline weights, i.e., streamline_weights.txt
, and the voxelwise output maps, i.e., fit_RMSE.nii.gz
, can be stored to files as follows:
mit.save_results()
As shown in the output, the results are saved in the folder Results_StickZeppelinBall
:
-> Saving results to "Results_StickZeppelinBall/*":
* Fitting errors:
- RMSE... [ 0.059 +/- 0.018 ]
- NRMSE... [ 0.118 +/- 0.037 ]
* Voxelwise contributions:
- Intra-axonal... [ OK ]
- Extra-axonal... [ OK ]
- Isotropic... [ OK ]
* Configuration and results:
- streamline_weights.txt... [ OK ]
- results.pickle... [ OK ]
[ 1.4 seconds ]
The streamline weights estimated by COMMIT, i.e., one coefficient for each streamline in the input tractogram demo01_fibers.tck
, can be found in the file streamline_weights.txt
:
0.00000e+00
0.00000e+00
0.00000e+00
0.00000e+00
1.47401e-03
3.36807e-03
2.32047e-03
1.02605e-05
8.35506e-06
0.00000e+00
...
...
If you need the estimated coefficients for all the compartments, you can access these values with:
x_ic, x_ec, x_iso = mit.get_coeffs()
print( x_ic.size ) # this shows there are 283522 streamline coefficients (one for each)
The following figure shows the density of the tracts (Calamante et al., NeuroImage, 2010) of the original tractogram (left) and of its optimized version (right):
It is also possible to visualize voxelwise maps of the corresponding contributions of the extra-cellular space (left) and other isotropic contaminations (right):
Finally, the fitting error in each voxel can also be inspected:
Diffusion Imaging and Connectivity Estimation (DICE) lab