Skip to content

FSL based DTI Processing

edickie edited this page Mar 9, 2017 · 3 revisions

DTI (pre)processing is an opinionated process with many opinions on exactly what to do. Here are some ideas.

Preprocessing with FSL

Certainly look to the FDT user guide.

If you're using eddy_correct and dtifit to correct and generate your tensor, we have a simple script to do the appropriate steps for an image: dtifit.sh

Rotate your bvecs?

When you run eddy_correct on your image, you end up rotating your volumes to align them in some way. It's an open question as to whether you should also then rotate the associated bvecs. See our discussion on the topics. FSL comes with a script to do this rotation based on the log files that eddy_correct makes: fdt_rotate_bvecs.

FSL dtifit and ENIGMA DTI as they currrently exist on our file system

Here is an example of the scripts that are run to do a simple FSL DTI preprocessing stream than run the ENIGMA DTI ROI extraction for the TIGRlab data.

on the SCC (as we now do it)

export PROJECTDIR=/external/rprshnas01/tigrlab/archive/data-2.0/<project_name>
module load /KIMEL/quarantine/modules/quarantine
module load datman/latest

  module load FSL/5.0.9 R/3.2.5 ENIGMA-DTI/2015.01
  dm-proc-dtifit.py \
    --inputdir ${PROJECTDIR}/data/nii \
    --outputdir ${PROJECTDIR}/pipelines/dtifit

  dtifit-qc.py \
    ${PROJECTDIR}/pipelines/dtifit/

  dm-proc-enigmadti.py \
    --calc-all \
    --QC-transfer ${PROJECTDIR}/metadata/checklist.csv \
    ${PROJECTDIR}/pipelines/dtifit \
    ${PROJECTDIR}/pipelines/enigmaDTI

on the kimel lab system you SHOULD only need to change some of the modules..

export PROJECTDIR=/archive/data-2.0/<project_name>
module load /archive/code/datman.module

  module load FSL/5.0.7 R/3.1.1 ENIGMA-DTI/2015.01
  dm-proc-dtifit.py \
    --inputdir ${PROJECTDIR}/data/nii \
    --outputdir ${PROJECTDIR}/pipelines/dtifit

  dtifit-qc.py \
    ${PROJECTDIR}/pipelines/dtifit/

  dm-proc-enigmadti.py \
    --calc-all \
    --QC-transfer ${PROJECTDIR}/metadata/checklist.csv \
    ${PROJECTDIR}/pipelines/dtifit \
    ${PROJECTDIR}/pipelines/enigmaDTI

Here is what those pieces do...

dm-proc-dtifit.py

Looks for DTI files in the input folder structure. When it finds them, it runs dtifit.sh on that DWI input. [dtifit.sh] does all the real work of:

  1. Running FSL's BET on DWI file
  2. Running eddy_current correction
  3. Running FSL's dtifit, which calculated FA MD and all those eigenvectors and things.

For more into on this little script. Read what Jon Piptone wrote here. And read FSL's documentation here. dtifit.sh is sitting in the assets folder of the datman project...or can be downloaded from here.

Note: dm-proc-dtifit.py's magik DTI file finding powers really on two rules...

  1. The DWI images are sitting in a nested folder structure inside a subject/session
  2. The DWI images are nifti images with "DTI" in their file names..

For example:

${PROJECTDIR}/data/nii
├── subject_01
│   ├── subject_01_DTI60-1000_04_Ax-DTI60plus5-20iso.bval
│   ├── subject_01_DTI60-1000_04_Ax-DTI60plus5-20iso.bvec
│   ├── subject_01_DTI60-1000_04_Ax-DTI60plus5-20iso.nii.gz
│   └── subject_01_T1_02_SagT1Bravo-09mm.nii.gz
├── subject_02
│   ├── subject_02_DTI60-1000_04_Ax-DTI60plus5-20iso.bval
│   ├── subject_02_DTI60-1000_04_Ax-DTI60plus5-20iso.bvec
│   ├── subject_02_DTI60-1000_04_Ax-DTI60plus5-20iso.nii.gz
│   └── subject_02_T1_02_SagT1Bravo-09mm.nii.gz
...
└── subject_n
    ├── subject_n_DTI60-1000_04_Ax-DTI60plus5-20iso.bval
    ├── subject_n_DTI60-1000_04_Ax-DTI60plus5-20iso.bvec
    ├── subject_n_DTI60-1000_04_Ax-DTI60plus5-20iso.nii.gz
    └── subject_n_T1_02_SagT1Bravo-09mm.nii.gz

dtifit-qc.py

dtifit-qc.py is a little script that loops over the dtifit outputs and generates two type of snaps-shots.

  1. Snapshots of the BET brainmask outline on the DWI images
  2. Some colorfull views of all DTI directions

This should just work on any ditfit output folder structure..

dm-proc-enigma.py

dm-proc-enigmadti.py looks for dtifit output directories and extracts ROI values following the ENIGMA-JHU atlas protocol (check out the ENIGMA protocols site for more into about that) - (Also cite this paper is you use this method)..

dm-proc-enigmadti.py does not actually run the enigma-dti pipeline itself but instead writes smaller scripts to run three other functions:

  1. doInd-enigma-dti.py : runs enigma-DTI for one person
  2. dm-proc-enigma-concat.py : concatenates all the ROI values into one csv file
  3. dm-qc-enigma.py: creates qc snapshots of the skeleton for visual inspection.
Clone this wiki locally