File: usage.rst

package info (click to toggle)
heudiconv 0.9.0-2
  • links: PTS, VCS
  • area: main
  • in suites: sid
  • size: 1,492 kB
  • sloc: python: 4,258; sh: 101; makefile: 49
file content (105 lines) | stat: -rw-r--r-- 3,081 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
=====
Usage
=====

``heudiconv`` processes DICOM files and converts the output into user defined
paths.

CommandLine Arguments
======================

.. argparse::
   :ref: heudiconv.cli.run.get_parser
   :prog: heudiconv
   :nodefault:
   :nodefaultconst:


Support
=======

All bugs, concerns and enhancement requests for this software can be submitted here:
https://github.com/nipy/heudiconv/issues.

If you have a problem or would like to ask a question about how to use ``heudiconv``,
please submit a question to `NeuroStars.org <http://neurostars.org/tags/heudiconv>`_ with a ``heudiconv`` tag.
NeuroStars.org is a platform similar to StackOverflow but dedicated to neuroinformatics.

All previous ``heudiconv`` questions are available here:
http://neurostars.org/tags/heudiconv/


Batch jobs
==========

``heudiconv`` can natively handle multi-subject, multi-session conversions,
although it will process these linearly. To speed this up, multiple ``heudiconv``
processes can be spawned concurrently, each converting a different subject and/or
session.

The following example uses SLURM and Singularity to submit every subjects'
DICOMs as an independent ``heudiconv`` execution.

The first script aggregates the DICOM directories and submits them to
``run_heudiconv.sh`` with SLURM as a job array.

If using bids, the ``notop`` bids option suppresses creation of
top-level files in the bids directory (e.g.,
``dataset_description.json``) to avoid possible race conditions.
These files may be generated later with ``populate_templates.sh``
below (except for ``participants.tsv``, which must be create
manually).

.. code:: shell

    #!/bin/bash

    set -eu

    # where the DICOMs are located
    DCMROOT=/dicom/storage/voice
    # where we want to output the data
    OUTPUT=/converted/data/voice

    # find all DICOM directories that start with "voice"
    DCMDIRS=(`find ${DCMROOT} -maxdepth 1 -name voice* -type d`)

    # submit to another script as a job array on SLURM
    sbatch --array=0-`expr ${#DCMDIRS[@]} - 1` run_heudiconv.sh ${OUTPUT} ${DCMDIRS[@]}


The second script processes a DICOM directory with ``heudiconv`` using the built-in
`reproin` heuristic.

.. code:: shell

    #!/bin/bash
    set -eu

    OUTDIR=${1}
    # receive all directories, and index them per job array
    DCMDIRS=(${@:2})
    DCMDIR=${DCMDIRS[${SLURM_ARRAY_TASK_ID}]}
    echo Submitted directory: ${DCMDIR}

    IMG="/singularity-images/heudiconv-0.9.0-dev.sif"
    CMD="singularity run -B ${DCMDIR}:/dicoms:ro -B ${OUTDIR}:/output -e ${IMG} --files /dicoms/ -o /output -f reproin -c dcm2niix -b notop --minmeta -l ."

    printf "Command:\n${CMD}\n"
    ${CMD}
    echo "Successful process"

This script creates the top-level bids files (e.g.,
``dataset_description.json``)

..code:: shell
    #!/bin/bash
    set -eu

    OUTDIR=${1}
    IMG="/singularity-images/heudiconv-0.9.0-dev.sif"
    CMD="singularity run -B ${OUTDIR}:/output -e ${IMG} --files /output -f reproin --command populate-templates"

    printf "Command:\n${CMD}\n"
    ${CMD}
    echo "Successful process"