Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add custom censoring options #1159

Draft
wants to merge 32 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
ad31b12
Add custom censoring options.
tsalo May 1, 2024
0b2e668
Add docstring.
tsalo May 1, 2024
26c353b
Improve figure.
tsalo May 1, 2024
8fe59d2
Keep working on censoring.
tsalo May 1, 2024
5389011
Run isort.
tsalo May 1, 2024
aafe452
Improve things.
tsalo May 1, 2024
de68f6a
Account for other threshes in flag_bad_run.
tsalo May 1, 2024
4bb97be
Update base.py
tsalo May 1, 2024
3b55511
Split up confound generation.
tsalo May 2, 2024
93a46b6
Keep working on it.
tsalo May 2, 2024
14185c7
Fix style.
tsalo May 2, 2024
be51236
More modifications.
tsalo May 2, 2024
df787ca
Update test_interfaces_censoring.py
tsalo May 2, 2024
e014daf
Update
tsalo May 2, 2024
ef142bb
Fix connections.
tsalo May 2, 2024
3a2149b
Fix more connections.
tsalo May 2, 2024
6504776
Whoops.
tsalo May 2, 2024
6a1ba08
Fix connection.
tsalo May 2, 2024
2aba326
Fix up some tests.
tsalo May 2, 2024
58f36ad
Improve tests.
tsalo May 2, 2024
20254e9
Work on tests.
tsalo May 2, 2024
045a4b8
Fix up tests more.
tsalo May 2, 2024
cbd5ba5
Update test_interfaces_nilearn.py
tsalo May 2, 2024
6234238
Fix test.
tsalo May 2, 2024
5c99b9e
Update test_workflows_connectivity.py
tsalo May 3, 2024
709e2cc
Update test_workflows_connectivity.py
tsalo May 3, 2024
1512189
Fix dataframe.
tsalo May 3, 2024
405d3da
Merge branch 'main' into custom-censoring
tsalo May 6, 2024
97126d4
Merge branch 'main' into custom-censoring
tsalo May 7, 2024
fb823a9
Merge remote-tracking branch 'upstream/main' into custom-censoring
tsalo May 31, 2024
8ecbb17
Try fixing.
tsalo May 31, 2024
4ae43ee
Revert "Try fixing."
tsalo May 31, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 9 additions & 9 deletions docs/workflows.rst
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ then fsnative surface files from the preprocessing derivatives will be warped to
Identification of high-motion outlier volumes
=============================================
:func:`~xcp_d.workflows.postprocessing.init_prepare_confounds_wf`,
:class:`~xcp_d.interfaces.censoring.GenerateConfounds`
:class:`~xcp_d.interfaces.censoring.GenerateTemporalMask`

XCP-D uses framewise displacement to identify high-motion outlier volumes.
These outlier volumes are removed from the BOLD data prior to denoising.
Expand All @@ -62,7 +62,7 @@ The threshold used to identify outlier volumes can be set with the ``--fd-thresh
Motion parameter filtering [OPTIONAL]
-------------------------------------
:func:`~xcp_d.workflows.postprocessing.init_prepare_confounds_wf`,
:class:`~xcp_d.interfaces.censoring.GenerateConfounds`,
:class:`~xcp_d.interfaces.censoring.ModifyConfounds`,
:func:`~xcp_d.utils.confounds.load_motion`

Motion parameters may be contaminated with respiratory effects :footcite:p:`power2019distinctions`.
Expand Down Expand Up @@ -116,7 +116,7 @@ per :footcite:t:`gratton2020removal`.
Framewise displacement calculation and thresholding
---------------------------------------------------
:func:`~xcp_d.workflows.postprocessing.init_prepare_confounds_wf`,
:class:`~xcp_d.interfaces.censoring.GenerateConfounds`,
:class:`~xcp_d.interfaces.censoring.GenerateTemporalMask`,
:func:`~xcp_d.utils.modified_data.compute_fd`

Framewise displacement is then calculated according to the formula from :footcite:t:`power_fd_dvars`.
Expand All @@ -137,7 +137,7 @@ These volumes will later be removed from the denoised data.
Confound regressor selection
============================
:func:`~xcp_d.workflows.postprocessing.init_prepare_confounds_wf`,
:func:`~xcp_d.interfaces.censoring.GenerateConfounds`
:func:`~xcp_d.interfaces.censoring.GenerateDesignMatrix`

The confound regressor configurations in the table below are implemented in XCP-D,
with ``36P`` as the default.
Expand Down Expand Up @@ -507,13 +507,13 @@ ReHo
:func:`~xcp_d.workflows.restingstate.init_reho_nifti_wf`,
:func:`~xcp_d.workflows.restingstate.init_reho_cifti_wf`

Regional Homogeneity (ReHo) is a measure of local temporal uniformity in the BOLD signal computed at each voxel of the processed image.
Greater ReHo values correspond to greater synchrony among BOLD activity patterns measured in a local neighborhood of voxels, with neighborhood size determined by a user-specified radius of voxels.
Regional Homogeneity (ReHo) is a measure of local temporal uniformity in the BOLD signal computed at each voxel of the processed image.
Greater ReHo values correspond to greater synchrony among BOLD activity patterns measured in a local neighborhood of voxels, with neighborhood size determined by a user-specified radius of voxels.
ReHo is calculated as the coefficient of concordance among all voxels in a sphere centered on the target voxel.

For NIfTIs, ReHo is always calculated via AFNI’s 3dReho with 27 voxels in each neighborhood, using Kendall's coefficient of concordance (KCC).
For CIFTIs, the left and right hemisphere are extracted into GIFTI format via Connectome Workbench’s CIFTISeparateMetric. Next, the mesh adjacency matrix is obtained,and Kendall's coefficient of concordance (KCC) is calculated, with each vertex having four neighbors.
For subcortical voxels in the CIFTIs, 3dReho is used with the same parameters that are used for NIfTIs.
For NIfTIs, ReHo is always calculated via AFNI’s 3dReho with 27 voxels in each neighborhood, using Kendall's coefficient of concordance (KCC).
For CIFTIs, the left and right hemisphere are extracted into GIFTI format via Connectome Workbench’s CIFTISeparateMetric. Next, the mesh adjacency matrix is obtained,and Kendall's coefficient of concordance (KCC) is calculated, with each vertex having four neighbors.
For subcortical voxels in the CIFTIs, 3dReho is used with the same parameters that are used for NIfTIs.


Parcellation and functional connectivity estimation [OPTIONAL]
Expand Down
85 changes: 75 additions & 10 deletions xcp_d/cli/parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
import os
import sys

import xcp_d.cli.parser_utils as types
from xcp_d import config


Expand All @@ -19,7 +20,6 @@ def _build_parser():

from packaging.version import Version

from xcp_d.cli.parser_utils import _float_or_auto, _int_or_auto, _restricted_float
from xcp_d.cli.version import check_latest, is_flagged
from xcp_d.utils.atlas import select_atlases

Expand Down Expand Up @@ -257,7 +257,7 @@ def _bids_filter(value, parser):
"--dummy_scans",
dest="dummy_scans",
default=0,
type=_int_or_auto,
type=types._int_or_auto,
metavar="{{auto,INT}}",
help=(
"Number of volumes to remove from the beginning of each run. "
Expand Down Expand Up @@ -399,7 +399,7 @@ def _bids_filter(value, parser):
"--head-radius",
"--head_radius",
default=50,
type=_float_or_auto,
type=types._float_or_auto,
help=(
"Head radius used to calculate framewise displacement, in mm. "
"The default value is 50 mm, which is recommended for adults. "
Expand All @@ -412,15 +412,67 @@ def _bids_filter(value, parser):
"-f",
"--fd-thresh",
"--fd_thresh",
default=0.3,
default=[0.3, 0],
metavar="FLOAT",
type=float,
nargs="+",
help=(
"Framewise displacement threshold for censoring. "
"Any volumes with an FD value greater than the threshold will be removed from the "
"denoised BOLD data. "
"Framewise displacement thresholds for censoring. "
"This may be a single value or a pair of values. "
"If two values are provided, the first one will determine which volumes are "
"removed from the denoised BOLD data, and the second one will determine which "
"volumes are removed from the interpolated BOLD data. "
"A threshold of <=0 will disable censoring completely."
),
)
g_censor.add_argument(
"--dvars-thresh",
"--dvars_thresh",
default=[0, 0],
metavar="FLOAT",
type=float,
nargs="+",
help=(
"DVARS threshold for censoring. "
"This may be a single value or a pair of values. "
"If two values are provided, the first one will determine which volumes are "
"removed from the denoised BOLD data, and the second one will determine which "
"volumes are removed from the interpolated BOLD data. "
"A threshold of <=0 will disable censoring completely."
),
)
g_censor.add_argument(
"--censor-before",
"--censor_before",
default=[0, 0],
metavar="INT",
nargs="+",
type=int,
help="The number of volumes to remove before any outlier volumes.",
)
g_censor.add_argument(
"--censor-after",
"--censor_after",
default=[0, 0],
metavar="INT",
nargs="+",
type=int,
help="The number of volumes to remove after any outlier volumes.",
)
g_censor.add_argument(
"--censor-between",
"--censor_between",
default=[0, 0],
metavar="INT",
nargs="+",
type=int,
help=(
"If any short sets of contiguous non-outliers are found between outliers, "
"this parameter will remove them. "
"For example, if the value is set to 1, then any cases where only one non-outlier "
"volume exists between two outlier volumes will be censored."
),
)
g_censor.add_argument(
"--min-time",
"--min_time",
Expand Down Expand Up @@ -519,7 +571,7 @@ def _bids_filter(value, parser):
"--min_coverage",
required=False,
default=0.5,
type=_restricted_float,
type=types._restricted_float,
help=(
"Coverage threshold to apply to parcels in each atlas. "
"Any parcels with lower coverage than the threshold will be replaced with NaNs. "
Expand Down Expand Up @@ -896,10 +948,23 @@ def _validate_parameters(opts, build_log, parser):
build_log.warning("Bandpass filtering is disabled. ALFF outputs will not be generated.")

# Scrubbing parameters
if opts.fd_thresh <= 0 and opts.min_time > 0:
opts.fd_thresh = types._check_censoring_thresholds(opts.fd_thresh, parser, "--fd-thresh")
opts.dvars_thresh = types._check_censoring_thresholds(
opts.dvars_thresh, parser, "--dvars-thresh"
)
opts.censor_before = types._check_censoring_numbers(
opts.censor_before, parser, "--censor-before"
)
opts.censor_after = types._check_censoring_numbers(opts.censor_after, parser, "--censor-after")
opts.censor_between = types._check_censoring_numbers(
opts.censor_between, parser, "--censor-between"
)

nocensor = all(t <= 0 for t in opts.fd_thresh + opts.dvars_thresh)
if nocensor and opts.min_time > 0:
ignored_params = "\n\t".join(["--min-time"])
build_log.warning(
"Framewise displacement-based scrubbing is disabled. "
"Censoring is disabled. "
f"The following parameters will have no effect:\n\t{ignored_params}"
)
opts.min_time = 0
Expand Down
75 changes: 75 additions & 0 deletions xcp_d/cli/parser_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,52 @@ def _float_or_auto(string, is_parser=True):
return floatarg


def _one_or_two_ints(string, is_parser=True):
"""Check if argument is one or two integers >= 0."""
error = argparse.ArgumentTypeError if is_parser else ValueError
try:
ints = [int(i) for i in string.split()]
except ValueError:
msg = "Argument must be one or two integers."
raise error(msg)

if len(ints) == 1:
ints.append(0)
elif len(ints) != 2:
raise error("Argument must be one or two integers.")

if ints[0] < 0 or ints[1] < 0:
raise error(
"Int arguments must be nonnegative. "
"If you wish to disable censoring, set the value to 0."
)

return ints


def _one_or_two_floats(string, is_parser=True):
"""Check if argument is one or two floats >= 0."""
error = argparse.ArgumentTypeError if is_parser else ValueError
try:
floats = [float(i) for i in string.split()]
except ValueError:
msg = "Argument must be one or two floats."
raise error(msg)

if len(floats) == 1:
floats.append(0.0)
elif len(floats) != 2:
raise error("Argument must be one or two floats.")

if floats[0] < 0 or floats[1] < 0:
raise error(
"Float arguments must be nonnegative. "
"If you wish to disable censoring, set the value to 0."
)

return floats


def _restricted_float(x):
"""From https://stackoverflow.com/a/12117065/2589328."""
try:
Expand Down Expand Up @@ -101,3 +147,32 @@ def __call__(self, parser, namespace, values, option_string=None): # noqa: U100
f"{self.__version__}. "
)
setattr(namespace, self.dest, values)


def _check_censoring_thresholds(values, parser, arg_name):
if len(values) == 1:
values = [values[0], values[0]]
elif len(values) > 2:
parser.error(
f"Invalid number of values for '{arg_name}': {len(values)}. "
"Please provide either one or two values."
)

if values[0] > 0 and values[1] > values[0]:
parser.error(
f"The second value ({values[1]}) for '{arg_name}' must be less than or equal to the "
f"first value ({values[0]})."
)

return values


def _check_censoring_numbers(values, parser, arg_name):
if len(values) == 1:
values = [values[0], 0]
elif len(values) > 2:
parser.error(
f"Invalid number of values for '{arg_name}': {len(values)}. "
"Please provide either one or two values."
)
return values
8 changes: 8 additions & 0 deletions xcp_d/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -548,6 +548,14 @@ class workflow(_Config):
"""Radius of the head in mm."""
fd_thresh = None
"""Framewise displacement threshold for censoring."""
dvars_thresh = None
"""DVARS threshold for censoring."""
censor_before = None
"""Number of volumes to remove before FD or DVARS outliers."""
censor_after = None
"""Number of volumes to remove after FD or DVARS outliers."""
censor_between = None
"""Number of volumes to remove between any FD or DVARS outliers."""
min_time = None
"""Post-scrubbing threshold to apply to individual runs in the dataset."""
bandpass_filter = True
Expand Down
4 changes: 2 additions & 2 deletions xcp_d/data/xcp_d_bids_config.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"sub-{subject}[/ses-{session}]/{datatype<anat|func>|func}/sub-{subject}[_ses-{session}][_task-{task}][_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}][_seg-{segmentation}][_res-{res}][_den-{den}]_stat-{statistic}[_desc-{desc}]_{suffix<bold|morph>}{extension<.tsv|.json>|.tsv}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_desc-{desc}]_{suffix<design>|design}{extension<.tsv|.json>|.tsv}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}][_seg-{segmentation}][_res-{res}][_den-{den}]_stat-{statistic}[_desc-{desc}]_{suffix<timeseries>}{extension<.tsv|.json>|.tsv}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_desc-{desc}]_{suffix<motion|outliers>}{extension<.json|.tsv>|.tsv}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_desc-{desc}]_{suffix<timeseries|outliers>}{extension<.json|.tsv>|.tsv}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}][_res-{res}][_den-{den}][_desc-{desc}]_{suffix<qc>}{extension<.tsv|.json>|.tsv}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_desc-{desc}]_{suffix<qc>}{extension<.hdf5>|.hdf5}",
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}][_den-{den}][_hemi-{hemi<L|R>}][_desc-{desc}]_{suffix<bold>}{extension<.dtseries.nii|.json>}",
Expand All @@ -47,7 +47,7 @@
"sub-{subject}[/ses-{session}]/{datatype<func>|func}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}]_seg-{segmentation}[_den-{den}][_hemi-{hemi<L|R>}]_stat-{statistic}[_desc-{desc}]_{suffix<boldmap>}{extension<.pconn.nii|.json>}",
"sub-{subject}/{datatype<figures>}/sub-{subject}[_ses-{session}][_acq-{acquisition}][_ce-{ceagent}][_rec-{reconstruction}][_run-{run}][_space-{space}][_cohort-{cohort}][_seg-{segmentation}][_desc-{desc}]_{suffix<T1w|T2w|T1rho|T1map|T2map|T2star|FLAIR|FLASH|PDmap|PD|PDT2|inplaneT[12]|angio|dseg|mask|dwi|epiref|fieldmap>}{extension<.html|.svg|.png>}",
"sub-{subject}/{datatype<figures>}/sub-{subject}[_ses-{session}][_acq-{acquisition}][_ce-{ceagent}][_rec-{reconstruction}][_run-{run}][_space-{space}][_cohort-{cohort}][_seg-{segmentation}][_desc-{desc}]_{suffix<dseg|mask|dwi|epiref|fieldmap>}{extension<.html|.svg|.png>}",
"sub-{subject}/{datatype<figures>}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}][_desc-{desc}]_{suffix<bold|motion>}{extension<.html|.svg|.png>}",
"sub-{subject}/{datatype<figures>}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_space-{space}][_cohort-{cohort}][_desc-{desc}]_{suffix<bold|timeseries|motion>}{extension<.html|.svg|.png>}",
"sub-{subject}/{datatype<figures>}/sub-{subject}[_ses-{session}]_task-{task}[_acq-{acquisition}][_ce-{ceagent}][_dir-{direction}][_rec-{reconstruction}][_run-{run}][_echo-{echo}][_desc-{desc}]_{suffix<design>|design}{extension<.html|.svg|.png>}"
]
}
Loading