Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calculate smoothed ALFF from smoothed BOLD #1124

Open
wants to merge 19 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 18 additions & 15 deletions docs/workflows.rst
Original file line number Diff line number Diff line change
Expand Up @@ -477,43 +477,46 @@ Amplitude of low-frequency fluctuation (ALFF) is a measure that ostensibly local
spontaneous neural activity in resting-state BOLD data.
It is calculated by the following:

1. The ``denoised, interpolated BOLD`` is passed along to the ALFF workflow.
2. If censoring+interpolation was performed, then the interpolated time series is censored at this
point.
3. Voxel-wise BOLD time series are normalized (mean-centered and scaled to unit standard deviation)
1. The ``denoised BOLD`` is passed along to the ALFF workflow.
2. Voxel-wise BOLD time series are normalized (mean-centered and scaled to unit standard deviation)
over time. This will ensure that the power spectrum from ``periodogram`` and ``lombscargle``
are roughly equivalent.
4. The power spectrum and associated frequencies are estimated from the BOLD data.
3. The power spectrum and associated frequencies are estimated from the BOLD data.

- If censoring+interpolation was not performed, then this uses :func:`scipy.signal.periodogram`.
- If censoring+interpolation was performed, then this uses :func:`scipy.signal.lombscargle`.

5. The square root of the power spectrum is calculated.
6. The power spectrum values corresponding to the frequency range retained by the
4. The square root of the power spectrum is calculated.
5. The power spectrum values corresponding to the frequency range retained by the
temporal filtering step are extracted from the full power spectrum.
7. The mean of the within-band power spectrum is calculated and multiplied by 2.
8. The ALFF value is multiplied by the standard deviation of the voxel-wise
6. The mean of the within-band power spectrum is calculated and multiplied by 2.
7. The ALFF value is multiplied by the standard deviation of the voxel-wise
``denoised, interpolated BOLD`` time series.
This brings ALFF back to its original scale, as if the time series was not normalized.

ALFF will only be calculated if the bandpass filter is enabled
(i.e., if the ``--disable-bandpass-filter`` flag is not used).

Smoothed ALFF derivatives will also be generated if the ``--smoothing`` flag is used.
ALFF will also be calculated from the smoothed, denoised BOLD if the ``--smoothing`` flag is used.


ReHo
----
:func:`~xcp_d.workflows.restingstate.init_reho_nifti_wf`,
:func:`~xcp_d.workflows.restingstate.init_reho_cifti_wf`

Regional Homogeneity (ReHo) is a measure of local temporal uniformity in the BOLD signal computed at each voxel of the processed image.
Greater ReHo values correspond to greater synchrony among BOLD activity patterns measured in a local neighborhood of voxels, with neighborhood size determined by a user-specified radius of voxels.
Regional Homogeneity (ReHo) is a measure of local temporal uniformity in the BOLD signal computed at each voxel of the processed image.
Greater ReHo values correspond to greater synchrony among BOLD activity patterns measured in a local neighborhood of voxels,
with neighborhood size determined by a user-specified radius of voxels.
ReHo is calculated as the coefficient of concordance among all voxels in a sphere centered on the target voxel.

For NIfTIs, ReHo is always calculated via AFNI’s 3dReho with 27 voxels in each neighborhood, using Kendall's coefficient of concordance (KCC).
For CIFTIs, the left and right hemisphere are extracted into GIFTI format via Connectome Workbench’s CIFTISeparateMetric. Next, the mesh adjacency matrix is obtained,and Kendall's coefficient of concordance (KCC) is calculated, with each vertex having four neighbors.
For subcortical voxels in the CIFTIs, 3dReho is used with the same parameters that are used for NIfTIs.
For NIfTIs, ReHo is always calculated via AFNI's 3dReho with 27 voxels in each neighborhood,
using Kendall's coefficient of concordance (KCC).
For CIFTIs, the left and right hemisphere are extracted into GIFTI format via
Connectome Workbench's CIFTISeparateMetric.
Next, the mesh adjacency matrix is obtained, and Kendall's coefficient of concordance (KCC) is calculated,
with each vertex having 5-6 neighbors.
For subcortical voxels in the CIFTIs, 3dReho is used with the same parameters that are used for NIfTIs.


Parcellation and functional connectivity estimation [OPTIONAL]
Expand Down
1 change: 1 addition & 0 deletions xcp_d/tests/test_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -349,6 +349,7 @@ def test_nibabies(data_dir, output_dir, working_dir):
"--head_radius=auto",
"--smoothing=0",
"--fd-thresh=0",
"--mem-gb=1",
]
_run_and_generate(
test_name=test_name,
Expand Down
10 changes: 6 additions & 4 deletions xcp_d/tests/test_workflows_restingstate.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,12 +51,13 @@ def test_nifti_alff(ds001419_data, tmp_path_factory):
alff_wf.base_dir = tempdir
alff_wf.inputs.inputnode.bold_mask = bold_mask
alff_wf.inputs.inputnode.denoised_bold = bold_file
alff_wf.inputs.inputnode.smoothed_bold = bold_file
compute_alff_res = alff_wf.run()

nodes = get_nodes(compute_alff_res)

# Let's get the mean of the ALFF for later comparison
original_alff = nodes["alff_wf.alff_compt"].get_output("alff")
original_alff = nodes["alff_wf.compute_alff"].get_output("alff")
original_alff_data_mean = nb.load(original_alff).get_fdata().mean()

# Now let's do an FFT
Expand Down Expand Up @@ -92,11 +93,12 @@ def test_nifti_alff(ds001419_data, tmp_path_factory):
alff_wf.base_dir = tempdir
alff_wf.inputs.inputnode.bold_mask = bold_mask
alff_wf.inputs.inputnode.denoised_bold = filename
alff_wf.inputs.inputnode.smoothed_bold = filename
compute_alff_res = alff_wf.run()
nodes = get_nodes(compute_alff_res)

# Let's get the new ALFF mean
new_alff = nodes["alff_wf.alff_compt"].get_output("alff")
new_alff = nodes["alff_wf.compute_alff"].get_output("alff")
assert os.path.isfile(new_alff)
new_alff_data_mean = nb.load(new_alff).get_fdata().mean()

Expand Down Expand Up @@ -142,7 +144,7 @@ def test_cifti_alff(ds001419_data, tmp_path_factory):
nodes = get_nodes(compute_alff_res)

# Let's get the mean of the data for later comparison
original_alff = nodes["alff_wf.alff_compt"].get_output("alff")
original_alff = nodes["alff_wf.compute_alff"].get_output("alff")
original_alff_data_mean = nb.load(original_alff).get_fdata().mean()

# Now let's do an FFT
Expand Down Expand Up @@ -172,7 +174,7 @@ def test_cifti_alff(ds001419_data, tmp_path_factory):
nodes = get_nodes(compute_alff_res)

# Let's get the new ALFF mean
new_alff = nodes["alff_wf.alff_compt"].get_output("alff")
new_alff = nodes["alff_wf.compute_alff"].get_output("alff")
assert os.path.isfile(new_alff)
new_alff_data_mean = nb.load(new_alff).get_fdata().mean()

Expand Down
1 change: 0 additions & 1 deletion xcp_d/utils/doc.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,6 @@
smoothing : :obj:`float`
The full width at half maximum (FWHM), in millimeters,
of the Gaussian smoothing kernel that will be applied to the post-processed and denoised data.
ALFF and ReHo outputs will also be smoothing with this kernel.
"""

docdict[
Expand Down
21 changes: 9 additions & 12 deletions xcp_d/utils/restingstate.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ def compute_alff(data_matrix, low_pass, high_pass, TR, sample_mask=None):
Parameters
----------
data_matrix : numpy.ndarray
data matrix points by timepoints
data matrix points by timepoints, after optional censoring
low_pass : float
low pass frequency in Hz
high_pass : float
Expand Down Expand Up @@ -140,7 +140,11 @@ def compute_alff(data_matrix, low_pass, high_pass, TR, sample_mask=None):

if sample_mask is not None:
sample_mask = sample_mask.astype(bool)
assert sample_mask.size == n_volumes, f"{sample_mask.size} != {n_volumes}"
assert sample_mask.sum() == n_volumes, f"{sample_mask.sum()} != {n_volumes}"

time_arr = np.arange(0, sample_mask.size * TR, TR)
time_arr = time_arr[sample_mask]
assert time_arr.size == n_volumes, f"{time_arr.size} != {n_volumes}"

alff = np.zeros(n_voxels)
for i_voxel in range(n_voxels):
Expand All @@ -156,26 +160,19 @@ def compute_alff(data_matrix, low_pass, high_pass, TR, sample_mask=None):
# have the same scale.
# However, this also changes ALFF's scale, so we retain the SD to rescale ALFF.
sd_scale = np.std(voxel_data)
voxel_data -= np.mean(voxel_data)
voxel_data /= np.std(voxel_data)

if sample_mask is not None:
voxel_data_censored = voxel_data[sample_mask]
voxel_data_censored -= np.mean(voxel_data_censored)
voxel_data_censored /= np.std(voxel_data_censored)

time_arr = np.arange(0, n_volumes * TR, TR)
assert sample_mask.size == time_arr.size, f"{sample_mask.size} != {time_arr.size}"
time_arr = time_arr[sample_mask]
frequencies_hz = np.linspace(0, 0.5 * fs, (n_volumes // 2) + 1)[1:]
angular_frequencies = 2 * np.pi * frequencies_hz
power_spectrum = signal.lombscargle(
time_arr,
voxel_data_censored,
voxel_data,
angular_frequencies,
normalize=True,
)
else:
voxel_data -= np.mean(voxel_data)
voxel_data /= np.std(voxel_data)
# get array of sample frequencies + power spectrum density
frequencies_hz, power_spectrum = signal.periodogram(
voxel_data,
Expand Down
3 changes: 2 additions & 1 deletion xcp_d/workflows/bold.py
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,8 @@ def init_postprocess_nifti_wf(
("outputnode.temporal_mask", "inputnode.temporal_mask"),
]),
(denoise_bold_wf, alff_wf, [
("outputnode.denoised_interpolated_bold", "inputnode.denoised_bold"),
("outputnode.censored_denoised_bold", "inputnode.denoised_bold"),
("outputnode.smoothed_denoised_bold", "inputnode.smoothed_bold"),
]),
]) # fmt:skip

Expand Down
3 changes: 2 additions & 1 deletion xcp_d/workflows/cifti.py
Original file line number Diff line number Diff line change
Expand Up @@ -280,7 +280,8 @@ def init_postprocess_cifti_wf(
("outputnode.temporal_mask", "inputnode.temporal_mask"),
]),
(denoise_bold_wf, alff_wf, [
("outputnode.denoised_interpolated_bold", "inputnode.denoised_bold"),
("outputnode.censored_denoised_bold", "inputnode.denoised_bold"),
("outputnode.smoothed_denoised_bold", "inputnode.smoothed_bold"),
]),
]) # fmt:skip

Expand Down
Loading