Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Antipodally reflected bvecs for sagittal DWIs #863

Closed
Lestropie opened this issue Sep 21, 2024 · 25 comments
Closed

Antipodally reflected bvecs for sagittal DWIs #863

Lestropie opened this issue Sep 21, 2024 · 25 comments

Comments

@Lestropie
Copy link

Describe the bug

Where DWIs are acquired with sagittal slice orientation, the saved bvecs are antipodally reflected with respect to the diffusion sensitisation gradients that the sequence was instructed to apply. This happens irrespective of phase encoding direction or slice order.

To reproduce

Build and run the Docker image at: https://github.com/Lestropie/DWI_metadata

Expected behavior

The gradient table utilised during acquisition contains three fiducials at the head, applying diffusion sensitisation gradients along the three axes of the Device Coordinate System. The software tool reads these three columns from bvecs, transforms them to RAS real space according to my understanding of the bvecs format (which is seemingly confirmed correct via other tests), and compares them to the fiducials. If these correspond, no warning will be issued.

Version

The Docker image builds the most recent tag, v1.0.20240202, from source.


PS. In case it's a clue: I found that with Eigen used by MRtrix3, when trying to access the 3x3 portion of an affine transform, .rotation() gives a different result to .linear() only for sagittal acquisition, as the former actually invokes a decomposition.

@neurolabusc
Copy link
Collaborator

@Lestropie I did create a validation dataset and minimal validation pipeline as dcm_qa_sag. It looks to me like the issue is with Siemens data where sagittal data is saved as slices (either enhanced or separate files) rather than mosaics.

Series 4 (from this VE11 scanner) saves the sagittals as mosacis and the direction looks fine:
4_DWI_SagAPmosaic

The same sequence, but saving the data as one 2D slice per file does not look correct:
6_DWI_SagAP

I suspect this is the same root issue with issue 797.

@Lestropie
Copy link
Author

Visualisation of peak orientations does not discriminate between the presence and absence of this particular bug. The reported gradient directions for the sagittal acquisitions are specifically the antipodal reflection of what was actually applied. This is inconsequential to typical fibre orientation estimation, which assumes the diffusion-weighted signal to be antipodally symmetric.

I am assuming that the error is in the dcm2niix output rather than the DICOMs themselves given that with the rectifications put in place in MRtrix3/mrtrix3#3011, which results in code that is both simpler and has a stronger expectation of being correct on first principles, no such reflection is detected. This is not only for DICOM load but also for an insane barrage of other tests, reading/writing from various formats with various configurations. So while it's not inconceivable that there could be adjoint bugs in the DICOM data and the updated MRtrix3 implementation that cancel one another out, resulting in my erroneously attributing fault to dcm2niix, I would consider that scenario far less likely.

neurolabusc added a commit to neurolabusc/dcm_qa_sag that referenced this issue Oct 11, 2024
@neurolabusc
Copy link
Collaborator

OK. Now I understand. b-vectors that are have polar opposite directions have identical sensitivity to fiber direction, but have the eddy-current distortions have opposite direction but identical magnitude. Therefore, getting all the bvec directions 180-degrees off will create identical tensors. I also believe that tools like Eddy will not be influenced by this, as it will undistort to the spatial position between the two images. Given the fact that our tools are so tolerant to this polarity reversal, what is the best way to objectively test this? For example, you and I both acquired E11 mosaic data, and I also acquired E11 non-mosaic images. We also want to make sure our solution works for XA data. Maybe you can expand on the consequences of this error, that can help mitigate data acquired before this issue was identified. Can you provide the mrtrix version that fixes this and also the canonical way to convert DICOM data with mrtrix to NIfTI (rather than mif) format that provides a solution? I am eager to solve this, but want to make sure we have a comprehensive solution that improves dcm2niix and other tools (e.g. dicm2nii).

As an aside, I did fix scripts for my validation scripts so fsleyes generates line images for each dataset. This does suggest that the line directions are correct for all the data (although this is expected when direction is off by 180 degrees):

Series 4 (Sag AP mosaic):
4_DWI_SagAPmosaic

Series 5 (Sag HF mosaic):

5_DWI_SagHFmosaic

Series 6 (Sag AP not mosaic):
6_DWI_SagAP

Series 7 (Sag HF not mosaic):
7_DWI_SagHF

@Lestropie
Copy link
Author

I also believe that tools like Eddy will not be influenced by this, as it will undistort to the spatial position between the two images.

I suspect you're right, but not with 100% certainty. All depends on whether the estimation of eddy current distortions makes any explicit assumptions about how eddy currents will manifest for a given diffusion gradient direction and sign, or whether it just finds a low-rank deformation that corrects the distortion relative to some reference.

Given the fact that our tools are so tolerant to this polarity reversal, what is the best way to objectively test this?

What I did in this experiment is create a gradient table for the scanner where the first three DWI volumes are acquired with diffusion sensitisation directions along +X, +Y and +Z of the Device Coordinate System. Those then act as fiducials. I don't have to look at a map of estimated fibre orientations to try to tell if they form smooth trajectories in space, and I don't have to worry about how any software responsible for fibre orientation estimation works (both of those I have downstream tests for, but I treat them separately). I just read the first three bvecs, transform from that coordinate system to RAS realspace based on the image header transform (and hopefully a proper understanding of the bvecs convention), and make sure that those match the fiducials.

So you'd need a tailored acquisition using this or similar gradient table if you wanted to extend this test to a broader spectrum of data. I will at least try to do this on our XA50 Vida at some point.

Can you provide the mrtrix version that fixes this and also the canonical way to convert DICOM data with mrtrix to NIfTI (rather than mif) format that provides a solution?

I had believed that MRtrix3/mrtrix3#3011 fully rectified. However while it passes my metadata tests it fails some pre-existing MRtrix3 CI tests. So I need to fully resolve that before I can definitely say that the fixed MRtrix3 implementation is correct.

@neurolabusc
Copy link
Collaborator

Closing for upcoming stable release. Happy to reopen once everything is fully resolved. It is my belief that all of our current tools generate identical results if this error exists. Happy to revisit once everything is fully rectified. The fact that the FSL bvec format requires flipping the first component if the image has a positive determinant is bizarre, and I lobbied against it when the BIDS standard was created. I fully endorse your efforts to ensure compliant and consistent behavior across manufacturers and tools. I have tremendous appreciation for your efforts on this.

@mharms
Copy link
Collaborator

mharms commented Oct 30, 2024

Curious why you call it bizarre to flip the polarity of the first bvec component. Isn't that done just so that the coordinate system of the bvecs matches the orientation of the image?

@neurolabusc
Copy link
Collaborator

neurolabusc commented Oct 30, 2024

The confusion with the FSL bvec is that you flip the first component if the image has a positive determinant, but not if it has a negative determinant. Different FSL tools are inconsistent regarding whether the first dimension is mirrored for positive determinants. For example. fslmaths will mirror the image while fslstats does not. For example, consider an image in RAS orientation, where the command fslstats img -x will give coordinates that are incompatible with fslmath's tfceS function. This feature of the bvec format has historically caused issues with tools, but it is not clearly defined in the BIDS specification.

@mharms
Copy link
Collaborator

mharms commented Oct 30, 2024

I'm not aware of the behavior of the other FSL tools, but as regards the bvecs, it would seem more "bizarre" to me if the coordinate system of the bvecs and the orientation of the image were not consistent with each other.

@mharms
Copy link
Collaborator

mharms commented Oct 30, 2024

Just to clarify, when you refer to flipping the sign of the first component of the bvec based on the determinant/orientation of the image, you're referring to what dcm2niix needs to do internally prior to writing out the bvec file, right? (if the output NIFTI is the typical left-to-right, posterior-to-anterior, inferior-to-superior oriented output)?

@neurolabusc
Copy link
Collaborator

@mharms the FSL/BIDS bvec format uses image space. Consider the vector [+1 0 0]. This means that the vector is pointing in the ascending order of columns as saved to disk (i+) if the image has a negative determinant, but a descending order (i-) if the image has a positive determinant. You could transform a NIfTI image from a positive to negative determinant by flipping any single dimension: row [i], column [j], or slice [k]. Likewise, you could invert the determinant by mirroring all three at once [i,j,k]. The FSL convention for flipping just i is arbitrary. Of the 48 possible combinations, 24 will have the column order mirrored (the columns +++, +--, -+-, --+ ). For brevity, consider these six examples that show have each spatial dimension can be mirrored in order to enforce a negative determinant:

disk -> fsl internal
R×A×S -> L×A×S
A×R×S -> P×R×S
S×A×R -> I×A×R
L×A×I -> R×A×I
P×R×I -> A×R×I
I×R×P -> S×R×P

Internally, many (but not all) FSL tools mirror the columns when they load an image with a positive determinant. This enforces a left-handed coordinate system (negative determinant). From inception, dcm2niix has supported this convention for bvecs, allowing users to validate vectors with FSL tools. However, other tools such as MRtrix (that uses a more elegant format internally) have had trouble supporting FSL tools.

While I believe that dcm2niix robust, it requires a lot of unusual logic. For example, the flipY option not only mirrors rows [j] but also inverts the determinant (so the polarity bvec for both the rows and columns must be flipped [i,j]). Likewise, you will see that the FreeSurfer team has added a lot of logic USING_DCM2NIIXFSWRAPPER for these situations.

@mharms
Copy link
Collaborator

mharms commented Oct 30, 2024

Perhaps I'm confused on the relationship between the determinant (which is not something I typically think about) and the orientation (as reported by fslhd). I've always assumed that for any given reported orientation (which I define here by the {x,y,z}orient fields returned by fslhd -- not the RADIOLOGICAL/NEUROLOGICAL distinction as returned by fslorient), the sign of the associated determinant follows (deterministically). And that there is a one-to-one correspondence between the reported orientation and the order and direction in which the data are (assumed to be) physically saved to disk. Is that not the case? [I realized that there may be some language confusion here, because to me, "orientation" is the LAS, RAS, PRI (etc) stuff; Not the distinction between a positive (right-handed) or negative (left-handed) determinant, which is what fslorient reports as either NEUROLOGICAL or RADIOLOGICAL, respectively].

So, if I understand correctly, you're saying that the FSL/BIDS bvec format uses "image space", which to me is defined by the reported orientation. Keeping those in sync seems good to me (although I don't know what format MRtrix uses internally, for comparison). Or stated different, to give a concrete example, if one took the usual LAS oriented NIFTI (by which I mean the fslhd reported xorient is "RIght-to-Left"), which fslorient reports as "RADIOLOGICAL", and converted that to an RAS oriented ("NEUROLOGICAL") NIFTI, one would want to concurrently invert the sign of the first component of the bvec file, right? [OR, are you saying that in the situation, you should actually leave the polarity on the first component of the bvec file unchanged, since the FSL bvec convention itself is strictly "radiological" REGARDLESS of the determinant (radiological vs. neurological) "orientation" of its associated NIFTI?]

@Lestropie
Copy link
Author

Firstly, I need to note that the issue of conditional flipping of the first bvecs element is somewhat distinct from the specific issue reported here. The DWI NIfTIs generated by dcm2niix all have negative determinants, presumably because it means that there would be no consequence for any downstream software that were to get this particular detail wrong. That is distinct to the antipodal reflection between applied and reported gradient direction reported here. If there is more to be done on documentation of that feature of the bvecs format, it might be better discussed in a separate Issue.

Despite that, I will however add to that conversation, specifically in the way in which it relates to the tool that justifies this bug report, since it may provide insight into bvecs over and above the antipodal reflection issue.

I acquired DWI data where I provided the MRI scanner with a bespoke gradient table. The contents of that table commence as follows:

[directions=27]
CoordinateSystem = xyz
Normalisation = none
Vector[0]  = (  1.000000000000000,  0.000000000000000,  0.000000000000000 )
Vector[1]  = (  0.000000000000000,  1.000000000000000,  0.000000000000000 )
Vector[2]  = (  0.000000000000000,  0.000000000000000,  1.000000000000000 )
...

So for those first three volumes, I know the direction in which the diffusion sensitisation directions were applied. These are based on "+X" / "+Y" / "+Z" in the Device Coordinate System (DCS). Therefore, I can perform any sequence of image manipulation operations, using any file format, and at the end I can read the diffusion gradient information and refer back to these three 3-vectors to see if there has been any corruption or misinterpretation.

For validation of operations involving the bvecs format, I need to be able to interpret the contents of the bvecs file directly; I don't want my validation to be tied to any specific software tool. I need to appropriately convert the contents of the bvecs file, making use of the corresponding image header transform, into 3-vectors in "real" / "scanner" space, so that I can compare the first three diffusion sensitisation directions against these fiducials, which are a genuine ground truth as I know that's what the scanner was instructed to do. That check should be faithful to the description of the format from FSL, but also verified using as many independent checks as possible.

Code for that check itself is here. That block of code can essentially serve as a reference implementation for how data stored in a bvecs file should be interpreted.

The key here is that it is not as simple as saying "the bvecs are defined with respect to image space". That is only true if the corresponding image has a negative determinant. If it has a positive determinant, then the sign of the first element needs to be flipped before transforming from image space to real / scanner space. So the language here needs to be more careful. One could perhaps say something like "the bvecs are defined with respect to the neurological view of that image" (assuming that the "neurological vs. radiological" and "negative vs. positive determinant" are exchangeable); sometimes that "view" is exactly identical to the image, sometimes it is not.

Now one could hypothesize that maybe my understanding of the bvecs format is incorrect. To address that requires ensuring that my interpretation is congruent with that of FSL itself. Here's how I've tried to address that. I acquired 24 DWI series, with every possible combination of slice direction / slice order / phase encoding direction. That results in 24 DICOM series, with every possible relationship between row / column / slice and anatomical orientations. I convert to NIfTI & bvec / bval using multiple different software tools & settings, and validate the bvecs using the approach above. I then run FSL's own tools for diffusion model fitting (both dtifit and bedpostx). If my interpretation is correct, then the resulting fibre orientations will form long and smooth trajectories through the white matter. I can test this by performing tractography, and ensuring that long streamlines are produced; if I can find any alternative interpretation of those fibre orientations---whether it be flipping / permuting axes / angles---that produces longer streamlines (through a modified version of this published method; code), then I suspect that I've given that FSL tool bvecs information that is incommensurate with the way in which FSL interprets such data.

Having fixed various MRtrix3 issues in this context---some known a priori, others not---this all reports total congruence; except for the dcm2niix outputs, which report antipodal flipping of gradient vectors of sagittal acquisitions.


Regarding the specific questions from @mharms, I'm hesitant to try to answer any specific one, because in my experience it only requires a lack of mutual consensus understanding of any one term ("neurological / radiological", "strides", "orientation", "coordinate system", "RAS" etc., "image space / real space") or assumption (eg. doing manual manipulation of individual files vs. processing the fileset as a whole), or an incomplete description, and the whole conversation is led astray. All I can really say is that if you are performing DWI manipulations, do whatever is necessary from first principles to not have an invalid gradient table. If unsure for N=1, check the data; if writing software to do such manipulations, it should be exhaustively checked against these data using these tools.

@mharms
Copy link
Collaborator

mharms commented Oct 31, 2024

Firstly, I need to note that the issue of conditional flipping of the first bvecs element is somewhat distinct from the specific issue reported here. The DWI NIfTIs generated by dcm2niix all have negative determinants, presumably because it means that there would be no consequence for any downstream software that were to get this particular detail wrong. That is distinct to the antipodal reflection between applied and reported gradient direction reported here. If there is more to be done on documentation of that feature of the bvecs format, it might be better discussed in a separate Issue.

Yes I admittedly asked a tangent with my question prompted by @neurolabusc's comment about the "bizarre" nature of the bvec format if the image has a positive determinant.

The key here is that it is not as simple as saying "the bvecs are defined with respect to image space". That is only true if the corresponding image has a negative determinant. If it has a positive determinant, then the sign of the first element needs to be flipped before transforming from image space to real / scanner space.

That sounds like my "[OR, are you saying ..." interpretation above. i.e., If you for some reason changed the orientation of an existing NIFTI from +LAS (negative determinant) to +RAS (positive determinant), you would NOT simultaneously change the sign of the first element of the corresponding existing bvec. I had always, apparently incorrectly, assumed that the FSL bvecs were (strictly) "defined with respect to image space". I didn't know that was only true if the q/sform has a negative determinant. Thanks for the information!

@neurolabusc
Copy link
Collaborator

neurolabusc commented Oct 31, 2024

@mharms this is out of scope for this topic, but for completeness I will describe why determinants for 3D space. Feel free to suggest a better place to document this. A simple way to think about a determinant is whether the object has positive or negative volume.

Consider a NIfTI image where the columns (I), rows (k), and slices (j) map to the world space Right, Anterior and Superior (aka RAS). In this example, the columns are 2mm between voxel centers, the rows 3mm and slices 4mm, so the volume is 24mm3. The volume is positive (it has a positive determinant):

X Y Z
I 2 0 0
J 0 3 0
K 0 0 4

if we save the voxels to disk with the reverse column order (mirroring the images) we could losslessly store this as LAS. However, note that the volume of this object is now negative 24mm (it has a negative determinant).

X Y Z
I -2 0 0
J 0 3 0
K 0 0 4

The important aspect is that we can not apply any combination of rigid body transforms (translations and rotations) to align the second image with the first image. To convert between the two images, we need to add a negative polarity zoom.
Screenshot 2024-10-31 at 7 49 10 AM

This is why we say that the handedness has been reversed. When my thumb, index finger and middle finger are fixed orthongal to each other, I can rotate my right hand so that the thumb faces left or right, up, down, but I can not map the alignment of my left hand (I have to reverse the polarity of my thumb direction).

geo-lefthand-vs-righthand

You could think of reversing the determinant as turning an object inside out. This is certainly the effect for triangulated meshes, where the winding order is used to discriminate the front face from the back face. For example, in the mesh image below, the viewer-relative back faces appear gray while the front faces are colored:

Screenshot 2024-10-31 at 8 10 24 AM

So some FSL tools swap column order when loading images, but they do not apply the same change to the bvec. As you note, if you mirror the columns in a NIfTI image (e.g. going from RAS to LAS in the example above), you would not change the bvec file (as FSL will load both as LAS). On the other hand, if you swapped row order (e.g. going from LAS to LPS), you will need to swap the first and second component of the bvec file. You swap the first component since FSL will load the image as RPS, and you swap the second component because you have reversed row order in image space. While this is the convention BIDS has adopted, I do think it is not intuitive and pragmatically it has led to a lot of unintended consequences and kludges in our code bases.

@neurolabusc
Copy link
Collaborator

@Lestropie I notice that all your examples are stored as mosaics. Are you able to tell if the same issue applies to data saved as slices? In the validation dataset dcm_qa_sag I provide the same DWI sequences saved as mosaics (series 4 and 5) and slices (6 and 7). I note that dcm2niix saves the mosaics to disk as PSL, while the slices are saved as PSR, so the slice order is different.

@mharms
Copy link
Collaborator

mharms commented Oct 31, 2024

It's not so much the motion of the determinant per se that I was wondering about (although appreciate the pictures) but whether the sign/handedness of the determinant is somehow an independent entity from the LAS, RAS, etc "orientation". (Answer: No, it is not -- once you know the "orientation, you know the sign/handedness).

So some FSL tools swap column order when loading images, but they do not apply the same change to the bvec. As you note, if you mirror the columns in a NIfTI image (e.g. going from RAS to LAS in the example above), you would not change the bvec file (as FSL will load both as LAS).

Makes sense, although I suspect you meant to write "(e.g. going from LAS to RAS in the example above)", which was my example per se, although any orientation change to/from LAS/RAS (or vice-versa), the same principle would apply -- i.e., you wouldn't make any change to the bvec in either situation.

On the other hand, if you swapped row order (e.g. going from LAS to LPS), you will need to swap the first and second component of the bvec file. You swap the first component since FSL will load the image as RPS, and you swap the second component because you have reversed row order in image space.

Also makes sense. Thanks. Learned something new!

@mharms
Copy link
Collaborator

mharms commented Oct 31, 2024

FWIW, where things are still murky to me is how does FSL set up its internal space for any arbitrary NIFTI orientation -- e.g., what does it do if the first dimension is not the L/R axis, such as a "PLS" (left-handed) or "ALS" (right-handed) NIFTI. But that's indeed out of scope, so feel free to ignore. :)

@mharms
Copy link
Collaborator

mharms commented Oct 31, 2024

What is perhaps relevant to this discussion, is that I think another way to summarize above is that the FSL dMRI tools simply load the bvec and assumes that those values apply "as is" to whatever internal space FSL happens to load a particular NIFTI into (which is controlled not by the dMRI tools themselves, but their general libraries for loading NIFTI). i.e., the dMRI tools are not aware of, nor do they attempt to account for, any change of the handedness (or potentially even reordering of axes) that may or may not occur during the loading of the image data itself. Correct?

@neurolabusc
Copy link
Collaborator

FSL tools that expect a negative determinant always flip the first (columns, i) dimension. See the table I posted above where R×A×S -> L×A×S while A×R×S -> P×R×S. You can see this by viewing the 48 possible combinations from NIfTIspace onto FSLeyes and exploring the relationship between the voxel coordinates and the world space coordinates. Note that most raw images also include angulation, so they are not orthogonally aligned in world space.

A tool that want to support this bvec format must always inspect the input image determinant. The tool can either

  1. Flip the first image dimension if the determinant is positive
  2. Invert the polarity of the first bvec dimension if the determinant is positive

@mharms
Copy link
Collaborator

mharms commented Oct 31, 2024

@Lestropie, sorry but lest we confuse someone else regarding your earlier comment:

The key here is that it is not as simple as saying "the bvecs are defined with respect to image space". That is only true if the corresponding image has a negative determinant. If it has a positive determinant, then the sign of the first element needs to be flipped before transforming from image space to real / scanner space. So the language here needs to be more careful. One could perhaps say something like "the bvecs are defined with respect to the neurological view of that image" (assuming that the "neurological vs. radiological" and "negative vs. positive determinant" are exchangeable); sometimes that "view" is exactly identical to the image, sometimes it is not.

I know you said the language about this can be confusing, so maybe this is a case of that, but did you inadvertently swap your use of "neurological" and "radiological" in that paragraph? ("radiological" is the negative determinant, while "neurological" is the positive determinant). i.e., based on my current understanding, did you mean to write the following?:

... So the language here needs to be more careful. One could perhaps say something like "the bvecs are defined with respect to the radiological view of that image" (assuming that the "radiological vs. neurological" and "negative vs. positive determinant" are exchangeable);`

@neurolabusc
Copy link
Collaborator

I use the term Radiological and Neurological to refer to differences in viewer location rather than spatial coordinates. Neurologists and radiologists agree on what is the patient's left-right, anterior-posterior and inferior-superior direction. They show volume renderings of images in the same way. The difference is that when looking at 2D slices the depth direction is reversed. The radiologist imagines themselves looking at a patient from their feet (red arrow), while the neurologist imagines themselves looking from a vantage point superior to the head (green arrow):

Screenshot 2024-10-31 at 12 21 09 PM

@mharms
Copy link
Collaborator

mharms commented Oct 31, 2024

Yes, there's "Radiological" and "Neurological" in the sense of the viewer, but I don't think that's the sense in which those terms have been used here. Any given viewer can choose to display images any way it wants (or even provide an option to swap between viewing conventions -- L hemi on the right side of the display for "radiological", or L hemi on the left side of the display for "neurological"), independently of the determinant of the underlying NIFTI, since that is merely a display issue.

In the context of this page, it seems we have been using those terms as proxies for the sign of the determinant (handedness of the NIFTI orientation), not the convention used for viewing them.

@Lestropie
Copy link
Author

Yes I admittedly asked a tangent with my question

All good, no stress; happens all the time, myself included. It does tend to make such discussions easier to find in the future if they're attached to an issue with corresponding description.

Though one issue here is where such content should actually reside. The FSL documentation is perhaps still a little brief in this regard. My project at https://github.com/Lestropie/DWI_metadata might actually be a somewhat sensible place to try to give a better description / figures to point to for comprehending bvecs.

If you for some reason changed the orientation of an existing NIFTI from +LAS (negative determinant) to +RAS (positive determinant), you would NOT simultaneously change the sign of the first element of the corresponding existing bvec.

Questions like this are always tricky because I never quite know, for instance, if:

  • You are suggesting strictly a modification of the header transform and not touching the image data, or are making corresponding changes to both transformation and ordering of image data
  • You are suggesting modifying the NIfTI both in-place and in isolation, and therefore require determining the manual corresponding change to the bvecs. For instance, in MRtrix3 if you import the bvecs for the input image and then write bvecs for the output image the requisite modifications to bvecs will be done internally for you, and therefore any subsequent manual manipulation would be erroneous.

So the whole thing's a minefield 😬

Are you able to tell if the same issue applies to data saved as slices?

I can see if it's possible to load the existing DICOMs back onto the scanner and re-save without mosaic. Otherwise I'd need to re-acquire. Which can be done if need be, I need to do similar for XA anyways.

I think another way to summarize above is that the FSL dMRI tools simply load the bvec and assumes that those values apply "as is" to whatever internal space FSL happens to load a particular NIFTI into

I think it's something like that. I don't have any insight into exactly what components of their API are responsible for what; but I do know that this negative-determinant applies to FSL tools beyond the DWI context, eg. FLIRT transforms.

Yes, there's "Radiological" and "Neurological" in the sense of the viewer, but I don't think that's the sense in which those terms have been used here.

I use the term Radiological and Neurological to refer to differences in viewer location rather than spatial coordinates.

I agree that this is the best usage; I have however seen some use these terms to refer to the sign of the determinant. Ideally any new documentation about bvecs would be very clear about the distinction. In retrospect I probably leant on this the wrong way in trying to land on a one-liner. So I would retract that definition. "The bvecs are defined with respect to the first-axis-flipped-if-positive-determinant view of that image" doesn't quite roll off the tongue... 🙃

@mharms
Copy link
Collaborator

mharms commented Nov 6, 2024

FWIW: The reason that I defined "radiological" and "neurological" in terms of the sign of the determinant, and not how those terms get interpreted in the sense of viewing, is that it is my understanding that is precisely what FSL is doing when it returns either RADIOLOGICAL or NEUROLOGICAL as the output of fslorient. (i.e., FSL's use of those terms in the context of fslorient is based entirely on the sign of the determinant, and has nothing to do with any viewer).

So I would retract that definition. "The bvecs are defined with respect to the first-axis-flipped-if-positive-determinant view of that image" doesn't quite roll off the tongue...

Again, I don't see how the "view of the image" comes into play here, which seems to me to be a completely separate and independent thing related to how any given viewer chooses to display the underlying NIFTI.

@Lestropie
Copy link
Author

Again, I don't see how the "view of the image" comes into play here

I think you're interpreting my use of "view" here specifically as visualisation of a two-dimensional slice on a plane. I had intended for it to be moreso something like "I can apply some projection to the data to look at it in a different way but without modifying the underlying image intensities". It's probably going to take a few attempts at getting these descriptions correct for public consumption... 😬

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants