-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Antipodally reflected bvecs for sagittal DWIs #863
Comments
@Lestropie I did create a validation dataset and minimal validation pipeline as dcm_qa_sag. It looks to me like the issue is with Siemens data where sagittal data is saved as slices (either enhanced or separate files) rather than mosaics. Series 4 (from this VE11 scanner) saves the sagittals as mosacis and the direction looks fine: The same sequence, but saving the data as one 2D slice per file does not look correct: I suspect this is the same root issue with issue 797. |
Visualisation of peak orientations does not discriminate between the presence and absence of this particular bug. The reported gradient directions for the sagittal acquisitions are specifically the antipodal reflection of what was actually applied. This is inconsequential to typical fibre orientation estimation, which assumes the diffusion-weighted signal to be antipodally symmetric. I am assuming that the error is in the |
OK. Now I understand. b-vectors that are have polar opposite directions have identical sensitivity to fiber direction, but have the eddy-current distortions have opposite direction but identical magnitude. Therefore, getting all the bvec directions 180-degrees off will create identical tensors. I also believe that tools like Eddy will not be influenced by this, as it will undistort to the spatial position between the two images. Given the fact that our tools are so tolerant to this polarity reversal, what is the best way to objectively test this? For example, you and I both acquired E11 mosaic data, and I also acquired E11 non-mosaic images. We also want to make sure our solution works for XA data. Maybe you can expand on the consequences of this error, that can help mitigate data acquired before this issue was identified. Can you provide the mrtrix version that fixes this and also the canonical way to convert DICOM data with mrtrix to NIfTI (rather than mif) format that provides a solution? I am eager to solve this, but want to make sure we have a comprehensive solution that improves dcm2niix and other tools (e.g. dicm2nii). As an aside, I did fix scripts for my validation scripts so fsleyes generates line images for each dataset. This does suggest that the line directions are correct for all the data (although this is expected when direction is off by 180 degrees): Series 5 (Sag HF mosaic): |
I suspect you're right, but not with 100% certainty. All depends on whether the estimation of eddy current distortions makes any explicit assumptions about how eddy currents will manifest for a given diffusion gradient direction and sign, or whether it just finds a low-rank deformation that corrects the distortion relative to some reference.
What I did in this experiment is create a gradient table for the scanner where the first three DWI volumes are acquired with diffusion sensitisation directions along +X, +Y and +Z of the Device Coordinate System. Those then act as fiducials. I don't have to look at a map of estimated fibre orientations to try to tell if they form smooth trajectories in space, and I don't have to worry about how any software responsible for fibre orientation estimation works (both of those I have downstream tests for, but I treat them separately). I just read the first three bvecs, transform from that coordinate system to RAS realspace based on the image header transform (and hopefully a proper understanding of the bvecs convention), and make sure that those match the fiducials. So you'd need a tailored acquisition using this or similar gradient table if you wanted to extend this test to a broader spectrum of data. I will at least try to do this on our XA50 Vida at some point.
I had believed that MRtrix3/mrtrix3#3011 fully rectified. However while it passes my metadata tests it fails some pre-existing MRtrix3 CI tests. So I need to fully resolve that before I can definitely say that the fixed MRtrix3 implementation is correct. |
Closing for upcoming stable release. Happy to reopen once everything is fully resolved. It is my belief that all of our current tools generate identical results if this error exists. Happy to revisit once everything is fully rectified. The fact that the FSL bvec format requires flipping the first component if the image has a positive determinant is bizarre, and I lobbied against it when the BIDS standard was created. I fully endorse your efforts to ensure compliant and consistent behavior across manufacturers and tools. I have tremendous appreciation for your efforts on this. |
Curious why you call it bizarre to flip the polarity of the first bvec component. Isn't that done just so that the coordinate system of the bvecs matches the orientation of the image? |
The confusion with the FSL bvec is that you flip the first component if the image has a positive determinant, but not if it has a negative determinant. Different FSL tools are inconsistent regarding whether the first dimension is mirrored for positive determinants. For example. fslmaths will mirror the image while fslstats does not. For example, consider an image in RAS orientation, where the command fslstats img -x will give coordinates that are incompatible with fslmath's tfceS function. This feature of the bvec format has historically caused issues with tools, but it is not clearly defined in the BIDS specification. |
I'm not aware of the behavior of the other FSL tools, but as regards the bvecs, it would seem more "bizarre" to me if the coordinate system of the bvecs and the orientation of the image were not consistent with each other. |
Just to clarify, when you refer to flipping the sign of the first component of the bvec based on the determinant/orientation of the image, you're referring to what |
@mharms the FSL/BIDS bvec format uses image space. Consider the vector [+1 0 0]. This means that the vector is pointing in the ascending order of columns as saved to disk (i+) if the image has a negative determinant, but a descending order (i-) if the image has a positive determinant. You could transform a NIfTI image from a positive to negative determinant by flipping any single dimension: row [i], column [j], or slice [k]. Likewise, you could invert the determinant by mirroring all three at once [i,j,k]. The FSL convention for flipping just i is arbitrary. Of the 48 possible combinations, 24 will have the column order mirrored (the columns +++, +--, -+-, --+ ). For brevity, consider these six examples that show have each spatial dimension can be mirrored in order to enforce a negative determinant:
Internally, many (but not all) FSL tools mirror the columns when they load an image with a positive determinant. This enforces a left-handed coordinate system (negative determinant). From inception, dcm2niix has supported this convention for bvecs, allowing users to validate vectors with FSL tools. However, other tools such as MRtrix (that uses a more elegant format internally) have had trouble supporting FSL tools. While I believe that dcm2niix robust, it requires a lot of unusual logic. For example, the flipY option not only mirrors rows [j] but also inverts the determinant (so the polarity bvec for both the rows and columns must be flipped [i,j]). Likewise, you will see that the FreeSurfer team has added a lot of logic |
Perhaps I'm confused on the relationship between the determinant (which is not something I typically think about) and the orientation (as reported by So, if I understand correctly, you're saying that the FSL/BIDS bvec format uses "image space", which to me is defined by the reported orientation. Keeping those in sync seems good to me (although I don't know what format MRtrix uses internally, for comparison). Or stated different, to give a concrete example, if one took the usual LAS oriented NIFTI (by which I mean the |
Firstly, I need to note that the issue of conditional flipping of the first Despite that, I will however add to that conversation, specifically in the way in which it relates to the tool that justifies this bug report, since it may provide insight into I acquired DWI data where I provided the MRI scanner with a bespoke gradient table. The contents of that table commence as follows:
So for those first three volumes, I know the direction in which the diffusion sensitisation directions were applied. These are based on "+X" / "+Y" / "+Z" in the Device Coordinate System (DCS). Therefore, I can perform any sequence of image manipulation operations, using any file format, and at the end I can read the diffusion gradient information and refer back to these three 3-vectors to see if there has been any corruption or misinterpretation. For validation of operations involving the Code for that check itself is here. That block of code can essentially serve as a reference implementation for how data stored in a The key here is that it is not as simple as saying "the Now one could hypothesize that maybe my understanding of the Having fixed various MRtrix3 issues in this context---some known a priori, others not---this all reports total congruence; except for the Regarding the specific questions from @mharms, I'm hesitant to try to answer any specific one, because in my experience it only requires a lack of mutual consensus understanding of any one term ("neurological / radiological", "strides", "orientation", "coordinate system", "RAS" etc., "image space / real space") or assumption (eg. doing manual manipulation of individual files vs. processing the fileset as a whole), or an incomplete description, and the whole conversation is led astray. All I can really say is that if you are performing DWI manipulations, do whatever is necessary from first principles to not have an invalid gradient table. If unsure for N=1, check the data; if writing software to do such manipulations, it should be exhaustively checked against these data using these tools. |
Yes I admittedly asked a tangent with my question prompted by @neurolabusc's comment about the "bizarre" nature of the
That sounds like my "[OR, are you saying ..." interpretation above. i.e., If you for some reason changed the orientation of an existing NIFTI from +LAS (negative determinant) to +RAS (positive determinant), you would NOT simultaneously change the sign of the first element of the corresponding existing |
@mharms this is out of scope for this topic, but for completeness I will describe why determinants for 3D space. Feel free to suggest a better place to document this. A simple way to think about a determinant is whether the object has positive or negative volume. Consider a NIfTI image where the columns (I), rows (k), and slices (j) map to the world space Right, Anterior and Superior (aka RAS). In this example, the columns are 2mm between voxel centers, the rows 3mm and slices 4mm, so the volume is 24mm3. The volume is positive (it has a positive determinant):
if we save the voxels to disk with the reverse column order (mirroring the images) we could losslessly store this as LAS. However, note that the volume of this object is now negative 24mm (it has a negative determinant).
The important aspect is that we can not apply any combination of rigid body transforms (translations and rotations) to align the second image with the first image. To convert between the two images, we need to add a negative polarity zoom. This is why we say that the handedness has been reversed. When my thumb, index finger and middle finger are fixed orthongal to each other, I can rotate my right hand so that the thumb faces left or right, up, down, but I can not map the alignment of my left hand (I have to reverse the polarity of my thumb direction). You could think of reversing the determinant as turning an object inside out. This is certainly the effect for triangulated meshes, where the winding order is used to discriminate the front face from the back face. For example, in the mesh image below, the viewer-relative back faces appear gray while the front faces are colored: So some FSL tools swap column order when loading images, but they do not apply the same change to the bvec. As you note, if you mirror the columns in a NIfTI image (e.g. going from RAS to LAS in the example above), you would not change the bvec file (as FSL will load both as LAS). On the other hand, if you swapped row order (e.g. going from LAS to LPS), you will need to swap the first and second component of the bvec file. You swap the first component since FSL will load the image as RPS, and you swap the second component because you have reversed row order in image space. While this is the convention BIDS has adopted, I do think it is not intuitive and pragmatically it has led to a lot of unintended consequences and kludges in our code bases. |
@Lestropie I notice that all your examples are stored as mosaics. Are you able to tell if the same issue applies to data saved as slices? In the validation dataset dcm_qa_sag I provide the same DWI sequences saved as mosaics (series 4 and 5) and slices (6 and 7). I note that dcm2niix saves the mosaics to disk as PSL, while the slices are saved as PSR, so the slice order is different. |
It's not so much the motion of the determinant per se that I was wondering about (although appreciate the pictures) but whether the sign/handedness of the determinant is somehow an independent entity from the LAS, RAS, etc "orientation". (Answer: No, it is not -- once you know the "orientation, you know the sign/handedness).
Makes sense, although I suspect you meant to write "(e.g. going from LAS to RAS in the example above)", which was my example per se, although any orientation change to/from LAS/RAS (or vice-versa), the same principle would apply -- i.e., you wouldn't make any change to the bvec in either situation.
Also makes sense. Thanks. Learned something new! |
FWIW, where things are still murky to me is how does FSL set up its internal space for any arbitrary NIFTI orientation -- e.g., what does it do if the first dimension is not the L/R axis, such as a "PLS" (left-handed) or "ALS" (right-handed) NIFTI. But that's indeed out of scope, so feel free to ignore. :) |
What is perhaps relevant to this discussion, is that I think another way to summarize above is that the FSL dMRI tools simply load the bvec and assumes that those values apply "as is" to whatever internal space FSL happens to load a particular NIFTI into (which is controlled not by the dMRI tools themselves, but their general libraries for loading NIFTI). i.e., the dMRI tools are not aware of, nor do they attempt to account for, any change of the handedness (or potentially even reordering of axes) that may or may not occur during the loading of the image data itself. Correct? |
FSL tools that expect a negative determinant always flip the first (columns, i) dimension. See the table I posted above where R×A×S -> L×A×S while A×R×S -> P×R×S. You can see this by viewing the 48 possible combinations from NIfTIspace onto FSLeyes and exploring the relationship between the voxel coordinates and the world space coordinates. Note that most raw images also include angulation, so they are not orthogonally aligned in world space. A tool that want to support this bvec format must always inspect the input image determinant. The tool can either
|
@Lestropie, sorry but lest we confuse someone else regarding your earlier comment:
I know you said the language about this can be confusing, so maybe this is a case of that, but did you inadvertently swap your use of "neurological" and "radiological" in that paragraph? ("radiological" is the negative determinant, while "neurological" is the positive determinant). i.e., based on my current understanding, did you mean to write the following?: ... So the language here needs to be more careful. One could perhaps say something like "the |
I use the term |
Yes, there's "Radiological" and "Neurological" in the sense of the viewer, but I don't think that's the sense in which those terms have been used here. Any given viewer can choose to display images any way it wants (or even provide an option to swap between viewing conventions -- L hemi on the right side of the display for "radiological", or L hemi on the left side of the display for "neurological"), independently of the determinant of the underlying NIFTI, since that is merely a display issue. In the context of this page, it seems we have been using those terms as proxies for the sign of the determinant (handedness of the NIFTI orientation), not the convention used for viewing them. |
All good, no stress; happens all the time, myself included. It does tend to make such discussions easier to find in the future if they're attached to an issue with corresponding description. Though one issue here is where such content should actually reside. The FSL documentation is perhaps still a little brief in this regard. My project at https://github.com/Lestropie/DWI_metadata might actually be a somewhat sensible place to try to give a better description / figures to point to for comprehending
Questions like this are always tricky because I never quite know, for instance, if:
So the whole thing's a minefield 😬
I can see if it's possible to load the existing DICOMs back onto the scanner and re-save without mosaic. Otherwise I'd need to re-acquire. Which can be done if need be, I need to do similar for XA anyways.
I think it's something like that. I don't have any insight into exactly what components of their API are responsible for what; but I do know that this negative-determinant applies to FSL tools beyond the DWI context, eg. FLIRT transforms.
I agree that this is the best usage; I have however seen some use these terms to refer to the sign of the determinant. Ideally any new documentation about |
FWIW: The reason that I defined "radiological" and "neurological" in terms of the sign of the determinant, and not how those terms get interpreted in the sense of viewing, is that it is my understanding that is precisely what FSL is doing when it returns either
Again, I don't see how the "view of the image" comes into play here, which seems to me to be a completely separate and independent thing related to how any given viewer chooses to display the underlying NIFTI. |
I think you're interpreting my use of "view" here specifically as visualisation of a two-dimensional slice on a plane. I had intended for it to be moreso something like "I can apply some projection to the data to look at it in a different way but without modifying the underlying image intensities". It's probably going to take a few attempts at getting these descriptions correct for public consumption... 😬 |
Describe the bug
Where DWIs are acquired with sagittal slice orientation, the saved
bvecs
are antipodally reflected with respect to the diffusion sensitisation gradients that the sequence was instructed to apply. This happens irrespective of phase encoding direction or slice order.To reproduce
Build and run the Docker image at: https://github.com/Lestropie/DWI_metadata
Expected behavior
The gradient table utilised during acquisition contains three fiducials at the head, applying diffusion sensitisation gradients along the three axes of the Device Coordinate System. The software tool reads these three columns from
bvecs
, transforms them to RAS real space according to my understanding of thebvecs
format (which is seemingly confirmed correct via other tests), and compares them to the fiducials. If these correspond, no warning will be issued.Version
The Docker image builds the most recent tag, v1.0.20240202, from source.
PS. In case it's a clue: I found that with Eigen used by MRtrix3, when trying to access the 3x3 portion of an affine transform,
.rotation()
gives a different result to.linear()
only for sagittal acquisition, as the former actually invokes a decomposition.The text was updated successfully, but these errors were encountered: