-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
first draft of symmetrical 25um blackcap atlas #314
Open
alessandrofelder
wants to merge
11
commits into
main
Choose a base branch
from
package-blackcaps
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 9 commits
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
6ca66b2
first draft of symmetrical 50um blackcap atlas
alessandrofelder 6492188
switch to 25um data
alessandrofelder 3101c10
polish blackcap for Oldenburg visit
alessandrofelder 32792ff
polish for new detailed annotations
alessandrofelder 9375b1e
combined magneto+detailed areas, more filtering and polishing
alessandrofelder 191b19d
tweaks for Dominik to double-check
alessandrofelder db9caec
packaged bird with DH checked annotations
alessandrofelder ff7cfee
remove validation script (to be integrated in separate PR)
alessandrofelder ef1e4b7
clearly label one-off scripts needed for reproducibility
alessandrofelder 42af7db
move blackcap packaging to pooch and GIN
alessandrofelder 33d3c8c
update blackcap minor version
alessandrofelder File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,80 @@ | ||
"""Helper functions to read annotation metadata from common formats.""" | ||
|
||
from pathlib import Path | ||
|
||
|
||
def split_label_text(name: str) -> str: | ||
if name.endswith(")"): | ||
name, acronym = name.split("(") | ||
name = name[:-1] # ignore trailing space | ||
acronym = acronym[:-1] # ignore trailing ) | ||
else: | ||
acronym = name[0] | ||
return name, acronym | ||
|
||
|
||
def read_itk_labels(path: Path) -> dict: | ||
labels = [] | ||
with open(path) as labels_file: | ||
for line in labels_file: | ||
if not line.startswith("#"): | ||
raw_values = line.split(maxsplit=7) | ||
id = int(raw_values[0]) | ||
rgb = tuple((int(r) for r in raw_values[1:4])) | ||
if raw_values[7][-1] == "\n": | ||
raw_values[7] = raw_values[7][:-1] | ||
label_text = raw_values[7][1:-1] | ||
if label_text != "Clear Label": | ||
name, acronym = split_label_text(label_text) | ||
labels.append( | ||
{ | ||
"id": id, | ||
"name": name, | ||
"rgb_triplet": rgb, | ||
"acronym": acronym, | ||
} | ||
) | ||
return labels | ||
|
||
|
||
ITK_SNAP_HEADER = """################################################ | ||
# ITK-SnAP Label Description File | ||
# File format: | ||
# IDX -R- -G- -B- -A-- VIS MSH LABEL | ||
# Fields: | ||
# IDX: Zero-based index | ||
# -R-: Red color component (0..255) | ||
# -G-: Green color component (0..255) | ||
# -B-: Blue color component (0..255) | ||
# -A-: Label transparency (0.00 .. 1.00) | ||
# VIS: Label visibility (0 or 1) | ||
# IDX: Label mesh visibility (0 or 1) | ||
# LABEL: Label description | ||
################################################ | ||
""" | ||
|
||
ITK_CLEAR_LABEL = '0 0 0 0 0 0 0 "Clear Label"\n' | ||
|
||
|
||
def write_itk_labels(path: Path, labels): | ||
with open(path, "w") as labels_file: | ||
labels_file.write(ITK_SNAP_HEADER) | ||
labels_file.write(ITK_CLEAR_LABEL) | ||
for label in labels: | ||
labels_file.write( | ||
f"{label['id']} " | ||
f"{label['rgb_triplet'][0]} " | ||
f"{label['rgb_triplet'][1]} " | ||
f"{label['rgb_triplet'][2]} 1.00 1 1 " | ||
f"\"{label['name'] + ' (' + label['acronym']+')'}\"\n" | ||
) | ||
|
||
|
||
# TODO turn into test | ||
if __name__ == "__main__": | ||
path = Path.home() / "Downloads" / "corrected_LabelMainBrainAreas_SW.txt" | ||
labels = read_itk_labels(path) | ||
[print(label) for label in labels] | ||
write_itk_labels( | ||
Path.home() / "Downloads" / "test-writing.txt", labels=labels | ||
) |
221 changes: 221 additions & 0 deletions
221
brainglobe_atlasapi/atlas_generation/atlas_scripts/blackcap.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,221 @@ | ||
__version__ = "0" | ||
|
||
import csv | ||
import glob as glob | ||
import os | ||
import time | ||
from pathlib import Path | ||
|
||
import numpy as np | ||
from brainglobe_utils.IO.image import load_nii | ||
from rich.progress import track | ||
|
||
from brainglobe_atlasapi.atlas_generation.annotation_utils import ( | ||
read_itk_labels, | ||
) | ||
from brainglobe_atlasapi.atlas_generation.mesh_utils import ( | ||
Region, | ||
create_region_mesh, | ||
) | ||
from brainglobe_atlasapi.atlas_generation.wrapup import wrapup_atlas_from_data | ||
from brainglobe_atlasapi.structure_tree_util import get_structures_tree | ||
|
||
|
||
def create_atlas(working_dir, resolution): | ||
ATLAS_NAME = "oldenburg_blackcap" | ||
SPECIES = "Sylvia atricapilla" | ||
ATLAS_LINK = "https://uol.de/en/ibu/animal-navigation" | ||
CITATION = "unpublished" | ||
ATLAS_FILE_URL = "https://uol.de/en/ibu/animal-navigation" # noqa | ||
ORIENTATION = "asr" | ||
ROOT_ID = 999 | ||
ATLAS_PACKAGER = "BrainGlobe Developers, [email protected]" | ||
ADDITIONAL_METADATA = {} | ||
|
||
# setup folder for downloading | ||
|
||
atlas_path = Path( | ||
"/media/ceph/margrie/sweiler/AnalyzedData/" | ||
"Bird_brain_project/black_cap/DH_checked/" | ||
) | ||
|
||
structures_file = atlas_path / "labels051224.txt" | ||
annotations_file = atlas_path / "annotations_051224.nii.gz" | ||
reference_file = atlas_path / "reference_res-25_hemi-right_IMAGE.nii.gz" | ||
meshes_dir_path = Path.home() / "blackcap-meshes" | ||
|
||
try: | ||
os.mkdir(meshes_dir_path) | ||
except FileExistsError: | ||
"mesh folder already exists" | ||
|
||
print("Reading structures files") | ||
hierarchy_path = atlas_path / "combined_structures.csv" | ||
structure_to_parent_map = {} | ||
with open(hierarchy_path, mode="r") as file: | ||
reader = csv.reader(file) | ||
next(reader) # Skip the header | ||
for row in reader: | ||
structure_to_parent_map[int(row[1])] = [ | ||
int(parent) for parent in row[2].split(",") | ||
] | ||
|
||
structure_data_list = read_itk_labels(structures_file) | ||
for structure in structure_data_list: | ||
structure_id = structure["id"] | ||
structure["structure_id_path"] = structure_to_parent_map[structure_id] | ||
|
||
# append root and pallium structures, which don't have their own voxels | ||
# and are therefore not in itk file | ||
structure_data_list.append( | ||
{ | ||
"id": 1, | ||
"name": "Pallium", | ||
"acronym": "P", | ||
"structure_id_path": [999, 1], | ||
"rgb_triplet": [0, 200, 100], | ||
} | ||
) | ||
structure_data_list.append( | ||
{ | ||
"id": 999, | ||
"name": "root", | ||
"acronym": "root", | ||
"structure_id_path": [999], | ||
"rgb_triplet": [255, 255, 255], | ||
} | ||
) | ||
|
||
tree = get_structures_tree(structure_data_list) | ||
print( | ||
f"Number of brain regions: {tree.size()}, " | ||
f"max tree depth: {tree.depth()}" | ||
) | ||
print(tree) | ||
print(f"Saving atlas data at {atlas_path}") | ||
|
||
# use tifffile to read annotated file | ||
annotated_volume = load_nii(annotations_file, as_array=True).astype( | ||
np.uint16 | ||
) | ||
# Append a mirror image of the reference volume along axis 2 | ||
mirrored_volume = np.flip(annotated_volume, axis=2) | ||
annotated_volume = np.concatenate( | ||
(annotated_volume, mirrored_volume), axis=2 | ||
) | ||
|
||
# rescale reference volume into int16 range | ||
reference_volume = load_nii(reference_file, as_array=True) | ||
dmin = np.min(reference_volume) | ||
dmax = np.max(reference_volume) | ||
drange = dmax - dmin | ||
dscale = (2**16 - 1) / drange | ||
reference_volume = (reference_volume - dmin) * dscale | ||
reference_volume = reference_volume.astype(np.uint16) | ||
# Append a mirror image of the reference volume along axis 2 | ||
mirrored_volume = np.flip(reference_volume, axis=2) | ||
reference_volume = np.concatenate( | ||
(reference_volume, mirrored_volume), axis=2 | ||
) | ||
|
||
has_label = annotated_volume > 0 | ||
reference_volume *= has_label | ||
|
||
# generate binary mask for mesh creation | ||
labels = np.unique(annotated_volume).astype(np.int_) | ||
for key, node in tree.nodes.items(): | ||
if ( | ||
key in labels or key == 1 | ||
): # Pallium == 1 needs mesh but has no own voxels | ||
is_label = True | ||
else: | ||
is_label = False | ||
|
||
node.data = Region(is_label) | ||
|
||
# mesh creation | ||
closing_n_iters = 1 | ||
decimate_fraction = 0.6 # higher = more triangles | ||
smooth = False | ||
|
||
start = time.time() | ||
|
||
for node in track( | ||
tree.nodes.values(), | ||
total=tree.size(), | ||
description="Creating meshes", | ||
): | ||
create_region_mesh( | ||
( | ||
meshes_dir_path, | ||
node, | ||
tree, | ||
labels, | ||
annotated_volume, | ||
ROOT_ID, | ||
closing_n_iters, | ||
decimate_fraction, | ||
smooth, | ||
) | ||
) | ||
|
||
print( | ||
"Finished mesh extraction in : ", | ||
round((time.time() - start) / 60, 2), | ||
" minutes", | ||
) | ||
|
||
# create meshes dict | ||
meshes_dict = dict() | ||
structures_with_mesh = [] | ||
for s in structure_data_list: | ||
# check if a mesh was created | ||
mesh_path = meshes_dir_path / f"{s['id']}.obj" | ||
if not mesh_path.exists(): | ||
print(f"No mesh file exists for: {s}, ignoring it.") | ||
continue | ||
else: | ||
# check that the mesh actually exists and isn't empty | ||
if mesh_path.stat().st_size < 512: | ||
print(f"obj file for {s} is too small, ignoring it.") | ||
continue | ||
structures_with_mesh.append(s) | ||
meshes_dict[s["id"]] = mesh_path | ||
|
||
print( | ||
f"In the end, {len(structures_with_mesh)} " | ||
"structures with mesh are kept" | ||
) | ||
|
||
output_filename = wrapup_atlas_from_data( | ||
atlas_name=ATLAS_NAME, | ||
atlas_minor_version=__version__, | ||
citation=CITATION, | ||
atlas_link=ATLAS_LINK, | ||
species=SPECIES, | ||
resolution=resolution, | ||
orientation=ORIENTATION, | ||
root_id=999, | ||
reference_stack=reference_volume, | ||
annotation_stack=annotated_volume, | ||
structures_list=structure_data_list, | ||
meshes_dict=meshes_dict, | ||
scale_meshes=True, | ||
working_dir=working_dir, | ||
hemispheres_stack=None, | ||
cleanup_files=False, | ||
compress=True, | ||
atlas_packager=ATLAS_PACKAGER, | ||
additional_metadata=ADDITIONAL_METADATA, | ||
) | ||
|
||
return output_filename | ||
|
||
|
||
if __name__ == "__main__": | ||
res = 25, 25, 25 | ||
home = str(Path.home()) | ||
bg_root_dir = Path.home() / "brainglobe_workingdir" | ||
bg_root_dir.mkdir(exist_ok=True, parents=True) | ||
|
||
create_atlas(bg_root_dir, res) | ||
67 changes: 67 additions & 0 deletions
67
.../atlas_generation/atlas_scripts/blackcap_reproducibility/_combine_blackcap_annotations.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,67 @@ | ||
"""One-off script to reproduce how we combined the first version of | ||
the annotations provided by the bird anatomists""" | ||
|
||
import csv | ||
from pathlib import Path | ||
|
||
import numpy as np | ||
from brainglobe_utils.IO.image import load_nii, save_any | ||
|
||
from brainglobe_atlasapi.atlas_generation.annotation_utils import ( | ||
read_itk_labels, | ||
write_itk_labels, | ||
) | ||
|
||
if __name__ == "__main__": | ||
# setup paths | ||
annotations_root = Path( | ||
"/media/ceph-neuroinformatics/neuroinformatics/atlas-forge/BlackCap" | ||
/ "templates/template_sym_res-25um_n-18_average-trimean/for_atlas" | ||
/ "annotations-right-hemisphere/" | ||
) | ||
main_annotation_path = ( | ||
annotations_root / "corrected_AnnotationsMainBrainAreas_SW.nii.gz" | ||
) | ||
small_annotation_path = annotations_root / "corrected_smallareas_SW.nii.gz" | ||
main_labels_path = ( | ||
annotations_root / "corrected_LabelMainBrainAreas_SW.txt" | ||
) | ||
small_labels_path = annotations_root / "corrected_smallareas_SW.txt" | ||
small_to_main_csv = annotations_root / "hierarchy_annotat1_annotat2.csv" | ||
|
||
# combine annotation images | ||
main_annotation_image = load_nii(main_annotation_path, as_array=True) | ||
small_annotation_image = load_nii(small_annotation_path, as_array=True) | ||
small_annotation_image *= ( | ||
10 # avoid label clashes, there are 6 main labels | ||
) | ||
small_annotation_image = small_annotation_image.astype(np.uint8) | ||
combined_annotation_image = main_annotation_image.copy() | ||
|
||
small_to_main_map = {} | ||
with open(small_to_main_csv, mode="r") as file: | ||
reader = csv.reader(file) | ||
next(reader) # Skip the header | ||
for row in reader: | ||
print(row) | ||
if row[3]: | ||
small_to_main_map[10 * int(row[3])] = int(row[1]) | ||
for small, main in small_to_main_map.items(): | ||
is_main_value = main_annotation_image == main | ||
is_small_value = small_annotation_image == small | ||
combined_indices = is_main_value * is_small_value | ||
combined_annotation_image[combined_indices] = small | ||
save_any( | ||
combined_annotation_image, | ||
annotations_root / "combined_annotations.nii.gz", | ||
) | ||
|
||
# combine label data | ||
main_annotation_labels = read_itk_labels(main_labels_path) | ||
small_annotation_labels = read_itk_labels(small_labels_path) | ||
for label in small_annotation_labels: | ||
label["id"] = int(10 * label["id"]) | ||
combined_labels = main_annotation_labels + small_annotation_labels | ||
write_itk_labels( | ||
annotations_root / "combined_label_descriptions.txt", combined_labels | ||
) |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this atlas fail our "normal" requirement that all data is available online, and the script should run without access to any specific filesystem?
Should we create a blackcap repo on GIN and upload these files?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done: https://gin.g-node.org/BrainGlobe/blackcap_materials