Skip to content

Commit

Permalink
Merging.
Browse files Browse the repository at this point in the history
  • Loading branch information
ksr committed Nov 13, 2019
2 parents 3d9f0e3 + ee0ad93 commit dda645b
Show file tree
Hide file tree
Showing 81 changed files with 2,023 additions and 5,047 deletions.
3 changes: 0 additions & 3 deletions .gitmodules

This file was deleted.

2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ install:
- export PATH="$HOME/miniconda/bin:$PATH"
- hash -r
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
# - conda update -q conda
- conda info -a
- conda create -q -n test-env python=$TRAVIS_PYTHON_VERSION
- conda activate test-env
Expand Down
8 changes: 4 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
Contributing to EEGIO
Contributing to NeuroimgPipe
======================

(adopted from scikit-learn)

The latest contributing guide is available in the repository at
`docs/contributing.rst`, or online at:

There are many ways to contribute to EEGIO, with the most common ones
There are many ways to contribute to NeuroimgPipe, with the most common ones
being contribution of code or documentation to the project. Improving the
documentation is no less important than improving the library itself. If you
find a typo in the documentation, or have made improvements, do not hesitate to
submit a GitHub pull request. Documentation can be found under the
[doc/](https://github.com/adam2392/eegio/tree/master/doc) directory.
[doc/](https://github.com/adam2392/neuroimg_pipeline/tree/master/doc) directory.

But there are many other ways to help. In particular answering queries on the
[issue tracker](https://github.com/adam2392/eegio/issues), and
[issue tracker](https://github.com/adam2392/neuroimg_pipeline/issues), and
investigating bugs are very valuable contributions that decrease the burden on
the project maintainers.

Expand Down
65 changes: 19 additions & 46 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,11 @@
<a href="https://codeclimate.com/github/adam2392/neuroimg_pipeline/maintainability"><img src="https://api.codeclimate.com/v1/badges/2c7d5910e89350b967c8/maintainability" /></a>
![GitHub repo size](https://img.shields.io/github/repo-size/adam2392/neuroimg_pipeline)

By: Adam Li

Date: 10/4/18

This repo describes Sarma lab effort to pipeline explicitly a neuroimaging data workflow that involves T1 MRI, CT, DTI
This repo describes Sarma lab effort to pipeline explicitly a neuroimaging data workflow that involves T1 MRI, CT,
and iEEG data (ECoG, or SEEG).

For incorporation of DTI data, see ndmeg: https://github.com/neurodata/ndmg

<!-- MarkdownTOC -->

- Setup and Installation
Expand All @@ -30,28 +28,27 @@ and iEEG data (ECoG, or SEEG).
- [ ] Add support for MRICloud running using R-script. Possibly convert to Python script.
- [ ] Create Python pipeline for running everything, or submit PR to Chang Lab's to run SEEG.
- [ ] Create unit and integration tests using pytest that test: pipeline in both snakemake and Python
- [ ] Add visualization features for SEEG that can be synced into Chang's lab repo.
- [ ] Add Docker container for project

# Setup and Installation
There are a couple of tools that you need to install in your system before everything is working. You ar recommended to use a Linux based OS.
Follow links and tutorials on each respective tool to install. Preferably this is done via Docker, or Singularity, but if not, then:

0. Anaconda and Python3.6+
* Conda (https://docs.anaconda.com/anaconda/install/)
* This is mainly necessary to run img_pipe (ECoG localization with Chang Lab repo), snakemake, and any Python wrapper code
Anaconda and Python3.6+ :
* Conda (https://docs.anaconda.com/anaconda/install/)
* This is mainly necessary to run img_pipe (ECoG localization with Chang Lab repo), snakemake, and any Python wrapper code

i. Conda env


# probably doesn't work
# create environment
conda create -n neuroimgpipe
conda activate neuroimgpipe

# optionally separate install
conda config --add channels bioconda
conda config --add channels conda-forge
conda install numpy scipy matplotlib pytest scikit-learn pandas seaborn nibabel mne snakemake ipykernel
conda install numpy scipy matplotlib pytest scikit-learn skimage pandas seaborn nibabel mne snakemake ipykernel
conda install -c flyem-forge/label/upgrade201904 marching_cubes
conda env export > environment.yml

Expand All @@ -63,9 +60,6 @@ Follow links and tutorials on each respective tool to install. Preferably this i
conda install -c conda-forge pyvtk
pip install img-pipe==2019.3.15.2

iii. (Optional and outdated) Update all submodules

git submodule update --init --recursive
## Modules to Install
1. Reconstruction
Expand Down Expand Up @@ -100,14 +94,20 @@ Follow links and tutorials on each respective tool to install. Preferably this i
8. (Optional) Nonlinear Registration NDREG:
* NDReg (https://github.com/neurodata/ndreg)

# Data Organization

We use BIDS.
See https://github.com/bids-standard/bids-starter-kit/wiki/The-BIDS-folder-hierarchy

# Running Your Own Image Patients

0. Setup your raw data directory for your patient to be read in:

* /patient_id/
- /premri/
- /postct/
- /dti/ (optional)
- /mri/
- /ct/

For more details, see BIDS: https://bids.neuroimaging.io/

1. (Optional) Setup data directory for your patient in FreeSurfer format:

Expand Down Expand Up @@ -158,12 +158,6 @@ data directories of your data. This is under pipeline/config/localconfig.yaml
<change directories and variables>
<run GUI>

## Docker and Singularity
TBD
1. Freesurfer with FSL
2. MRTrix3
3. NDReG

# Pipeline Description
At a high level, this pipeline is taking neuroimaging data of a patient to produce usable data about the brain's geometry, regional parcellation into atlas regions, connectivity between brain regions measured by white matter tracts, and channel localization in MRI space.

Expand Down Expand Up @@ -228,8 +222,8 @@ At a high level, this pipeline is taking neuroimaging data of a patient to produ
TBD

# Localizing Electrodes Process
For NCSL specific Readme, check out: [link](docs/contact_localization/localizingelectrodes_instructions.pdf)
For general-purpose Readme, check out: [link](docs/contact_localization/localizingelectrodes_instructions.pdf)
For NCSL specific Readme, check out: [link](docs/extra_docs/contact_localization/localizingelectrodes_instructions.pdf)
For general-purpose Readme, check out: [link](docs/extra_docs/contact_localization/localizingelectrodes_instructions.pdf)

To only localize contacts using fieldtrip toolbox GUI, or img_pipe GUI, then follow these instructions:
1. Install SPM (preferably 12): https://www.fil.ion.ucl.ac.uk/spm/software/spm12/
Expand Down Expand Up @@ -266,27 +260,6 @@ language.

sphinx-quickstart


# Testing
Install testing and formatting libs:

conda install sphinx black pytest pytest-cov coverage
pip install coverage-badge

Run tests

black neuroimg/*
black tests/*
pylint ./neuroimg/
pre-commit run black --all-files
pytest --cov-config=.coveragerc --cov=./neuroimg/ tests/
coverage-badge -f -o coverage.svg

Tests is organized into two directories right now:
1. eegio/: consists of all unit tests related to various parts of eegio.
2. api/: consists of various integration tests tha test when eegio should work.


### Pipeline Process Visualized
[DAG of Pipeline in Snakemake](neuroimg/pipeline/dag_neuroimaging_pipeline.pdf)

Expand Down
19 changes: 19 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SOURCEDIR = .
BUILDDIR = _build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
173 changes: 173 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,173 @@
# -*- coding: utf-8 -*-
#
# Configuration file for the Sphinx documentation builder.
#
# This file does only contain a selection of the most common options. For a
# full list see the documentation:
# http://www.sphinx-doc.org/en/master/config

# -- Path setup --------------------------------------------------------------

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))


# -- Project information -----------------------------------------------------

project = 'neuroimgpipe'
copyright = '2019, Adam Li'
author = 'Adam Li'

# The short X.Y version
version = ''
# The full version, including alpha/beta/rc tags
release = '0.1'


# -- General configuration ---------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
#
# needs_sphinx = '1.0'

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'

# The master toctree document.
master_doc = 'index'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = None


# -- Options for HTML output -------------------------------------------------

# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'alabaster'

# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#
# html_theme_options = {}

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']

# Custom sidebar templates, must be a dictionary that maps document names
# to template names.
#
# The default sidebars (for documents that don't match any pattern) are
# defined by theme itself. Builtin themes are using these templates by
# default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
# 'searchbox.html']``.
#
# html_sidebars = {}


# -- Options for HTMLHelp output ---------------------------------------------

# Output file base name for HTML help builder.
htmlhelp_basename = 'neuroimgpipedoc'


# -- Options for LaTeX output ------------------------------------------------

latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',

# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',

# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',

# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}

# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'neuroimgpipe.tex', 'neuroimgpipe Documentation',
'Adam Li', 'manual'),
]


# -- Options for manual page output ------------------------------------------

# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'neuroimgpipe', 'neuroimgpipe Documentation',
[author], 1)
]


# -- Options for Texinfo output ----------------------------------------------

# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'neuroimgpipe', 'neuroimgpipe Documentation',
author, 'neuroimgpipe', 'One line description of project.',
'Miscellaneous'),
]


# -- Options for Epub output -------------------------------------------------

# Bibliographic Dublin Core info.
epub_title = project

# The unique identifier of the text. This can be a ISBN number
# or the project homepage.
#
# epub_identifier = ''

# A unique identification for the text.
#
# epub_uid = ''

# A list of files that should not be packed into the epub file.
epub_exclude_files = ['search.html']
Loading

0 comments on commit dda645b

Please sign in to comment.