This repository contains everything you'll need for the pybrain workshop "MRI analysis in Python using Nipype, Nilearn and more" in Cambridge 2020, 9th to 10th of November.
Registration for this workshop can be done via pybrain-workshop.github.io (Update: Course is unfortunately already fully booked!)
This 2-day workshop runs from 09:30 to 15:30 on both days. Lunch breaks are scheduled from 12:00 to 13:00. Individual short breaks can be taken during the multiple independent excersie sessesion throughout the workshop.
The scheudle might still be subject to some small changes, but the bigger picture looks as follows.
09:30-10:00
- Workshop overview10:00-10:45
- Quick recap on Jupyter, Python and more*10:45-12:00
- Explore MRI data with Nibabel and Nilearn12:00-13:00
- Lunch13:00-13:30
- How to set up your system, using Conda and Docker13:30-15:00
- Functional connectivity and machine learning15:00-15:30
- Innovations in neuroimaging tools (Part I)15:30-...
- Open ended for questions
09:30-10:00
- Introduction to Nipype10:00-11:00
- Exploration of Nipype's building blocks11:00-12:00
- Creating a Nipype Pipeline from A-Z12:00-13:00
- Lunch13:00-14:00
- PyBIDS and Nistats14:00-15:00
- Multivariate pattern analysis using Searchlight and Deep Learning15:00-15:30
- Innovations in neuroimaging tools (Part 2)15:30-...
- Open ended for questions
There are four ways how you can profit from this workshop:
- Via Docker (recommened): Full workshop experience, interactive, with all software dependencies.
- Via Conda (Python): Almost full workshop experience, interactive, with only Python based software dependencies (i.e. no support for FSL, SPM or ANTs)
- Via Mybinder: Full workshop experience, interactive and online, with restricted computational power.
- Via Jupyter NBViewer: Only visual exploration of the workshop content, no interactive aspect.
If you want the full experience, chose this option. This is also the one you should use if you participate in the November 2020 workshop. The only thing you'll need to install for this workshop is Docker Desktop (for more detailed instructions, see here Mac, Windows and Linux).
Once Docker Desktop is set up, open a (docker) terminal and run the following command to verify if everything is good to go:
docker run hello-world
Now, the only thing that's still missing is the actual workshop content. Do get this, use again a (docker) terminal and run the following command
docker pull miykael/workshop_pybrain
Given that the whole workshop content is more than 8GB in size, the download of this container image might take a while.
Once the container is pulled and on your system, you're all good to go. To access the workshop and open the Jupyter Notebook environment, please follow these steps:
-
Open a (docker) terminal and run the following command:
docker run -p 8888:8888 -it --rm miykael/workshop_pybrain
-
Open http://127.0.0.1:8888/?token=pybrain or http://localhost:8888/?token=pybrain in your web browser to access the workshop content.
-
Once Jupyter Notebook is open, double click on the Jupyter Notebook called
program.ipynb
- et voilà.
Note: Should you by any chance encounter the following "Password or token" needed message, use the token pybrain
to login.
Everything you do within this docker container will be reset the moment you terminate the docker run ...
command in step one (or you close this terminal). This means, any output file created within the docker container will be deleted. Similarly, any notes and changes within the notebooks will be lost. To prevent this from happening, either (1) manually download the changed notebooks (i.e. File > Download As > Notebook (.ipynb)) or (2) create a common folder within the container and on your system and allow a direct transition of data by adding the -v
flag to the docker run ...
command.
For example, something like docker run -p 8888:8888 -it --rm -v /path/to/your/output_folder:/output miykael/workshop_pybrain
, where /path/to/your/output_folder
should be an empty folder on your system, such as /User/neuro/Desktop/output
.
Here's a more detailed explanation of this full command:
docker run \ # start up a container already built or pulled
-p 8888:8888 \ # port used, <local port>:<container port>
-it \ # run Docker interactively
--rm \ # remove the container when it exits
-v ~/local_folder:/output # use local files <local path>:<container path>
miykael/workshop_pybrain # use specified user/project:version container
Note: The path to the folder /path/to/your/output_folder
needs to be an absolut path (i.e. it cannot be relate). So if you're corrently in the folder /User/neuro/Desktop/workshop/
and want to give access to a subfolder called results
. You cannot use -v results:/output
or -v ./results:/output
. You either need to use -v /User/neuro/Desktop/workshop/results:/output
or -v ~/Desktop/workshop/results:/output
.
It is possible that you might run into some MemoryError
messages during the workshop, or that you don't have enough CPUs for parallel process. This has most likely nothing to do with your system, and probably is due to Docker Desktop and it's resource management. By default, Docker Desktop only uses 2 CPUs and 2 GB of RAM. You can change this default setting by opening Docker Desktop, go to Settings/Preferences > Resources. For a more detailed description see here for Mac and for Windows.
Docker is a great tool to quickly provide a out-of-the-box running computer environment. However, if you're not carefull, it can quickly create a lot of unwanted files on your machine. To better understand these footprints and to clean out unwanted files after the workshop, please concider the following commands (run from within a (docker) terminal).
# Show all installed Docker images
docker images
# Remove a docker image
docker rmi -f $IMAGE_ID
# Remove all unused docker objects
docker system prune
# Remove everything (including images you might still need)
docker system prune -a
And should you chose to remove the --rm
string in the docker run ...
command, you can use the following commands to explore which containers are still open and potentially can be accessed once more (with their changes and additional outputs still present). Our advice is against such an approach as it can clutter your machine even quicker.
# Show running containers
docker ps
# Show all (also stopped) containers
docker ps -a
# Start a stopped container & attach to it
docker start -ia $CONTAINER_ID
# Remove a container
docker rm $CONTAINER_ID
If you want to do the workshop without installing heavy neuroimaing software packages, such as FSL, SPM12 or ANTs and don't want to install Docker on your system, than you might be interested in this approach. By installing Python on your system (i.e. specifically Conda) and setting up the appropriate environment, you will be able to open all the Jupyter Notebooks and go through the whole content of the course. Except for the notebooks which depend on the heavy neuroimaing software packages, you should be able to run everything locally.
To get things up and running, please follow these steps:
- Download and install either miniconda or Anaconda with a Python version of at least 3.6
- Download the
environment.yml
file (e.g. with right mouse click -> Save As). Make sure that the file ends with.yml
and not.txt
. - Open up a conda terminal (or any other terminal), and create a new conda environment with the following command:
conda env create -f /path/to/file/environment.yml
- For example ``conda env create -f ~/Downloads/environment.yml` - Download the notebooks in this repository via this link) and unzip them to your prefered location, e.g.
Desktop/pybrain
.h - Download the three datasets adhd, ds000114 and ds000228 and put them into the workshop folder as well, e.g. at
Desktop/pybrain/adhd
andDesktop/pybrain/ds000114
. - Next, open up a conda terminal (or any other terminal), activate the conda environment with
conda activate pybrain
(or on older conda environments withsource activate pybrain
for mac and linux andactivate pybrain
for windows). - Finally, via the terminal, move to the folder where you've put all the unzipped content of this workshop, e.g. with the command
cd ~/Desktop/pybrain
and run the commandjupyter notebook
from the folder that contains theprogram.ipynb
notebook.
Note: This only provides you the notebooks from the workshop that are not already in the nipype_tutorial
. You can download the notebooks for the Nipype section of this workshop directly from here.
MyBinder.org is a great service that allows you to run Jupyter notebooks in a Docker or Python environment, directly online and for free. However, this service comes of course with a restricted computational environment (1-2GB of RAM). This means, many notebooks might be very slow and some might even crash, due to not enough memory.
You can use this approach to run and test most of the notebooks and to explore the slides. To access the MyBinder instance, use this link.
If you want to go through the content of this workshop without installing anything on your machine, you should chose this approach. To see all the notebooks and slides from this workshop, please use this Jupyter NBviewer link.
Note: For the non-interactive version of the Nipype section of this workshop, please us this link.