uiPSF: Universal inverse modelling of point spread functions for SMLM localization and microscope characterization
The point spread function (PSF) of a microscope describes the image of a point emitter. Knowing the accurate PSF model is essential for various imaging tasks, including single molecule localization, aberration correction or deconvolution.
Here we present uiPSF (universal inverse modelling of Point Spread Functions), a toolbox to infer accurate PSF models either from image stacks of fluorescent beads or directly from images of single blinking fluorophores, the raw data in SMLM. It is a powerful tool to characterize and optimize a microscope as it reports the aberration modes, including field-dependent aberrations. The resulting PSF model enables accurate 3D super-resolution imaging using single molecule localization microscopy. Our modular framework is applicable to a variety of microscope geometries, and the forward model can include system specific characteristics such as the bead size, camera pixel size and transformations among channels. We demonstrate its application in single objective systems with single or multiple channels, 4Pi-SMLM, and lattice light-sheet microscopes.
uiPSF can run on both CPU and GPU, however, we recommend installing the GPU version for fast processing speed. To install the GPU version, a GPU card that supports CUDA 11.2 is required. Reference to Systems tested for selecting your GPU card.
uiPSF is supported for Windows, Linux and MacOS. Only CPU version is supported for MacOS.
cudatoolkit (GPU version only)
cudnn (GPU version only)
pip
python
numpy
scipy
matplotlib
tensorflow
tensorflow-probability
scikit-image
tqdm
czifile
hdfdict
dotted_dict
omegaconf
ipykernel
- Windows 11 with RTX 4090, RTX 3080,RTX 3090, RTX 2080
- Windows 10 with RTX 4000, RTX 3090, RTX 1070
- Rocky Linux 8.7 with RTX A6000
- Ubuntu 20.04 with RTX 1070
Installation time for the GPU version is around 10 minutes.
- Install miniconda for windows, miniconda
- Open Anaconda Powershell Prompt, clone the uiPSF package
git clone https://github.com/ries-lab/uiPSF.git
cd uiPSF
- Create a new conda enviroment for the uiPSF package
- for GPU:
conda env create --name psfinv --file=environment.yml
- for CPU:
conda create --name psfinv python=3.7.10
- Activate the installed enviroment and install the uiPSF package
conda activate psfinv
pip install -e .
- Install miniconda for Mac.
- Open Terminal and follow the installation for Windows to install the uiPSF package. Only the CPU version is supported.
- Install miniconda for Linux.
- Install uiPSF package.
-
For TensorFlow 2.9
Follow the installation for Windows to install the uiPSF package.
-
For lastest TensorFlow (Note that TensorFlow later than 2.10 is no longer supported on Window)
a. Modify the version numbers in the environment.yml file as follows:
- cudatoolkit=11.8 - cudnn=8.4 - python=3.9
b. Remove the version numbers in
install_requires
in the setup.py file as follows:"tensorflow" "tensorflow-probability"
c. Follow the installation for Windows to install the uiPSF package.
d. If the GPU version is intalled, run the following command
pip install tensorflow[and-cuda]
We used above procedure to intall uiPSF on a Linux computer with RTX A6000 to fully utilize the computability from the GPU.
- If the GPU version is installed, add cudnn path
mkdir -p $CONDA_PREFIX/etc/conda/activate.d
echo 'export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CONDA_PREFIX/lib/' >> $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh
- For bead data
- For SMLM data
- Microscope characterization
- Generate PSF model at a given imaging depth
The following run times were obtained from a desktop PC with Windows 11, RTX 3080.
PSF type | run time (min) | # of parameters |
---|---|---|
1ch LLS voxel | 1.9 | 31,144 |
1ch zernike_vector | 0.5 | 992 |
2ch zernike_vector | 5.1 | 3,827 |
4pi zernike | 2.8 | 775 |
FD zernike_vector | 16.1 | 98,680 |
1ch in situ | 4.7 | 10,433 |
2ch in situ | 13.1 | 22,404 |
4pi in situ | 35 | 35,189 |
FD in situ | 49.7 | 143,023 |
- 40 nm bead data from single-channel, dual-color ratiometric and 4Pi systems.
- bead data from a single-channel system with a FOV of 177 um x 177 um.
- bead data from a lattice-light sheet microscope.
- SMLM data of Nup96-AF647 from a single-channel system with a FOV of 177 um x 177 um.
- SMLM data of tubulin-AF647 from a single-channel system with astigmatism aberration applied by a deformable mirror
- SMLM data of TOMM20-AF647 from a single-channel system with Tetrapod PSFs applied by a phase plate
- SMLM data of Nup96-AF647 and WGA-CF680 from a dual-color ratiometric system.
- SMLM data of Nup96-mMaple from a 4Pi-SMLM system
Download the example data
- Install uiPSF for your operating system.
- Install Visual Studio Code.
- Open Visual Studio Code (VScode), click Extensions from the sidebar menu and search for
Python
and installPython extension for VScode
. - Go to File->Open Folder, select the uiPSF folder from git clone.
- Open the file demo/datapath.yaml, change the
main_data_dir
to the path of the downloaded example data. - Navigate to a demo notebook, e.g. demo/demo_beadPSF_1ch.ipynb.
- Click the run button of the first cell, if running for the first time, a window will popup asking to install the
ipykernel
package, click install. Then a drop down menu will show up asking to select the kernel, select the created conda enviromentpsfinv
during the installation.- In case there is no window popup, an alternative method is: install
Jupyter
from Extensions, then click Select Kernel at the upper right corner of the demo notebook and select thepsfinv
from the dropdown menu.
- In case there is no window popup, an alternative method is: install
- Run subsequent cells sequentially.
- For explanation of user defined parameters and details of creating config files, please refer to user defined parameters.
- For explanation of the output parameters from PSF learning, please refer to output parameters.
uiPSF normally assumes the input data are unprocessed raw data from the camera frames. Except for the 4Pi bead data, all data were saved as a 3D array: [x, y, z]
for bead data or [x, y, frames]
for SMLM data.
.mat
file should be save as Matlab version 7.3..tif
or.tiff
file should be multi-page tiff file.
- File format: .czi, .tif, .tiff, .mat
- Data dimension: 3D
- File format: .tif, .tiff, .mat
- Data dimension:
- For .tif file: 3D, all channels were collected from the same camera. uiPSF will crop the channels from the raw data based on user defined parameters (
mirrortype
andchannel_arrange
, see user defined parameters). - For .mat file: each channel is saved as a 3D array in the same file.
- For .tif file: 3D, all channels were collected from the same camera. uiPSF will crop the channels from the raw data based on user defined parameters (
- File format: .mat
- Data dimension:
- Bead data: Each channel is saved as a 4D array,
[x, y, phase, z]
, in the same file. - SMLM data: Each channel is saved as a 3D array,
[x, y, frames]
, in the same file.
- Bead data: Each channel is saved as a 4D array,
- Please ensure that the computer's current graphics card driver supports CUDA 11.2.
- Don't run two notebooks at the same time, click
Restart
at the top of the notebook to release the memory. - Although uiPSF usually process on raw data from camera frames, where it will select multiple candidate emitters, it can also take small data stack with only one emitter in it. However, in this case, the user need to set the
roi_size
to a value smaller than the FOV of the input data. roi_size
is normally a two-element vector, defining the crop size of one emitter in[y, x]
. Only for agarose bead data,roi_size
is a three-element vector, defining the crop size of one bead in[z, y, x]
.
Tutorials for using the PSF model generated from uiPSF for localization analysis. Use one of the demo notebooks to generate the corresponding PSF model (.h5 file) before using the following tutorials.
- Single channel SMLM imaging.
- Ratiometric dual-color SMLM imaging.
- 4Pi-SMLM imaging
- Single channel SMLM imaging with large FOV
Open an issue here on github, or contact Jonas Ries ([email protected]), Sheng Liu ([email protected])