-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About Conda Environment #2
Comments
@ikalvet That would be very much apricated on our side as well, as we currently can not run apptainers on our cluster. |
I think I found some good starting instructions in the heme binder design repo https://github.com/ikalvet/heme_binder_diffusion:
|
Yep, that conda environment setup should work for running RFdiffusionAA (tested on a remote non-UW system). |
Can you provide an env.yml or requirements.txt for creating a conda environment? Thanks |
You can extract the Singularity "recipe" and conda data from the sif file as listed in baker-laboratory/RoseTTAFold-All-Atom#5 (comment) |
I thought I'd share my recipe as I did some moving around so it need not be run in the github repo, which may be useful for others —changing the cuda=11.6 to whichever fits with the cuda drivers... export NEW_CONDA_ENV="RFdiffusionAA"
conda create -y -n $NEW_CONDA_ENV python=3.9
conda activate $NEW_CONDA_ENV
conda env config vars set LD_LIBRARY_PATH=$CONDA_PREFIX/lib:$CONDA_PREFIX:/.singularity.d/libs
conda env config vars set PYTHONUSERBASE=$CONDA_PREFIX
conda env config vars set CONDA_OVERRIDE_CUDA="11.6.2";
conda deactivate
conda activate $NEW_CONDA_ENV
# No to the pip route for pytorch as it will be messed up:
#pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu116
conda install -y pytorch torchvision torchaudio pytorch-cuda=11.6 -c pytorch -c nvidia
conda install -y -c conda-forge omegaconf hydra-core=1.3.2 scipy icecream openbabel assertpy opt_einsum pandas pydantic deepdiff e3nn prody pyparsing=3.1.1
conda install -y dglteam/label/cu116::dgl
conda install -y pytorch::torchdata
conda install -y anaconda::git
git clone https://github.com/baker-laboratory/RoseTTAFold-All-Atom /tmp/RoseTTAFold-AA
pip install --no-cache-dir -r /tmp/RoseTTAFold-AA/rf2aa/SE3Transformer/requirements.txt
pip install /tmp/RoseTTAFold-AA/rf2aa/SE3Transformer
# ops. the requirements downgraded to e3nn==0.3.3
pip install e3nn==0.5.1
# RFdiffusion full atom via poor man's pip
export CONDA_SITE_PACKAGES=$(ls -d $CONDA_PREFIX/lib/python*/ 2>/dev/null | head -n 1)site-packages
git clone --recurse-submodules https://github.com/baker-laboratory/rf_diffusion_all_atom.git /tmp/rf_diffusion_all_atom
mv /tmp/rf_diffusion_all_atom/*.py $CONDA_SITE_PACKAGES/
mv /tmp/rf_diffusion_all_atom/inference $CONDA_SITE_PACKAGES/
touch $CONDA_SITE_PACKAGES/inference/__init__.py
mv /tmp/rf_diffusion_all_atom/potentials $CONDA_SITE_PACKAGES/
touch $CONDA_SITE_PACKAGES/potentials/__init__.py
mv /tmp/rf_diffusion_all_atom/lib/rf2aa/rf2aa $CONDA_SITE_PACKAGES/rf2aa
# already has an init
#randomly missing:
pip install fire
# set weights
export RFAA_WEIGHTS=$HOME2/.cache/RFAA_weights
wget http://files.ipd.uw.edu/pub/RF-All-Atom/weights/RFDiffusionAA_paper_weights.pt -P $RFAA_WEIGHTS
mv /tmp/rf_diffusion_all_atom/config $HOME2/.cache/RFdiffusionAA_config
export RFDIFFUSIONAA_CONFIG=$HOME2/.cache/RFdiffusionAA_config/inference
# Make a nice alias
alias run_inference="$CONDA_PREFIX/bin/python $CONDA_SITE_PACKAGES/run_inference.py --config-path=$RFDIFFUSIONAA_CONFIG 'inference.ckpt_path=$RFAA_WEIGHTS'" run_inference hydra.output_subdir=... |
Here is the full yaml I exported from the conda environment that runs RFdiffusionAA. This may be slightly different from the one extracted from the Apptainer with some of the package versions, but it's tested to work on two different systems.
|
Thanks for all the helpful contributions in this thread, I was able to cobble together a Docker image ❤️ Please note:
FROM pytorch/pytorch:latest
RUN apt-get update
RUN apt-get install -y git wget
RUN apt-get install libxrender1
RUN pip install omegaconf hydra-core==1.3.2 scipy icecream assertpy opt_einsum pandas pydantic deepdiff e3nn pyparsing==3.1.1 fire
RUN conda install -y -c conda-forge prody openbabel
RUN conda install -y dglteam::dgl
RUN conda install -y pytorch::torchdata
WORKDIR /
RUN git clone https://github.com/baker-laboratory/rf_diffusion_all_atom.git
WORKDIR rf_diffusion_all_atom
RUN wget http://files.ipd.uw.edu/pub/RF-All-Atom/weights/RFDiffusionAA_paper_weights.pt
RUN git submodule init
RUN git submodule update |
There is no need to use the |
Can you provide the configuration file for the conda environment?
The text was updated successfully, but these errors were encountered: