Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BTK Preprocessor stuck on "running HD-Bet" #29

Open
c-gamble opened this issue Jul 23, 2023 · 5 comments
Open

BTK Preprocessor stuck on "running HD-Bet" #29

c-gamble opened this issue Jul 23, 2023 · 5 comments
Assignees
Labels
potential duplicate question Further information is requested

Comments

@c-gamble
Copy link

Hello,

There is a closed issue addressing the same problem I am facing, but I figured I would reopen it since the previous issuer seems to no longer be active and the issue was never resolved. When I run BTK single or batch preprocessing, my terminal stays stuck on "running HD-Bet" for more than 8 hours (after which point I simply exited the process). This does not make sense, especially for the single case, considering that HD-Bet's documentation estimates no more than ten seconds per MRI scan. I ran
docker run --rm --gpus all nvidia/cuda:11.0-base nvidia-smi
and received the following output.
Screenshot 2023-07-23 at 1 07 22 AM
I'm not actually sure what this command does, but it seems like the output may indicate that no containers are active? Though again, I don't know for sure.

Thank you in advance for any guidance!

@neuronflow
Copy link
Owner

Hi,
Thanks for your interest in BTK and for reporting the problems here. The HD-BET step should take a few minutes max.

I'm not actually sure what this command does, but it seems like the output may indicate that no containers are active? Though again, I don't know for sure.

The comment indicates that docker is installed correctly and can communicate with your GPUs, a common problem for many users but most likely not yours.

Any idea how to replicate your issue? Which mode are you using "gpu" or "cpu"? Did you try both modes?

@c-gamble
Copy link
Author

c-gamble commented Jul 26, 2023

Thank you for your reply! Below is the exact python file I'm running:

from brats_toolkit.preprocessor import Preprocessor

preprocessor = Preprocessor()

exam_name = "BraTS-GLI-[CENSORED]"
t1_file = f"/newresearch/research/projects/Cooper/BraTS_2023/BraTS-Toolkit/preprocessing/input_data/{exam_name}/{exam_name}-t1.nii.gz"
t1c_file = f"/newresearch/research/projects/Cooper/BraTS_2023/BraTS-Toolkit/preprocessing/input_data/{exam_name}/{exam_name}-t1c.nii.gz"
t2_file = f"/newresearch/research/projects/Cooper/BraTS_2023/BraTS-Toolkit/preprocessing/input_data/{exam_name}/{exam_name}-t2.nii.gz"
fla_file = f"/newresearch/research/projects/Cooper/BraTS_2023/BraTS-Toolkit/preprocessing/input_data/{exam_name}/{exam_name}-fla.nii.gz"

output_dir = f"/newresearch/research/projects/Cooper/BraTS_2023/BraTS-Toolkit/segmentation/input_data/{exam_name}"

preprocessor.single_preprocess(
    t1File=t1_file,
    t1cFile=t1c_file,
    t2File=t2_file,
    flaFile=fla_file,
    outputFolder=output_dir,
    mode="gpu",
    confirm=True,
    skipUpdate=False,
    gpuid="2"
)

I've just attempted running the above file, this time removing the "gpuid" line and chaning mode to "cpu," and the program now seems to be stuck at "png slice extraction," as mentioned in #14

@neuronflow
Copy link
Owner

Okay, maybe your GPUs are too new for HD-BET. Unfortunately, we don't have docker on the cluster where we have the same GPUs, so I have no machine to test and replicate the issue.

Can you double-check the permissions? especially to your output folder?

@neuronflow neuronflow self-assigned this Oct 18, 2023
@neuronflow
Copy link
Owner

@c-gamble I have an alternative pipeline that you could beta test if you want to :)

@neuronflow neuronflow added the question Further information is requested label Oct 30, 2023
@neuronflow neuronflow added invalid This doesn't seem right and removed question Further information is requested labels Nov 7, 2023
@neuronflow neuronflow reopened this Nov 7, 2023
@neuronflow neuronflow added question Further information is requested and removed invalid This doesn't seem right labels Nov 7, 2023
@neuronflow
Copy link
Owner

neuronflow commented Nov 7, 2023

PS: did you see this post: #14 (comment) ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
potential duplicate question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants