You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran the newer version of the atac-seq-pipeline four time with same JSON file and same command line, since I wanted to test the output consistency. I am mainly interested in IDR analysis on already filtered BAM files. I find that the pipeline runs successfully sometimes and fails with different errors other times. Following are my commands and attached are corresponding stdout messages and the common JSON file (changed the file name to my.json.txt so that it attaches here) used for all four runs. Please let me know if I am doing anything wrong.
#Command 1: ran successfully with non-empty output
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json >& my.log &
#Command 2: terminated with error type 1
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json &> my_reran.log
#Command 3: terminated with error type 2
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json &>my_reran2.log
#Command 4: ran successfully with non-empty output
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json &>my_reran3.log
Please let me know if you need any further information. Thank you for your help in advance.
Please look into stderr and stdout files in each task directory cromwel-executions/atac/RANDOM_HASH_STRING/call-TASK_NAME/shard-?/executions/stderr.
It looks like you ran these pipelines with a local mode (without using docker/singularity).
Did you activate Conda environment before running them? Pipeline can also fail if you don't have enough resource on your system.
Please post an issue on the new repo. You may find an instruction how to make a tar ball for debugging and upload it.
The error message in stderr file is also present in the .log files (towards the end).
Yes, I am running the pipeline locally, with activated Conda environment. I doubt the resources are limiting since I am using a server with 128GB RAM (50GB free currently).
I couldn't find a way to post an issue on the new repo page, hence, I ended up posting the issue here. Sorry about that.
Hello,
I ran the newer version of the atac-seq-pipeline four time with same JSON file and same command line, since I wanted to test the output consistency. I am mainly interested in IDR analysis on already filtered BAM files. I find that the pipeline runs successfully sometimes and fails with different errors other times. Following are my commands and attached are corresponding stdout messages and the common JSON file (changed the file name to my.json.txt so that it attaches here) used for all four runs. Please let me know if I am doing anything wrong.
#Command 1: ran successfully with non-empty output
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json >& my.log &
#Command 2: terminated with error type 1
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json &> my_reran.log
#Command 3: terminated with error type 2
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json &>my_reran2.log
#Command 4: ran successfully with non-empty output
nohup java -jar -Dconfig.file=backends/backend.conf cromwell-34.jar run atac.wdl -i my.json &>my_reran3.log
Please let me know if you need any further information. Thank you for your help in advance.
Best,
Ravi
my.log
my_reran.log
my_reran2.log
my_reran3.log
my.json.txt
The text was updated successfully, but these errors were encountered: