Installing Firedrake on HPC #3620
-
I have repeatedly gotten this error during the h5py step of the install
At first I tried using my HPC's installation of openmpi and passing |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 14 replies
-
@lindsayad can you share the result of running @JDBetteridge do you have any ideas? |
Beta Was this translation helpful? Give feedback.
-
If you're running on HPC you almost certainly don't want to let PETSc build MPI. You want to load your preferred MPI module at installation time and whenever you activate the virtual environment. As could you try running the command Connor mentioned above except you need to add one more bit to the end of the path:
I'm not 100% sure why your numpy is linking against MPI in the first place, perhaps the BLAS/LAPACK on the system is linked against MPI. |
Beta Was this translation helpful? Give feedback.
-
This is all a little bit complicated, as you can see from the
We need to find out where Could you also share the output of
I hope and suspect it will contain a line that looks like:
If so I will share a method for building |
Beta Was this translation helpful? Give feedback.
-
When running
|
Beta Was this translation helpful? Give feedback.
-
Alright, install completed successfully 🎉 Now at run-time there are some module import errors but I think this is an issue with one of our HPC python packages, so I'll work that out with them |
Beta Was this translation helpful? Give feedback.
This is all a little bit complicated, as you can see from the
mpicc
command that PETSc thinks it's using: