You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When one runtime instance is running, another one can not be started at the same time.
The error message is: terminate called after throwing an instance of 'boost::exception_detail::clone_impl<boost::exception_detail::error_info_injector<hpx::exception> >' what(): bind: Address already in use: HPX(network_error)
This seems to indicate that the runtime is opening a server socket on a specific port and each instance following that tries to do the same on the same port and thus fails.
This issue prevents us to run multiple tests in parallel reliably during development.
The text was updated successfully, but these errors were encountered:
Yeah - not a real solution to the problem mentioned here though.
I guess we got this solution via mail or somehow, but we use it on our CI system all the time.
You have to disable networking support to do this:
-DHPX_WITH_NETWORKING=OFF
Not quite correct. -DHPX_WITH_NETWORKING=OFF is considered to be a workaround. Once distributed tests are integrated, this is no longer viable. Running tests in parallel isn't really supported as each individual test already runs in parallel.
The issue has been left open to remind others that invoking ctest in
parallel is not really supported.
When one runtime instance is running, another one can not be started at the same time.
The error message is:
terminate called after throwing an instance of 'boost::exception_detail::clone_impl<boost::exception_detail::error_info_injector<hpx::exception> >' what(): bind: Address already in use: HPX(network_error)
This seems to indicate that the runtime is opening a server socket on a specific port and each instance following that tries to do the same on the same port and thus fails.
This issue prevents us to run multiple tests in parallel reliably during development.
The text was updated successfully, but these errors were encountered: