-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test harness for unit models of waste treatment #1371
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1371 +/- ##
=======================================
Coverage 94.01% 94.01%
=======================================
Files 335 335
Lines 35561 35561
=======================================
Hits 33431 33431
Misses 2130 2130 ☔ View full report in Codecov by Sentry. |
class TestInitializers: | ||
@pytest.fixture | ||
def model(self): | ||
m = build() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This class seems mostly unchanged. Are you able to make use of self.unit_solutions
here instead of the assert statements?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we are testing the initialization methods here not the solutions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I were to channel my "inner Tim", I'd say is testing the initialization methods necessary? If not, we should delete it. If so, we should consider doing something similar for other unit models or adding it to the harness (which might not be feasible/a good idea)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To add my 2 cents: the Initializers
are not part of WaterTAP and are tested as part of IDAES, so the question becomes twofold:
- do you trust IDAES not to break things on you, and
- what do you do if these tests start failing?
From a software engineering point of view, the "best" answer to this would be to add this case study to the IDAES backward compatibility tests. That way IDAES will see immediately if they break the tests, rather than having to wait for WaterTAP to discover and report it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM (other than the unrelated test failure). I'll just note here that the team internally discussed the inclusion of the initializer tests in the CSTR test file, and we ultimately decided that it's fine to add additional tests outside of the TestHarness framework.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
* test harness cstr, thickener * running black * cstr_injection and conservation * running black * LCOW for CSTR * removing duplicateted class * deleting extra class --------- Co-authored-by: Ludovico Bianchi <[email protected]>
Fixes/Resolves:
Addresses for 3 unit models #1302
Summary/Motivation:
Adds unit test harness for:
Changes proposed in this PR:
Legal Acknowledgement
By contributing to this software project, I agree to the following terms and conditions for my contribution: