-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add extra marks to VertexAI IT tests #28534
Conversation
Run Python 3.8 PostCommit |
Codecov Report
@@ Coverage Diff @@
## master #28534 +/- ##
==========================================
+ Coverage 72.33% 72.34% +0.01%
==========================================
Files 683 683
Lines 100709 100814 +105
==========================================
+ Hits 72843 72938 +95
- Misses 26288 26298 +10
Partials 1578 1578
Flags with carried forward coverage won't be shown. Click here to find out more. see 18 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
Run Python 3.8 PostCommit |
Assigning reviewers. If you would like to opt out of this review, comment R: @damccorm for label python. Available commands:
The PR bot will only process comments in the main thread (not review comments). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, lets make sure that the postcommit passes before merging though
@@ -45,6 +47,10 @@ | |||
|
|||
|
|||
class VertexAIInference(unittest.TestCase): | |||
@unittest.skipIf( | |||
tf is None, 'Missing dependencies. ' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that we already have @pytest.mark.uses_vertex_ai , I would't do this skip. We might accidentally silently disable the test and won't know it's not running.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need to remove @pytest.mark.it_postcommit
if this test is accidentally included into a suite where it is not supposed to run?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, this is almost the same as statements in line 32. i guess i have a general question, how do we make sure this test is running in a suite where it is supposed to run in. It sounds like this test was passing before, and now it is failing, b/c TF no longer installed in beam containers, and the spirit of this change is to skip the test.
sounds like something might be misconfigured.
- If it was included in a suite that picks up all it_postcommit, not vertexai-specific - that's not expected - is it because we incorrectly included it_postcommit annotation? then we should remove that.
- If the test was included in a special vertex AI suite, which we run as part of postcommits, and now starts to fail - i'd expect seeing changes pertaining to adding tensorflow in a requirements file, to make the test pass again.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks like we have
tensorflow>=2.12.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
disregard that, it's WAI the limited load on the logs just obscured the rest of the suites listed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.
So, if we look at tests that have uses_tf, we have two kinds of tests: unit tests and postcommit IT tests (the latter run a whole pipeline).
Since for vertex AI you don't have unit tests and only have postcommit tests, I think can simplify the markers in your suite and only use one marker uses_vertex_ai, and collect that test in postcommit suite.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's weird is that we're getting the gradle task invoked separately and not as part of the postcommit
postCommitIT is a one test suite out of many that run in a postcommit Jenkins / GHA suite. VertextAI suite is another suite running in parallel.
Line 497 in a90879a
tasks.register("python38PostCommit") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to clarify: vertex ai suite is a part of inferencePostcommitIT suite:
project.tasks.register("inferencePostCommitIT") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tensorflow is getting installed and the test isn't getting skipped for lacking tensorflow, but the test is running on Dataflow where the worker doesn't have tensorflow - https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-19_10_21_28-17008252103869768778?project=apache-beam-testing
@unittest.skipIf( | ||
tf is None, 'Missing dependencies. ' | ||
'Test depends on tensorflow') | ||
@pytest.mark.uses_tf |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this necessary?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's ok too keep. looks like we alredy defined this marker and use it in other tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps not but considering the postcommits went red without a clear reason I'd rather cover all of our bases here. would have thought that missing TF would have gotten caught in the try-except-skip import block getting the pipeline code, but it didn't.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, i take it back. I think this decorator is serving a different puprose, see
Line 398 in 3915c85
/bin/sh -c 'pytest -o junit_suite_name={envname} --junitxml=pytest_{envname}.xml -n 6 -m uses_tf {posargs}; ret=$?; [ $ret = 5 ] && exit 0 || exit $ret' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adding this annotation would make tox pick up this test some other unit test suites, and then skip b/c vertex ai deps are not installed.
Run Python 3.8 PostCommit |
Specifying the requirements file for the IT test seems to have resolved the issue - https://ci-beam.apache.org/job/beam_PostCommit_Python38_PR/777/ |
Run Python_Coverage PreCommit |
somehow postcommits are still failing https://ci-beam.apache.org/job/beam_PostCommit_Python38/4612/ |
sent #28571 |
Mark the Vertex AI IT tests as using tensorflow and add an extra check to ensure the dependency has been installed before running the test suite.
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123
), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>
instead.CHANGES.md
with noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.