-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add extra marks to VertexAI IT tests #28534
Changes from 3 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||
---|---|---|---|---|
|
@@ -26,7 +26,9 @@ | |||
from apache_beam.io.filesystems import FileSystems | ||||
from apache_beam.testing.test_pipeline import TestPipeline | ||||
|
||||
# pylint: disable=ungrouped-imports | ||||
try: | ||||
import tensorflow as tf | ||||
from apache_beam.examples.inference import vertex_ai_image_classification | ||||
from apache_beam.examples.inference import vertex_ai_llm_text_classification | ||||
except ImportError as e: | ||||
|
@@ -45,6 +47,10 @@ | |||
|
||||
|
||||
class VertexAIInference(unittest.TestCase): | ||||
@unittest.skipIf( | ||||
tf is None, 'Missing dependencies. ' | ||||
'Test depends on tensorflow') | ||||
@pytest.mark.uses_tf | ||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. is this necessary? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. it's ok too keep. looks like we alredy defined this marker and use it in other tests. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Perhaps not but considering the postcommits went red without a clear reason I'd rather cover all of our bases here. would have thought that missing TF would have gotten caught in the try-except-skip import block getting the pipeline code, but it didn't. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. actually, i take it back. I think this decorator is serving a different puprose, see Line 398 in 3915c85
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. adding this annotation would make tox pick up this test some other unit test suites, and then skip b/c vertex ai deps are not installed. |
||||
@pytest.mark.uses_vertex_ai | ||||
@pytest.mark.it_postcommit | ||||
def test_vertex_ai_run_flower_image_classification(self): | ||||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that we already have @pytest.mark.uses_vertex_ai , I would't do this skip. We might accidentally silently disable the test and won't know it's not running.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need to remove
@pytest.mark.it_postcommit
if this test is accidentally included into a suite where it is not supposed to run?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, this is almost the same as statements in line 32. i guess i have a general question, how do we make sure this test is running in a suite where it is supposed to run in. It sounds like this test was passing before, and now it is failing, b/c TF no longer installed in beam containers, and the spirit of this change is to skip the test.
sounds like something might be misconfigured.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks like we have
beam/sdks/python/apache_beam/ml/inference/vertex_ai_tests_requirements.txt
Line 19 in 3915c85
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
disregard that, it's WAI the limited load on the logs just obscured the rest of the suites listed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, if we look at tests that have uses_tf, we have two kinds of tests: unit tests and postcommit IT tests (the latter run a whole pipeline).
Since for vertex AI you don't have unit tests and only have postcommit tests, I think can simplify the markers in your suite and only use one marker uses_vertex_ai, and collect that test in postcommit suite.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
postCommitIT is a one test suite out of many that run in a postcommit Jenkins / GHA suite. VertextAI suite is another suite running in parallel.
beam/build.gradle.kts
Line 497 in a90879a
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to clarify: vertex ai suite is a part of inferencePostcommitIT suite:
beam/sdks/python/test-suites/dataflow/common.gradle
Line 470 in 6c4bdf1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tensorflow is getting installed and the test isn't getting skipped for lacking tensorflow, but the test is running on Dataflow where the worker doesn't have tensorflow - https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-19_10_21_28-17008252103869768778?project=apache-beam-testing