Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add extra marks to VertexAI IT tests #28534

Merged
merged 5 commits into from
Sep 19, 2023
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,9 @@
from apache_beam.io.filesystems import FileSystems
from apache_beam.testing.test_pipeline import TestPipeline

# pylint: disable=ungrouped-imports
try:
import tensorflow as tf
from apache_beam.examples.inference import vertex_ai_image_classification
from apache_beam.examples.inference import vertex_ai_llm_text_classification
except ImportError as e:
Expand All @@ -45,6 +47,10 @@


class VertexAIInference(unittest.TestCase):
@unittest.skipIf(
tf is None, 'Missing dependencies. '
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that we already have @pytest.mark.uses_vertex_ai , I would't do this skip. We might accidentally silently disable the test and won't know it's not running.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we need to remove @pytest.mark.it_postcommit if this test is accidentally included into a suite where it is not supposed to run?

Copy link
Contributor

@tvalentyn tvalentyn Sep 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, this is almost the same as statements in line 32. i guess i have a general question, how do we make sure this test is running in a suite where it is supposed to run in. It sounds like this test was passing before, and now it is failing, b/c TF no longer installed in beam containers, and the spirit of this change is to skip the test.

sounds like something might be misconfigured.

  • If it was included in a suite that picks up all it_postcommit, not vertexai-specific - that's not expected - is it because we incorrectly included it_postcommit annotation? then we should remove that.
  • If the test was included in a special vertex AI suite, which we run as part of postcommits, and now starts to fail - i'd expect seeing changes pertaining to adding tensorflow in a requirements file, to make the test pass again.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like we have

, so probably need to remove it_postcommit annotation from this test to not include it twice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

disregard that, it's WAI the limited load on the logs just obscured the rest of the suites listed

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.

So, if we look at tests that have uses_tf, we have two kinds of tests: unit tests and postcommit IT tests (the latter run a whole pipeline).
Since for vertex AI you don't have unit tests and only have postcommit tests, I think can simplify the markers in your suite and only use one marker uses_vertex_ai, and collect that test in postcommit suite.

Copy link
Contributor

@tvalentyn tvalentyn Sep 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's weird is that we're getting the gradle task invoked separately and not as part of the postcommit

postCommitIT is a one test suite out of many that run in a postcommit Jenkins / GHA suite. VertextAI suite is another suite running in parallel.

tasks.register("python38PostCommit") {

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to clarify: vertex ai suite is a part of inferencePostcommitIT suite:

project.tasks.register("inferencePostCommitIT") {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

Tensorflow is getting installed and the test isn't getting skipped for lacking tensorflow, but the test is running on Dataflow where the worker doesn't have tensorflow - https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-19_10_21_28-17008252103869768778?project=apache-beam-testing

'Test depends on tensorflow')
@pytest.mark.uses_tf
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this necessary?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's ok too keep. looks like we alredy defined this marker and use it in other tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps not but considering the postcommits went red without a clear reason I'd rather cover all of our bases here. would have thought that missing TF would have gotten caught in the try-except-skip import block getting the pipeline code, but it didn't.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, i take it back. I think this decorator is serving a different puprose, see

/bin/sh -c 'pytest -o junit_suite_name={envname} --junitxml=pytest_{envname}.xml -n 6 -m uses_tf {posargs}; ret=$?; [ $ret = 5 ] && exit 0 || exit $ret'

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

adding this annotation would make tox pick up this test some other unit test suites, and then skip b/c vertex ai deps are not installed.

@pytest.mark.uses_vertex_ai
@pytest.mark.it_postcommit
def test_vertex_ai_run_flower_image_classification(self):
Expand Down
Loading