Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[test only] Test PostCommit runs for #31116 #31107

Conversation

Abacn
Copy link
Contributor

@Abacn Abacn commented Apr 25, 2024

Reverts #30770

the newly added test TestFastAvro::test_avro_schema_to_beam_schema_with_nullable_atomic_fields failing XVR test:

#30770 (comment)

@Abacn
Copy link
Contributor Author

Abacn commented Apr 25, 2024

#30770 was several bug fix together. We can add them back one at a time for those did not have issue.

@Abacn
Copy link
Contributor Author

Abacn commented Apr 25, 2024

R: @benkonz @ahmedabu98

@Abacn Abacn closed this Apr 25, 2024
@Abacn Abacn reopened this Apr 25, 2024
Copy link
Contributor

The Workflow run is cancelling this PR. It is an earlier duplicate of 2083803 run.

Copy link
Contributor

The Workflow run is cancelling this PR. It is an earlier duplicate of 1729654 run.

Copy link
Contributor

Stopping reviewer notifications for this pull request: review requested by someone other than the bot, ceding control

@benkonz
Copy link
Contributor

benkonz commented Apr 25, 2024

yeah, for some reason, in the flink runner specifically, after the SqlTransform("SELECT * FROM PCOLLECTION") in the test_avro_schema_to_beam_schema_with_nullable_atomic_fields test, there aren't any records outputted. I verified that the test works locally when I run pytest -v apache_beam/io/avroio_test.py::TestFastAvro::test_avro_schema_to_beam_schema_w ith_nullable_atomic_fields, but with flink it just outputs an empty PCollection, still looking into this, but feel free to merge the revert. I'll post a new PR once I can get the test working, also open to feedback if anyone knows what's causing this behavior!

@codecov-commenter
Copy link

codecov-commenter commented Apr 25, 2024

Codecov Report

Attention: Patch coverage is 25.00000% with 6 lines in your changes are missing coverage. Please review.

Project coverage is 71.01%. Comparing base (2db9b80) to head (0dfdcc2).
Report is 8 commits behind head on master.

❗ Current head 0dfdcc2 differs from pull request most recent head 2ed07b8. Consider uploading reports for the commit 2ed07b8 to get more accurate results

Files Patch % Lines
sdks/python/apache_beam/io/avroio.py 25.00% 6 Missing ⚠️
Additional details and impacted files
@@              Coverage Diff              @@
##             master   #31107       +/-   ##
=============================================
+ Coverage     60.36%   71.01%   +10.64%     
  Complexity     2980     2980               
=============================================
  Files           659     1064      +405     
  Lines         66058   133212    +67154     
  Branches       3231     3231               
=============================================
+ Hits          39875    94595    +54720     
- Misses        23083    35517    +12434     
  Partials       3100     3100               
Flag Coverage Δ
python 81.48% <25.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@benkonz
Copy link
Contributor

benkonz commented Apr 26, 2024

@Abacn I found the cause of the issue in the test. The Flink runner wasn't able to access the local file-system with avroio.ReadFromAvro(...) and avroio.WriteToAvro(...). I refactored the test in #31116 to create a PCollection from memory, rather than using temp directories, and verified that the previously failing gradle command works (./gradlew :runners:flink:1.15:job-server:validatesCrossLanguageRunnerPythonUsingSql). Could you take a look?

@Abacn
Copy link
Contributor Author

Abacn commented Apr 26, 2024

@benkonz thanks for the fix. The test wasn't only failing on Flink runner, it's also failing on direct runner (and all runners)

See https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Spark3.yml?query=event%3Aschedule https://github.com/apache/beam/actions/workflows/beam_PostCommit_XVR_Samza.yml?query=event%3Aschedule https://github.com/apache/beam/actions/runs/8847577228/job/24295682944

Let me trigger tests for #31116 to see if the test is fixed on all runners

@Abacn Abacn force-pushed the revert-30770-python-sdk-avro-beam-schema-conversion-bugfixes branch from 0dfdcc2 to 2ed07b8 Compare April 26, 2024 15:48
@github-actions github-actions bot added the build label Apr 26, 2024
@Abacn Abacn changed the title Revert "python sdk: fix several bugs regarding avto <-> beam schema conversion" [test only] Test PostCommit runs for #31116 Apr 26, 2024
@Abacn Abacn marked this pull request as draft April 26, 2024 15:51
@Abacn Abacn closed this Apr 26, 2024
@Abacn Abacn deleted the revert-30770-python-sdk-avro-beam-schema-conversion-bugfixes branch April 26, 2024 16:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants