Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Put an explanatory comment in trigger files that exist #30586

Merged
merged 1 commit into from
Mar 12, 2024

Put an explanatory comment in trigger files that exist

97f8dae
Select commit
Loading
Failed to load commit list.
Sign in for the full log view
Merged

Put an explanatory comment in trigger files that exist #30586

Put an explanatory comment in trigger files that exist
97f8dae
Select commit
Loading
Failed to load commit list.
GitHub Actions / Test Results failed Mar 12, 2024 in 0s

7 fail, 28 skipped, 147 pass in 11h 55m 37s

182 tests  +157   147 ✅ +122   11h 55m 37s ⏱️ + 11h 45m 34s
 45 suites + 28    28 💤 + 28 
 45 files   + 28     7 ❌ +  7 

Results for commit 97f8dae. ± Comparison against earlier commit 43b69c3f.

Annotations

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testDirectRead (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT.xml [took 10s]
Raw output
java.lang.RuntimeException: Failed to create a workflow job: Consumer 'projects/parent-project' is invalid: Resource projects/parent-project could not be found..
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1462)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:109)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT.testDirectRead(BigQueryDirectReadSchemaTransformProviderIT.java:284)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
POST https://dataflow.googleapis.com/v1b3/projects/parent-project/locations/us-central1/jobs
{
  "code" : 400,
  "details" : [ {
    "@type" : "type.googleapis.com/google.rpc.Help"
  }, {
    "@type" : "type.googleapis.com/google.rpc.ErrorInfo",
    "reason" : "CONSUMER_INVALID"
  } ],
  "errors" : [ {
    "domain" : "global",
    "message" : "Consumer 'projects/parent-project' is invalid: Resource projects/parent-project could not be found..",
    "reason" : "badRequest"
  } ],
  "message" : "Consumer 'projects/parent-project' is invalid: Resource projects/parent-project could not be found..",
  "status" : "INVALID_ARGUMENT"
}
	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:439)
	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:525)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:466)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:576)
	at org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:64)
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1448)
	... 52 more

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testDirectReadWithSelectedFieldsAndRowRestriction (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT.xml [took 8s]
Raw output
java.lang.RuntimeException: Failed to create a workflow job: Consumer 'projects/parent-project' is invalid: Resource projects/parent-project could not be found..
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1462)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:109)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT.testDirectReadWithSelectedFieldsAndRowRestriction(BigQueryDirectReadSchemaTransformProviderIT.java:348)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
POST https://dataflow.googleapis.com/v1b3/projects/parent-project/locations/us-central1/jobs
{
  "code" : 400,
  "details" : [ {
    "@type" : "type.googleapis.com/google.rpc.Help"
  }, {
    "@type" : "type.googleapis.com/google.rpc.ErrorInfo",
    "reason" : "CONSUMER_INVALID"
  } ],
  "errors" : [ {
    "domain" : "global",
    "message" : "Consumer 'projects/parent-project' is invalid: Resource projects/parent-project could not be found..",
    "reason" : "badRequest"
  } ],
  "message" : "Consumer 'projects/parent-project' is invalid: Resource projects/parent-project could not be found..",
  "status" : "INVALID_ARGUMENT"
}
	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:439)
	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:525)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:466)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:576)
	at org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:64)
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1448)
	... 52 more

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testSimpleWrite (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.xml [took 5m 29s]
Raw output
java.lang.RuntimeException: java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
Workflow failed. Causes: S02:Create.Values/Read(CreateSource)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/element-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Error on failed inserts+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Error on failed inserts+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120703-2h2z-harness-crc5,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120703-2h2z-harness-crc5,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120703-2h2z-harness-crc5,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120703-2h2z-harness-crc5
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:149)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.testSimpleWrite(BigQueryStorageWriteApiSchemaTransformProviderIT.java:180)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testFailedRows (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.xml [took 5m 34s]
Raw output
java.lang.RuntimeException: java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
Workflow failed. Causes: S04:Create.Values/Read(CreateSource)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/element-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Construct failed rows and errors/Map+ExtractFailedRows/Map+PAssert$9/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$9/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$9/GroupGlobally/WithKeys/AddKeys/Map+PAssert$9/GroupGlobally/GroupByKey/Reify+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/error-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Construct failed rows and errors/Map+ExtractFailedRows/Map+PAssert$9/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$9/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$9/GroupGlobally/WithKeys/AddKeys/Map+PAssert$9/GroupGlobally/GroupByKey/Reify+PAssert$9/GroupGlobally/GroupByKey/Session/Flatten+PAssert$9/GroupGlobally/GroupByKey/Write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/error-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120709-lqa7-harness-n6rk,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120709-lqa7-harness-n6rk,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120709-lqa7-harness-n6rk,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120709-lqa7-harness-n6rk
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:149)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.testFailedRows(BigQueryStorageWriteApiSchemaTransformProviderIT.java:298)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testInputElementCount (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.xml [took 5m 12s]
Raw output
java.lang.RuntimeException: java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
Workflow failed. Causes: S02:Create.Values/Read(CreateSource)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/element-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Error on failed inserts+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Error on failed inserts+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120714-lct6-harness-ghqj,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120714-lct6-harness-ghqj,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120714-lct6-harness-ghqj,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120714-lct6-harness-ghqj
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:149)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.testInputElementCount(BigQueryStorageWriteApiSchemaTransformProviderIT.java:231)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testErrorCount (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.xml [took 5m 14s]
Raw output
java.lang.RuntimeException: java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
Workflow failed. Causes: S02:Create.Values/Read(CreateSource)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/element-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Construct failed rows and errors/Map+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/error-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Construct failed rows and errors/Map+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/error-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120720-s5de-harness-q77k,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120720-s5de-harness-q77k,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120720-s5de-harness-q77k,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120720-s5de-harness-q77k
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:149)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.testErrorCount(BigQueryStorageWriteApiSchemaTransformProviderIT.java:324)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)

Check warning on line 0 in org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

testWriteToDynamicDestinations (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT) failed

runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.xml [took 5m 6s]
Raw output
java.lang.RuntimeException: java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.getTableContainer(FakeDatasetService.java:289)
	at org.apache.beam.sdk.io.gcp.testing.FakeDatasetService.createWriteStream(FakeDatasetService.java:575)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.lambda$getOrCreateStreamName$0(StorageApiWriteUnshardedRecords.java:362)
	at org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers.createTableWrapper(CreateTableHelpers.java:70)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.getOrCreateStreamName(StorageApiWriteUnshardedRecords.java:357)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn$DestinationState.flush(StorageApiWriteUnshardedRecords.java:629)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.flushAll(StorageApiWriteUnshardedRecords.java:956)
	at org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords$WriteRecordsDoFn.finishBundle(StorageApiWriteUnshardedRecords.java:1113)
Workflow failed. Causes: S02:Create.Values/Read(CreateSource)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/element-count+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Error on failed inserts+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/post-write+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/Error on failed inserts+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120725-umnz-harness-p785,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120725-umnz-harness-p785,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120725-umnz-harness-p785,

      Root cause: Work item failed.
      Worker ID: bigquerystoragewriteapisc-03120725-umnz-harness-p785
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:149)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:101)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:56)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:324)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
	at org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT.testWriteToDynamicDestinations(BigQueryStorageWriteApiSchemaTransformProviderIT.java:208)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:323)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:112)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:40)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:60)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:52)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
	at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:113)
	at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:65)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
	at worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Test Results

28 skipped tests found

There are 28 skipped tests, see "Raw output" for the full list of skipped tests.
Raw output
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testDynamicDestinationsWithAutoShardingAndCopyJobs[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testDynamicDestinationsWithFixedShards[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testLoadWithAutoShardingAndCopyJobs[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testLoadWithFixedShards[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingValueSchemaUnknownTakeNull
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[3]
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testReadWithDataBoost

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Test Results

182 tests found

There are 182 tests, see "Raw output" for the full list of tests.
Raw output
org.apache.beam.sdk.io.gcp.bigquery.BigQueryClusteringIT ‑ testE2EBigQueryClusteringNoPartitionDynamicDestinations
org.apache.beam.sdk.io.gcp.bigquery.BigQueryClusteringIT ‑ testE2EBigQueryClusteringNoPartitionTableFunction
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOJsonIT ‑ testFileLoadWriteExportRead
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOJsonIT ‑ testLegacyStreamingWriteDefaultRead
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOJsonIT ‑ testQueryRead
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOJsonIT ‑ testStorageWriteRead
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageQueryIT ‑ testBigQueryStorageQuery1G
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageQueryIT ‑ testBigQueryStorageQueryWithErrorHandling1M
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageRead1GArrow
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageRead1GAvro
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageRead1MErrorHandlingArrow
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageRead1MErrorHandlingAvro
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageReadProjectionPushdown
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageReadWithArrow
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT ‑ testBigQueryStorageReadWithAvro
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageWriteIT ‑ testBigQueryStorageWrite3KProtoALOStreaming
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageWriteIT ‑ testBigQueryStorageWrite3KProtoStreaming
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageWriteIT ‑ testBigQueryStorageWrite3MProto
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageWriteIT ‑ testBigQueryStorageWrite3MProtoALO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryNestedRecordsIT ‑ testNestedRecords
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT ‑ runWriteTestTempTableAndDynamicDestination
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT ‑ testAllowFieldAddition
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT ‑ testAllowFieldRelaxation
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT ‑ testE2EBigQueryClustering
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT ‑ testE2EBigQueryClusteringDynamicDestinations
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT ‑ testE2EBigQueryClusteringTableFunction
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT ‑ testE2EBigQueryTimePartitioning
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT ‑ testLegacyQueryWithoutReshuffle
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT ‑ testNewTypesQueryWithReshuffle
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT ‑ testNewTypesQueryWithoutReshuffle
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT ‑ testStandardQueryWithoutCustom
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testDynamicDestinationsWithAutoShardingAndCopyJobs[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testDynamicDestinationsWithAutoShardingAndCopyJobs[1]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testDynamicDestinationsWithFixedShards[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testDynamicDestinationsWithFixedShards[1]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testLoadWithAutoShardingAndCopyJobs[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testLoadWithAutoShardingAndCopyJobs[1]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testLoadWithFixedShards[0]
org.apache.beam.sdk.io.gcp.bigquery.FileLoadsStreamingIT ‑ testLoadWithFixedShards[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiDirectWriteProtosIT ‑ testDirectWriteProtos[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiDirectWriteProtosIT ‑ testDirectWriteProtos[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiDirectWriteProtosIT ‑ testDirectWriteProtos[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiDirectWriteProtosIT ‑ testDirectWriteProtos[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkCreateIfNeededIT ‑ testCreateManyTables[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkCreateIfNeededIT ‑ testCreateManyTables[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingRequiredValueSchemaKnownTakeDefault
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingRequiredValueSchemaKnownTakeNull
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingRequiredValueSchemaUnknownTakeDefault
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingValueSchemaKnownTakeDefault
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingValueSchemaKnownTakeNull
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingValueSchemaUnknownTakeDefault
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkDefaultValuesIT ‑ testMissingValueSchemaUnknownTakeNull
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testInvalidRowCaughtByBigquery[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testInvalidRowCaughtByBigquery[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testInvalidRowCaughtByBigquery[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testInvalidRowCaughtByBigquery[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testSchemaMismatchCaughtByBeam[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testSchemaMismatchCaughtByBeam[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testSchemaMismatchCaughtByBeam[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT ‑ testSchemaMismatchCaughtByBeam[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkRowUpdateIT ‑ testCdc
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithAutoSchemaUpdate[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnceWithIgnoreUnknownValues[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testAtLeastOnce[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithAutoSchemaUpdate[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnceWithIgnoreUnknownValues[3]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[0]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[1]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[2]
org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkSchemaUpdateIT ‑ testExactlyOnce[3]
org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProtoIT ‑ testBaseTableRow
org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProtoIT ‑ testNestedRichTypesAndNull
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT ‑ testDirectRead
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT ‑ testDirectReadWithSelectedFieldsAndRowRestriction
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProviderIT ‑ testValidateConfig
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT ‑ testErrorCount
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT ‑ testFailedRows
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT ‑ testInputElementCount
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT ‑ testInvalidConfig
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT ‑ testSimpleWrite
org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProviderIT ‑ testWriteToDynamicDestinations
org.apache.beam.sdk.io.gcp.bigtable.BigtableReadIT ‑ testE2EBigtableRead
org.apache.beam.sdk.io.gcp.bigtable.BigtableReadIT ‑ testE2EBigtableSegmentRead
org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProviderIT ‑ testInvalidConfigs
org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProviderIT ‑ testRead
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT ‑ testE2EBigtableWrite
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT ‑ testE2EBigtableWriteWithEmptyMutationFailures
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT ‑ testE2EBigtableWriteWithEmptyRowFailures
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT ‑ testE2EBigtableWriteWithInvalidColumnFamilyFailures
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT ‑ testE2EBigtableWriteWithInvalidTimestampFailures
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT ‑ testE2EBigtableWriteWithOversizedQualifierFailures
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testDeleteCellsFromColumn
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testDeleteCellsFromColumnWithTimestampRange
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testDeleteColumnFamily
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testDeleteRow
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testInvalidConfigs
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testSetMutationNewColumn
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProviderIT ‑ testSetMutationsExistingColumn
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testComplexMutation
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testDeleteCell
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testDeleteColumnFamily
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testDeleteRow
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testLargeMutation
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testManyMutations
org.apache.beam.sdk.io.gcp.bigtable.changestreams.it.BigtableChangeStreamIT ‑ testReadBigtableChangeStream
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT ‑ testSplitQueryFnWithLargeDataset
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT ‑ testSplitQueryFnWithSmallDataset
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT ‑ testE2EV1Read
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT ‑ testE2EV1ReadWithGQLQueryWithLimit
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT ‑ testE2EV1ReadWithGQLQueryWithNoLimit
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT ‑ testDatastoreWriterFnWithDuplicatedEntities
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT ‑ testE2EV1Write
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT ‑ testE2EV1WriteWithLargeEntities
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ batchGet
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ batchWrite_partialFailureOutputsToDeadLetterQueue
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ listCollections
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ listDocuments
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ partitionQuery
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ runQuery
org.apache.beam.sdk.io.gcp.firestore.it.FirestoreV1IT ‑ write
org.apache.beam.sdk.io.gcp.healthcare.FhirIOLROIT ‑ test_FhirIO_deidentify
org.apache.beam.sdk.io.gcp.healthcare.FhirIOLROIT ‑ test_FhirIO_exportFhirResources_BigQuery
org.apache.beam.sdk.io.gcp.healthcare.FhirIOLROIT ‑ test_FhirIO_exportFhirResources_Gcs
org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverythingIT ‑ testFhirIOPatientEverything[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOSearchIT ‑ testFhirIOSearchWithGenericParameters[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOSearchIT ‑ testFhirIOSearch[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOSearchIT ‑ testFhirIOSearch_emptyResult[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_ExecuteBundle[DSTU2]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_ExecuteBundle[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_ExecuteBundle[STU3]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_ExecuteBundle_parseResponse[DSTU2]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_ExecuteBundle_parseResponse[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_ExecuteBundle_parseResponse[STU3]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_Import[DSTU2]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_Import[R4]
org.apache.beam.sdk.io.gcp.healthcare.FhirIOWriteIT ‑ testFhirIO_Import[STU3]
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOReadIT ‑ testHL7v2IO_ListHL7v2Messages
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOReadIT ‑ testHL7v2IO_ListHL7v2Messages_filtered
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOReadWriteIT ‑ testHL7v2IOE2E
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOReadWriteIT ‑ testHL7v2IOGetAllByReadParameterE2E
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOReadWriteIT ‑ testHL7v2IOGetAllE2E
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOWriteIT ‑ testHL7v2IOWrite
org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIT ‑ testGetSchema
org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIT ‑ testGetSchemaPath
org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteIT ‑ testBoundedWriteLargeMessage
org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteIT ‑ testBoundedWriteMessageWithAttributes
org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteIT ‑ testBoundedWriteSequence
org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteIT ‑ testBoundedWriteSmallMessage
org.apache.beam.sdk.io.gcp.pubsublite.ReadWriteIT ‑ testPubsubLiteWriteReadWithSchemaTransform
org.apache.beam.sdk.io.gcp.pubsublite.ReadWriteIT ‑ testReadWrite
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testQuery
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testQueryWithTimeoutError
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testQueryWithTimeoutErrorPG
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testRead
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testReadAllRecordsInDb
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testReadFailsBadSession
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testReadFailsBadTable
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testReadWithDataBoost
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT ‑ testReadWithTimeoutError
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT ‑ testFailFast
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT ‑ testPgFailFast
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT ‑ testReportFailures
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT ‑ testSequentialWrite
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT ‑ testWrite
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT ‑ testWriteViaSchemaTransform
org.apache.beam.sdk.io.gcp.storage.GcsKmsKeyIT ‑ testGcsWriteWithKmsKey
org.apache.beam.sdk.io.gcp.storage.GcsMatchIT ‑ testGcsMatchContinuously