Skip to content

Merge pull request #32791 from liferoad/beam-summit-2024 #29920

Merge pull request #32791 from liferoad/beam-summit-2024

Merge pull request #32791 from liferoad/beam-summit-2024 #29920

GitHub Actions / Test Results failed Oct 16, 2024 in 0s

1 fail, 22 skipped, 2 pass in 1m 32s

25 tests    2 ✅  1m 32s ⏱️
 1 suites  22 💤
 1 files     1 ❌

Results for commit 1487cf5.

Annotations

Check warning on line 0 in apache_beam.transforms.managed_iceberg_it_test.ManagedIcebergIT

See this annotation in the file changed.

@github-actions github-actions / Test Results

test_write_read_pipeline (apache_beam.transforms.managed_iceberg_it_test.ManagedIcebergIT) failed

sdks/python/pytest_ioCrossLanguage.xml [took 8s]
Raw output
RuntimeError: java.lang.IllegalArgumentException: unable to serialize SchemaCoder<Schema: Fields:
Field{name=tableIdentifierString, description=, type=STRING NOT NULL, options={{}}}
Field{name=serializableDataFile, description=, type=ROW<fileFormat STRING NOT NULL, recordCount INT64 NOT NULL, fileSizeInBytes INT64 NOT NULL, splitOffsets ARRAY<INT64 NOT NULL>, keyMetadata BYTES, partitionPath STRING NOT NULL, partitionSpecId INT32 NOT NULL, columnSizes MAP<INT32 NOT NULL, INT64 NOT NULL>, valueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nullValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nanValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, upperBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, lowerBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, path STRING NOT NULL> NOT NULL, options={{}}}
Encoding positions:
{tableIdentifierString=0, serializableDataFile=1}
Options:{{}}UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea  UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$hT2DejGA@50b7af20
	at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
	at org.apache.beam.sdk.util.construction.CoderTranslation.toCustomCoder(CoderTranslation.java:158)
	at org.apache.beam.sdk.util.construction.CoderTranslation.toProto(CoderTranslation.java:118)
	at org.apache.beam.sdk.util.construction.SdkComponents.registerCoder(SdkComponents.java:284)
	at org.apache.beam.sdk.util.construction.PCollectionTranslation.toProto(PCollectionTranslation.java:35)
	at org.apache.beam.sdk.util.construction.SdkComponents.registerPCollection(SdkComponents.java:239)
	at org.apache.beam.sdk.util.construction.PTransformTranslation.translateAppliedPTransform(PTransformTranslation.java:610)
	at org.apache.beam.sdk.util.construction.ParDoTranslation$ParDoTranslator.translate(ParDoTranslation.java:184)
	at org.apache.beam.sdk.util.construction.PTransformTranslation.toProto(PTransformTranslation.java:277)
	at org.apache.beam.sdk.util.construction.SdkComponents.registerPTransform(SdkComponents.java:183)
	at org.apache.beam.sdk.util.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:96)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:477)
	at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:68)
	at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
	at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:52)
	at org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:701)
	at org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:762)
	at org.apache.beam.model.expansion.v1.ExpansionServiceGrpc$MethodHandlers.invoke(ExpansionServiceGrpc.java:306)
	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)
	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:351)
	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:861)
	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.NotSerializableException: org.apache.beam.sdk.schemas.utils.ByteBuddyUtils$TransformingMap
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
	at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
	... 34 more


java.lang.IllegalArgumentException: unable to serialize SchemaCoder<Schema: Fields:
Field{name=tableIdentifierString, description=, type=STRING NOT NULL, options={{}}}
Field{name=serializableDataFile, description=, type=ROW<fileFormat STRING NOT NULL, recordCount INT64 NOT NULL, fileSizeInBytes INT64 NOT NULL, splitOffsets ARRAY<INT64 NOT NULL>, keyMetadata BYTES, partitionPath STRING NOT NULL, partitionSpecId INT32 NOT NULL, columnSizes MAP<INT32 NOT NULL, INT64 NOT NULL>, valueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nullValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nanValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, upperBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, lowerBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, path STRING NOT NULL> NOT NULL, options={{}}}
Encoding positions:
{tableIdentifierString=0, serializableDataFile=1}
Options:{{}}UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea  UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$hT2DejGA@50b7af20
self = <apache_beam.transforms.managed_iceberg_it_test.ManagedIcebergIT testMethod=test_write_read_pipeline>

    def test_write_read_pipeline(self):
      iceberg_config = {
          "table": "test.write_read",
          "catalog_name": "default",
          "catalog_properties": {
              "type": "hadoop",
              "warehouse": f"file://{self.warehouse_path}",
          }
      }
    
      rows = [self._create_row(i) for i in range(100)]
      expected_dicts = [row.as_dict() for row in rows]
    
      with beam.Pipeline() as write_pipeline:
        _ = (
>           write_pipeline
            | beam.Create(rows)
            | beam.managed.Write(beam.managed.ICEBERG, config=iceberg_config))

apache_beam/transforms/managed_iceberg_it_test.py:73: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pvalue.py:138: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:754: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/runner.py:191: in apply
    return self.apply_PTransform(transform, input, options)
apache_beam/runners/runner.py:195: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/managed.py:156: in expand
    return input | SchemaAwareExternalTransform(
apache_beam/pvalue.py:138: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:754: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/runner.py:191: in apply
    return self.apply_PTransform(transform, input, options)
apache_beam/runners/runner.py:195: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/external.py:429: in expand
    return pcolls | self._payload_builder.identifier() >> ExternalTransform(
apache_beam/pvalue.py:138: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:681: in apply
    return self.apply(
apache_beam/pipeline.py:692: in apply
    return self.apply(transform, pvalueish)
apache_beam/pipeline.py:754: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/runner.py:191: in apply
    return self.apply_PTransform(transform, input, options)
apache_beam/runners/runner.py:195: in apply_PTransform
    return transform.expand(input)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <ExternalTransform(PTransform) label=[ExternalTransform(beam:expansion:payload:schematransform:v1)] at 0x7e838c060ee0>
pvalueish = <PCollection[Create/Map(decode).None] at 0x7e838c176970>

    def expand(self, pvalueish):
      # type: (pvalue.PCollection) -> pvalue.PCollection
      if isinstance(pvalueish, pvalue.PBegin):
        self._inputs = {}
      elif isinstance(pvalueish, (list, tuple)):
        self._inputs = {str(ix): pvalue for ix, pvalue in enumerate(pvalueish)}
      elif isinstance(pvalueish, dict):
        self._inputs = pvalueish
      else:
        self._inputs = {'input': pvalueish}
      pipeline = (
          next(iter(self._inputs.values())).pipeline
          if self._inputs else pvalueish.pipeline)
      context = pipeline_context.PipelineContext(
          component_id_map=pipeline.component_id_map)
      transform_proto = beam_runner_api_pb2.PTransform(
          unique_name=pipeline._current_transform().full_label,
          spec=beam_runner_api_pb2.FunctionSpec(
              urn=self._urn, payload=self._payload))
      for tag, pcoll in self._inputs.items():
        transform_proto.inputs[tag] = context.pcollections.get_id(pcoll)
        # Conversion to/from proto assumes producers.
        # TODO: Possibly loosen this.
        context.transforms.put_proto(
            '%s_%s' % (self._IMPULSE_PREFIX, tag),
            beam_runner_api_pb2.PTransform(
                unique_name='%s_%s' % (self._IMPULSE_PREFIX, tag),
                spec=beam_runner_api_pb2.FunctionSpec(
                    urn=common_urns.primitives.IMPULSE.urn),
                outputs={'out': transform_proto.inputs[tag]}))
      output_coders = None
      if self._type_hints.output_types:
        if self._type_hints.output_types[0]:
          output_coders = dict(
              (str(k), context.coder_id_from_element_type(v))
              for (k, v) in enumerate(self._type_hints.output_types[0]))
        elif self._type_hints.output_types[1]:
          output_coders = {
              k: context.coder_id_from_element_type(v)
              for (k, v) in self._type_hints.output_types[1].items()
          }
      components = context.to_runner_api()
      request = beam_expansion_api_pb2.ExpansionRequest(
          components=components,
          namespace=self._external_namespace,
          transform=transform_proto,
          output_coder_requests=output_coders,
          pipeline_options=pipeline._options.to_runner_api())
    
      expansion_service = _maybe_use_transform_service(
          self._expansion_service, pipeline.options)
    
      with ExternalTransform.service(expansion_service) as service:
        response = service.Expand(request)
        if response.error:
>         raise RuntimeError(_sanitize_java_traceback(response.error))
E         RuntimeError: java.lang.IllegalArgumentException: unable to serialize SchemaCoder<Schema: Fields:
E         Field{name=tableIdentifierString, description=, type=STRING NOT NULL, options={{}}}
E         Field{name=serializableDataFile, description=, type=ROW<fileFormat STRING NOT NULL, recordCount INT64 NOT NULL, fileSizeInBytes INT64 NOT NULL, splitOffsets ARRAY<INT64 NOT NULL>, keyMetadata BYTES, partitionPath STRING NOT NULL, partitionSpecId INT32 NOT NULL, columnSizes MAP<INT32 NOT NULL, INT64 NOT NULL>, valueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nullValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nanValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, upperBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, lowerBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, path STRING NOT NULL> NOT NULL, options={{}}}
E         Encoding positions:
E         {tableIdentifierString=0, serializableDataFile=1}
E         Options:{{}}UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea  UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$hT2DejGA@50b7af20
E         	at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
E         	at org.apache.beam.sdk.util.construction.CoderTranslation.toCustomCoder(CoderTranslation.java:158)
E         	at org.apache.beam.sdk.util.construction.CoderTranslation.toProto(CoderTranslation.java:118)
E         	at org.apache.beam.sdk.util.construction.SdkComponents.registerCoder(SdkComponents.java:284)
E         	at org.apache.beam.sdk.util.construction.PCollectionTranslation.toProto(PCollectionTranslation.java:35)
E         	at org.apache.beam.sdk.util.construction.SdkComponents.registerPCollection(SdkComponents.java:239)
E         	at org.apache.beam.sdk.util.construction.PTransformTranslation.translateAppliedPTransform(PTransformTranslation.java:610)
E         	at org.apache.beam.sdk.util.construction.ParDoTranslation$ParDoTranslator.translate(ParDoTranslation.java:184)
E         	at org.apache.beam.sdk.util.construction.PTransformTranslation.toProto(PTransformTranslation.java:277)
E         	at org.apache.beam.sdk.util.construction.SdkComponents.registerPTransform(SdkComponents.java:183)
E         	at org.apache.beam.sdk.util.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:96)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
E         	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
E         	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
E         	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:477)
E         	at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:68)
E         	at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
E         	at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:52)
E         	at org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:701)
E         	at org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:762)
E         	at org.apache.beam.model.expansion.v1.ExpansionServiceGrpc$MethodHandlers.invoke(ExpansionServiceGrpc.java:306)
E         	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)
E         	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:351)
E         	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:861)
E         	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
E         	at org.apache.beam.vendor.grpc.v1p60p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
E         	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
E         	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
E         	at java.lang.Thread.run(Thread.java:750)
E         Caused by: java.io.NotSerializableException: org.apache.beam.sdk.schemas.utils.ByteBuddyUtils$TransformingMap
E         	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
E         	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
E         	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
E         	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
E         	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
E         	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
E         	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
E         	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
E         	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
E         	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
E         	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
E         	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
E         	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
E         	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
E         	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
E         	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
E         	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
E         	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
E         	at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
E         	... 34 more
E         
E         
E         java.lang.IllegalArgumentException: unable to serialize SchemaCoder<Schema: Fields:
E         Field{name=tableIdentifierString, description=, type=STRING NOT NULL, options={{}}}
E         Field{name=serializableDataFile, description=, type=ROW<fileFormat STRING NOT NULL, recordCount INT64 NOT NULL, fileSizeInBytes INT64 NOT NULL, splitOffsets ARRAY<INT64 NOT NULL>, keyMetadata BYTES, partitionPath STRING NOT NULL, partitionSpecId INT32 NOT NULL, columnSizes MAP<INT32 NOT NULL, INT64 NOT NULL>, valueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nullValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nanValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, upperBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, lowerBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, path STRING NOT NULL> NOT NULL, options={{}}}
E         Encoding positions:
E         {tableIdentifierString=0, serializableDataFile=1}
E         Options:{{}}UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea  UUID: 9bee3d1d-c675-4960-86f0-c743c0328bea delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$hT2DejGA@50b7af20

apache_beam/transforms/external.py:754: RuntimeError

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Test Results

22 skipped tests found

There are 22 skipped tests, see "Raw output" for the full list of skipped tests.
Raw output
apache_beam.examples.ml_transform.ml_transform_it_test
apache_beam.examples.snippets.transforms.elementwise.mltransform_test
apache_beam.examples.snippets.transforms.elementwise.runinference_test
apache_beam.io.external.xlang_kafkaio_it_test.CrossLanguageKafkaIOTest ‑ test_hosted_kafkaio_null_key
apache_beam.io.external.xlang_kafkaio_it_test.CrossLanguageKafkaIOTest ‑ test_hosted_kafkaio_populated_key
apache_beam.ml.inference.huggingface_inference_it_test
apache_beam.ml.inference.huggingface_inference_test
apache_beam.ml.inference.onnx_inference_test
apache_beam.ml.inference.pytorch_inference_test
apache_beam.ml.inference.tensorflow_inference_test
apache_beam.ml.inference.tensorrt_inference_test
apache_beam.ml.inference.vertex_ai_inference_it_test
apache_beam.ml.inference.xgboost_inference_test
apache_beam.ml.transforms.handlers_test
apache_beam.ml.transforms.tft_test
apache_beam.runners.dask.dask_runner_test
apache_beam.testing.analyzers.perf_analysis_test
apache_beam.testing.benchmarks.cloudml.cloudml_benchmark_test
apache_beam.transforms.enrichment_handlers.feast_feature_store_it_test
apache_beam.transforms.enrichment_handlers.feast_feature_store_test
apache_beam.typehints.pytorch_type_compatibility_test
apache_beam.yaml.yaml_ml_test

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Test Results

25 tests found

There are 25 tests, see "Raw output" for the full list of tests.
Raw output
apache_beam.examples.ml_transform.ml_transform_it_test
apache_beam.examples.snippets.transforms.elementwise.mltransform_test
apache_beam.examples.snippets.transforms.elementwise.runinference_test
apache_beam.io.external.xlang_kafkaio_it_test.CrossLanguageKafkaIOTest ‑ test_hosted_kafkaio_null_key
apache_beam.io.external.xlang_kafkaio_it_test.CrossLanguageKafkaIOTest ‑ test_hosted_kafkaio_populated_key
apache_beam.ml.inference.huggingface_inference_it_test
apache_beam.ml.inference.huggingface_inference_test
apache_beam.ml.inference.onnx_inference_test
apache_beam.ml.inference.pytorch_inference_test
apache_beam.ml.inference.tensorflow_inference_test
apache_beam.ml.inference.tensorrt_inference_test
apache_beam.ml.inference.vertex_ai_inference_it_test
apache_beam.ml.inference.xgboost_inference_test
apache_beam.ml.transforms.handlers_test
apache_beam.ml.transforms.tft_test
apache_beam.runners.dask.dask_runner_test
apache_beam.testing.analyzers.perf_analysis_test
apache_beam.testing.benchmarks.cloudml.cloudml_benchmark_test
apache_beam.transforms.enrichment_handlers.feast_feature_store_it_test
apache_beam.transforms.enrichment_handlers.feast_feature_store_test
apache_beam.transforms.external_transform_provider_it_test.ExternalTransformProviderIT ‑ test_generate_sequence_signature_and_doc
apache_beam.transforms.external_transform_provider_it_test.ExternalTransformProviderIT ‑ test_run_generate_sequence
apache_beam.transforms.managed_iceberg_it_test.ManagedIcebergIT ‑ test_write_read_pipeline
apache_beam.typehints.pytorch_type_compatibility_test
apache_beam.yaml.yaml_ml_test