beam
beam copied to clipboard
Apache Beam is a unified programming model for Batch and Streaming data processing.
Adds a build & stage to RC creation to release signed hashed and zipped prism binaries to a release. Currently requires a release with the RC tag to exist, actually...
Backlink to global prism tracking issue: #29650 ### What needs to happen? Umbrella task for tracking PRs to improve prism and get it to support non-go SDKs. More targeted tasks...
Adding workflows for Iceberg integration and load tests. Integration test already exists. Load test will be added in #31392
Adding a load test for IcebergIO Separating integration test (from #31220) into its own suite
### What happened? When running a streaming job (with DirectRunner locally and with DataflowRunner on GCP) that uses the apache_beam.io.kafka.ReadFromKafka connector without `max_num_records`, the job does not process any information...
Bringing back the changes reverted in #31109 More details in #31061 and #31353
### What happened? Apache Beam version: 2.55.0 Java version: 17 When attempting to read messages from RabbitMQ with List type headers (streaming dataflow), a NotSerializableException error is being returned. Example:...
Closes #31112 ------------------------ Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily: - [ ] Mention the appropriate issue in your description...
fixes #31360 Add withoutValidation option to Bigtable change stream IO. This aligns with withoutValidation of Read and Write IO. This allows users to create a pipeline without validating the correctness...