beam
beam copied to clipboard
Apache Beam is a unified programming model for Batch and Streaming data processing.
- Added support for displaying the full tranform name with any stack-trace for Samza Runner - Added support for decorating stack traces with additional debugging information by adding a SamzaExceptionListener...
Part of #19815 This adds support for encoding the schema type FLOAT to Python's RowCoder. This is added in a new coder, `SinglePrecisionFloatCoder`, designed to be compatible with Java's `FloatCoder`....
[https://github.com/apache/beam/blob/fd8546355523f67eaddc22249606fdb982fe4938/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ConsumerSpEL.java#L180-L198](https://github.com/apache/beam/blob/fd8546355523f67eaddc22249606fdb982fe4938/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ConsumerSpEL.java#L180-L198) Right now the 'startReadTime' config for KafkaIO.Read looks up an offset in every topic partition that is newer or equal to that timestamp. The problem is that if we...
### What happened? Java uses `FloatCoder` for encoding the FLOAT type, which encodes as a 4 byte single-precision floating point number. Go currently shunts FLOAT through the same path as...
### What happened? While performing load tests for the FhirIO [Import](https://github.com/lnogueir/beam/blob/a7abdc2771098576dacae7c71a8ebf32b77c2ed2/sdks/go/pkg/beam/io/fhirio/import.go#L220) transform I implemented in #22460 for the Go SDK, I noticed that after the resources were successfully imported, the...
**Please** add a meaningful description for your change here Integration of TensorRT into Apache Beam. ------------------------ Thank you for your contribution! Follow this checklist to help us incorporate your contribution...
This PR address #21414 with a KafkaSchemaTransformReadConfiguration implementation. It's design goals are to work with a KafkaSchemaTransformReadProvider that extends a [TypedSchemaTransformProvider](https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/transforms/TypedSchemaTransformProvider.java). Subsequent to this PR's approval/merge, the plan is to...
Add streaming test for BigQuery Write API ------------------------ Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily: - [ ] [**Choose reviewer(s)**](https://beam.apache.org/contribute/#make-your-change)...
### What happened? **Beam: 2.40** While using custom `DynamicDestination` in `BigQueryIO.Write` got the following exception: ``` java.lang.ClassCastException: org.apache.beam.sdk.io.gcp.bigquery.TableDestination cannot be cast to java.lang.String com.king.da.destinations.KingAppDestinations.getTable(KingAppDestinations.java:17) org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination.processElement(UpdateSchemaDestination.java:131) org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination$DoFnInvoker.invokeProcessElement(Unknown Source) ``` Find below...
**Please** add a meaningful description for your change here ------------------------ Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily: - [ ]...