pulsar-spark icon indicating copy to clipboard operation
pulsar-spark copied to clipboard

Spark Connector to read and write with Pulsar

Results 39 pulsar-spark issues
Sort by recently updated
recently updated
newest added

**Describe the bug** when write data in pulsar,,use the follow code: ```scala def main(args: Array[String]): Unit = { val sparkSession = SparkSession.builder().appName("test-pulsar").master("local").getOrCreate() val startingOffsets = topicOffsets(Map("persistent://public/default/my-topic" -> MessageId.fromByteArray(Array(8,33,16,8)))) import sparkSession.implicits._...

type/bug
compute/data-processing

**Describe the bug** While we're testing our integration with Pulsar Spark Connector, we found out that while shutting down `PulsarClientImpl` doesn't shutdown `EventLoopGroup` properly (which is seems to be addressed...

type/bug

**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when \[...] **Describe the solution you'd...

type/feature
compute/data-processing

**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when \[...] **Describe the solution you'd...

type/feature

**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when \[...] **Describe the solution you'd...

type/feature

**Describe the bug** A running Spark application stopped working after a microbatch execution failed (HTTP 500 error from Pulsar), after the failure, the Spark Application seems to try to reconnect...

type/bug
workflow::todo
triage/week-16

Here is the template https://streamnative.slab.com/posts/repository-release-template-prb0lvyt If you have any questions about the release doc template, please **leave your comment** on it, **do not edit it directly**.

triage/week-50
compute/data-processing

If we have a considerably large backlog for one or more topics that are read by the connector, then using the current implementation we cannot really place an upper limit...

We publish artifacts into maven central now instead of bintary. So remove the bintary related documentation.

We already have spark structured streaming based pulsar connector. However I am unable to find any DStream based pulsar connector for Spark streaming. Is it possible today with streamnative? Is...

type/feature