pulsar-spark icon indicating copy to clipboard operation
pulsar-spark copied to clipboard

Spark Connector to read and write with Pulsar

Results 39 pulsar-spark issues
Sort by recently updated
recently updated
newest added

### Description Pulsar message read into a structured stream is incorrectly deserialized if a `string` field has length over 127 ### Environment `scala 2.11` `spark 2.4.7` `pulsar-spark_2.11 2.4.5` ### Protobuf...

type/bug

**Describe the bug** The spark-submit of a spark job written with the connector fails if a DataFrame is not created prior to calling readStream and writeStream. **To Reproduce** Steps to...

type/bug
triage/week-36

**Describe the bug** Randomly (or at least I havent found the exact moment when it happens), when the Spark application receives a message from Pulsar I get this exception :...

type/bug
workflow::todo
triage/week-16

![4011579595142_ pic_hd](https://user-images.githubusercontent.com/1387718/72792975-cfe53780-3c74-11ea-8e3f-81a9efe26999.jpg)

type/bug
workflow::todo
triage/week-16

**Is your feature request related to a problem? Please describe.** When the Spark Job starts, currently it uses random subscription names for consuming data. This makes it harder to log...

type/feature
workflow::todo
triage/week-15

Hi, I'm trying to use startingOffsets on my topic, but messages id of my topic are instance of BatchMessageIdImpl (with batchIndex set). I have tried with a topic where messages...

workflow::todo
triage/week-15

**Describe the bug** I just write number 1 to 10 into a Pulsar Topic thru spark-pulsar connector . When I use Pulsar SQL to read the 10-number, the exception info...

type/bug
workflow::backlog
triage/week-3

*Motivation* Pulsar support schema evolution. Each pulsar message is associated with a schema version. Pulsar Spark Integration should be able to handle schema properly. Related Issue: #2

triage/week-3
workflow::triaged

We are trying to use this adapter with pyspark. One of the challenges we are facing is setting custom subscription name for the individual readers that are created. The current...

[ Disclaimer - I am fairly new with Pulsar so I might not understand all the pulsar details but I have been using spark from a while now. ] I...

type/bug