Chandan Bhattad
Chandan Bhattad
Looks useful, why not merge after resolving conflicts?
nope, nothing yet
can anyone please share the docker-compose for kafka? Even after renaming to `kafka-cluster` or `kafkaa` and then running on kubernetes, I am getting the same error shared above. `org.apache.kafka.common.config.ConfigException: Invalid...
never mind, I solved it by adding KAFKA_PORT env variable
hey - is this done?
I am working on the fix and will soon raise a PR
Link to PR: https://github.com/databricks/spark-redshift/pull/440
@JoshRosen @marmbrus @brkyvz Please take a look
can we merge this please @JoshRosen
@ibnipun10 If you cast the null column to a spark sql supported type. It solves the issue. Example: `lit(null).cast(DoubleType))` in scala and `lit(None).cast(DoubleType())` in python