Lukasz Jastrzebski
Lukasz Jastrzebski
Please see reference: [https://medium.com/@junwan01/a-java-client-for-tensorflow-serving-grpc-api-d37b5ad747aa](https://medium.com/@junwan01/a-java-client-for-tensorflow-serving-grpc-api-d37b5ad747aa)
As far as I checked the code of https://github.com/AsyncHttpClient/async-http-client in case of http protocol following things needs to be done: - rewrite request line (to include original full address of...
Sorry didn't have time to work on this
I was wondering if anyone has tried it and knows any workaround
@JoshRosen hit the same issue after upgrading from Spark 2.0.2 to Spark 2.1.0 our pipeline started throwing exceptions with the same cause ``` Caused by: java.lang.AbstractMethodError: org.apache.spark.sql.execution.datasources.OutputWriterFactory.getFileExtension(Lorg/apache/hadoop/mapreduce/TaskAttemptContext;)Ljava/lang/String; ``` We are...
found the root cause, spark 2.1 added new method to the interface: `org.apache.spark.sql.execution.datasources.OutputWriterFactory#def getFileExtension(context: TaskAttemptContext): String` which is not implemented in spark-avro, hence AbstractMethodError
@apurva-sharma you can build this patch: https://github.com/databricks/spark-avro/pull/206 and replace spark-avro dependency with this custom version, at least it worked for us
@apurva-sharma +1
I did try this one with sbteclipse 2.5.0 like this in build.sbt: ``` EclipseKeys.configurations := Set(Compile, Test, Provided) ``` and the following result: ``` [error] Could not create Eclipse project...
@estebandonato Similar story for 4.0.0-RC2 ``` [error] Could not create Eclipse project files: [error] Undefined setting 'unmanagedSourceDirectories'! [error] Undefined setting 'unmanagedResourceDirectories'! [error] Undefined setting 'managedSourceDirectories'! [error] Undefined setting 'managedResourceDirectories'! [error]...