spark-deep-learning icon indicating copy to clipboard operation
spark-deep-learning copied to clipboard

Deep Learning Pipelines for Apache Spark

Results 87 spark-deep-learning issues
Sort by recently updated
recently updated
newest added

--------------------------------------------------------------------------- AttributeError Traceback (most recent call last) in ----> 1 from sparkdl import TFTransformer /anaconda3/lib/python3.6/site-packages/sparkdl/__init__.py in 20 from sparkdl.transformers.utils import imageInputPlaceholder 21 from sparkdl.estimators.text_estimator import TextEstimator, KafkaMockServer ---> 22 from...

With the depreciation of the keras model interfaces. The README says to use PANDAS UDF But I cannot find a reliable article on that. Does anyone have any suggestions?

Bumps [tensorflow](https://github.com/tensorflow/tensorflow) from 1.13.1 to 1.15.2. Release notes *Sourced from [tensorflow's releases](https://github.com/tensorflow/tensorflow/releases).* > ## TensorFlow 1.15.2 > # Release 1.15.2 > > ## Bug Fixes and Other Changes > *...

dependencies

Hi , does this library support scala spark? Thanks

I am using pyspark shell using below command on EMR cluster (Spark 2.1.1 and tried Python versions 2.7.12 and Anacoda Python 3.5.4) pyspark --master local[2] --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2.11,databricks:tensorframes:0.2.9-s_2.11 --jars /home/hadoop/scala-logging-slf4j_2.11-2.1.2.jar Trying...

Hello guys, I want to apply deep learning to load the images by using **pysparkdl** and **google colab.** but after running the following commands in colab, I take an error....

I am using a local system to run sparkdl, installed sparkdl with the help of pip but while trying run `import sparkdl` This error was thrown out ``` Using TensorFlow...

sparkdl 0.2.2 available in pip returns the following error when importing either DeepImageFeaturizer or KerasTransformer. It seems there is an incompatibility on the use of KerasImageFileTransformer since keras_image.py has been...

Found docker build failed due to SPARK version 2.4.4 is no longer available. I guess this should address the issue ENV **SPARK_VERSION 2.4.5**

Fixes the following Exception that occurs when running ```odf = transformer.transform(sdf)``` with Apache Spark 2.3.0, Scala 2.11 from Databricks ```java.lang.Exception: The type of node 'input_tensor' (ScalarDoubleType) is not compatible with...