spark-deep-learning
spark-deep-learning copied to clipboard
Deep Learning Pipelines for Apache Spark
Hi, I have installed sparkdl 1.5.0, when I run the example it cause excaption below, pls help me, thanks! Py4JJavaError: An error occurred while calling o162.analyze. : org.apache.spark.SparkException: Job aborted...
For example: [H2O.ai has documentation](https://h2o-release.s3.amazonaws.com/sparkling-water/master/284_nightly/doc/deployment/sw_google_cloud_dataproc.html). Here's what I have tried so far. I copied the requirements from this repo's `environment.yml` file. The cluster initialization step fails though. ``` gcloud dataproc...
Hey, I want to load the model from this [git](https://github.com/qubvel/efficientnet) And use it to predict labels on a very large image data set I have. The data set is updated...
Hello. I am using the sparkdl in a Spark cluster with YARN integrated with Docker. I am having problems related to user home directory when the codes fetch the preprocessed...
I know TF 2 is still in beta but is there a plan to support TF 2? If so, is there any timeline?
Problem solved after adding --packages argument ``` ./pyspark --master yarn --packages databricks:spark-deep-learning:1.5.0-spark2.4-s_2.11 ``` _Originally posted by @Liangmp in https://github.com/databricks/spark-deep-learning/issues/189#issuecomment-480008670_
I've tried saving an InceptionV3-based model using DeepImageFeaturizer with both MLeap and model.save receiving the following errors: p_model.serializeToBundle("jar:file:/tmp/Images/ParkingSpaces/Models/psinception.zip", tested_df) java.util.NoSuchElementException: key not found: com.databricks.sparkdl.DeepImageFeaturizer and p_model.save('/dbfs://FileStore/psinception.pb') # saves to the...
x in ----> 1 from sparkdl import KerasTransformer ImportError: cannot import name 'KerasTransformer'
Hi everyone, I am very new to spark-deep-learning just tried out the transfer learning tutorial. When I tried to use string as the label for the my the classes, I...
Hi, I am trying to save a model for future use, belowed are the code: ```python3 from pyspark.ml.image import ImageSchema from pyspark.sql.functions import * img_dir = "hdfs:///personalities" jobs_df = ImageSchema.readImages(img_dir...