Alexander Zwitbaum

Results 10 comments of Alexander Zwitbaum

@imback82 : the Microsoft.Spark.Worker is already deployed on the azure databricks and the .net application works properly, when it runs fully in the cluster. But what I mean is something...

@elvaliuliuliu: I have investigated the correct values for and set the PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON as follow, but it does not solve the problem: - PYSPARK_PYTHON=python3 - PYSPARK_DRIVER_PYTHON=python3 @imback82: what you...

If I change the DOTNET_WORKER_DIR on my local windows machine, the app writes in cmd, e.g.: "Using the environment variable to construct .NET worker path: /usr/bin/Microsoft.Spark.Worker\Microsoft.Spark.Worker.exe", what IMO makes no...

@imback82 If I set the DOTNET_WORKER_DIR to the azure worker path on **my local dev pc**, e.g. "setx DOTNET_WORKER_DIR /usr/bin" (or /usr/local/bin, or /usr/bin/Microsoft.Spark.Worker) the `/databricks/spark/python/pyspark/worker.py` still get launched. Must...

@elvaliuliuliu : the DOTNET_WORKER_DIR was already set to the local path of Microsoft.Spark.Worker before I have posted the bug (`setx DOTNET_WORKER_DIR "C:\bin\Microsoft.Spark.Worker-0.7.0"`). Unfortunately, it does not help to solve the...

Once again to describe our requirements for a .net spark software development: ![grafik](https://user-images.githubusercontent.com/6919145/72175008-52e7d180-33db-11ea-959f-98b9853fcaa3.png) We want to develop our .net spark application on a local dev PC with Visual Studio and...

@imback82 : we develop and debug our application on the windows local machine exactly as described in the link you've provided: `%SPARK_HOME%\bin\spark-submit --class org.apache.spark.deploy.dotnet.DotnetRunner --master local microsoft-spark-2.4.x-0.7.0.jar debug` Because we...

Any progress on this?