Terry Kim

Results 101 comments of Terry Kim

I see. Can you try to set the environment variable `DOTNET_WORKER_DIR` to the remote path where worker binaries are installed?

I meant the path on the cluster. So if the worker binaries are installed under `/usr/bin`, you would set `DOTNET_WORKER_DIR` to `/usr/bin` on the driver side. Looking at your error...

After setting the env variable, does `/databricks/spark/python/pyspark/worker.py` still get launched? The worker path gets inserted into UDF on the driver side if the env is set, otherwise, it will look...

@zwitbaum before moving furhther, can you please provide the exact command to submit your spark application to azure databricks?

Can't you do the following? 1. Develop/debug your application locally. We have docs how to debug your application: https://github.com/dotnet/spark/blob/master/docs/developer-guide.md 2. Once you are ready to deploy your app, follow this...

Can you please elaborate what you are trying to do?

We could expose `LoggerServiceFactory` and `ILoggerService` as public, but let me double-confirm. Meanwhile, you can use https://github.com/aelij/IgnoresAccessChecksToGenerator to access internal classes if you are blocked.

Do you set it before executing any Spark related code? The following works for me. ```C# public void Run(string[] args) { LoggerServiceFactory.SetLoggerService(MyLogger.s_instance); SparkSession spark = SparkSession .Builder() .AppName(".NET Spark SQL...