Indy
Indy
@imback82 thanks a lot for your reply. The idea is to run .Net for Apache Spark in the container and debug the C# app from the outside, as described in...
@imback82 I'm really sorry for the delay. However, it is still on my todo list.
@imback82 Just built a test image with the two changes and that seems to work fine: > [spark/src/csharp/Microsoft.Spark/Network/DefaultSocketWrapper.cs](spark/src/csharp/Microsoft.Spark/Network/DefaultSocketWrapper.cs) Line 32 ` _innerSocket.Bind(new IPEndPoint(IPAddress.Any, 0));` > [spark/src/scala/microsoft-spark-2.4.x/src/main/scala/org/apache/spark/api/dotnet/DotnetBackend.scala](spark/src/scala/microsoft-spark-2.4.x/src/main/scala/org/apache/spark/api/dotnet/DotnetBackend.scala) Line 60 `channelFuture =...
> For work, I'd like to implement a similar scenario to integration test our Spark jobs. To simplify the development setup, we're running all our components (DB servers, Redis caches...
> @indy-3rdman / @moredatapls So, what's the recommended way of doing this for the docker container? Would `socat` be enough or allowing users to override binding address/port will be a...
@MichaelSimons, thank you very much for reviewing the Dockerfile and your valuable comments. I've just updated the PR to reflect the changes required for dotnet.spark version 1.0.0. This should also...
To demonstrate how the dev image could be used, a quick tutorial on how two [build .NET for Apache Spark with VS Core in a browser](https://3rdman.de/2020/10/build-net-for-apache-spark-with-vs-code-in-a-browser/)
> When I attempted to run `build.sh`, I was able to successfully finish building the images. However, I encountered this: > >  > > Is this normal? I've updated...
> The initial commands in the readme did not work for me after building the container - latest wasn't tagged, but when I switched to the actual tag from `docker...
@strtdusty, I've added a new image type [runtime-hadoop](https://github.com/indy-3rdman/docker-dotnet-spark/tree/develop/images/runtime-hadoop) that comes with a full installation of hadoop. You should be able to try it out via `docker pull 3rdman/dotnet-spark:2.1.1-3.2.1-3.2.3-hadoop` for example....