-
Notifications
You must be signed in to change notification settings - Fork 315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE REQUEST]: Dotnet Backend IPAddress #419
Comments
@indy-3rdman thanks for the suggestion. Note that I didn't fully get your scenario, but are you trying to run both C# app and spark in the container? Or do you plan to have spark running as a debug mode in the container and run the C# app outside? Can you try to update spark/src/scala/microsoft-spark-2.4.x/src/main/scala/org/apache/spark/api/dotnet/DotnetBackend.scala Line 60 in ee95ca2
and see it would work for your scenario, and let me know? |
@imback82 thanks a lot for your reply. The idea is to run .Net for Apache Spark in the container and debug the C# app from the outside, as described in this post https://3rdman.de/2019/10/debug-net-for-apache-spark-with-visual-studio-and-docker. |
Nice blog post. So, you already have this running with the container, so how would specifying the IP address help further? By the way, were you able to get your scenario working by changing above files? (You can hard-code the IP and see if it works). |
@indy-3rdman Any update on this? |
@imback82 I'm really sorry for the delay. However, it is still on my todo list. |
@imback82 Just built a test image with the two changes and that seems to work fine:
Not sure where this leaves us, though, as this probably should be configurable, shouldn't it? |
For work, I'd like to implement a similar scenario to integration test our Spark jobs. To simplify the development setup, we're running all our components (DB servers, Redis caches etc.) in a local Docker compose setup. I'd like to integrate the Dotnet Backend into this setup as well so that we can easily run the tests in the developer's IDEs as well as in our Azure pipelines without having to go through the hassle of setting up Spark (and all the correct Java versions etc.) locally. But this is not possible since the backend only listens on the loopback address. I documented this setup in a repository: https://github.com/moredatapls/dotnet-spark-docker It also works for me when I implement the changes suggested by @imback82 above. |
@moredatapls, |
@indy-3rdman / @moredatapls So, what's the recommended way of doing this for the docker container? Would |
@imback82, I think that allowing the user to specify the binding address (including Any/0.0.0.0 to listen on all available addresses) would be much nicer than the current |
@Niharikadutta Do you want to work on this? |
Yes I'll work on this |
Thanks. I am assigning this to @Niharikadutta. |
@indy-3rdman I agree, letting the user specify the address seems like the much nicer solution. Happy to see the progress here :) |
@indy-3rdman/@moredatapls Thanks for your inputs. Please feel free to take a look at #537 if you want. Thanks! |
Hello. Any updates due to this feature request and #537 PR? We have some restrictions for Environment's setup(dev and test) and this feature will be very helpful. |
@Niharikadutta Hello! Any progress here? |
Hi @shamahov , I am looking into this. We need to figure out the security implications, I'll keep you updated, thanks. |
We need this as well. It's super valuable if you are writing tests in Visual Studio on Windows because you can just bind to the WSL2 address with dynamic port forwarding, however that only works if the address is bound to 0.0.0.0. |
Is your feature request related to a problem? Please describe.
Currently it is possible to specify a custom port for the Dotnet Backend. However it will only listen on the loopback address.
Describe the solution you'd like
Would it be possible to assign a specify the IP-address as well (e.g. 0.0.0.0 for all IPv4 addresses)?
Additional context
This would make things easier, if using .NET for Apache Spark with a docker container (e.g. https://hub.docker.com/r/3rdman/dotnet-spark) for example.
The text was updated successfully, but these errors were encountered: