You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'd like to be able to run spark connect without needing to write any python code. If I want to run spark via Java or Scala, I still need to write python code to start the connect server.
Describe the solution you'd like
# download the binary# this is just an example. Most of the times there's an install script that selects the correct binary to install based off the arch. # it could also just be hosted on github artifacts.
curl https://eventualcomputing.com/install-spark-connect.sh | sh
# start the server
./spark-connect --port 55555
hmmm would we bundling python in? how do you perceive this working? or do you want to remove all python dependencies? the issue that I was given for removing all python dependencies is it would not work with ray.
hmmm would we bundling python in? how do you perceive this working? or do you want to remove all python dependencies? the issue that I was given for removing all python dependencies is it would not work with ray.
The user will still need python installed on their system, I'm pyo3 supports dynamically linking to python at runtime. Right now we use the extension-module feature that essentially treats it as a python module (.so). If we compile a normal rust binary without that feature enabled, it should link to the system's python at runtime.
Is your feature request related to a problem?
I'd like to be able to run spark connect without needing to write any python code. If I want to run spark via Java or Scala, I still need to write python code to start the connect server.
Describe the solution you'd like
Then I can connect using any language I want.
Describe alternatives you've considered
No response
Additional Context
this follows pretty closely with the official spark connect overview
Would you like to implement a fix?
No
The text was updated successfully, but these errors were encountered: