-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[question] Native kubernetes spark vs spark-on-k8s-operator? #1657
Comments
There is only one native version of spark on k8s. Spark operator wraps it and provides a k8s-native interface so that you can run your apps using yamls+kubectl (or another k8s client). But it doesn't make it a part of apache spark. Just in the same way as jupyter notebooks can run spark code, but you can't work with a notebook from spark-submit. Jupyter is not part of spark. So as spark operator. |
As far as I know Spark operator lets users do spark-submit commits using As I answered here, if you want spark application on k8s, you can use client mode and configure it on k8s (spark.master, etc.) in the application. Pyspark users can use the sparglim package to build client mode applications quickly. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it. |
As far as I can tell there is a native version of spark on kubernetes:
https://spark.apache.org/docs/latest/running-on-kubernetes
And there is the spark-on-k8s-operator:
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator
In the native version you can directly submit spark jobs while on the k8s version we would write a SparkApplication yaml file. Now I'm wondering if the spark-on-k8s-operator makes use of the native application? And If so would it then be possible to directly submit jobs using
spark-submit
.The text was updated successfully, but these errors were encountered: