We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
driver doesn`t restart on failure. I set a restart policy for the kind: spark application apiVersion: "sparkoperator.k8s.io/v1beta2" kind: SparkApplication metadata: name: spark-app namespace: dev spec: type: Python mode: cluster image: "registry.gitlab.com" imagePullPolicy: Always mainApplicationFile: "local:///opt/spark/work-dir/main.py" sparkVersion: "3.5.2" restartPolicy: type: Always onFailureRetries: 5 onFailureRetryInterval: 90 onSubmissionFailureRetries: 5 onSubmissionFailureRetryInterval: 90
.......
No response
Give it a 👍 We prioritize the issues with most 👍
The text was updated successfully, but these errors were encountered:
No branches or pull requests
What happened?
driver doesn`t restart on failure. I set a restart policy for the kind: spark application
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: spark-app
namespace: dev
spec:
type: Python
mode: cluster
image: "registry.gitlab.com"
imagePullPolicy: Always
mainApplicationFile: "local:///opt/spark/work-dir/main.py"
sparkVersion: "3.5.2"
restartPolicy:
type: Always
onFailureRetries: 5
onFailureRetryInterval: 90
onSubmissionFailureRetries: 5
onSubmissionFailureRetryInterval: 90
.......
Reproduction Code
No response
Expected behavior
No response
Actual behavior
No response
Environment & Versions
Additional context
No response
Impacted by this bug?
Give it a 👍 We prioritize the issues with most 👍
The text was updated successfully, but these errors were encountered: