We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python exe_job.py 执行一个shell a.sh a.sh的内容为spark-submit --master yarn b.py ps -ef | grep python 会有exe_job.py 和 b.py kill job只会杀掉exe_job.py进程。b.py进程不会被杀掉。导致yarn的任务仍然在运行。 也不能真正监控到yarn上的这个任务。 请问一下有解决方案吗?
The text was updated successfully, but these errors were encountered:
目前的版本没有解决方案,建议看下这个回答~ #15
Sorry, something went wrong.
#15没有说这个问题的哈。 有解决方案的思路吗? 我可以改一下源码。 我发现使用commands库执行sh,然后停掉后对应的yarn上的任务也会被杀掉。
但是commands不能向paramiko 实时输出 stdout内容。
这个问题困扰我好久,没有解决方案就用不了你这个插件了。
hi,我在exe_job.py的 ssh.exec_command()方法中,添加get_pty=True。解决了这个问题。
No branches or pull requests
python exe_job.py 执行一个shell a.sh
a.sh的内容为spark-submit --master yarn b.py
ps -ef | grep python
会有exe_job.py 和 b.py
kill job只会杀掉exe_job.py进程。b.py进程不会被杀掉。导致yarn的任务仍然在运行。
也不能真正监控到yarn上的这个任务。
请问一下有解决方案吗?
The text was updated successfully, but these errors were encountered: