-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Capturing stdout output of jobs #886
Comments
You can. Take a look at the custom worker classes here You can implement your own output capturing worker for example by overriding the execute_job method here This is more tricky though as you have to capture the output of a forked process. |
@anandsaha Have you found a solution for this? |
I agree this would be a great feature. It would streamline the process of integrating RQ into backend applications using existing shell tools, for example. |
+1 it would be awesome to have this! |
Can't you use logging to redirect your output to stdout? |
+1 |
Yes, I think capturing job outputs in “job.output” is a good idea. I’d welcome a PR for this. |
I solved this by a dummy way: 2.fetch and update job.meta['output'] by job id during subprocess running
3.define a function to read job.meta['output'] and call it |
Using this method, are you still able to watch the output of the currently executing job by issuing 'rq worker'? |
Hi,
I would like to somehow capture the data dumped by my job onto stdout. Is there a way to access it from the Job object?
I am spawning new processes using Popen in my job function.
Thanks,
Anand
The text was updated successfully, but these errors were encountered: