-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sending large job data #33
Comments
Your solution should be right. But I need some days to think about this issue. |
Yup. No problem. |
Any update on this? |
I've made a pull request and merged it. Could you please do a test for this? |
Unfortunately this is now causing more problems than it fixed. It was working fine for a few jobs when I did testing, but in production with a higher volume it will eventually hang on reading from the connection and block all incoming jobs. For my use, I will likely just revert this commit and increase the buffer size to fit my needs. |
I revert de91c99 `
} And it seems to work stable |
I noticed with my application, if I send a large amount of data, then it takes a long time to process while gearman is spewing out "Not enough data" errors back at me. There are multiple solutions to this problem, but one I tested can be found here:
https://github.com/kdar/gearman-go/compare/big-data
It basically reads the entire data upfront before it ever gets to decodeInPack(), so decodeInPack() won't throw an error. Another solution is to have the caller of decodeInPack() notice when it's not enough data and wait until there is a sufficient amount to continue. You would also need to increase bufferSize as a size of 1024 is extremely small and would still make it take a long time to process.
Let me know what you think.
The text was updated successfully, but these errors were encountered: