-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Decouple Concurrency and Iterations #88
Comments
The goal of the In your case, the Fetch by id step is processed atomically by the executor. |
I think I see. So you're saying that a plan item is sent to an executor to be executed, and a plan item is executed serially. And the Is there a way to generate an independent executor for each item in a CSV file? So a large CSV file could be processed quickly. Or how feasible would it be to make a change to support something like that? |
If I'm understanding the code correctly, there are 1 to n iterations, and the concurrency controls how many iterations can run at once. But, within an iteration, a step is evaluated, and in the case of the mutli-csv-request there is a step created for each row in the CSV file (Code here). Each step is done sequentially, which is controlled here in a simple Would it not be possible to do all the 'benchchmark executes' in parallel? It looks like it would 'just' require a flag to control whether the user wanted sequential or parallel processing, then the sequential would execute as today, and the parallel would use something like rayon(?) to execute the actions?? |
Sorry for the late answer. I have been really busy these weeks. The problem here is that the benchmark is executed sequentially because actions can have dependencies with previous actions, like storing / using variables. On the other hand, one thing we can do is to execute all |
From #83:
Use Case: Have a list of 10k parameters to run against a url to validate they return 200. Want each item in the list to be run once, but run with concurrency of 20.
The text was updated successfully, but these errors were encountered: