You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi @CacheControl , We are using json-rule-engine in our project to get outcome by evaluating around 10k records stored in mongodb.
I am sharing one rule for your reference. We have 10k record like this in our project and evaluating it against facts takes around 10-15 seconds.
Can you please help me understand why it is taking so much time to evaluate this type of rule. If possible please share solutions for improving performance also.
The text was updated successfully, but these errors were encountered:
That's only about 1ms per rule, right? Doesn't seem excessive from the json-rules-engine standpoint - seems exceptionally fast actually. But of course combined, 10-15 seconds is far too long for a user to wait.
I'm thinking some better architecture could improve it though.
Can you not provide some pre-filtering on the mongoDB results so that you'd have fewer rules to run?
@iay25 as @Ben-CA pointed out on a per evaluation basis that's not bad actually. Javascript isn't well suited for doing a large amount of compute per-record on a large volume of data given the normally single-threaded nature of the system.
There is support for worker-threads in Node.js (https://nodejs.org/docs/latest-v18.x/api/worker_threads.html) I'm not sure if there's a decent thread-pool implemenation available on npm but I would look at something like that. You'll need to initialize each tread by setting up an instance of the rules engine. But then you should be able to setup the system to parse these in a shared queue. The other option would be to leverage a worker pool through subprocesses.
hi @CacheControl , We are using json-rule-engine in our project to get outcome by evaluating around 10k records stored in mongodb.
I am sharing one rule for your reference. We have 10k record like this in our project and evaluating it against facts takes around 10-15 seconds.
{ "conditions": { "all": [ { "fact": "customer_delivery_address", "operator": "equal", "factLabel": "Customer Delivery Address", "value": "GB", "valueSet": [ { "value": "GB", "label": "GB" } ] }, { "fact": "customer_tier", "operator": "equal", "factLabel": "Customer Tier", "value": "gold", "valueSet": [ { "value": "gold", "label": "Gold" } ] }, { "fact": "new_customer", "operator": "isBoolean", "factLabel": "New Customer", "value": true, "valueSet": [ { "value": true, "label": true } ] }, { "fact": "order_amount", "operator": "greaterThan", "factLabel": "Order Amount", "value": 2500, "valueSet": [ { "value": 2500, "label": "2500" } ] }, { "fact": "order_count", "operator": "lessThan", "factLabel": "Order Count", "value": 100, "valueSet": [ { "value": 100, "label": "100" } ] }, { "fact": "order_date", "operator": "isDateGreaterThan", "factLabel": "Order Date", "value": "2024-02-01T05:15:44Z", "valueSet": [ { "value": "2024-02-01T05:15:44Z", "label": "2024-02-01T05:15:44Z" } ] }, { "fact": "order_date", "operator": "isDateLessThan", "factLabel": "Order Date", "value": "2024-03-01T05:10:50Z", "valueSet": [ { "value": "2024-03-01T05:10:50Z", "label": "2024-03-01T05:10:50Z" } ] }, { "fact": "order_state", "operator": "equal", "factLabel": "Order State", "value": "confirmed", "valueSet": [ { "value": "confirmed", "label": "Confirmed" } ] }, { "fact": "payment_state", "operator": "equal", "factLabel": "Payment State", "value": "paid", "valueSet": [ { "value": "paid", "label": "Paid" } ] }, { "fact": "customers", "operator": "equal", "factLabel": "Customers", "value": "[email protected]", "valueSet": [ { "value": "[email protected]", "label": "[email protected]" } ] } ] }, "event": { "type": "categories", "params": { "label": "Category", "value": "Fitness Kit", "key": "8173dfd1-d8a1-417d-ab78-07dfa6799f59", "operator": "is", "source": "resource" } } }
Can you please help me understand why it is taking so much time to evaluate this type of rule. If possible please share solutions for improving performance also.
The text was updated successfully, but these errors were encountered: