Replies: 3 comments
-
Hello,
Don't forget rules have a purpose and a context. I have split some large catalog registry rules because the treatment of false positives was awful.
one, who left, made an in-house php application to manage them. We don't take the |
Beta Was this translation helpful? Give feedback.
-
We had similar challenges with our Sigma rule management in ASGARD that prepares the rules to be used in the endpoint agents named Aurora. Regarding False Positive FilteringAt that time, we have refactored all rules to avoid conditions like Combining multiple rulesAs @frack113 already mentioned. The context is highly relevant. You could combine multiple rules into a single rule to improve performance, but you would loose the context. You could, after a match, lookup the context information for the strings and add it, but I think that many of our rules have or will have filtering sections that you would also have to combine into that big rule. I don't know how that would work out in the long run. If I had to reduce impact, I would use only the rules of certain level, e.g. only |
Beta Was this translation helpful? Give feedback.
-
It is a very beta , hope it can help https://github.com/frack113/SigmaDiff |
Beta Was this translation helpful? Give feedback.
-
Hi everyone!
I would like to share with you some thinking and reasoning for a project redefining the way we manage our ruleset in our organization.
I would love to have your feedback and believe that we could all benefit from each other's experience here.
Background
We have ca 1000 servers, highly diverse, mostly Windows, with Sysmon but also Linux. This plus other logs generates 3-4K eps on a ELK stack with a Sigma ruleset of 120 rules and Elastalert (v1) generating 140 queries. It currently manage to run them all in less than 3 minutes over a 5 min window.
General Problem: Outdated ruleset
Our ruleset was built-up overtime cherry-picking elements of Sigma rules we could find here and there and customizing them to our environment (filters mostly) to limit false positives.
We did not have enough time to follow all commits on the Sigma repository and now, I have the feeling that our ruleset needs a strong update. We reviewed last year the commits we missed but we did it manually and this was very tedious.
Today there are 1919 rules in Sigma repository. We will have to review again all the commits but I want to be sure that this time, we will keep our ruleset up-to-date!
Challenge: Taking advantage of the Sigma repository and remaining ahead
So the idea would be to re-base our ruleset onto the Sigma repository.
The principle would be to clone the Sigma repository, create our branch, commit our adaptations to the rules, tag the rules we want to integrate in our ruleset. Periodically, we would pull from the upstream (this github repo) and rebase our branch on the newest commits from this master. We would then automatically benefit from the newest rules modifications and periodically review the new rules to integrate them or not to our ruleset.
A script would then take all tagged rules and create our ruleset for Elastalert.
Merging rules
One approach we follow for our ruleset is to combine several Sigma rules into one.
I've always presumed (but haven't formally tested) that it is more efficient to have 1 rule looking for 10
CommandLine|contains
elements than 10 rules each looking for 1CommandLine|contains
element. This also allows to centralize more our filters...Eg: We have 3 (one per criticality) sigma rules for suspicious process creation combining some of the 720 rules of suspicious process creation for windows in the Sigma repository,
So to maintain this approach, the script would have to be smart enough to combine the rules by itself, merging selections and filters... More complex but not impossible. Or we would just explode the number of rules...
Please tell us...!
Thanks a lot in advance for your contribution :)
Beta Was this translation helpful? Give feedback.
All reactions