Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Datajson v2.0 #12175

Open
wants to merge 9 commits into
base: master
Choose a base branch
from
Open

Datajson v2.0 #12175

wants to merge 9 commits into from

Conversation

regit
Copy link
Contributor

@regit regit commented Nov 28, 2024

Update of #12102

Contribution style:

Our Contribution agreements:

Changes (if applicable):

Link to ticket: https://redmine.openinfosecfoundation.org/issues/7372

Describe changes:

SV_REPO=OISF/suricata-verify#2123

This patch introduces a new keyword datajson that is similar
to dataset with a twist. Where dataset allows match from sets,
datajson allows the same but also adds JSON data to the alert
event. This data is comint from the set definition it self.
For example, an ipv4 set will look like:

  10.16.1.11,{"test": "success","context":3}

The syntax is value and json data separated by a comma.

The syntax of the keyword is the following:

  datajson:isset,src_ip,type ip,load src.lst,key src_ip;

Compare to dataset, it just have a supplementary option key
that is used to indicate in which subobject the JSON value
should be added.

The information is added in the even under the alert.extra
subobject:

  "alert": {
    "extra": {
      "src_ip": {
        "test": "success",
        "context": 3
      },

The main interest of the feature is to be able to contextualize
a match. For example, if you have an IOC source, you can do

 value1,{"actor":"APT28","Country":"FR"}
 value2,{"actor":"APT32","Country":"NL"}

This way, a single dataset is able to produce context to the
event where it was not possible before and multiple signatures
had to be used.

Ticket: OISF#7372
Previous code was using an array and introducing a limit in the
number of datajson keywords that can be used in a signature.

This patch uses a linked list instead to overcome the limit. By
using a first element of the list that is part of the structure
we limit the cost of the feature to a structure member added to
PacketAlert structure. Only the PacketAlertFree function is
impacted as we need to iterate to find potential allocation.

Ticket: OISF#7372
It was not handling correctly the json values with space as they
were seen as multiple arguments.

Ticket: OISF#7372
@regit regit mentioned this pull request Nov 28, 2024
5 tasks
@jasonish
Copy link
Member

I still have to play with this a little more, and datasets in general, but a few quick observations, not all related to this PR, but make things confusing.

If my IP dataset input file is simply:

10.16.1.11

then datajson crashes.


  • A dataset input line like 10.16.1.10,asdf is parsed as datarep even if dataset is used. Errors out because asdf is not a valid reputation value, but it seems like we already have some overloading going on.
  • A dataset input like like 10.16.1.10,["asdf"] parses the JSON even if dataset is used and not datajson
  • Scalars like 1, and "one" are valid JSON but not accepted.

Should datarepjson be a thing for completeness?

@suricata-qa
Copy link

Information:

ERROR: QA failed on SURI_TLPW2_autofp_suri_time.

ERROR: QA failed on SURI_TLPR1_suri_time.

field baseline test %
SURI_TLPW2_autofp_stats_chk
.uptime 138 145 105.07%
SURI_TLPR1_stats_chk
.uptime 624 687 110.1%
.http.byterange.memuse 99602115 104192025 104.61%

Pipeline 23600

@regit
Copy link
Contributor Author

regit commented Nov 29, 2024

I still have to play with this a little more, and datasets in general, but a few quick observations, not all related to this PR, but make things confusing.

If my IP dataset input file is simply:

10.16.1.11

then datajson crashes.

  • A dataset input line like 10.16.1.10,asdf is parsed as datarep even if dataset is used. Errors out because asdf is not a valid reputation value, but it seems like we already have some overloading going on.
  • A dataset input like like 10.16.1.10,["asdf"] parses the JSON even if dataset is used and not datajson
  • Scalars like 1, and "one" are valid JSON but not accepted.

OK, I need to fix the parsing for here. :/

Should datarepjson be a thing for completeness?

I more in favor of building the set from the CTI tool at the correct cut before exporting it to Suricata. I think it corresponds to most of the usage I've seen. But potentially we could open the issue and try to see if someone is interested by it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants