Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix NestedProcessingTransformation #298

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

r0ot
Copy link

@r0ot r0ot commented Nov 4, 2024

When using a custom pipeline written in Yaml like so

pipeline_classes = [ecs_windows(), ecs_zeek_corelight()]
pipeline_names = [i.name for i in pipeline_classes if i.name]
ProcessingPipelineResolver.from_pipeline_list(pipeline_classes).resolve(pipeline_names + ['custom_pipeline.yml'])

using NestedProcessingTransformation like in custom_pipeline.yml

name: Custom
priority: 100
transformations:
  - type: nest
    items:
      - type: field_name_mapping
        rule_conditions:
          - type: logsource
            service: signinlogs
        mapping:
            ResultType: azure.signinlogs.result_type

yields the following error

File .../python3.12/site-packages/sigma/processing/transformations.py:1021, in NestedProcessingTransformation.__post_init__(self)
-> 1021 self._nested_pipeline = ProcessingPipeline(items=self.items)

File .../python3.12/site-packages/sigma/processing/pipeline.py:433, in ProcessingPipeline.__post_init__(self)
    432 if not all((isinstance(item, ProcessingItem) for item in self.items)):
--> 433     raise TypeError(
    434         "Each item in a processing pipeline must be a ProcessingItem - don't use processing classes directly!"
    435     )
    436 if not all(
    437     (isinstance(item, QueryPostprocessingItem) for item in self.postprocessing_items)
    438 ):

TypeError: Each item in a processing pipeline must be a ProcessingItem - don't use processing classes directly!

So I propose the following fix. Somehow it appears NestedProcessingTransformation isn't getting instantiated with its from_dict method which would've assured each of its items were of type ProcessingItem. Remedying this would be another fix for this issue. There's a chance the way I'm using my custom yaml pipeline is incorrect (ProcessingPipelineResolver.from_pipeline_list(...).resolve(...)) in which case please correct me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant