The following is a list of processors available to mutate an event.
A processor that adds entries to an event
To get started, create the following pipeline.yaml
.
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- add_entries:
entries:
- key: "newMessage"
value: 3
overwrite_if_key_exists: true
sink:
- stdout:
Create the following file named logs_json.log
and replace the path
in the file source of your pipeline.yaml
with the path of this file.
{"message": "value"}
When run, the processor will parse the message into the following output:
{"message": "value", "newMessage": 3}
If
newMessage
had already existed, its existing value would have been overwritten with3
entries
- (required) - A list of entries to add to an eventkey
- (required) - The key of the new entry to be addedvalue
- (required) - The value of the new entry to be added. Strings, booleans, numbers, null, nested objects, and arrays containing the aforementioned data types are valid to useoverwrite_if_key_exists
- (optional) - When set totrue
, ifkey
already exists in the event, then the existing value will be overwritten. The default isfalse
.
A processor that copies values within an event
To get started, create the following pipeline.yaml
.
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- copy_values:
entries:
- from_key: "message"
to_key: "newMessage"
overwrite_if_to_key_exists: true
sink:
- stdout:
Create the following file named logs_json.log
and replace the path
in the file source of your pipeline.yaml
with the path of this file.
{"message": "value"}
When run, the processor will parse the message into the following output:
{"message": "value", "newMessage": "value"}
If
newMessage
had already existed, its existing value would have been overwritten withvalue
entries
- (required) - A list of entries to be copied in an eventfrom_key
- (required) - The key of the entry to be copiedto_key
- (required) - The key of the new entry to be addedoverwrite_if_to_key_exists
- (optional) - When set totrue
, ifto_key
already exists in the event, then the existing value will be overwritten. The default isfalse
.
A processor that deletes entries in an event
To get started, create the following pipeline.yaml
.
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- delete_entries:
with_keys: ["message"]
sink:
- stdout:
Create the following file named logs_json.log
and replace the path
in the file source of your pipeline.yaml
with the path of this file.
{"message": "value", "message2": "value2"}
When run, the processor will parse the message into the following output:
{"message2": "value2"}
If
message
had not existed in the event, then nothing would have happened
with_keys
- (required) - An array of keys of the entries to be deleted
A processor that renames keys in an event
To get started, create the following pipeline.yaml
.
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- rename_keys:
entries:
- from_key: "message"
to_key: "newMessage"
overwrite_if_to_key_exists: true
sink:
- stdout:
Create the following file named logs_json.log
and replace the path
in the file source of your pipeline.yaml
with the path of this file.
{"message": "value"}
When run, the processor will parse the message into the following output:
{"newMessage": "value"}
If
newMessage
had already existed, its existing value would have been overwritten withvalue
entries
- (required) - A list of entries to rename in an eventfrom_key
- (required) - The key of the entry to be renamedto_key
- (required) - The new key of the entryoverwrite_if_to_key_exists
- (optional) - When set totrue
, ifto_key
already exists in the event, then the existing value will be overwritten. The default isfalse
.
The renaming operation occurs in the order defined. This means that chaining is implicit with the RenameKeyProcessor. Take the following piplines.yaml
for example:
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- rename_key:
entries:
- from_key: "message"
to_key: "message2"
- from_key: "message2"
to_key: "message3"
sink:
- stdout:
Let the contents of logs_json.log
be the following:
{"message": "value"}
After the processor runs, this will be the output
{"message3": "value"}
This plugin is compatible with Java 14. See