diff --git a/_data-prepper/pipelines/configuration/processors/parse-json.md b/_data-prepper/pipelines/configuration/processors/parse-json.md index e32b4e5110..306327a8d7 100644 --- a/_data-prepper/pipelines/configuration/processors/parse-json.md +++ b/_data-prepper/pipelines/configuration/processors/parse-json.md @@ -1,27 +1,79 @@ --- layout: default -title: Parse JSON +title: parse_json parent: Processors grand_parent: Pipelines nav_order: 45 --- -# Parse JSON +# parse_json -## Overview +The `parse_json` processor parses JSON data for an event, including any nested fields. The processor extracts the JSON pointer data and adds the input event to the extracted fields. -The `parse_json` processor parses JSON data for an event, including any nested fields. The following table describes several optional parameters you can configure in the `parse_json` processor. -Option | Required | Type | Description -:--- | :--- | :--- | :--- -source | No | String | The field in the `Event` that will be parsed. Default value is `message`. -destination | No | String | The destination field of the parsed JSON. Defaults to the root of the `Event`. Cannot be `""`, `/`, or any whitespace-only `String` because these are not valid `Event` fields. -pointer | No | String | A JSON Pointer to the field to be parsed. There is no `pointer` by default, meaning the entire `source` is parsed. The `pointer` can access JSON Array indices as well. If the JSON Pointer is invalid then the entire `source` data is parsed into the outgoing `Event`. If the pointed-to key already exists in the `Event` and the `destination` is the root, then the pointer uses the entire path of the key. +## Configuration - \ No newline at end of file +To get started, create the following `pipeline.yaml` file: + +```yaml +parse-json-pipeline: + source: + stdin: + processor: + - parse_json: + sink: + - stdout: +``` + +### Basic example + +To test the `parse_json` processor with the previous configuration, run the pipeline and paste the following line into your console, then enter `exit` on a new line: + +``` +{"outer_key": {"inner_key": "inner_value"}} +``` +{% include copy.html %} + +The `parse_json` processor parses the message into the following format: + +``` +{"message": {"outer_key": {"inner_key": "inner_value"}}", "outer_key":{"inner_key":"inner_value"}}} +``` + +### Example with a JSON pointer + +You can use a JSON pointer to parse a selection of the JSON data by specifying the `pointer` option in the configuration. To get started, create the following `pipeline.yaml` file: + +```yaml +parse-json-pipeline: + source: + stdin: + processor: + - parse_json: + pointer: "outer_key/inner_key" + sink: + - stdout: +``` + +To test the `parse_json` processor with the pointer option, run the pipeline, paste the following line into your console, and then enter `exit` on a new line: + +``` +{"outer_key": {"inner_key": "inner_value"}} +``` +{% include copy.html %} + +The processor parses the message into the following format: + +``` +{"message": {"outer_key": {"inner_key": "inner_value"}}", "inner_key": "inner_value"} +``` \ No newline at end of file