Skip to content

Commit

Permalink
[DOCS] Update ES ingest pipeline refs (#28239)
Browse files Browse the repository at this point in the history
## What does this PR do?

Changes several 'ingest node' references in the docs to 'ingest pipeline.'

## Why is it important?

While a node with the `ingest` role is required to use Elasticsearch ingest pipelines, we no longer include 'ingest node' in the feature name.

There are no official plans, but the Elasticsearch team has discussed removing the `ingest` role in the future.

## Related issues

- elastic/elasticsearch#70253
- elastic/elasticsearch#78633

Co-authored-by: dedemorton <[email protected]>
  • Loading branch information
jrodewig and dedemorton authored Oct 5, 2021
1 parent e0cc99f commit c3cb365
Show file tree
Hide file tree
Showing 38 changed files with 75 additions and 75 deletions.
2 changes: 1 addition & 1 deletion auditbeat/auditbeat.reference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -479,7 +479,7 @@ output.elasticsearch:
# In case you modify this pattern you must update setup.template.name and setup.template.pattern accordingly.
#index: "auditbeat-%{[agent.version]}-%{+yyyy.MM.dd}"

# Optional ingest node pipeline. By default no pipeline will be used.
# Optional ingest pipeline. By default no pipeline will be used.
#pipeline: ""

# Optional HTTP path
Expand Down
2 changes: 1 addition & 1 deletion auditbeat/docs/fields.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -2358,7 +2358,7 @@ alias to: error.message
[float]
=== geoip
The geoip fields are defined as a convenience in case you decide to enrich the data using a geoip filter in Logstash or Ingest Node.
The geoip fields are defined as a convenience in case you decide to enrich the data using a geoip filter in Logstash or an Elasticsearch geoip ingest processor.
Expand Down
3 changes: 2 additions & 1 deletion auditbeat/module/auditd/_meta/fields.yml
Original file line number Diff line number Diff line change
Expand Up @@ -858,7 +858,8 @@
type: group
description: >
The geoip fields are defined as a convenience in case you decide to
enrich the data using a geoip filter in Logstash or Ingest Node.
enrich the data using a geoip filter in Logstash or an Elasticsearch geoip
ingest processor.
fields:
- name: continent_name
type: keyword
Expand Down
2 changes: 1 addition & 1 deletion auditbeat/module/auditd/fields.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

13 changes: 6 additions & 7 deletions docs/devguide/modules-dev-guide.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -304,10 +304,9 @@ variables to dynamically switch between configurations.
[float]
==== ingest/*.json

The `ingest/` folder contains Elasticsearch
{ref}/ingest.html[Ingest Node] pipeline configurations. The Ingest
Node pipelines are responsible for parsing the log lines and doing other
manipulations on the data.
The `ingest/` folder contains {es} {ref}/ingest.html[ingest pipeline]
configurations. Ingest pipelines are responsible for parsing the log lines and
doing other manipulations on the data.

The files in this folder are JSON or YAML documents representing
{ref}/pipeline.html[pipeline definitions]. Just like with the `config/`
Expand Down Expand Up @@ -344,10 +343,10 @@ on_failure:
----

From here, you would typically add processors to the `processors` array to do
the actual parsing. For details on how to use ingest node processors, see the
{ref}/ingest-processors.html[ingest node documentation]. In
the actual parsing. For information about available ingest processors, see the
{ref}/processors.html[processor reference documentation]. In
particular, you will likely find the
{ref}/grok-processor.html[Grok processor] to be useful for parsing.
{ref}/grok-processor.html[grok processor] to be useful for parsing.
Here is an example for parsing the Nginx access logs.

[source,json]
Expand Down
4 changes: 2 additions & 2 deletions filebeat/_meta/config/filebeat.inputs.reference.yml.tmpl
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ filebeat.inputs:
# this can mean that the first entries of a new file are skipped.
#tail_files: false

# The Ingest Node pipeline ID associated with this input. If this is set, it
# The ingest pipeline ID associated with this input. If this is set, it
# overwrites the pipeline option from the Elasticsearch output.
#pipeline:

Expand Down Expand Up @@ -351,7 +351,7 @@ filebeat.inputs:
# carriage_return, carriage_return_line_feed, next_line, line_separator, paragraph_separator.
#line_terminator: auto

# The Ingest Node pipeline ID associated with this input. If this is set, it
# The ingest pipeline ID associated with this input. If this is set, it
# overwrites the pipeline option from the Elasticsearch output.
#pipeline:

Expand Down
4 changes: 2 additions & 2 deletions filebeat/beater/filebeat.go
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,9 @@ import (
_ "github.com/elastic/beats/v7/filebeat/autodiscover"
)

const pipelinesWarning = "Filebeat is unable to load the Ingest Node pipelines for the configured" +
const pipelinesWarning = "Filebeat is unable to load the ingest pipelines for the configured" +
" modules because the Elasticsearch output is not configured/enabled. If you have" +
" already loaded the Ingest Node pipelines or are using Logstash pipelines, you" +
" already loaded the ingest pipelines or are using Logstash pipelines, you" +
" can ignore this warning."

var (
Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/filebeat-modules-options.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ a log type that isn't supported, or you want to use a different setup.

{beatname_uc} <<{beatname_lc}-modules,modules>> provide a quick way to
get started processing common log formats. They contain default configurations,
{es} ingest node pipeline definitions, and {kib} dashboards to help you
{es} ingest pipeline definitions, and {kib} dashboards to help you
implement and deploy a log monitoring solution.

You can configure modules in the `modules.d` directory (recommended), or in the
Expand Down
4 changes: 2 additions & 2 deletions filebeat/docs/include/what-happens.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ defaults)
* Makes sure each multiline log event gets sent as a single event
* Uses ingest node to parse and process the log lines, shaping the data into a structure suitable
for visualizing in Kibana
* Uses an {es} ingest pipeline to parse and process the log lines, shaping the
data into a structure suitable for visualizing in Kibana
ifeval::["{has-dashboards}"=="true"]
* Deploys dashboards for visualizing the log data
Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/inputs/input-common-options.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ processors in your config.
[float]
===== `pipeline`

The Ingest Node pipeline ID to set for the events generated by this input.
The ingest pipeline ID to set for the events generated by this input.

NOTE: The pipeline ID can also be configured in the Elasticsearch output, but
this option usually results in simpler configuration files. If the pipeline is
Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/modules-overview.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ the following:
The {beatname_uc} configuration is also responsible with stitching together
multiline events when needed.

* {es} {ref}/ingest.html[Ingest Node] pipeline definition,
* {es} {ref}/ingest.html[ingest pipeline] definition,
which is used to parse the log lines.

* Fields definitions, which are used to configure {es} with the
Expand Down
4 changes: 2 additions & 2 deletions filebeat/docs/modules/iptables.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ When you run the module, it performs a few tasks under the hood:
* Sets the default input to `syslog` and binds to `localhost` port `9001`
(but don’t worry, you can override the defaults).

* Uses ingest node to parse and process the log lines, shaping the data into
a structure suitable for visualizing in Kibana.
* Uses an ingest pipeline to parse and process the log lines, shaping the data
into a structure suitable for visualizing in Kibana.

* Deploys dashboards for visualizing the log data.

Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/modules/netflow.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ NetFlow versions older than 9, fields are mapped automatically to NetFlow v9.

This module wraps the <<filebeat-input-netflow,netflow input>> to enrich the
flow records with geolocation information about the IP endpoints by using
Elasticsearch Ingest Node.
an {es} ingest pipeline.

include::../include/gs-link.asciidoc[]

Expand Down
6 changes: 3 additions & 3 deletions filebeat/filebeat.reference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -575,7 +575,7 @@ filebeat.inputs:
# this can mean that the first entries of a new file are skipped.
#tail_files: false

# The Ingest Node pipeline ID associated with this input. If this is set, it
# The ingest pipeline ID associated with this input. If this is set, it
# overwrites the pipeline option from the Elasticsearch output.
#pipeline:

Expand Down Expand Up @@ -758,7 +758,7 @@ filebeat.inputs:
# carriage_return, carriage_return_line_feed, next_line, line_separator, paragraph_separator.
#line_terminator: auto

# The Ingest Node pipeline ID associated with this input. If this is set, it
# The ingest pipeline ID associated with this input. If this is set, it
# overwrites the pipeline option from the Elasticsearch output.
#pipeline:

Expand Down Expand Up @@ -1391,7 +1391,7 @@ output.elasticsearch:
# In case you modify this pattern you must update setup.template.name and setup.template.pattern accordingly.
#index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"

# Optional ingest node pipeline. By default no pipeline will be used.
# Optional ingest pipeline. By default no pipeline will be used.
#pipeline: ""

# Optional HTTP path
Expand Down
Loading

0 comments on commit c3cb365

Please sign in to comment.