Skip to content

Commit

Permalink
Rename 'Ingest Node Pipelines' to 'Ingest Pipelines'
Browse files Browse the repository at this point in the history
While Elasticsearch ingest pipelines require a node with the `ingest`
role, we don't need to include `ingest node` in the feature name.

There are no official plans, but the Elasticsearch team has discussed removing
the `ingest` role in the future. This also better aligns the Kibana UI with the
Elasticsearch docs.

Relates to elastic/elasticsearch#70253.
  • Loading branch information
jrodewig committed Oct 4, 2021
1 parent ba7bea4 commit 7197fef
Show file tree
Hide file tree
Showing 16 changed files with 40 additions and 38 deletions.
25 changes: 13 additions & 12 deletions docs/dev-tools/grokdebugger/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,21 +9,22 @@ structure it. Grok is good for parsing syslog, apache, and other
webserver logs, mysql logs, and in general, any log format that is
written for human consumption.

Grok patterns are supported in the ingest node
{ref}/grok-processor.html[grok processor] and the Logstash
{logstash-ref}/plugins-filters-grok.html[grok filter]. See
{logstash-ref}/plugins-filters-grok.html#_grok_basics[grok basics]
for more information on the syntax for a grok pattern.

The Elastic Stack ships
with more than 120 reusable grok patterns. See
https://github.com/elastic/elasticsearch/tree/master/libs/grok/src/main/resources/patterns[Ingest node grok patterns] and https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns[Logstash grok patterns]
for the complete list of patterns.
Grok patterns are supported in {ref}/runtime.html[{es} runtime fields], the {es}
{ref}/grok-processor.html[grok ingest processor], and the {ls}
{logstash-ref}/plugins-filters-grok.html[grok filter]. For syntax, see
{ref}/grok.html[Grokking grok].

The {stack} ships with more than 120 reusable grok patterns. For a complete
list of patterns, see
https://github.com/elastic/elasticsearch/tree/master/libs/grok/src/main/resources/patterns[{es}
grok patterns] and
https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns[{ls}
grok patterns].

Because
ingest node and Logstash share the same grok implementation and pattern
{es} and {ls} share the same grok implementation and pattern
libraries, any grok pattern that you create in the *Grok Debugger* will work
in ingest node and Logstash.
in both {es} and {ls}.

[float]
[[grokdebugger-getting-started]]
Expand Down
2 changes: 1 addition & 1 deletion docs/developer/plugin-list.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -452,7 +452,7 @@ the infrastructure monitoring use-case within Kibana.
|{kib-repo}blob/{branch}/x-pack/plugins/ingest_pipelines/README.md[ingestPipelines]
|The ingest_pipelines plugin provides Kibana support for Elasticsearch's ingest nodes. Please refer to the Elasticsearch documentation for more details.
|The ingest_pipelines plugin provides Kibana support for Elasticsearch's ingest pipelines.
|{kib-repo}blob/{branch}/x-pack/plugins/lens/readme.md[lens]
Expand Down
2 changes: 1 addition & 1 deletion docs/redirects.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@ This content has moved. Refer to <<dashboard, **Dashboard**>>.
This content has moved. Refer to <<dashboard, **Dashboard**>>.

[role="exclude",id="ingest-node-pipelines"]
== Ingest Node Pipelines
== Ingest Pipelines

This content has moved. Refer to {ref}/ingest.html[Ingest pipelines].

Expand Down
5 changes: 3 additions & 2 deletions docs/user/monitoring/monitoring-metricbeat.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -189,8 +189,9 @@ If you configured the monitoring cluster to use encrypted communications, you
must access it via HTTPS. For example, use a `hosts` setting like
`https://es-mon-1:9200`.

IMPORTANT: The {es} {monitor-features} use ingest pipelines, therefore the
cluster that stores the monitoring data must have at least one ingest node.
IMPORTANT: The {es} {monitor-features} use ingest pipelines. The
cluster that stores the monitoring data must have at least one node with the
`ingest` role.

If the {es} {security-features} are enabled on the monitoring cluster, you
must provide a valid user ID and password so that {metricbeat} can send metrics
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ function TutorialFleetInstructions({ http, basePath, isDarkTheme }: Props) {
'xpack.apm.tutorial.apmServer.fleet.message',
{
defaultMessage:
'The APM integration installs Elasticsearch templates and Ingest Node pipelines for APM data.',
'The APM integration installs Elasticsearch templates and ingest pipelines for APM data.',
}
)}
footer={
Expand Down
12 changes: 6 additions & 6 deletions x-pack/plugins/ingest_pipelines/README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
# Ingest Node Pipelines UI
# Ingest Pipelines UI

## Summary
The `ingest_pipelines` plugin provides Kibana support for [Elasticsearch's ingest nodes](https://www.elastic.co/guide/en/elasticsearch/reference/master/ingest.html). Please refer to the Elasticsearch documentation for more details.
The `ingest_pipelines` plugin provides Kibana support for [Elasticsearch's ingest pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/master/ingest.html).

This plugin allows Kibana to create, edit, clone and delete ingest node pipelines. It also provides support to simulate a pipeline.
This plugin allows Kibana to create, edit, clone and delete ingest pipelines. It also provides support to simulate a pipeline.

It requires a Basic license and the following cluster privileges: `manage_pipeline` and `cluster:monitor/nodes/info`.

---

## Development

A new app called Ingest Node Pipelines is registered in the Management section and follows a typical CRUD UI pattern. The client-side portion of this app lives in [public/application](public/application) and uses endpoints registered in [server/routes/api](server/routes/api). For more information on the pipeline processors editor component, check out the [component readme](public/application/components/pipeline_processors_editor/README.md).
A new app called Ingest Pipelines is registered in the Management section and follows a typical CRUD UI pattern. The client-side portion of this app lives in [public/application](public/application) and uses endpoints registered in [server/routes/api](server/routes/api). For more information on the pipeline processors editor component, check out the [component readme](public/application/components/pipeline_processors_editor/README.md).

See the [kibana contributing guide](https://github.com/elastic/kibana/blob/master/CONTRIBUTING.md) for instructions on setting up your development environment.

Expand All @@ -25,7 +25,7 @@ The app has the following test coverage:

### Quick steps for manual testing

You can run the following request in Console to create an ingest node pipeline:
You can run the following request in Console to create an ingest pipeline:

```
PUT _ingest/pipeline/test_pipeline
Expand Down Expand Up @@ -73,7 +73,7 @@ PUT _ingest/pipeline/test_pipeline
}
```

Then, go to the Ingest Node Pipelines UI to edit, delete, clone, or view details of the pipeline.
Then, go to the Ingest Pipelines UI to edit, delete, clone, or view details of the pipeline.

To simulate a pipeline, go to the "Edit" page of your pipeline. Click the "Add documents" link under the "Processors" section. You may add the following sample documents to test the pipeline:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,11 +52,11 @@ describe('<PipelinesList />', () => {

// Verify app title
expect(exists('appTitle')).toBe(true);
expect(find('appTitle').text()).toEqual('Ingest Node Pipelines');
expect(find('appTitle').text()).toEqual('Ingest Pipelines');

// Verify documentation link
expect(exists('documentationLink')).toBe(true);
expect(find('documentationLink').text()).toBe('Ingest Node Pipelines docs');
expect(find('documentationLink').text()).toBe('Ingest Pipelines docs');

// Verify create button exists
expect(exists('createPipelineButton')).toBe(true);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -557,7 +557,7 @@ export const mapProcessorTypeToDescriptor: MapProcessorTypeToDescriptor = {
defaultMessage: 'Pipeline',
}),
typeDescription: i18n.translate('xpack.ingestPipelines.processors.description.pipeline', {
defaultMessage: 'Runs another ingest node pipeline.',
defaultMessage: 'Runs another ingest pipeline.',
}),
getDefaultDescription: ({ name }) =>
i18n.translate('xpack.ingestPipelines.processors.defaultDescription.pipeline', {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ export const PipelinesList: React.FunctionComponent<RouteComponentProps> = ({
<span data-test-subj="appTitle">
<FormattedMessage
id="xpack.ingestPipelines.list.listTitle"
defaultMessage="Ingest Node Pipelines"
defaultMessage="Ingest Pipelines"
/>
</span>
}
Expand All @@ -172,7 +172,7 @@ export const PipelinesList: React.FunctionComponent<RouteComponentProps> = ({
>
<FormattedMessage
id="xpack.ingestPipelines.list.pipelinesDocsLinkText"
defaultMessage="Ingest Node Pipelines docs"
defaultMessage="Ingest Pipelines docs"
/>
</EuiButtonEmpty>,
]}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import { ManagementAppMountParams } from '../../../../../../src/plugins/manageme
type SetBreadcrumbs = ManagementAppMountParams['setBreadcrumbs'];

const homeBreadcrumbText = i18n.translate('xpack.ingestPipelines.breadcrumb.pipelinesLabel', {
defaultMessage: 'Ingest Node Pipelines',
defaultMessage: 'Ingest Pipelines',
});

export class BreadcrumbService {
Expand Down
2 changes: 1 addition & 1 deletion x-pack/plugins/ingest_pipelines/public/plugin.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ export class IngestPipelinesPlugin
apiService.setup(http, uiMetricService);

const pluginName = i18n.translate('xpack.ingestPipelines.appTitle', {
defaultMessage: 'Ingest Node Pipelines',
defaultMessage: 'Ingest Pipelines',
});

management.sections.section.ingest.registerApp({
Expand Down
4 changes: 2 additions & 2 deletions x-pack/test/accessibility/apps/ingest_node_pipelines.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,14 +14,14 @@ export default function ({ getService, getPageObjects }: any) {
const log = getService('log');
const a11y = getService('a11y'); /* this is the wrapping service around axe */

describe('Ingest Node Pipelines', async () => {
describe('Ingest Pipelines', async () => {
before(async () => {
await putSamplePipeline(esClient);
await common.navigateToApp('ingestPipelines');
});

it('List View', async () => {
await retry.waitFor('Ingest Node Pipelines page to be visible', async () => {
await retry.waitFor('Ingest Pipelines page to be visible', async () => {
await common.navigateToApp('ingestPipelines');
return testSubjects.exists('pipelineDetailsLink') ? true : false;
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
import { FtrProviderContext } from '../../../ftr_provider_context';

export default function ({ loadTestFile }: FtrProviderContext) {
describe('Ingest Node Pipelines', () => {
describe('Ingest Pipelines', () => {
loadTestFile(require.resolve('./ingest_pipelines'));
});
}
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ export default function ({ getService }: FtrProviderContext) {
await createPipeline({ body: PIPELINE, id: PIPELINE_ID }, true);
} catch (err) {
// eslint-disable-next-line no-console
console.log('[Setup error] Error creating ingest node pipeline');
console.log('[Setup error] Error creating ingest pipeline');
throw err;
}
});
Expand Down Expand Up @@ -225,7 +225,7 @@ export default function ({ getService }: FtrProviderContext) {
await createPipeline({ body: PIPELINE, id: PIPELINE_ID }, true);
} catch (err) {
// eslint-disable-next-line no-console
console.log('[Setup error] Error creating ingest node pipeline');
console.log('[Setup error] Error creating ingest pipeline');
throw err;
}
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
it('should render the "Ingest" section with ingest pipelines', async () => {
await PageObjects.common.navigateToApp('management');
const sections = await managementMenu.getSections();
// We gave the ingest node pipelines user access to advanced settings to allow them to use ingest node pipelines.
// We gave the ingest pipelines user access to advanced settings to allow them to use ingest pipelines.
// See https://github.com/elastic/kibana/pull/102409/
expect(sections).to.have.length(2);
expect(sections[0]).to.eql({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,10 +29,10 @@ export default ({ getPageObjects, getService }: FtrProviderContext) => {
});

it('Loads the app', async () => {
log.debug('Checking for section heading to say Ingest Node Pipelines.');
log.debug('Checking for section heading to say Ingest Pipelines.');

const headingText = await pageObjects.ingestPipelines.sectionHeadingText();
expect(headingText).to.be('Ingest Node Pipelines');
expect(headingText).to.be('Ingest Pipelines');
});

it('Creates a pipeline', async () => {
Expand Down

0 comments on commit 7197fef

Please sign in to comment.